WTF ~ ?

Today, Smon asks why isn’t the ~ operator is not defined for the byte type. It is the exact same problem for the bitshift operators (>> and <<). What’s the official rationale? That it doesn’t make any sense, as the processor is faster doing these operations on 32bit types than it is on 8bit ones. If you prefer, the processor would convert these types to 32bit anyway.

Only way if you want to define these at the language level is to either let the compiler analyze and cast up and down itself, or let the JIT consider that as a special case. My guess is that hiding the slowness of byte operations by slowing down with implicit casting prevent the developer from knowing that what he’s doing has performance implications. A warning at the compiler level could resolve this in a clean way. What do you think? Should we always be honest to the developer?