All primitive types ( boolean, char, short, int, ... ) are actually bit arrays in memory.
The type of a variable determines only in which range this variable can take a value:
boolean 1 bit [range 0-1]char 16 bits [range 0 to 216-1 or \ u0000 to \ uFFFF]byte 8 bits [range from -128 to 127]short 16 bits [-32768 to 32767]int 32 bits [-2147483648 to 2147483647]- ...
char is represented by bits or optional hexadecimal, dec, oct-noobmer. It does not matter. This is why you can assign a number to it, and this nubmer matches the Unicode representation of that number.
darijan Jun 18 '13 at 10:26 2013-06-18 10:26
source share