Why does the following code not generate an error?
System.out.println((char) 2147483647);
According to oracle datatypes, the maximum size for a char
is 65,535
.
- char: The char data type is a single 16-bit Unicode character. It has a minimum value of '\u0000' (or 0) and a maximum value of '\uffff' (or 65,535 inclusive).
2147483647
is not achar
but anint
.You're not assigning an invalid value to a
char
, you're casting a validint
tochar
and then Narrowing Primitive Conversion rules apply. See Java specs: §5.1.3.In short you keep lowest 16 bits of original integer ("A narrowing conversion of a signed integer to an integral type T simply discards all but the n lowest order bits, where n is the number of bits used to represent type T.").
Because it's not an error, it's a well-defined behavior.