I'm using OpenJDK17 and am perplexed as to why Integer.parseInt("10000000000000000000000000000000", 2) would throw a NumberFormatException as it is the binary representation of Integer.MIN_VALUE (or -2147483648 in decimal).
Is this expected or some known edge-case/quirk?
I attempted to provide the binary representation of Java's Integer.MIN_VALUE to Integer.parseInt in order to receive back -2147483648 but inside experience a NumberFormatException.
Note: I'm not particularly trying to achieve anything just fiddling around on the JVM and came across this.
The
Integer.parseInt()method with the radix of2is not used to convert a bit representation of a two's complement integer value to anintvalue. Instead it "just" parsed the string as a signed integer with the given radix. What that means is that "10000000000000000000000000000000" in binary is 2147483648 in decimal. Such a big value cannot be saved asintin java because the maximumintvalue in java is 2147483647 (orInteger.MAX_VALUE). That's why you get aNumberFormatException. The result will not be a negative value likeInteger.MIN_VALUEwhat you expected. The result will only be negative when the first character in the string is a-character as mentioned in the javadoc ofInteger.parseInt():