I am trying to understand the following code:
#include <stdio.h>
int main()
{
int a=2147483647,b=1;
printf("%d\n",a+b);
printf("%d\n",a+b>a);
return 0;
}
Since a is the maximum integer, so a+b should be negative. But the last line print out "True".
If I change the second print out to printf("%d\n",2147483647+1>2147483647); Their will be an warning and print out false which is what I expected. Is their any setting in the compiler leads to this result?
Integer overflow on
signedtypes is Undefined Behaviour (UB) and the compiler is free to do what it wants to do - this is exactly what you observe. The compiler is not even comparing the numbers.UB meand that the program behaviour is undefined ie it can be predicted from the C point of view.
Using GCC you can force the arithmetic and see what your hardware will do with it (but it is still undefined in C language):
https://godbolt.org/z/6fjh8obfY