For some reason in C++, the expressions if(!(n & 1)) and if(n & 1 == 0) seem to not be equivalent.
Can someone please explain why this happens?
For some reason in C++, the expressions if(!(n & 1)) and if(n & 1 == 0) seem to not be equivalent.
Can someone please explain why this happens?
On
if(!(n & 1)) will evaluate to true if the least significant bit of n is 1.
if(n & 1 == 0) is equivalent to if(n & (1 == 0)), which will become if (n & 0), which is always false.
Check out the operator precedence table, you will see that == precedes &.
On
Normally (n&1), if we assume n is integer example 2 then first it will convert in 32bit integer so 2= 00000000000000000000000000000010 and in the other hand 1=00000000000000000000000000000001 now we apply "&" operator which return us 0 (you can simply do a Binary Multiplication). Now if we take an example as 3=00000000000000000000000000000011, here output will be 1.
So basically we use n&1 to check the last bit is set or not. (you can also say the number is even or odd)
Now in your question, 1.if(!(n & 1)) : here first n&1 will perform and what ever the value returned it will Logical Not.
Because of operator precedence.
n & 1 == 0is parsed asn & (1 == 0), not(n & 1) == 0.