Is there any way the compiler can set the type 'a' to char, instead of int.
This makes the values of these expressions true:
sizeof('a') == 1
_Generic('a', char : true, default : false)
In gcc _Generic('a', char : true, default : false) == false
You can use
(char)'a'instead ofa.