I found a strange bug in my code. My code is should be cross compatible with windows and linux. I found when I compile my program windows mingw, it adds 6 bytes to a specific data type/variable when getting its length.
I don't know how to explain much further so here's what i found
int paddingSize = strlen((char*)cipherText);
I defined cipherText as a unsigned char * which is an array of raw encrypted aes-256 bytes. when the snippet above is compiled on linux, if the array size is 16 bytes, the int will be 16. Being compiled on windows however, there will be on offset of exactly +6 bytes every time. To combat this I was using the snippet below
int paddingSize = strlen((char*)cipherText) - 6;
I was wondering why this could be? Is this a compiler issue such as mingw vs G++ (GNU)? or is it an architecture problem? Could this also be a data type problem where type casting is ruining it on one OS but not the other?
I have tried different things but all came out with confusing answer.
strlencounts the bytes until it finds a'\0'character. If yourypoints to an array of chars with length of 16, the result ofstrlensimply gives you the number of chars until the first'\0' was found, fully independent how long your array is! It is also possible that your prog simply crashes because you access memory behind the array.You can't get the length of an array from a pointer to char.
The size of a variable can be taken by using
sizeof(). In case of a pointer it returns the size of the pointer itself, which is typically 8, depending on machine type you are running.You should avoid typecasts at all