This is my code:
template <typename T> void printlnbits(T v) {
const int value_size = sizeof(v) * 8;
long long int* ch = (long long int*)&v;
int j = 0;
for (int i = value_size - 1; i >= 0; --i)
{
extractBit(*ch, i);
j++;
if (j == 8)
{
std::cout << " ";
j = 0;
}
}
std::cout << "\t" << value_size << std::endl;
}
and
void extractBit(long long int ch, int ith) {
std::cout << (ch & (1 << ith) ? 1 : 0);
}
When passing in the argument:
const long long unsigned e = 1LLU << 40;
the output is:
10000000 00000000 00000000 00000000 10000000 00000000 00000000 00000000 64
Now 1LLU << 40 = 1099511627776
Should'nt the output be:
00000000 00000000 00000000 10000000 00000000 00000000 00000000 00000000 64
Am I missing something?
Struct as argument:
When I am sending in a struct as argument, how does the calculation of the variables inside the struct get calculated together and store a value which then gets outputed as a bit representation?
struct foo {
int a = 2;
char b = -1;
unsigned long long int x = 1LLU << 63;
};
outputs:
00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 11001100 11001100 11001100 11111111 00000000 00000000 00000000 00000010 128