For the project i'm working on, a "word" is defined as 10 bit length, and as according to what my program does, I need to update specific bits in this word, with binary numbers (of course up to the limit of the length of the bits). My problem is that I don't know how to create these bits, and after that how to read them.
For example, "word" is set like this:
bits 0-1 - representing something A - can get values between 0-3.
bits 2-3 - representing something B - can get values between 0-3.
bits 4-5 - C - values 0-3.
bits 6-9 - D - values 0-15.
and as my program running, I need to decide what to fill in each group of bit. After that, when my word is completely full, I need to analyze the results, meaning to go over the full word, and understand from bits 0-1 what A is representing, from bits 2-3 what B is representing, and so on..
another problem is that bit number 9 is the most significant bit, which mean the word is filling up from bits 6-9 to 4-5 to 2-3 to 0-1, and later on printed from bit 9 to 0, and not as a regular array.
I tried to do it with struct of bit-fields, but the problem is that while a "word" is always 10 bits length, the sub-division as mentioned above is only one example of a "word". it can also be that the bits 0-1 representing something, and bits 2-9 something else.
I'm a bit lost and don't know how to do it, and I'll be glad if someone can help me with that. Thanks!
Just model a "word" as an
uint16_t
, and set the appropriate bits.Something like this:
... and so on.