I am trying to generate 1024bits random numbers with this code:
var bytes = RandomNumberGenerator.GetBytes(128);
var number = new BigInteger(bytes);
But when I inspect my number using BigInteger methods, its BitLength is always different.
For example, when running this code:
for (int i = 0; i < 5; i++)
{
var bi = new BigInteger(RandomNumberGenerator.GetBytes(128));
Console.WriteLine($"{i}: {bi.GetBitLength()} bits | {bi.GetByteCount()} bytes");
}
I am getting this output:
0: 1021 bits | 128 bytes
1: 1023 bits | 128 bytes
2: 1022 bits | 128 bytes
3: 1021 bits | 128 bytes
4: 1022 bits | 128 bytes
What am I doing wrong / not understanding here ?
GetBitLengthis documented as:And also, helpfully:
Now, consider two (very small)
BigIntegervalues - 1 and 3. It only takes 1 bit to represent the integer value 1, whereas it takes 2 bits to represent the integer 3.Effectively, any leading 0 bits which might have been passed in the original byte array aren't included in
GetBitLength().I suspect that's absolutely fine for you, but it will definitely depend on what you're planning to do with the result. To put it another way: if you only wanted 4 bit numbers, would you want values between 0 and 15 inclusive, or between 8 and 15 inclusive? Because requiring
GetBitLength()to return 4 would make you generate the latter... (That's effectively "3 random bits and a leading bit that's always 1".)