I am implementing Huffman Encoding, the Huffman algorithm works well, the thing is:
When I'm trying to convert BitArray to byte[] using
BitArray encoded = huffmanTree.Encode(input); // The encoding works well
byte[] bytes = new byte[encoded.Length / 8 + (encoded.Length % 8 == 0 ? 0 : 1 )];
encoded.CopyTo(bytes, 0);
It converts wrongly. I wrote the byte array to a file, and then read it, and converted it to BitArray again, to see if it is the same:
//Displaying BitArray
Console.Write("Encoded: ");
foreach (bool bit in encoded)
{
Console.Write((bit ? 1 : 0) + "");
}
Console.WriteLine();
//Writing to a file
File.WriteAllBytes(@"C:\Users\vvvoh\source\repos\Laba1Alg\Laba1Alg\Encoded", bytes);
// Reading from file
byte[] bytes1 = File.ReadAllBytes(@"C:\Users\vvvoh\source\repos\Laba1Alg\Laba1Alg\Encoded");
// To BitArray
var bits = new BitArray(bytes1);
//Displaying BitArray after reading from file
Console.Write("Encoded again: ");
foreach (bool bit in bits)
{
Console.Write((bit ? 1 : 0) + "");
}
Console.WriteLine();
Here is the output, it shows the code, the BitArray before conversion, and after.
Origin: ABC
Code:
A - 10
B - 11
C - 0
Encoded: 10110
Encoded again: 10110000
Final: ABCCCC
Please let me know where is my mistake, help me to understand where am I wrong.