Converting Decimal to Octal Using Bitwise Operators

1.9k Views Asked by At

The below function is intended to convert its parameter, an integer, from decimal to octal.

std::string dec_to_oct(int num) {
    std::string output;
    for(int i=10; i>=0; --i) {
        output += std::to_string( (num >> i*3) & 0b111 );
    }
    return output;
}

It works for any positive input, however, for num = -1 it returns 77777777777, when it should return 37777777777, so the first digit needs to be a 3 instead of a 7. Why is this happening? The function appears to be incorrect for all negative input. How can I adjust the algorithm so that it returns correctly for negative numbers?

Note: this is a CS assignment so I'd appreciate hints/tips.

2

There are 2 best solutions below

12
On BEST ANSWER

This is because the arithmetic shift preserves the sign of the number. To overcome this, cast the input integer to the equivalent unsigned type first.

(((unsigned int)num) >> 3*i) & 7


Going further, you can make the function templated and cast the pointer to the input to uint8_t*, using sizeof to calculate the number of octal digits (as suggested by DanielH). However that will be a bit more involved as the bits for a certain digit may stretch over two bytes.

1
On

Copy paste of documentation

ios_base& oct (ios_base& str);

Use octal base Sets the basefield format flag for the str stream to oct.

Example

// modify basefield
#include <iostream>     // std::cout, std::dec, std::hex, std::oct

int main () {
  int n = 70;
  std::cout << std::dec << n << '\n';
  std::cout << std::hex << n << '\n';
  std::cout << std::oct << n << '\n';
  return 0;
}

Output:

70
46
106

So bottom line you are reinventing the wheel.