I was learning about one's complement and how one's complement uses the most significant digit as a sign indicator.
If you represent binary number 110 with 1's complement, you get -1. But, obviously, 110 is equivalent to 6 in decimal. Here I was really confused.
How does the computer know if you're intending to use -1 or 6? Thanks.
The computer doesn't implicitly associate any meaning with a certain string of bits. It's up to your programming language of choice to associate types with values to determine how to interpret the bits (like for instance, whether an integer is signed or unsigned).