I've taken up a recent interest in making/emulating text-based rpgs from the 80s, like Rogue and its derivatives, which feature graphics make out of extended ASCII characters. So when it comes to creating and printing graphics to the console for these games, I figure that I should do the following: 1) Design the levels and whatnot in a text editor like Notepad; 2) Save those files as Unicode-encoded txt files, since they contain extended ASCII; 3) Have my game program read the graphics from these files and print them, verbatim, to the console. This seems like a fine plan to me, except there is one problem.
For the life of me, I can't get the program to output the extended ASCII characters properly. What generally happens is that the program will seem to read each single char from the file as a pair of ASCII chars. For example, the char '☺' would be output as "&;", or something like that.
In C++ and/or C#, how can I properly read extended ASCII chars from Unicode-encoded txt files, line by line, into a program and output those lines to the console window?
(I mean, I suppose I could make a translator function that takes the corrupted char-pair, like "&;" and converts it back to the single ASCII char, like '☺', by way of a big ol' if-then statement or some cleverly-deduced mathematical formula, but I am not only quite lazy, I would also very much be interested in knowing how C++/C# handle file I/O with non-ANSI-encoded txt files, if they indeed have such mechanisms implemented!)
Since you control both sides (writing a text file and reading it back) things are very easy:
.net uses UTF-8 encoding by default. If you use a StreamWriter() to write a file, you may use a StreamReader() to read the file back and all characters will survive the round-trip unaltered.
Now the trick for you: If you want to manipulate such a file with en external editor make sure the editor is able to read/write UTF-8 encoding. Use notepad++, it will do.