I need to implement a couple of app-specific character format options in a WinAPI c++ program using the Rich Edit control. The control is created with the MSFTEDIT_CLASS macro, which evaluates to L"RICHEDIT50W". I am loading it by explicitly loading MSFTEDIT.DLL on a Win10 system.
From a naive reading of the docs, the most straightforward way to add custom formats might be simply to turn the dwCookie member of CHARFORMAT2 into a bit-sensitive value. I tried doing this, however, and I can't seem to get it to work. As a test, I simplified my code to the following:
CHARFORMAT2W cf;
ZeroMemory(&cf, sizeof(cf));
cf.cbSize = sizeof(cf);
cf.dwMask = CFM_COOKIE;
cf.dwCookie = 0x02;
SendDlgItemMessageW(GetWindowHandle(), GetControlID(), EM_SETCHARFORMAT, SCF_SELECTION | SCF_DEFAULT, (LPARAM)&cf); // Apply formatting
CHARFORMAT2W test;
ZeroMemory(&test, sizeof(test));
test.cbSize = sizeof(test);
SendDlgItemMessageW(GetWindowHandle(), GetControlID(), EM_GETCHARFORMAT, SCF_SELECTION, (LPARAM)&test);
The information returned in test correctly identifies font, size, and standard styles. But the dwCookie value coming back is always zero, despite that I explicitly set it to 0x02 above. Am I missing something about how dwCookie works? The documentation doesn't say much about how one might use it.
Is there a better way to do my attributes? They are "hidden" (but grayed out to the user of the control, not actually hidden like CFM_HIDDEN is) and "absolute" (which means the text will not scale along with other elements in the program). I'll probably show "absolute" as a slightly larger font size in the Rich Edit control, once I have a way of storing and retrieving it.