I have some C# code I got from http://bytes.com/topic/c-sharp/answers/572657-net-clipboard-metafiles that copies a cell range under the following two settings:
- As shown on screen,
- As shown when printed.
When I look at the resulting metafile's resolution (which is documented as Gets the resolution, in pixels-per-inch, of this Image object
), I get different values depending on the copy method.
With the As shown when printed option, the resolution is 600, which I believe corresponds to the DPI settings I have in Excel.
With the As shown on screen setting, it spits out something like
Metafile.VerticalResolution = 72.08107
andMetafile.HorizontalResolution = 71.95952
. On other machines, I've seen this value vary a lot (values around 111, 130, etc.).
Zoom level doesn't seem to affect this. From what I've observed, the values stay consistent on a single machine, but may differ from machine to machine.
Can anyone explain the logic Excel follows when computing the resolution of the metafile in As shown on screen mode?
After changing the windows resolution and measuring the metafile resolutions, here is the table I generated (hope it looks formatted correctly):
Width Height HorizontalResolution VerticalResolution
1680 1050 71.95952 72.08107
1600 1024 72.05672 72.04874
1600 900 72.05672 71.88678
1360 768 71.96666 71.98228
1280 1024 71.9292 72.04874
1280 960 71.9292 71.9292
1280 800 71.9292 72.05672
1280 768 71.9292 71.98228
1280 720 71.9292 71.99999
1152 864 72.07093 71.95278
1088 612 71.96666 71.96666
1024 768 72.04874 71.98228
960 600 71.9292 71.88678
800 600 72.05672 71.88678
After running a similar procedure on a virtual machine (same physical machine), these are the results. A bit more volatile than the physical machine itself. This data might not be useful, but thought I'd provide it anyways.
Width Height HorizontalResolution VerticalResolution
1680 1050 133.35 111.125
1280 800 101.6 84.66666
1024 768 81.27999 81.27999
800 600 63.5 63.5
My Hypothesis
I believe this has to do with running a non-native resolution on an LCD monitor. In "the old days" CRTs didn't have a native resolution per se. So the computer was unaware of any preferred resolution for a given monitor size or aspect ratio. With newer digital displays (LCD) the computer is now aware of the preferred resolution and aspect ratio for your display if properly installed. My windows 7 machine shows "recommended" next to my LCD's native resolution and then shows the other 2 equal aspect ratio resolutions in black, with the remaining "mismatches" being unlabelled but selectable (resulting in the squished or stretched look that I hate to see on other peoples computers!).
The default DPIs of 96 and 120 in windows were established back in the CRT days. My windows 7 machine doesn't even say DPI anymore it just says 'smaller', 'medium', 'larger'.
Either way, when you buy an LCD monitor that is say 1920x1080 or 1920x1200 but set the display resolution to something smaller, you result in a conversion factor. In the case of the non-matching horizontal and vertical resolutions close to 72, your non-native display resolution may not be exactly the same scaling factor vertically as it is horizontally resulting in this small discrepancy.
How To Test My Hypothesis
On each of your test machines record the operating systems configured resolution and the displays native resolution. See if the ratio between these two is close to the ratio between your metafile 'as on screen' vs either 96 or 120dpi. I would prefer you conduct this test on physical machines to simply rule out the possibility of further scaling with remote desktop or virtual machine drivers.
If the solution isn't immediately apparent, take a step further and record the operating and control panel settings for DPI or 'smaller', 'medium' and 'larger'. Windows XP may behave differently than Windows Vista/Windows 7.
You could also re-run the test on the same physical machine several times, adjusting the configured display resolution between tests and observe any changes. If my hypothesis is correct you should see a different metafile resolution for each configured display resolution on the same physical machine/display combination and this result should be predictable and repeatable (returning to the first resolution should return to the same metafile resolution)
EDIT #1
I found an excellent article that discusses physical DPI vs logical DPI. Have a read of this: Where does 96 DPI come from in Windows?
So now the next test I'd recommend is changing displays! Do you have a different brand/size/resolution LCD monitor available for testing? You don't need quite so many lines as your first test above as we've established the various resolutions tend to produce a very similar DPI for the same display. Just test maybe a couple common resolutions including the display's native resolution and 1024x768 as a "baseline" for comparison.
Also while poking around in Windows 7, I did find the "Set Custom Text Size (DPI)" link in control panel->display which included a "Use Windows XP Style DPI scaling" option. While I don't think this is the primary concern, curiosity makes me interested in its effect, so I thought I'd mention it.
EDIT #2 - SOLVED!
The resolution you are seeing in your metafiles is the physical DPI of your monitor. I'll post some C# code here for you to test yourself:
This code on my display outputs the following:
Notice both the logical and physical DPI? Does that physical DPI look familiar? It all makes sense after reading that article about 72dpi reflecting 1pt=1px. Give this code a try on your various test machines and let me know how it goes! (FYI, I ran this code in a C# winforms app, a console app should be able to get the screen device context, but maybe not...)