What is the difference between the three following scenarios (in an app.config files of an exe)?
<startup>
<supportedRuntime version="v4.0" />
<supportedRuntime version="v2.0" />
</startup>
<startup useLegacyV2RuntimeActivationPolicy="true">
<supportedRuntime version="v4.0" />
<supportedRuntime version="v2.0" />
</startup>
<startup useLegacyV2RuntimeActivationPolicy="true">
<supportedRuntime version="v4.0" />
</startup>
I've read the MS documentation on it and some blogs but it still isn't very clear to me exactly what happens and when to use which.
EDIT
I have a situation where a third party application was compiled with CLR 2 (and also uses legacy COM) and the allowed extensions that I've made for the application are compiled with CLR 4. So, recompiling the application is not an option for me. I just need to know the impact of the three scenarios.
The useLegacyV2RuntimeActivationPolicy attribute is a bit of a cop-out. Setting it to true allows a .NET 4 program to load mixed-mode (C++/CLI) or [ComVisible] .NET assemblies that stated expressly in the registry that they need version 2.0.50727 of the runtime. It won't make any difference if you don't have such assemblies, they are fairly rare. The sane thing to do is to not use it, you'll get an error message when it is required. A FileLoadException whose message looks like:
The next sane thing to do is rebuild such assemblies to target .NET 4. Last thing you do is use that attribute.
If you offer more than one version of the CLR, like you did in the first two snippets, then you'll get the one that the EXE asks for in its manifest. The last one forces the v4 version. The implication is that you'll potentially run code that was only ever tested on CLR v2 on a different .NET runtime. This will almost always come to a good end, v4 is very compatible with v2. But they did take the opportunity to fix bugs in v4. You could be accidentally depend on buggy behavior. Very rare of course.