I'm pretty experienced in .NET development, but today I had to wrap my head with something that I never thought about:
How does the installed .NET Framework, the target .NET Framework in Visual Studio and the C # compiler work?
Case study: System.dll contains the System.Net.SecurityProtocolType enumeration. In .NET 4.5, this enumeration contains the members SSl3 , Tls , Tls11 and Tls12 . With .NET 4.7, a SystemDefault element has been SystemDefault .
So, aiming at .NET 4.7.x, this code compiles fine:
var p = SecurityProtocolType.SystemDefault;
However, when I target .NET 4.5.x, this code does not compile (as you would expect).
What puzzles me is why this works, considering that .NET 4.7 is an in-place upgrade to .NET 4.5 (i.e. when installing .NET 4.7, System.dll for .NET 4.5 is replaced by .NET 4.7).
How does the compiler know that I cannot use SystemDefault in .NET 4.5, but can use it on 4.7? Is this done using any API file known to the compiler?
Side fact: When I'm targeting .NET 4.5 and installing .NET 4.7, calling Enum.GetValues(typeof(SecurityProtocolType) will give me SecurityProtocolType.SystemDefault . So I'm sure my .NET 4.5 application uses the .NET 4.7 System.dll .
Sebastian krysmanski
source share