What is the relationship between the .NET assembly and the "bitness" or, in fact, the processor architecture? I thought that .NET programs were compiled in CIL (bytecode) and that they could work in different architectures and would be compiled in the -time order automatically. Thus, there should be no βbitnessβ for the .NET assembly.
But the real world does not seem so simple. I test programs on a 64-bit machine and often run into problems of non-standard compatibility between, say, my program and another .NET library. So my questions are:
- Can .NET binards somehow integrate processor architecture into it?
- Under what conditions should this happen?
- On 64-bit machines, which chain of events causes the .NET code to run in a 32-bit environment?
BTW: this question (The differences between 32-bit and 64-bit .NET applications (4) and its top answer are taken for granted that there is a certain bitticity for .NET. But I do not see the possibility of setting up the architecture in Visual Studio.
source
share