Because .NET, like the Java platform, is a JIT environment. All high-level .NET code is compiled into bytecode in the Microsoft Intermediate Language.
To run your program, this bytecode must be compiled / translated to your own machine code. However, compiled .NET-programmed files are not stored in native machine code, but in intermediate bytecode of an intermediate virtual machine.
The first run of JIT compiled, so it took extra time. Subsequent runs no longer need to compile JIT, but native code is retrieved from the JIT cache, so it should be faster.
Was your application supported without completing subsequent runs? Then the second reason is also related to VM. (VM: 1 = virtual machine, VM: 2 = virtual memory). All modern generalized operating systems perform their processes in virtual memory, which is a real memory card to allow the operating system to manage and optimize the use of system resources. Lesser-used processes are often flushed to the disk cache so that other processes have optimal resource utilization.
For the first time, your process was not in virtual memory, so it had to carry the overhead of being brought into memory. Because subsequently your process was one of the most recently used top lists (also at the bottom of the list that was recently used), it has not yet been deleted to disk cache.
In addition, resources will be removed by the OS in your process as needed. Thus, for the first round, your process had to go through pushing envelope competition to the OS in order to expand the boundaries of its resources.
The virtual machine allows .NET and Java to abstract most of the software functions to a standalone, machine-independent level, dividing and therefore leaving fewer problems for machine-dependent engineers to solve. Although Microsoft Windows runs on a fairly unified x86 streaming hardware, there are sufficient differences with different OS versions and CPU models to guarantee an abstracted virtual machine to give .NET programmers and users a consistent view.