I am not kidding. I have a C # application and a C ++ application. They do the same thing, in the same amount of code ...
... And C # one is faster, not only faster, but also 10 times faster.
This seemed strange to me because, firstly, I ran the C # application in a debugger, which should start with C #. Then, due to the fact that C # is a bytecode with huge overhead, using .NET, compiled into MSIL with a bunch of extra features, this should slow down. Although C ++ is just pure machine code.
Here is the C # code:
static void main() { ulong i = 0; while (i < 100000000000) { Console.WriteLine(i); i++; } }
While it was C ++ code
int main() { usigned long i = 0; while (i < 100000000000) { cout << i << endl; i++; } return 0; }
They just count and show the number. C ++ will be at 1000, and C # at 7000. (7 times faster)
I even tried to compile both of them and run them without a debugger using the command line using the command: cplusplus.exe && & csharp.exe
Yes, I know, maybe this question is βofftopicβ: P, or maybe βitβs not clear what they are asking for.β: / But, please, someone will explain this to me.
If that matters, I use this processor: Intel i7 2.5 Ghz.
EDIT: I did cout <i <Idea "\ n"; , plus the idea of std :: ios_base :: sync_with_stdio (false); without any luck or change in results.
EDIT 2: I tried C printf () and it was much faster. 3 times faster than C #.
People told me that the IO stream was very slow, so I tried both of them without writing to the console, and C ++ is still significantly faster than C #.
In conclusion, Writeline () is much faster than cout, and printf () is much faster than both. Therefore, writing to the console is the only thing that slows down the work.
TL; DR: printf () wins, and console writing slows down.