OK, first of all, I don’t want any flamewar here or something like that. My larger question is more theoretical and will contain a few examples.
So, as I wrote, I cannot understand how an interpreted language can even be a little effective. And from the moment of its creation I will take Java as an example.
Let's get back to the days when there were no JIT compilers. Java has its own virtual machine, which is basically its hardware. You write the code, than it was compiled into bytecode to perform at least some work from the virtual machine. But considering how complex even a set of RISC commands can be at the hardware level, I can't even figure out how to do this on software emulated hardware.
I have no experience writing virtual machines, so I don’t know how to do it at the most efficient level, but I can’t think of anything more efficient than testing each instruction for adn compliance, than doing the corresponding actions. You know, something like: if(instruction=="something") { (do it) } else if(instruction=="something_diffrent"){ (do it) } , etc.
But that should be terribly slow. And yet, there are even articles that java was slow in front of JIT compilers, they still say that it is not so slow. But emulation requires the execution of several clock cycles of a real HW to execute a single bytecode instruction.
And yet even entire platforms are based on java. For example, Android. And in the first versions of Android there was no JIT compiler. They were interpreted. But shouldn't Android be terribly slow? And yet this is not so. I know, when you call any API function from the Android library, they are written in machine code, so they are efficient, therefore it helps.
But imagine you write your own sratch game engine using the API only for displaying images. You will need to do many copy operations of arrays, many calculations will be very slow when emulating.
And now some examples, as I promised. Since I mainly work with the MCU, I found the JVM for the Atmel AVR MCM. Thay states that the 8MHZ MCU can do 20K java optcodes per second. But since AVR can execute most instructions in one or two cycles, let's say 6,000,000 instructions on average. This gives us that a JVM without a JIT compiler is 300 times slower than machine code. So why become java so popular without a JIT compiler? Is this a too bad performance loss? I just do not understand. Thanks.