I really wanted to switch to Java 7 (for my own selfish reasons for coding). However, we have users with a very sensitive delay (everything should be a sub-millisecond). I did a simple performance comparison test between 3 different JVMs and found that Java 7 is much slower. The test struck a few simple messages through our application. This test is a low load and load test that pushes one message every few seconds. The results were (in microseconds):
- Hotspot 6 (build 24): msgs= 23 avg= 902 - JRockit 6 (R28 b 29): msgs= 23 avg= 481 - Hotspot 7 (build 04): msgs= 34 avg=1130
Oracle's strategy is to combine JRockit and Hotspot starting in Java 7 (which is why JRockit 6 is the latest available). Anyone have any ideas why performance is so much worse? (It should be noted that the code was compiled under Java 1.6. Not sure if this will explain this ...)
UPDATE: I voted to close my own question, because from the comments I can see that I can not provide enough information to make this constructive question. Thanks to everyone who commented.
UPDATE: after I return, I thought I would provide more information. The test is always after a new start. All factors are equal for each test. The only thing that changes is the JVM. Repeating the test several times gives a consistent result. There was no GC in any test iteration.
Below are the graphical values of one of the test runs. For JRockit and Hotspot 7, the very first delay value was selected. JRockit has a huge first value, but then very quickly optimizes and settles to the average value. Access Point 7 takes longer to optimize and never drops to a JRockit average. Each data point represents microseconds for reading a message from a TCP / IP socket, executing business logic, and writing a message to another socket. Each message is identical, and no new codes are entered for any message.
source share