Best way to measure runtime in automated regression tests

I have a code that I want to measure speed while it runs continuously in automatic regression tests. The purpose of this would be to warn me of changes made to the code that had a negative impact on performance.

In pseudo code, I want something like this:

cpuTimer.start runTest cpuTimer.stop diff = cpuTimer.getDuration if diff > prevDiff // Perhaps to within a tolerance failTest 

I am considering ThreadMXBean # getCurrentThreadCpuTime () for this, but the key problem is that automated tests will run on a wide range of different developers and will be automatically processed to test servers with a range of different hardware and capabilities.

Will this work, or will the numbers be wrong?

How to solve this problem? Is there a better way? Is there a standard tool for this kind of caper?

+6
java performance testing automated-tests cpu-usage
source share
2 answers

You can try to look at Perf4J . I haven't used it yet, but it's on my list of things to investigate.

On the home page:

Perf4J refers to System.currentTimeMillis () since log4j refers to System.out.println ()

The developer guide is a great introduction.

+3
source share

One approach is to parse the standard xml files that are output from junit, working on both maven and ant. They contain all the necessary data. Although there are tools for creating reports based on them, I do not know about any aggregators / observers.

If you put them in a database table with the hostname, testname, timetime and buildNumber fields, you can accomplish most of the things you need.

0
source share

All Articles