[ Team LiB ] Previous Section Next Section

8.1 Introduction

Performance issues inevitably sneak into a project. Tracking down the issues is troublesome without the proper tools. Commercial performance-monitoring tools, such as JProbe or OptimizeIt, help pinpoint performance problems. These tools excel at providing performance metrics but typically require expert human intervention to run and interpret the results. These tools are not designed to execute automatically as part of a continuous integration process—which is where JUnitPerf enters the picture.

JUnitPerf, available from http://www.clarkware.com/software/JUnitPerf.html, is a tool for continuous performance testing. JUnitPerf transparently wraps, or decorates existing JUnit tests without affecting the original test.[1] Remember that JUnit tests should execute quickly. Figure 8-1 shows the UML diagram for the JUnitPerf TimedTest.

[1] For more information on the decorator pattern refer to Design Patterns: Elements of Reusable Object-Oriented Software (Addison-Wesley) by Erich Gamma, et al.

JUnitPerf tests can (and should) be executed separately from normal JUnit tests. This approach ensures that the overall execution of JUnit tests isn't hindered by the additional time spent executing JUnitPerf tests.

Figure 8-1. JUnitPerf UML diagram
figs/jexp_0801.gif

Here's a quick overview of how a JUnitPerf timed test works. The following occurs when a JUnitPerf TimedTest.run(TestCase) method is invoked:

  1. Retrieve the current time (before JUnit test execution).

  2. Call super.run(TestResult) to run the JUnit test, where super refers to the JUnit TestDecorator.

  3. Retrieve the current time (after JUnit test execution).

  4. If the elapsed time turns out to be greater than the maximum allowed time, then a junit.framework.AssertionFailedError(String) is thrown. Otherwise, the test passes.

    [ Team LiB ] Previous Section Next Section