We are getting close enough to the 3.6 release that I’m able to talk more about the new features – so look for more posts in the next few days. I blogged previously about the improved performance goal features, but at the time, I could not detail how they would be used to improve Load Tester’s analysis reports.
The ability to set global page and transaction goals and override them individually for each page or transaction is a nice feature, as is the appearance of the goals on the performance charts. But the real value comes from Load Tester automatically analyzing the test results against the performance goals and using that to help highlight the performance problems and estimate system capacity. I will blog about estimating system capacity next week – today I want to show you some of the options available for analyzing the performance against the goals.
First, here is an example of the results of the analysis for a single page:
In this example, the goals were analyzed using 4 methods: Average Page Duration, Average Wait Time, Maximum Duration and 95th percentile. As you might expect, goal failure are reported at different load levels depending on the analysis methods used. This gives the tester the flexibility to specify a wide variety of requirements and get accurate results. There are more analysis methods available in the report settings:
We know your time is valuable and you don’t want to be wading through every page of the report looking for the pages that failed, so we added a Performance Goals section to the report. It will show you which pages failed the goals at each user level, as seen in this excerpt:
Load Tester is the first product to offer this level of analysis capabilities and they have already proven very useful for our load testing services. We are very excited to make them available in the Load Tester 3.6 release.
Until then, you can see these in action and learn more about them in our Performance Goals screencast.
When his dad brought home a Commodore PET computer, Chris was drawn into computers. 7 years later, after finishing his degree in Computer and Electrical Engineering at Purdue University, he found himself writing software for industrial control systems. His first foray into testing software resulted in an innovative control system for testing lubricants in automotive engines. The Internet grabbed his attention and he became one of the first Sun Certified Java Developers. His focus then locked on performance testing of websites. As Chief Engineer for Web Performance since 2001, Chris now spends his time turning real-world testing challenges into new features for the Load Tester product.