Session Title: Interpreting and Reporting Performance Test Results
Speaker: Dan Downing, Principal Consultant, Mentora Group
Date: Tuesday, December 11, 2012
Time: 12:30 PM PT - 1:30 PM PT
You’ve worked hard to define, develop and execute a performance test on a new application to determine its behavior under load. Your initial test results have filled a couple of 52 gallon drums with numbers. What next? Crank out a standard report from your testing tool, send it out, and call yourself done? NOT. Results interpretation is where a performance tester earns the real stripes.
During this session we’ll start by looking at some results from actual projects and together puzzle out the essential message in each. We will form hypotheses, draw tentative conclusions, determine what further information we need to confirm them, and identify key target graphs that give us the best insight on system performance and bottlenecks.
Next, we will try to codify the analytic steps we went through and take steps to confirm them, and finally report the results. The process can best be summarized in a CAVIAR approach for collecting and evaluating performance test results:
We will also discuss an approach for reporting results in a clear and compelling manner, with data-supported observations, conclusions drawn from these observations, and actionable recommendations. A link will be provided to the reporting template that you can adopt or adapt to your own context.
About Dan Downing
Dan Downing is co-founder and Principal Consultant at Mentora Group, Inc. (www.mentora.com) , a testing and managed hosting company. Dan is the author of the 5-Steps of Load Testing, which he taught at Mercury Education Centers, and of numerous presentations, white papers and articles on performance testing. He teaches load testing and over the past 13 years has led hundreds of performance projects on applications ranging from eCommerce to ERP and companies ranging from startups to global enterprises. He is a regular presenter at STAR, HP Software Universe, Software Test Professionals conferences, and is one of the organizers of the Workshop on Performance and Reliability (WOPR).