F12 – 303: Interpreting and Reporting Performance Test Results (Part 1)

Track: Performance Testing

You’ve worked hard to define, develop and execute a performance test on a new application to determine its behavior under load. Your initial test results have filled a couple of 52 gallon drums with numbers. What next? Crank out a standard report from your testing tool, send it out, and call yourself done? NOT. Results interpretation is where a performance tester earns the real stripes.

In the first half of this double session we’ll start by looking at some results from actual projects and together puzzle out the essential message in each. This will be a highly interactive session where I will display a graph, provide a little context, and ask “what do you see here?” We will form hypotheses, draw tentative conclusions, determine what further information we need to confirm them, and identify key target graphs that give us the best insight on system performance and bottlenecks. Feel free to bring your own sample results (on a thumb drive so I can load and display) that we can engage participants in interpreting!


Download Presentation
Please Note: The presentations are intended for attendees only. The presentations page is password protected – contact info@softwaretestpro.com for verification of attendance and the password to access the presentation.


Session Speaker:

Dan DowningDan Downing – Principal Consultant, Mentora Group
Dan Downing is co-founder and Principal Consultant at Mentora Group, Inc. (http://www.mentora.com) , a testing and managed hosting company. Dan is the author of the 5-Steps of Load Testing, which he taught at Mercury Education Centers, and of numerous presentations, white papers and articles on performance testing. He teaches load testing and over the past 13 years has led hundreds of performance projects on applications ranging from eCommerce to ERP and companies ranging from startups to global enterprises. He is a regular presenter at STAR, HP Software Universe, Software Test Professionals conferences, and is one of the organizers of the Workshop on Performance and Reliability (WOPR).