F15-10/6: WORKSHOP: Interpreting and Reporting Performance Test Results

Track/s: Performance

You’ve worked hard to define, develop and execute a performance test on a new application to determine its behavior under load. You have barrels full of numbers. What’s next? The answer is definitely not to generate and send a canned report from your testing tool. Results interpretation and reporting is where a performance tester earns the real stripes.

In the first half of this workshop we’ll start by looking at some results from actual projects and together puzzle out the essential message in each. This will be a highly interactive session where we will display a graph, provide a little context, and ask “what do you see here?” We will form hypotheses, draw tentative conclusions, determine what further information we need to confirm them, and identify key target graphs that give us the best insight on system performance and bottlenecks.

Please bring your own sample results (on a thumb drive so that we can load and display) and we’ll interpret them together!

In the second half of this session, we’ll try to codify the analytic steps we went through in the first session, and consider a CAVIAR approach for collecting and evaluating test results: Collecting, Aggregating, Visualizing, Interpreting, Analyzing, And Reporting.

Workshop Takeaways:

  • Training in interpreting results: Data + Analysis = Information
  • Examples of telling performance test graphs
  • Advice on Reporting: compel action with your information
  • Interpretation of your results – please bring them along!


Download Presentation

Please Note: The presentations are intended for attendees only. The presentations page is password protected – contact info@softwaretestpro.com for verification of attendance and the password to access the presentation.


Session Speaker:

Dan Downing – Performance Testing Consultant, Independent
Dan DowningA pioneer in performance testing when it was an arcane science, Dan developed the 5-Steps of Load Testing, taught at Mercury Interactive (creator of LoadRunner) education centers. He is known for having turned performance testing into a business, co-founding Mentora in 2001, which was acquired by a $1B IT services company (Forsythe) in 2012, from which he recently retired as a full-time employee. Dan has lead hundreds of performance testing projects for startups and large enterprises, has published numerous papers and articles on performance testing, and is recognized as a dynamic presenter and workshop leader at testing conferences like STAR, STP and Dynatrace. He is also one of the organizers of the Workshop on Performance and Reliability (WOPR).

Speaker Details:

Dan Downing – Performance Testing Consultant, Independent
LinkedIn: Dan Downing
Past Events: STPCon


Eric Proegler – Performance Engineer, SOASTA
Eric ProeglerEric Proegler has worked in testing for 16 years, and specialized in performance and reliability testing for 13. He works for SOASTA in Mountain View, California.

Eric is an organizer for WOPR, the Workshop on Performance and Reliability, and a Community Advisory Board member for STPCon. He’s presented and facilitated at Agile2015, CAST, WOPR, PNSQC, STPCon, and STiFS. Eric has recently started engaging with tester certification and testing standards to help our community best respond to these economic tactics.

In his free time, Eric spends time with family, reads, sees a lot of stand-up comedy, seeks out street food from all over, plays video games, and follows professional basketball.

Speaker Details:

Eric Proegler – Performance Engineer, SOASTA
Twitter: @ericproegler
LinkedIn: Eric Proegler
Website: www.SOASTA.com
Blog: ContextDrivenPerformanceTesting.com
Past Events: STPCon, WOPR, CAST