Track: Performance Testing
Operating a scripting/playback tool is only one step in performance testing. Real value is added by analyzing and interpreting the barrels of numbers collected during a test. Adding analysis to data produces information. Interpreting this information skillfully yields hypotheses to test, conclusions to draw and results you can act on. These are the real value of the performance test and of the tester. Generating these results requires insightful advance planning and test design—the critical business-analyst challenging aspect of performance testing.
In this two part session we will start with results: what they mean, and where they come from. As we focus on identifying patterns in results that indicate information we are testing for, we will describe how to generate, collect, correlate, and interpret actionable information before and during performance tests.
We’ll also share with you an approach to reporting results in a clear and compelling manner, with data-supported observations, conclusions drawn from these observations, and actionable recommendations.
From feedback on earlier versions of this presentation-workshop, I feel certain that you will take home at least one new idea that will make you a more effective performance tester. Come prepared to participate actively!
Please Note: The presentations are intended for attendees only. The presentations page is password protected – contact email@example.com for verification of attendance and the password to access the presentation.
Dan Downing – Principal Consultant, Mentora Group
Dan Downing is co-founder and Principal Consultant at Mentora Group, Inc. (www.mentora.com), a testing and managed hosting company. Dan is the author of the 5-Steps of Load Testing, which he taught at Mercury Education Centers, and of numerous presentations, white papers and articles on performance testing. He teaches load testing and over the past 13 years has led hundreds of performance projects on applications ranging from eCommerce to ERP and companies ranging from startups to global enterprises. He is a regular presenter at STAR, HP Software Universe, Software Test Professionals conferences, and is one of the organizers of the Workshop on Performance and Reliability (WOPR).
Eric Proegler – Senior Performance Engineer, Mentora Group, Inc
Eric Proegler is a Senior Performance Engineer at Mentora Group, Inc. (www.mentora.com), a testing and managed hosting company. Before that, he led performance engineering for a software vendor, worked in software testing, and consulted on performance testing, hardware sizing, and other technologies of interest in engineering and deploying software solutions.