Performance Testing is often a forgotten or forsaken tool by both project teams and business partners. The primary reason for this is that, traditionally, performance was never a concern until something happened (such as poor customer experiences). Couple this with the realization that it has been traditionally difficult to performance test an application until it is stable–and software often isn’t stable until after it is released. These factors, among many, affect the perceived value, or lack thereof, in performance testing. Most large scale enterprise organizations have performance testing practices while small and mid-size companies rarely engage in performance testing. Even then, most companies only embark on performance testing once they encounter a significant performance issue in production. This sort of after-thought to performance testing is pervasive in the testing community. This creates challenges to properly implementing performance testing objectives, and even acceptance of the need for performance testing, among management. In this session we will discuss how to build a business case to support the implementation or expansion of performance testing in your organization. At the same time, we will cover aspects of performance testing that make it a first class citizen within the world of testing.
- Through real-world example case studies, we will demonstrate.
- The difference between performance testing and performance engineering.
- Why performance issues are worse than production outages.
- Outline how performance testing positively impacts revenue, operational expenses, employee productivity.
- Typical user expectations of application performance
Please Note: The presentations are intended for attendees only. The presentations page is password protected – contact firstname.lastname@example.org for verification of attendance and the password to access the presentation.
Mark Tomlinso – nPresident, West Evergreen Consulting, LLC
Mark Tomlinson is a performance engineering and software testing consultant. His career began in 1992 with a comprehensive two-year test for a life-critical transportation system, a project which captured his interest for software testing, quality assurance, and test automation. That first test project sought to prevent trains from running into each other — and Mark has metaphorically been preventing “train wrecks” for his customers for the past 20 years. He has broad experience with real-world scenario testing of large and complex systems and is regarded as a leading expert in software testing automation with a specific emphasis on performance. For the majority of Mark’s career he has worked for companies as a testing practitioner and consultant using the leading products for performance testing, profiling and measurement. He has also consistently established close ties and relationships with the major vendors who create these tools. Mark worked for six years at Microsoft Corporation as a performance consultant and engineer in the Microsoft Services Labs, in the Enterprise Engineering Center and in the SQL Server labs. His efforts to foster the success of Microsoft’s top-tier Enterprise customers was focused on their early adoption of Microsoft products as part of mission-critical operations. In 2008, as the LoadRunner Product Manager at HP Software Mark led the team to deliver leading innovations for performance testing and engineering as part of HP’s suite of performance validation and management products. Mark now offers coaching, training and consulting to help customers adopt modern performance testing and engineering strategies, practices and behaviors for better performing technology systems.