Alex Podelko

Published on:
Alexander Podelko

Alex Podelko

Twitter: @apodelko

LinkedIn: alexander

Website: alexanderpodelko.com

Blog: alexanderpodelko/blog

Alex Podelko has specialized in performance since 1997, working as a performance engineer and architect for several companies. Currently he is Consulting Member of Technical Staff at Oracle, responsible for performance testing and optimization of Enterprise Performance Management and Business Intelligence (a.k.a. Hyperion) products. Alex periodically talks and writes about performance-related topics, advocating tearing down silo walls between different groups of performance professionals. His collection of performance-related links and documents (including his recent papers and presentations) can be found at alexanderpodelko.com.

Alex currently serves as a director for the Computer Measurement Group, an organization of performance and capacity planning professionals.


Session - Reinventing Performance Testing

July 31 12-1PM

The industry is transforming before our eyes and performance testing should change too. A stereotypical, last-moment performance validation in a test lab using an expensive record-playback load testing tool is no longer enough. There are many factors driving the change such as:

• Cloud practically eliminated the lack of appropriate hardware as a reason for not doing load testing, decreased the cost of large-scale tests, and significantly increased a number of configuration options.

• Agile development eliminates the primary problem of the traditional development: you need to have a working system before you may test it. Now, with agile development, we've had a major “shift left”, allowing us to start testing early. But that blessing brings need for both automated and exploratory performance testing – both requiring a new mindset and another set of skills and tools [and not many available].

• Continuous Integration brings the need for performance testing to be interwoven with the whole development process. There are no easy answers here to fit every situation and performance testing integration with CI is just taking its first steps. • New Architectures - dynamic, auto-scaling architectures, often heavily using third-party components - changing the way we should must design collect and analyze information.

• New Technologies may require other ways to generate load. In addition to traditional protocol-level record/playback tools, we may use UI-level record/playback or programming.

Performance testing is rather trailing behind the changes and, unfortunately, there are no simple solutions to catch up easily. I am going to analyze all the above factors further in depth illustrating the challenges we have there and what are possible ways to address them. I don’t have ready recipes here – and it may be a long way before we get any. We should practically re-invent performance testing to match the challenges – old traditional approaches don’t cover the need. And we still have the need – there are many areas where performance testing remains the only way to assure performance and reliability of systems [before you run into problems in production].

Session Preview