Josiah: All right. Today, I'm joined by Andy Grabner, a technology strategist at Copmuware APM. Andy, thank you very much for joining us today.
Andreas: You're welcome. I'm happy to be here.
Josiah: Great! First off, could you tell us just a bit about your experience in the industry?
Andreas: Sure. I've been working… I was in the performance industry for the last fifteen years. I may not look that old but I've been actually… I started my career with a company called Segway Software. Back then, we built load-testing tools like Silk Performer, which a lot of people may know.
I've been switched over to other products like Silk Test which was like on the functional testing side. Seven years ago, I then joined a company called DynaTrace, which is now part of Compuware APM, which I'm still working for. Basically, the problem that we try to solve is instead of breaking applications with load-testing tools, we wanted to figure out what seems to be wrong in these applications when they break.
Actually, I follow one of my colleagues who founded DynaTrace. He used to be my colleague at Segway. Then I followed him because he builds DynaTrace and I thought it would be perfect. We've been breaking applications for so long and we built up a lot of expertise and knowing which metrics to look at, and how to do load testing right.
Then we wanted to know, what do we need to look at, what do we need to tell our customers on what to look for within the application? Why it may break and what they've did wrong in architectural decision.
Overall, I've been in the industry for almost fifteen years now and run into many different roles. Started as a tester, I was an architect, a developer, and product manager. Now I'm working as… we call it a technology strategist. I'm trying to share my knowledge with the people out there who blog at conferences where I'm talking this year. Also, we deal with customers and make them successful when it comes to performance management.
Josiah: You'll be talking at the upcoming STARWEST event in Anaheim. Your talk is called, "Checking Performance along your Build Pipeline." A lot of what you'll be talking about are the small changes along the process.
Why do you think that people tend to ignore the impact small changes can have on performance and scalability?
Andreas: Developers like to build new features. Obviously, they're on this big pressure to come out with new features more frequently. I think there’s still the thought, "Hey! In the end, there's somebody who is testing my software anyway. I’d rather spend my time in focusing on what I'm paid for, that's basically building new features and somebody later down the pipe will take care of load testing and then tell me if anything is wrong."
Unfortunately, the way we develop software, the longer you wait, the longer it will last until you’re actually getting into the testing phase. More of these small changes add up to bigger problem.
The real reason is, there's a thinking of "Hey, my role is developing and I'm developing new features. Your role is a tester and you're going to test it and then you tell me what's wrong." I think that's just a mentality change that we need to educate people more. It's not about… there's testing in the end. Testing needs to be continuous, they have to keep up with it.