Automation in Sports Video Game Testing: An Interview with Fazeel Gareeboo


In video game development, the pressure is on to add more features to sell more games. Getting buy-in to automate testing can be a challenge. Fazeel discusses the sports video game development environment and how to convince product owners to spend precious resources on automated testing.

Fazeel Gareeboo will be presenting a presentation titled "Game On: Automating Sports Video Game Testing" at STAREAST 2014, which will take place May 4–9, 2014.

About "Game On: Automating Sports Video Game Testing":

Sports video games are generally on a short cycle time—tied to the start of a particular sport’s season. Like all video games, the pressure is always on to add more features to sell more games, and the list of "cool" features is endless. Getting buy-in to implement automated testing in this environment can be a challenge. And once you get that buy-in, your next challenge is to ensure it provides significant value to the game team. Fazeel Gareeboo shares the lessons they learned at EA Sports—lessons you can take back to your project. Fazeel describes the sports video game development environment and discusses how you can overcome the reluctance of product owners to spend precious resources on implementing automated testing. Fazeel addresses how to get buy-in from the team and finally how to make your product—game or not—more successful using automated testing.


Cameron Philipp-Edmonds: So today we have Fazeel Gareeboo, and he will be speaking at STAREAST 2014, which is May 4 through 9. He will be doing a presentation titled "Game On: Automating Sports Video Game Testing." Fazeel is a software development director at EA Sports Tiburon. Fazeel manages a team that provides automated game testing for the studio. Before moving into management, he developed computer-aided design software and wrote device drivers for Windows and MicroStation. Fazeel has a special interest in automating any repetitive work and in creating great teams for the kinds of work that computers cannot do yet. He grew up on the island of Mauritius and worked in Europe before settling in the United States. Anything to add to that?

Fazeel Gareeboo: No, that sounds good.

Cameron: OK. Because you’re doing a session titled "Game On: Automating Sports Video Game Testing," which covers the world of testing as it relates to video games, I would like to ask you some related questions. So, the first question is: What makes testing sports video games different than testing any other video games, or any other software, for that matter?

Fazeel: I would say the big difference between testing games and most other software is that technically, most other software, there is a fixed set of functionality that you're writing to, while with games, generally it's speculative. What you are going to add to creating the game is your judgment of what is going to make the product sell. And technically, most teams tend to put as many features, as many bells and whistles, because they obviously want to sell the game. So the bias tends to be more on doing more features and doing more bells and whistles. And I think that's the challenge—that creates a challenge for testing because you always have a limited amount of time. So the focus is "Well, let's focus on getting the features in, because that's what's going to sell the game."

Cameron: Right. So it's more about how it looks and the appearance of everything as opposed to how it actually runs.

Fazeel: And the functionality. And the focus is being pushed towards "Let's make the game. Let's have the features. Let's make it something fun," before they look at testing. It tends to be more pushed towards the end, not like the traditional "build it, test it, release it" model.

Cameron: And I imagine that with sports video games, because of the way that sports season run, it would be more of a strict timeline to testing.

Fazeel: Yeah, and that's another constraint that is added to the sports video games. It's exactly right. It's typically like a nine-month turnaround time. We don't really have twelve months because we are dealing with a release, and then you have some downtime, and then you're ready for the next one. So, it's more like nine months of development time that you have. And, yeah, you can't really typically push the release date.

Cameron: And you're a key proponent of automated testing, and you even discuss in your session how to overcome the reluctance of product owners to spend precious resources on implementing automated testing. So, what about automated testing is so appealing to you?

About the author

Upcoming Events

Nov 05
Nov 14
Dec 05
Jun 03