David Dang, with Zenergy Technologies Inc., led the engaging session "Mobile Testing Tools 101." There were some technical issues to begin with and the screen wasn't working, but David jumped right in and started his presentation without the aid of his slides. He was quite funny and asked several questions of the audience, requesting people raise their hands or provide answers or opinions, and the attendees seemed to appreciate being involved. His speaking style made the session feel more like a one-on-one conversation, like you're just having a fun chat with someone who happens to know a lot about mobile testing tools.
David started by talking about the challenges of mobile testing, including verifying data, functionality, location base, and user experience. He also spoke about the importance of testing. David said the rate of regression failure from a best practices standpoint is less than 1 percent, so trying to achieve that rate requires lots of diligent testing. One major challenge discussed is that the mobile realm uses so many different operating systems. Depending on what mobile device you're testing for, you may have to deal with iOS, Android, Windows, or Symbian, each of which will require a different test case. There are also the many upgrades and versions of the different operating systems.
David's first tip is to establish manual testing strategies. He recommends using agile methods, and when he polled the audience, most of those who raised their hands to indicate that they work in mobile testing also raised their hands to show that they use agile practices.
He also highlighted the test cases that are required with mobile devices that you never need to test for with desktop or laptop computers. For instance, the smartphones and tablets we use have touchscreen capabilities that need to be checked, such as the swiping, scrolling, and pinching functions. Mobile devices also need to be turned 90 degrees to test how images render in landscape rather than portrait orientation. (Try doing that with a desktop computer.)
Speaking of picking stuff up and turning it over, David polled the audience again with the question, "Which is better, testing with the real device or with a simulator?" Answer: Both. Each testing method has its advantages, with the physical device tests letting you try out those touchscreen functions and the simulated tests letting you replicate and repeat test cases. He said if you use agile practices, you definitely need to use both methods of testing.
Another tip: If you're in mobile testing and aren't tracking analytics, you should start. David said the analytics for one company he did work for showed that of the mobile users going to their website, about 70 percent were Android users, and about 30 percent were iOS users. The company then wanted to do more testing and put more resources toward its Android functions. But after studying the second stage of analytics, it was discovered that of the mobile users actually buying things off the company's website, about 80 percent were iOS users and about 20 percent were Android users. So study all your analytics.
And finally, a tip offered in response to a delegate's question at the end of the session: When it comes to choosing a testing tool, David recommends trying it out with the salesperson, but with your app, not theirs. He said whatever app the salesperson shows you will always test perfectly and easily, so you should try it with something of your own to see how much work you'll realistically have to do day to day.
P.S. Be on the lookout for the tech company David's going to start whenever he decides to go into business for himself. He said he's going to call it Dang IT!, and hs tagline is the best: "If you find yourself saying 'Dang it,' call Dang IT!"