Mobile Development and Aggressive Testing: An Interview with Josh Michaels

[interview]

JM: The first is that as I build I test aggressively, meaning that I try to not leave a feature until I've really just fully evaluated it. Because I'm writing the code and testing it, I'm able to really think concretely about the testing matrix and where there's likely to be problems and where there isn't likely to be problems. I try as aggressively as I can when I build a feature to right then test it as thoroughly as I possibly can as one individual.

Now, the challenge is that very often I'll release this stuff and people use it in ways that I don't even think about. Maybe they'll set a couple of settings and I'm like why would you ever want to setting A this way but setting B and C this way? Sure enough people come up with a reason to do it and that's where I can't test it all because there's going to be lots of different ways to set things up that are going to be ways that I don't even think of and that's where I really depend upon the fans and customers who really love the product who function as beta-testing group.

When people contact me who are like, "Hey, I love your app," I always like to offer up, "Hey, do you want to join the beta testing group?" Then when I've got a new release coming out I'm able to contact them on and say, "Hey, you can try it before anyone else." Fans love that. Fans love to get a chance to try it before anyone else. What that does for me is get a lot of people using it in different ways that aren't the ways I'm going to think about, that aren't the ways that are in my list of tests that I have to do before I ship.

I would say I rely very heavily on the fans of the product who I give very early copies of the app too and who give me feedback on where there are problems both from a technical point of view, bugs, things that aren't working, but also usability. I couldn't figure this out, you said this feature is there and I don't know where it is. Where did you put this feature?

JV: Were these ever disgruntled fans maybe who had specific problems like, "How dare this thing doesn't show up at this point?"

JM: I would say half the time these interactions start from disgruntled fans who are writing in to complain about something and it's something that I can't necessarily fix. Airplay is a perfect example of that. When I am able to make improvements or when I come up with like, "Oh crap, I didn't think about it. I could do it this way and that would be a little bit better," I always like to jump to those and say, "Hey, try this out. It's not what you asked for but it's a little bit closer." They're always enthusiastic to see any amount of progress towards what they want as fans of the product.

It took me a little while to realize that when somebody's angry about the product, that passion is just as much love as anything else. It's just they're frustrated, but the fact that they're that into it shows that they really care.

JV: It's like voting. The act of voting shows that there's some caring.

JM: The dude showed up and made a complaint. That's doing a lot.

JV: This is when I first heard you speak at the Mobile Web Development conference here in San Francisco and you were talking a lot about the reviews and how that affects you, so this is a good segue to get into that. It's like confronting the people who are having the negative reactions immediately but then trying to get them to go to help you and join your side.

JM: At the end of the day when somebody contacts me who's angry, my end goal is not only have them leave happy but also have them leave more of a fan than they came in. If I can get them on the way out the door to go write a review, they're going to write a glowing review because they just had a great personal interaction with me and that's really, really hard to top.

If you go and look through the reviews for Magic Window—not for Ow My Balls!; nobody should be subject to read those—If you look at the ones for Magic Window you'll see there are a lot of them that say, "Great customer service." "Developer responded super fast." "Developer implemented the feature I suggested." You see that stuff in the comments and in the reviews, and that's what I love to see and I think that's what other potential customers love to see because it shows that as a developer I'm going to be there to help them if something goes wrong.

JV: They're more willing to spend some money on the product knowing that they're going to get some service back.

JM: When you look at an app that you want to buy and you're looking at the reviews, if you see a couple reviews that are like, "This didn't work and the developer didn't even respond," that's a big warning sign because maybe it's some edge case that doesn't work, maybe it's people who use Gmail in a particular way, but maybe I'm that dude who uses Gmail in a particular way. If it doesn't work I want to know that there's going to be someone there who's respectable and responsive when I try to contract them to solve it.

JV: I want to add on this interview with a real nice story that you had to share at the conference and it's involving the word Beelzebub. Can you explain that?

JM: I'll try to make a long story short there without actually reading the support mail. I received a support mail from a customer who is concerned because their child was playing Ow My Balls!, which they didn't have a problem with.

JV: They didn't, okay.

User Comments

1 comment

About the author

Jonathan Vanian's picture Jonathan Vanian

Jonathan Vanian is an online editor who edits, writes, interviews, and helps turn the many cranks at StickyMinds, TechWell, AgileConnection, and CMCrossroads. He has worked for newspapers, websites, and a magazine, and is not as scared of the demise of the written word as others may appear to be. Software and high technology never cease to amaze him.

Upcoming Events

Mar 27
Apr 13
May 03
Jun 01