"Investigators aren't sure" is a phrase that frequently pops up in the media. Information systems workers seem to share this uncertainty. So, what's the secret to success in this "aren't sure" world?
Recently, a British Airways B777 crashed just short of the runway at Heathrow Airport near London. Due to the skills of the pilots and the proximity to the airport, tragedy was averted. The British Air Accident Investigation Branch announced, “The autothrottle demanded an increase in thrust from the two engines but the engines did not respond.” The announcement continued, “Investigators aren’t sure of the reason.”
As I read, “Investigators aren’t sure,” the phrase echoed in my brain. Haven’t I read that before? So I Googled “investigators aren’t sure” and got more than 3,300 hits. Examples include:
- Investigators aren’t sure what instigated an attack.
- Investigators aren’t sure whether the intentionally.
- Investigators aren’t sure how long the body was in the area.
- Investigators aren’t sure if the same crew is responsible for all of the breakins.
- Investigators aren’t sure what kind of gun used.
- Investigators aren’t sure how long they will look or evidence.
- Investigators aren’t sure what she (Britney) is wearing now.
- Investigators aren’t sure if that’s the whole story.
- Investigators aren’t sure they believe it either.
Now, this “Investigator” job sounds like it’s right up my alley. Apparently you don’t have to be sure of anything and the pay is good. But s I was reading the Google search results, it occurred to me that this sounds a lot like information systems work. In our business, we are the investigators, and we aren’t sure of a number of things:
- Business analysts aren’t sure they understand the totality of what the stake holders expect the system to do. In fact, they aren’t sure the stakeholders understand the totality of what the stakeholders expect the system to do.
- Stakeholders aren’t sure business analysts understand enough of the details of their business to really understand the details.
- Developers aren’t sure why they’re required to write code before the requirements are complete, correct, and consistent. They’re not sure why those least technical decisions have he authority to do so.
- Testers aren’t sure about either the documented requirements or the system under test. They aren’t sure what they can safely assume about the requirements. They aren’t sure which are the most important parts of a system to test. They aren’t sure if they have created test cases with the capacity to detect hidden defects. They aren’t sure if they have created sufficiently varied data. They aren’t sure why all the people on the front end of the project get paid more than they do.
- Project managers aren’t sure if they’ll be given the resources they really need to carry their project to a successful conclusion.
All aren’t sure why the others don’t recognize the intellectual creativity and prowess required to do what they do.
We live in an “aren’t sure world, so I’m constantly surprised at how many people are “totally sure” of their positions on anything and everything. In software development, people are sure of this elicitation process, this requirements notation, this programming language, this test technique, and this project management approach.
Giving us fair warning about our “aren’t sure” world, Mark Twain wrote, “It ain’t what people don’t know that hurts them, it’s what they know that ain’t so.”
But “aren’t sure” applies even to this oft-quoted saying—some attribute it in-stead to Josh Billings  while others claim it was Will Rogers  who penned it.
So what’s the secret to suc-cess in an “aren’t sure” world? It’s no secret. It’s the Plan-Do-Check-Act (PDCA) process cycle invented by Walter She-whart, popularized by W. Ed-wards Deming, and incorpo-rated into every iterative/in-cremental software develop-ment process model today .
Plan and do are ubiqui-tous. Everyone does
|What's the Deal with Investigators?||594.43 KB|