(c) 2001 Esther Derby, www.estherderby.com
This article originally appeared in STQE, March/April 2001.
Let me tell you a little story, a true story, about how our beliefs influence what we see in the world and affect our ability to solve problems.
Two years ago my friend Julia, who was forty-four and a bit portly at the time, starting experiencing troubling physical symptoms. She was fatigued, depressed, and generally uncomfortable. After several weeks, she went to the doctor. The doctor didn’t find anything specifically wrong.
Julia was sent home with a vague diagnosis and a prescription for Prozac. After a while her mood lifted and she felt less tired, but the discomfort continued. Finally, after several months and several more visits, her doctor determined she had a fibroid tumor that was increasing in size. He decided to remove the tumor.
Julia wasn’t happy to be facing surgery, but was relieved that after seven months of discomfort there was a diagnosis and a concrete plan. Two days before surgery, Julia went in for an ultrasound to precisely locate the tumor.
Based on the ultrasound results, Julia’s surgery was canceled. Julia was sent home to prepare for the birth of her daughter–who arrived, full-term, two months later.
Now I’m willing to bet that you guessed the end of this story by the middle of the second paragraph. It’s obvious…if you don’t already have any particular beliefs about Julia.
Julia and her doctor, however, did have a belief, built up over years, that Julia would never become pregnant. And over the course of six months of office visits and medical exams, no one ever suggested pregnancy as the cause of Julia’s symptoms.
We could say that the medical staff were incompetent, but I would say they suffered from a belief problem. Their belief caused them to overlook information that was readily available–and also limited their application of the information they were using as they diagnosed the cause of Julia’s symptoms.
What does this have to do with software?
We all have beliefs about the world and other filters that affect what information we take in. Our beliefs, built up through education and experience, form the internal maps that help navigate the world we live in. Our internal maps can enable us recognize and categorize the vast flood of sensory inputs and think quickly. And often they are very helpful as general models of how the world works.
Other times, our beliefs keep us from seeing what is blindingly obvious to someone with a different set of eyes. It’s “as plain as the nose on your face” to someone looking at it without our particular set of blinders.
Take Tom the test manager, for instance, assigned to a team that had always operated on participative and consensus-based decision making. Tom’s framework for managing relied on his belief that, as a manager, he should entertain input from the group but make all final decisions on his own.
Soon after Tom was assigned to the group, the team was assigned to finish an evaluation of testing tools. Tom read the reports and listened to the group discussion, then closed his office door and decided which tool he favored. At the next team meeting, as he discussed his decision, he reminded the group that “we decided this at our last meeting.” Tom didn’t notice that most of the other heads in the room were shaking back and forth, indicating “no, we didn’t.”
Was Tom a bad manager? Maybe, but it’s hard to say based on one incident. What we see is that because of his belief about how decisions should be made, Tom didn’t ask questions that might have given him direct information about how the group operated, and he also filtered out valuable non-verbal information that would have given him additional clues. As a result, he was far less than effective in working with the team…at least until he became aware that his map didn’t match the territory.
We often don’t consciously account for the existence of our internal maps, which makes them more likely to trip us up–just as Julia and her doctor, and Tom, our test manager, stumbled when their maps didn’t show all the ups and downs of the territory.
Our thinking process happens so fast that it’s extremely difficult to pause the process in the middle and ask, “What unconscious beliefs, filters or maps are influencing me right now?” The challenge is to pause between the time we reach an initial conclusion and the time we act on that conclusionkind of like how we test a piece of software before we ship it to understand the quality of the product and the risk associated with releasing it in its current state. I have four questions I use for this pause in the mental process:
1. What have I seen or heard that led me to this belief?
This question reminds me to really look at what data my response is based on. If I hear myself saying something like “because its always been like that” I send up a tiny little internal red flag
2. Am I willing to consider that my belief or conclusion may be mistaken?
If I’m not willing to consider that I might be wrong, it’s a sign that I’m reacting out of a belief I’m pretty attached toand it’s a clear sign I need to go to the next question.
3. What are three other possible interpretations of this situation
If I can’t think of any other interpretations, it’s time to get some help shaking up my assumptions. I find a colleague I trust and we brainstorm as many different interpretations as we can.
4. What would I do differently if one of these other interpretations were true?
This gives me a wider range of responses to choose from, and increases the chance I’ll choose one that will help solve the problem.When I start to test my conclusions, I can surface and examine my beliefs–my assumptions–about the situation. If I’m willing to admit that my initial interpretation might be inadequate, I can gather more information and represent the situation more accurately. And when I do that I open up the possibility of making better decisions, working more effectively with people, and–coincidentally–building better software.