Standard economic theory is based on the assumption that people are perfectly rational. In other words, people rationally weigh the costs, benefits, and risks before making decisions. Except for my wife, I have never met a perfect person. I love you, honey.
A new line of behavioral economists is proving that people make irrational decisions driven by biases that can be anticipated. One such economist, Dan Ariely, summarizes this in his book "Predictably Irrational: The Hidden Forces That Shape Our Decisions."
"(We assume) that we are rational... But, as the results presented in this book (and others) show, we are far less rational in our decision making... Our irrational behaviors are neither random nor senseless -- they are systematic and predictable. We all make the same types of mistakes over and over, because of the basic wiring of our brains."
These irrational choices that humans make not only affect the economy, but infect every aspect of our lives. As we studied the root causes of major business “black swan” events, from major oil spills to worldwide automotive recalls, we have identified several of these human factors that must be taken into consideration when designing your company’s operational excellence program. Organizational structures, policies and procedures, and underlying technology tools must all recognize that humans are not always rational and build checks and balances that account for human biases.
Groupthink occurs when the momentum of a group influences acceptance of a decision or course of action that may not have been reached by individual members. In groupthink, an individual may be hesitant to go against the group from fear of looking dumb in front of the crowd. They may be thinking: the whole group seems so sure, so I must be wrong.
In your company, traditional brainstorming exercises may cause people to be especially prone to the effects of groupthink. That's one reason why Trenegy uses the ACE Method (accelerate, collaborate, execute) for meeting facilitation. This approach alternates between convergence (group brainstorming) and divergence (outside party review) cycles to ensure the consequences of groupthink are mitigated.
Confirmation bias is our tendency to view evidence presented to us through the lens of what we believe to be true. Confirmation bias explains why the same economic data can be seen by the government in power to be good, while it's used by opposing factions to prove the economy is bad. Each group is seeing the evidence through their individual belief that their base position is correct.
Your company’s internal reporting can be easily influenced by confirmation bias. If the definition of the data presented is not clearly understood, and how that information relates to the ultimate company strategy is not effectively outlined, people will tend to draw conclusions based on their own interpretations. One way to mitigate confirmation bias is to seek out data that may contradict the popular beliefs in your organization and try to uncover why that is. For example, if the conviction in your organization is that you maintain a safe workplace, solely reporting lost time incidents may only be confirming that opinion since those types of incidents are few and far between. Conscious reporting of near-miss events and first aid injuries may show that you might not be working as safely as you thought.
Normalization of deviance occurs when unacceptable practices become gradually more acceptable after the unacceptable behavior avoids negative consequences. Some normalization of deviance is harmless. The rise of business casual dress is a good example. In the 1970’s, President Carter called for thermostats to be raised to save energy during the energy crisis. This led to business people leaving their ties and jackets at home. When there were no repercussions from doing so, the acceptable dress for companies continued to drift toward what we call business casual.
In some cases, though, normalization of deviance can have catastrophic results. The original design parameters of the Space Shuttle’s Solid Rocket Booster O-Rings anticipated no blow-through of gases through the O-Ring. But early in the shuttle program, some blow-through was observed after launches. Although minor modifications were made to the design, over time some blow-through became acceptable to the shuttle team. In 1986, blow-through of gases through the Solid Rocket Boosters O-Rings of the Challenger resulted in an explosion and death of the crew on board.
The design of safety, critical, and reliability systems (organizational, procedural, or technological) must include barriers that prevent normalization of deviance from occurring.
Optimism bias is the tendency of humans to be overly-optimistic and develop a this-won't-happen-to-me mentality. Dan Ariely found that 95% of drivers believe they have above average driving skills. For those who are mathematically challenged, only about half of all drivers can be above average by definition. This tendency can have undesirable effects. People may put off diagnostic tests such as colonoscopies or not wear a helmet because they believe they are unlikely to be in a motorcycle accident.
In companies, decision making needs to guard against optimism bias. This is particularly true in the project management process. A pervasive tendency to think this won’t happen to you while planning projects contributes to 64% of energy mega projects going over budget, and 73% being delayed, according to a study by EY. A robust risk identification and mitigation process can help fight this tendency to be overly optimistic.
Expectation bias occurs when we hear or see what we expect rather than what's actually happening. Most of us have experienced this as kids. For example, maybe your sister was always “the good kid,” and when your mom saw the broken lamp, she automatically assumed you broke it and didn't believe you when you said you didn't.
While that example is harmless, expectation bias can have tragic consequences. In the 2010 Macondo blowout in the Gulf of Mexico, the rig crews were told incorrectly that a critical test had passed successfully. Because they believed the well was safely secured when it wasn't, they failed to see the indicators of a blow-out in the data they were receiving, contributing to a catastrophic blow out that took 11 lives and spilled five million barrels of oil.
When designing your company’s operational excellence programs, proper attention must be paid to these expected faults in the way our minds work. For example, organizations must mitigate groupthink and confirmation bias by allowing cross-functional interactions to occur. Policies and procedures must guard against exceptions so normalization of deviance doesn't set in. A robust risk management process must force the organization to realize, despite optimism bias, bad things can happen. Technology must be in place to ensure the right information is driving decisions so that an organization isn't blinded by expectation bias.