0
I Have Never Met a Perfect Person: Dealing with an Imperfect World
On October 14, 2016 / By Alan Quintero

Navigate

A man stands in a crowd of people

Standard economic theory is based on the assumption that people are perfectly rational.  In other words, people rationally weigh the costs, benefits and risks before making decisions.  But, except for my wife, I have never met a perfect person.  (I love you, honey.)

A new line of behavioral economists is proving that people make irrational decisions that are driven by biases that can be anticipated.  One such economist, Dan Ariely, summarizes this in his book Predictably Irrational:  The Hidden Forces That Shape Our Decisions:

(We assume) that we are rational… But, as the results presented in this book (and others) show, we are far less rational in our decision making… Our irrational behaviors are neither random nor senseless — they are systematic and predictable. We all make the same types of mistakes over and over, because of the basic wiring of our brains.

These irrational choices that humans make not only affect the economy, but infect every aspect of our lives.  As we studied the root causes of major business “black swan” events – from major oil spills to worldwide automotive recalls – we have identified several of these human factors that must be taken into consideration when designing your company’s operational excellence program.  Organizational structures, policies and procedures, and underlying technology tools must all recognize that humans are not always rational and build checks and balances that account for these human biases.

Groupthink

Groupthink occurs when the momentum of a group influences acceptance of a decision or course of action that may not have been reached by the individual members.  In groupthink, an individual may be hesitant to go against the group from fear of looking dumb in front of the crowd.  They may be thinking “the whole group seems so sure, so I must be wrong.”

In your company, traditional brainstorming exercises may be especially prone to the effects of groupthink.  That is one reason why Trenegy uses the ACEtm Methodology for meeting facilitation.  This approach alternates between convergence (group brainstorming) and divergence (outside party review) cycles to ensure the consequences of groupthink are mitigated.

Confirmation Bias

Confirmation bias is our tendency to view evidence presented to us through the lens of what we believe to be true.  Confirmation bias explains why the same economic data can be seen by the government in power to be good, while it is used by the opposing factions to prove the economy is bad.  Each group is seeing the evidence through their individual belief that their base position is correct.

Your company’s internal reporting can be easily influenced by confirmation bias.  If the definition of the data presented is not clearly understood, and how that information relates to the ultimate company strategy is not effectively outlined, people will tend to draw conclusions based on their own interpretations.  One way to mitigate confirmation bias is to seek out data that may contradict the popular beliefs in your organization and try to uncover why that is.  For example, if the conviction in your organization is that you are a safe work place, solely reporting lost time incidents may only be confirming that opinion (since those types of incidents are few and far between).  Conscious reporting of near miss events and first aid injuries may show that maybe you are not working as safely as you thought.

Normalization of Deviance

Normalization of deviance occurs when unacceptable practices become gradually more acceptable after the unacceptable behavior avoids negative consequences.  Some normalization of deviance is harmless.  The rise of “business casual” dress is a good example.  In the 1970’s President Carter called for thermostats to be raised to save energy during the energy crisis.  This led to business people leaving their ties and jackets at home.  When there were no repercussions from doing so, the acceptable dress for companies continued to drift toward what we now call “business casual.”

In some cases, though, normalization of deviance can have catastrophic results.  The original design parameters of the Space Shuttle’s Solid Rocket Booster O-Rings anticipated no blow-through of gases through the O-Ring.  But early in the Shuttle program some blow-through was observed after launches.  Although minor modifications were made to the design, over time some blow-through became acceptable to the Shuttle team.  In 1986, blow-through of gases through the Solid Rocket Boosters O-Rings of the Challenger resulted in an explosion and death of the crew on board.

The design of safety critical and reliability systems (organizational, procedural, or technological) must include barriers that prevent normalization of deviance from occurring.

Optimism Bias

Optimism bias is the tendency of humans to be overly-optimistic and developing a “this won’t happen to me” mentality.  Dan Ariely found that 95% of drivers believe they have above average driving skills (for those of you who are mathematically challenged, by definition only about half of all drivers can be above average).  This tendency can have undesirable effects.  People may put off diagnostic tests such as colonoscopies, or not wear a helmet because they believe they are unlikely to be in a motorcycle accident.

In companies, decision-making needs to guard against optimism bias.  This is particularly true in the Project Management process.   A pervasive tendency to think “this won’t happen to me” while planning projects contributes to 64% of energy mega projects going over budget, and 73% being delayed (according to a study by EY).   A robust risk identification and mitigation process can help fight this tendency to be overly optimistic.

Expectation Bias

Expectation bias occurs when we hear or see what we expect rather than what is actually happening.  Most of us have experienced this as kids (e.g., your sister was always “the good kid” so when your mother saw the broken lamp she automatically assumed you broke it even though she did, and does not believe you when you tell her so).

Although this example is harmless, expectation bias can have tragic consequences.   In the 2010 Macondo blowout in the Gulf of Mexico, the rig crews were told incorrectly that a critical test had passed successfully.  Because they believed the well was safely secured (even though it wasn’t), they failed to see the indicators of a blow out in the data they were receiving, contributing to a catastrophic blow out that took 11 lives and spilled five million barrels of oil.

When designing your company’s operational excellence programs, proper attention must be paid to these expected faults in the way our minds work.  For example, organizations must mitigate groupthink and confirmation bias by allowing cross-function interactions to occur.  Policies and procedures must guard against exceptions so that normalization of deviance does not set in.  A robust risk management process must force the organization to realize that, despite optimism bias, bad things can happen.   And technology must be in place to ensure the right information is driving decisions so that the organization is not blinded by expectation bias.

Trenegy helps companies successfully manage any aspect of their operational excellence program using proprietary methodologies tailored to our client’s needs. We help our clients get value of out their new system quickly and relatively painlessly.  This is the fifth in a series of articles on operational excellence.