GUEST COLUMNIST With biases all around, how can you ensure sound preparedness decisions?

 (CIDRAP Business Source Osterholm Briefing) – The nature of our response to the 2009 H1N1 pandemic has moved from urgent to watch-and-see. This is the ideal time to take stock of what happened during the last 14 months and to make decisions about how we prepare for and deal with what comes next. We know that pandemics are a fact of life; another will emerge at some point unknown. We simply don't have the ability yet to banish infectious disease threats from our list of risks to business. But we do have the ability to examine our response to the pandemic in an as unbiased way as possible—if we so choose.

Fortunately, reports of lessons learned from the H1N1 experience in a wide range of areas are coming out almost daily. We have a wealth of data to consider. Unfortunately, we face an uphill battle trying to synthesize so much data into pearls that we can use to make sound decisions. If you've invested the time to do a thorough analysis of your pandemic response, you certainly want to be confident of your findings. Just as programs are implemented to improve the quality and lessen the variability of your organization's products and services, the goal of this column is to provide you with the tools to identify and counteract cognitive and decision-making biases so you can improve the quality of your most important preparedness decisions.

I'd like to recommend to you a simple tool that will help ensure you can do that. It's a framework for looking at biases that can undermine our best efforts to draw optimal conclusions. In this column, I will:

  • Describe five biases that you are likely to encounter as you evaluate and plan for the future
  • Offer an illustration of each bias
  • Give a brief "reality check"
  • Provide simple, practical ways to steer clear of these biases

1. Hindsight bias: 'Predicting' the past

Hindsight bias creeps in when we begin to believe that we "knew it would happen that way all along." In addition to potentially annoying colleagues, this bias can cause more substantive problems if we rely on these supposed predictive abilities. Coupled with the biases describe below, this bias may lead to an unwarranted affirmation of the quality of decisions based on your perceived predictive abilities.

Example: "From the start I just knew this whole novel H1N1 pandemic threat was overblown."

Reality check: If your intuition told you last May that this pandemic was not going to be as bad as that of 1918, you were certainly right. However, correctly guessing a coin flip once does not mean the next time you will be correct. The lesson here is to be frank with yourself now so that you don't deceive yourself in the future. Business leaders every day seek to make the best decisions possible based on the best available information. Make sure you don't check this practice at the door when making decisions related to infectious diseases. Be honest with yourself and your colleagues about what is known and what is not. Allowing hindsight bias to convert speculations and hunches into "facts" is surely no recipe for positioning your organization for a favorable outcome.

What you can do to keep this bias in check:

Don't be scared to not know the answer. When an answer does not exist, don't create to make yourself and others feel better. Instead, describe the known and relevant facts and the spectrum of likely scenarios. As Mark Twain once observed, and Dr. Osterholm recently highlighted in a recent column, "It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so."

2. Confirmation bias: Confirming you're right (whether or not you are)

When we suffer from confirmation bias, we seek to find information that confirms our preexisting perceptions. If we are not finding what we are looking for, we may try to interpret the new information in such a way to support our original perception.

Example: On a recent Sunday afternoon I was approached at a stoplight in downtown Minneapolis by a woman trying to find the city's farmers' market. She asked if the direction she was headed was correct. It was not. I shared with her that there was another farmers' market in the general direction she was heading, but that it was not open on Sunday, and the one she was looking for was in a different location. She promptly returned to her companions and confirmed that, yes, they were, in fact, heading in the right direction.

Reality check: Even if your initial perception is "correct," by failing to seek and truly hear alternative views, the result is a less nuanced understanding of your own position, other's positions, and how your position may be improved with minor adjustments and modifications. While headed to a farmer's market on a Sunday afternoon that isn't open on a Sunday is sheer folly, the consequences of such a bias are surely not dire. Unfortunately, this type of bias does not naturally recede when the stakes rise.

What you can do to keep this bias in check:

  • Focus on listening after you ask a question to test the validity of your view. Don't ask leading questions, such as: "Stockpiling antivirals is really the only way to go, isn't it?"
  • Verbalize the answer you receive to confirm that you are hearing what was said, which may not necessarily be what you wanted to hear.
  • Seek out information that may be contrary to your current view, and then evaluate the merits of that information.

3. Common-information bias: Focusing on what we all know, instead of what we all need to know

When we meet someone new, our first instinct is to identify and discuss common information, experiences, and backgrounds. While this strategy is an effective way to "break the ice" and build working relationships, it can lead to problems if it shapes decisions of consequence. Groups tend to focus on common information that members are already aware of instead of probing for the unique pieces of information each person can contribute.

Example: When assessing the potential impact of the H1N1 pandemic on organizations, meetings that focused on news reports everyone had read served as a common source of information. But an overemphasis on this information to the exclusion of data known only to certain group members means missed opportunities to inform the discussion of what course of action to take. For example, did you get the full perspective that an employee whose spouse worked in the intensive care unit may have had to offer?

Reality check: When the answer is apparent, the need for discussion is low; however, when tough decisions combine with uncertain direction, make sure you truly are basing your decision on the best information available. And that may not necessarily be the most commonly known data.

What you can do to keep this bias in check:

  • When meeting with a group, define the issue as a problem that needs to be solved rather than courses of action that need to be judged. Changing the tenor of the group in this way can elicit new important facts and alternatives that may be withheld by colleagues who think they're expected to make quick judgments.
  • Request that all relevant options are discussed before the decision is made or a vote is cast.
  • Actively search out unique information by asking individuals in the group to "wear a different hat" (ie, role) than the one to which they're typically accustomed. During the conversation, for example, assign such roles as naysayer, "just the facts," "only sees the positive," and "thinks outside the box." Rotate the "hats" each person wears until you surface as much information or as many new perspectives as you can.
  • Encourage people to not make assumptions about what other group members already know. It's impossible for everyone at the table to know what knowledge is, in fact, common. If individuals self-censor because they think they'll be redundant, invariably they will withhold pertinent and unique information as well. If information shared is common knowledge in the group, simply note it and move on.

4. Omission bias: 'Darned if you do, less darned if you don't'

In the face of ambiguous threats, doing nothing or taking a minor action that it is unlikely to have a material impact is "safer" than taking bold action because, all things being equal, we tend to judge action resulting in an adverse outcome harsher than inaction that results in an adverse outcome. This is where leadership comes in. True leaders are able to put aside the greater personal risk associated with action when it is the best course.

Example: Changing your human resource policy related to absenteeism is no small task. For professionals 2 years ago, avoiding the issue was a safer choice than taking action to modify the policy for an influenza pandemic that might not materialize on their watch.

Reality check: Every day each one of us must face the degree of personal risk we are willing to absorb in the face of decisions between action and inaction. Practically speaking, the increased risk of being judged harshly may not be worth the effort or pushback of championing warranted action. In the case of threats to mission-critical activities and the health and welfare of a workforce, however, we must be honest with ourselves about our true rationale. Is an investment in pandemic preparedness a poor idea because it is a waste of money, or is it a poor idea because if the money is viewed as wasted we will be at greater risk of criticism than if we chose not to recommend any investment and something happened?

What you can do to keep this bias in check:

  • Question your motives. Are you not taking action because that's the prudent thing to do or because inaction means less risk of critical judgment and its associated consequences?If you decide that the personal risk associated with inaction outweighs action that you perceive to be better for the organization, at least be intellectually honest with yourself about your true rationale.
  • When evaluating people who have made decisions to take action, consider the risk of inaction and whether it would have been a better choice given the best information available at the time of the decision. If it was not, be careful to not judge based on outcome alone. Of course, results matter, but in the face of uncertainty, even the decisions of the best and the brightest can result in a suboptimal outcome.

5. Escalation of commitment: Justifying suboptimal decisions by amplifying them

Continued investments in a particular course of action can empirically serve as evidence of the sound nature of the original decision to take those actions. Escalation of commitment occurs when we continue to make investments to justify decisions that are leading to undesirable outcomes.

Example: Some organizations chose to implement screening measures (eg, scanners that measure body temperature) to keep sick people out of the workplace. But say, for example, evidence indicated that this measure was not reducing the incidence of influenza-like illness (ILI) in the workplace, and the organization opted to double the amount of screening stations anyway. The evaluation should ask whether doubling screening capacity was likely to make a dent in the incidence of ILIs, or whether evidence was ignored in favor of justifying the validity of the original decision.

Reality check: Some courses of action cannot be easily corrected, such as significant investments in fixed assets that prove unproductive. Sometimes, in these cases, additional investments must be made to make a bad situation less worse. If that happens it is critical to acknowledge the poor outcome of the original decision and to make clear that the additional investments do not further the original faulty strategy, but are necessary to correct for its poor outcome. If these additional investments are viewed as an escalation of commitment to an unsound strategy, you and your organization will lose credibility and buy-in.

What you can do to keep this bias in check:

  • Seek counsel from an uninvolved person you can trust. Ask this person to evaluate the decision so you can compare your thinking with that of someone who is not burdened with any commitment to the initial decision. Ask how he or she approached the evaluation and consider whether any biases factored into your decision-making process.
  • Foster an environment of iterative improvement, not one of make-or-break decisions. As a leader within your organization, focus on how improvements can be made from lessons learned from the 2009 pandemic. Allow your colleagues the freedom to make course corrections instead of burying their missteps through an escalation of commitment to a futile cause. When things go awry, emphasize how to improve the process instead of who was to blame.

Bottom line for organizations

As you continue to seek the best courses of action in preparing your organization for infectious disease threats, be aware of these cognitive and decision-making biases. Consider how you can minimize their impact in your own problem solving and how these biases influence the thinking of others in your organization. We know one thing for sure: Ignoring these tendencies will not make them go away.

Aaron Desmond, MBA, is director of business preparedness at the Center for Infectious Disease Research and Policy.

This week's top reads