The extremely complex human brain is like a powerful computer, storing memories and controlling how we think, feel, and react. Yet, there are limitations to how the brain processes information. For decision-making to be less taxing on our minds, we have evolved short-cuts in our thinking called heuristics, which include common sense judgments, educated guesses, and the use of intuition.
In dangerous situations, this kind of quick thinking comes in handy; however, when we settle for quick heuristics in our everyday decisions, we can be led astray. Our personal lens creates errors of social attribution, memory, and miscalculations—for example misjudging risk—all of which affect how we assimilate information. These mental mistakes or cognitive biases interfere with our desire to appreciate others and treat people equally, and they greatly compromise our ability to make sound decisions.
There are many ways in which humans can be influenced or biased. Companies and universities require overt biases to be disclosed when reporting findings for studies or even when providing education. However, we also need to be cognizant of the covert, cognitive biases that influence our daily decisions. Even though most errors in our thinking occur unconsciously, we can learn behaviors that de-bias our decisions.
The Top Three Cognitive Biases
Confirmation bias
Who doesn’t want the best person for the job? The fact is we pass most of our judgment about a candidate in the first few seconds of meeting the person. Confirmation bias occurs when we make assumptions about people, essentially confirming our belief when linking candidates’ traits or actions with our preconceived ideas about those traits and actions. Confirmation bias can lead quickly to hiring the wrong person for the job and create an unfair playing field for other applicants. The strength of a handshake, where the candidate went to college, whether or not the person smiles, how he or she dresses, his or her sex, and race are all factors that can influence who gets a job even before considering the individual’s qualifications.
Interviews are a time to recognize subtle, unconscious judgments that our minds might make. Is it possible that the candidate with the weak handshake has a disability affecting his hand? Why would graduation from your rival college make this candidate unsuitable for any particular position? We owe it to our patients to ensure we acknowledge our own reactions as well as the reactions of others, and to listen for clues that the candidates possess the expertise and experience that make them the best fit for the job.
Bandwagon bias
Bandwagon bias, also known as herd behavior, is a common bias that can impact labs in many ways. For example, consider requests from multiple departments to have all samples prioritized as stat. A surgeon hears that the emergency department (ED) is sending all of its samples in red bags so that lab staff can prioritize and process those samples more quickly. Well, what is good for the ED must be good for the operating room and intensive care unit, and pretty soon the laboratory ends up with a stat priority determination for the majority of in house samples. Stat becomes the routine, and patients truly in need of stat priority are now at higher risk of being overlooked. While these departments had good intentions of getting their samples prioritized and processed quickly, their herd behavior overloaded the system.
We love to follow the crowd—it satisfies our desire to fit in. Yet the crowd doesn’t have to be a large department. It can be a small group of co-workers, banding together to make changes without careful consideration of the consequences.
Framing effect
The framing effect occurs when people react to a choice differently depending on whether it is presented as a loss or a gain. Political ads, opinion polls, and advertisements are often framed to elicit a desired response. Sales teams are constantly bombarding labs with framed solutions to their problems. Later, labs may realize the framing effect when these solutions turn out to cause problems.
For example, healthcare administrators considering budget requests may be subject to framing effect biases. The administrator may react more or less favorably depending on how it is justified (framed) and the project’s perceived importance. A large percentage reduction in errors or costs is more likely to be viewed as favorable versus presenting a low absolute number of errors or costs—even though the two choices are identical.
Laboratory leaders asking for budget justifications should expect to be presented with biased views for the requests. Leaders need to consider what may be missing to get an accurate understanding of the proposed sale or request. In summary, whether we admit it or not, bias lives in all of us. Overcoming bias involves acknowledging its presence and then practicing some simple interventions to overcome it.
Overcoming Bias
Many situations facing laboratorians are likely to involve bias. Consider thefollowing five tips for reducing the effect of bias on decision making:
- Recognize that bias is a factor in decision making. Remind yourself of a bias-based error that you have made in the past.
- Make key decisions—such as hiring, technology selection, and budget approvals—deliberately. Include a pause period to see if a decision marinates well over time before proceeding with it. Many times, a colleague will come in your office expecting you to make a decision immediately, but the situation is not urgent. Have the courage to take some time to reflect before making a final decision. Let people know that your pause is not a delaying maneuver, but rather a tactic to make a good decision.
- Use people outside your herd to help with your decision process. They can bring fresh eyes and ears to it, and may not be so easily led by the herd.
- Ask an outsider to look critically at your decision, with an eye for bias-based weaknesses in the decision.
Examples of common heuristics in the clinical laboratory and how they can cause patient safety problems
- Assuming that a coworker has completed a task, such as communicating a lab result, because that coworker is usually reliable.
- Taking a work break at a time when the Emergency Department is usually not busy, but without confirming.
- Hiring a person who has many similarities to the group, but who turns out to be disruptive to the lab’s workflow.
CLN's Patient Safety Focus is sponsored by ARUP Laboratories.
