Archives
Fall 2018
Select Page
Jabian.com
The Bias Field Guide: A Roadmap for Success
Bias is all around us. It creeps into our thoughts and infiltrates our decision-making. Sometimes we’re aware of it and sometimes we aren’t. In our feature section, we identify the many biases built into everyday life, how to spot them, and how to use them to your advantage for business success.

Part 1: Built to be Biased

We are programmed to make thousands of decisions a day. How can we know we are being objective about the ones that really matter?

Take a moment to think about the following questions:

  • Which would you choose: 4 ounces of popcorn for $3, 9 ounces for $6.50, or 10 ounces for $7?
  • What is more likely: getting killed by a cow or a shark?
  • Which media channel is less biased: MSNBC or Fox News?
  • If the roulette wheel has come up red nine of the last ten times, which do you choose: red or black?
  • When markets are down, do you check your account balance more or less often?

And most importantly, what does any of this have to do with day-to-day life and work?

You may be surprised that how we answer these questions affects each of us every day. They involve the concept of “decision-making bias.” We’ll look at why decision-making biases exist, provide examples familiar to both your personal and professional lives, and give tips on noticing these biases and mitigating their effects.

Background of Decision-Making Biases

Decision-making is the process of selecting a logical choice from the available options. Many variables affect the decisions we make every day. Over the past several decades, there has been a lot of research into decision-making and the evolution of how humans make decisions. It is considered a part of psychology, behavioral economics, and strategy research. While the academic world now offers a lot of information on what influences decisions, many are unaware of strategies that can improve decision-making.

Picture this scenario: You have come into the office early to wrap up a project for your boss that is due first thing this morning. Lo and behold, your boss is also using this morning to catch up on other things and is about to walk right by your cubicle! You begin to feel overly stressed. Should you hide or run? Should you pop out of the cube and immediately start asking about her weekend to keep her distracted? She is seconds away; what do you do?

In a business workplace, we do not have the same stressors and threats people faced in previous centuries. Safety and order in our society have improved considerably and, except for a few extreme occupations, most of us do not fear injury or death at work. However, the “fight or flight” instinct still exists, as do many inherent biases made necessary as humans evolved.

Nearly all decisions are biased. For the most part, it is good to make subjective decisions. Biased decision-making enables you to make many decisions in a day, from choosing what you want for dinner to pressing the brakes when you approach a red light. We are wired to make these decisions to save time and, even more importantly, to save our lives. This allows us to spend time on our higher priorities. However, some of the biases inherent to how we developed our civilization can actually hinder us in the workplace.

Love at First Sight: Anchoring Bias

Imagine you are preparing for a long-awaited interview. As you walk in, you are greeted by the candidate with a hello, a name, a handshake, and a resume. As you sit, your brain already begins processing information about this person: “Do I like her? Should I hire her?” How quickly do you think your brain has made a decision about this candidate through the interview process? When it is over, or maybe even halfway through?

According to studies recalled by Malcolm Gladwell in his book Blink, you make a hiring decision within the first two to three seconds of meeting the candidate. That means, when you shook hands and made eye contact, you already anchored to a decision. If you read her resume or LinkedIn profile, you may have made a decision before she stepped into the room. You will end up spending the remaining time in the interview confirming or disproving your initial decision.

This is an example of an anchoring bias: overvaluing the initial set of information received toward a decision. This bias explains why you may think you are getting a great deal on a new pair of jeans (40 percent off “retail”); why curb appeal matters for home values more than we would like to admit; and why many of you probably opted for that huge
$7 bucket of popcorn!

In the workplace, one example of anchoring involves project estimation. In general, people are terrible at estimating. How often does a project you worked on either deliver less scope than originally expected, run longer than anticipated, or exceed the budget? This often happens because a team is given a target date to complete the project. Instead of working out sound estimates and giving a realistic depiction of what can be completed within that time frame—or what a realistic time line would look like—the team will work to fit the estimates into the time frame given.

Types of Biases

Ambiguity Effect

Tendency to avoid or rule out options that have missing information.

A 1990 study showed that people were reluctant to vaccinate a child if the vaccination could cause death, even if deaths from the vaccination were extremely rare compared to deaths caused by the disease itself.1

Anchoring Bias

Tendency to fixate on initial information and devalue subsequent information.

In an experiment run by Dan Ariely (author of Predictably Irrational) at MIT, only 32 percent of students chose a “web only” subscription to The Economist when two options were given (web only at $59; print and web for $125). When a third option was introduced (print only for $125), 84 percent of students now chose the print and web option.2

Availability Bias

Recent or impactful events weigh more heavily on the decision-making process; also, valuing information at hand more than information that may be more difficult to gather.

People buy lottery tickets because the media and commercials so often show the winners. Winning the lottery looks more common/accessible. In fact, the odds of winning any amount in the Mega Millions lottery are less than 7 percent (the odds of winning the jackpot are 1 in 258,890,850). 3

Availability bias can lead us to think that fatal shark attacks are more common than death by cows because we hear about them in the news more often.

The More You Know: Availability Bias

You are getting ready to launch into a new market. Your costs are well within what was estimated and expected, the demand exists for the service, and your partners are ready to distribute as soon as you say “go.” Do you go ahead with the price that worked well for the last launch, use your current pricing, or wait for your business partners to provide their input from a competitive pricing analysis? Waiting could cost money, but so could setting the wrong price. Will that pricing analysis really turn up much new information?

We call that initial urge—the comfort to charge ahead with existing information—availability bias. People often overemphasize the value of the information they have and lessen the value or importance of gathering and incorporating information they do not have. In January 1986, NASA had a major decision to make. Launch day at Cape Canaveral was an unseasonably cold 31 degrees. Flight engineers had lots of checks and procedures to follow before launch, but none of the information really gave a good sense of what would happen at a temperature that cold. Lost in the sea of charts, diagrams, and procedures was a readout of the rocket booster’s O-rings—the gaskets between adjoining sections of the boosters. The data was presented in a way that was difficult to interpret. The rest is history: 73 seconds into the launch, Space Shuttle Challenger exploded, killing seven crew members. Although availability bias does not always blind us to life-or-death situations, it can lead us to think that fatal shark attacks are more common than death by cows because we hear about them in the news more often (cows do not tend to be nearly as newsworthy).

Types of Biases

Blind Spot Bias

Tendency to think you are less biased than other people in similar situations.

In a study from 2002, after having unconscious biases explained, 63 percent of participants still rated their self-assessments as accurate and objective.4

Confirmation Bias

Tendency to look for information that supports a preconceived notion.

In 2015, fivethirtyeight.com (run by Nate Silver) published an interactive feature
called “Science Isn’t Broken.” Depending on which variables about the economy you choose, you can prove with statistical significance that the Democratic and Republican parties both improve or degrade the economy while in office. 5

Framing Effect

Presenting the same information in different ways to try to get an audience to have a certain reaction.

When presented with meat that was 75 percent lean and meat with 25 percent fat, participants in a 1988 study were more likely to choose the 75 percent lean meat.6

They Are Who We Thought They Were: Confirmation Bias

Tomorrow is the deadline to deliver the operational analysis for which your business unit lead asked. He had a hunch there were significant inefficiencies in the upstream sales organization, but wanted you to validate the hypothesis. He sent a note to an analyst to pull some raw data for you and gave you a list of people to interview. Sure enough, the data showed sales were at an all-time low. None of the stakeholders you reached had anything positive to say about the sales team. Analysis completed…or not?

This happens all too often in the workplace. Someone has a hunch or hypothesis that something is or should be occurring, then finds the data and people to back up the theory. We call this a confirmation bias. It occurs when someone selectively uses data or information to support a position and downplays information that does not support the case. How easy was it to justify the last impulse purchase you made? It is much easier to build a story and then find the data to support it than it is to allow the objective data to guide you down the right path. Depending on your political stance, MSNBC and Fox News are equally biased and as wrong or right as a media outlet could be!

Types of Biases

Gambler’s Fallacy

Thought that future probabilities are influenced by past events when, in fact, the probabilities are unchanged.

In 1913, gamblers lost millions of francs betting against a roulette wheel that, for
26 times in a row, landed on a black number. Gamblers still often follow this mentality today.7

Groupthink

The inability of people in a group setting to disagree with the common thoughts discussed by the group.

Enron, thought at the time to be a very prosperous and growing company, operated under a groupthink philosophy. That mentality resulted in one of the largest business failures in U.S. history.

Loss Aversion

The utility lost by giving up an item is greater than the utility gained from acquiring the same item.

Investors frequently focus on one investment losing money when the rest of their portfolio is gaining value.

Sixty Percent of the Time, It Works Every Time: Outcome Bias

It is two weeks from the next release launch, and today is your go/no-go meeting. Even though you can delay a release if necessary, there is a ton of pressure to get these new features out the door. The focus groups and beta testers loved it. The media is buzzing about it. The new marketing campaign is snazzy, and finance has built some impressive projections for the revenue the launch will bring. However, that QA team always seems to be behind. They estimate another three weeks to test everything, which would delay the product launch a week.

Should you delay? You have a top-notch development team. For every release thus far, no major bugs have ever been uncovered during testing—even for releases where they skipped testing and saved a ton of money. Surely that means there will not be any issues this time, right?

As you might imagine, skipping tests for a major release is a recipe for disaster. We call this an outcome bias. People tend to think the better idea is the one with the better outcome, regardless of the intent or the likelihood of something happening. The team may not feel testing is needed based on the past. But they are forgetting why testing is part of the process in the first place. Because of outcome bias, gamblers tend to flock to the “hot” blackjack table or bet on red to come up again at the roulette table. In fact, the best investment option is to walk away: Both red and black each have a 47.4 percent chance of winning, no matter what the previous outcomes were.

According to research by Loewenstein and Seppi, investors tend to review their holdings 50–80 percent less often in a down market versus a rising or flat market.

If I Ignore It, It Will Go Away: Ostrich Effect

The annual employee survey came out almost a month ago, and there it is, sitting right at the top of your inbox. “My employees love me,” you tell yourself as you continue to ignore the document and move right along into the day’s tasks.

As you continue to work through the emails, you see it: One of your best and brightest has put in her two weeks’ notice. You do not understand; how could she leave? You love this company and the work you do; doesn’t your team?

Annual surveys, customer feedback, yellow/red status updates, and financial results are just a few things we ignore because of the ostrich effect. Our desire to enjoy life and stay happy can lead to us to impulsively ignore negative information.

Have you ever avoided looking at your latest bank statement, but cannot wait to open your child’s report card? We are drawn to positivity, but for things we know either subconsciously or consciously are going to be bad, we often drag our feet. According to research by Loewenstein and Seppi, investors tend to review their holdings 50–80 percent less often in a down market versus a rising or flat market. If you are avoiding your portfolio, you are not alone.

Types of Biases

Ostrich Effect

Tendency to avoid negative information

Research by George Loewenstein and Duane Seppi determined that people in Scandinavia looked up the value of their investments 50–80 percent less often during bad markets.8

Outcome Bias

Tendency to value a decision based solely on its outcome and not by the decision’s quality at the time of the decision

Gamblers continue to play in casinos because they think they could win a large sum of money.

Overconfidence

Tendency to believe you are more certain than you truly are for a given question

Blockbuster was presented with the opportunity to buy Netflix for $50 million in 2000, but decided not to purchase Netflix (now with a market cap of greater than $50 billion). The decision was based on the current movie-rental market, without the vision to see the changing landscape.9

Identification and Mitigation Techniques

For each of these biases (and many more), there are a few ways to both notice you are succumbing to them and mitigate their effects.

Understand the problem you are trying to solve.

Take time to reflect on the situation at hand and try not to be reactive. Remember that what you initially see may not always be what it seems.

Collect diverse input.

Reach out to those who might be able to play devil’s advocate or could provide a different angle than you or your team can. Incorporate that feedback into your decision.

Think forward to what consequences a decision will bring.

Instead of reacting and focusing on the decision that needs to be made, think forward to the potential consequences of your decision—and whether they’re good or bad. If possible, compare those consequences to those of alternative decisions.

Be willing to compromise.

In some cases, there are clear-cut decisions; in others, the solutions’ outcomes are not as black and white. Be open to suggestions and modifications to ensure buy-in for the decision.

Let multiple hypotheses drive data, not the desired outcome.

For the decision at hand, build multiple hypotheses and try to prove or disprove those with the data available. If conflicting hypotheses can be proven, you may not have enough information at hand to make the decision.

Embrace curiosity.

Do not be afraid to ask questions. The adage, “Better to keep your mouth closed and let people think you are a fool than to open it and remove all doubt” could not be further from the truth when trying to solve a tricky problem.

Reference and learn from experiences and situations.

As we mentioned, many biases are good. Leverage your knowledge and experience to guide you and your team to an effective decision.

Individually, none of these mitigation techniques will solve your problem. In addition, we often fall victim to multiple biases concurrently. Leverage these techniques in combination with one another to effectively mitigate biases you regularly encounter in the workplace and your personal life.

We encounter a great deal of decision-making bias daily. For most, moving forward with a biased decision is appropriate. For the small subset of important decisions requiring more objectivity, we should be mindful to recognize and avoid biased decisions. At the least, it is nice to know the beach is a safer family vacation option than the farm!

Reactance

Tendency to make a choice opposed to the guidance received because of the perceived loss of freedom of choice

Frequently seen with parent/child relationships; kids tend to want to go to their friends’ houses more when their parents tell them they are not allowed to go.

Stereotyping

Basing one’s perception of another on a generalization of that person’s gender, ethnicity, personality, etc., without having actual information about that person

Participants from a study by Greenwald and Banaji were asked to pick out famous names from a list. In cases where they chose names that were fictionalized, they chose males names by a 2-to-1 ratio.10

Zero-risk Bias

Tendency to prefer complete
elimination of smaller risks rather than an alternative that produces a much greater risk reduction, but doesn’t fully eliminate the risk

The Delaney clause, part of the Food Additive Amendment, outlaws cancer-causing additives in foods (regardless of the actual risk and other ingredients that could cause other complications).11

References

  1. Ritov, I., & Baron, J. (1990). Reluctance to vaccinate: omission bias and ambiguity. Journal of Behavioral Decision Making, 3, 263-277
  2. TED Talk, Predictably Irrational, etc.
  3. https://www.durangobill.com/MegaMillionsOdds.html
  4. Pronin et al. (2002)
  5. fivethirtyeight.com
  6. Levin and Gaeth (1998)
  7. The Universal Book of Mathematics: From Abracadabra to Zeno’s Paradoxes
  8. Zweig, Jason (September 13, 2008). “Should You Fear the Ostrich Effect?” The Wall Street Journal. pp. B1
  9. https://variety.com/2013/biz/news/epic-fail-how-blockbuster-could-have-owned-netflix-1200823443/
  10. Greenwald and Banaji (1994)
  11. https://medical-dictionary.thefreedictionary.com/Delaney+clause

Part 2: Planning for Bias

Understanding the variety of biases that can affect our decision-making in process design.

Think back to the year 2000. Most of us felt good about surviving Y2K and entered a new millennium with a strong U.S. economy riding the tech bubble. At the time, if you wanted to watch a movie, you either went to a theater or rented a video tape from a brick-and-mortar video rental store—likely Blockbuster. As the clear market leader in a well-established industry, Blockbuster sat atop the video rental world. It had shaped the future of the industry, as it had for a decade. Or so it thought.

In its market-leading position, Blockbuster benefited from a process that was not customer-centric—and did nothing to change it. The video rental behemoth profited from late fees. The company attributed 15 to 20 percent of its revenue to late fees in a given year.1 Feeling invincible, Blockbuster saw no reason to change its model or listen to startups pitching new business models.

In the spring of 2000, one specific startup pitched its revolutionary business model: a delivery subscription video rental service with no late fees. The startup’s executive team nearly got laughed out of the Blockbuster CEO’s office for proposing a $50 million valuation to integrate with Blockbuster’s enormous footprint and create a “click-and-mortar” video rental model.2 You may have heard of this startup. It’s called Netflix, and it now has a market cap of about $160 billion.3

Why would Blockbuster’s leadership not realize that a better, more customer-centric model existed? As the existing state of business processes become ingrained in leadership’s thinking, it is easy for them to become near-sighted about competition, strategy, and how processes affect customers.

Count this among the many reasons Blockbuster ultimately filed for bankruptcy and liquidated its assets: Its leadership was unable to empathize with its customers and recognize how its processes undermined customer loyalty to the Blockbuster brand.

A big cause for this oversight is bias in its processes. Among the many biases Blockbuster executives showed was overconfidence bias: They thought they could do anything their competition could do, but better.

Biased? Who, Me?

To set the stage for identifying bias in process design, we must first provide background by defining and classifying decision-making biases (lest we fall victim to optimism bias: thinking each of you has read our previous article on decision-making biases).

Nearly every decision we make is biased. In most cases, this is a good thing. Instead of analysis-paralysis, our instincts and experiences help us narrow down the choices to a manageable set of options. What shapes our final set of options or choices are cognitive biases.

In some cases, biases are helpful. Think about the last time you went to a networking event. You either approached or were approached by a stranger, and both of you started to ask general questions: “How are you?” “What do you do?” “Where did you go to school?” Through the small talk, you formed an opinion about that person and began to ask leading questions—questions to confirm the assumptions you’ve made about that person.

You might also subconsciously start to mirror the language and actions of the other person, even if those traits are not normal for you. These may seem irrational (or you may not even be aware you do this at all), but you employ the confirmation bias and mirroring to improve the chances that you build a strong social connection with the other person.

In other words, biases help you gain an acquaintance and potentially a business partner or friend in the future.

In other cases, biases are unhelpful. They can limit information or reduce options in certain situations, which can lead to terrible results much further on. Take our introductory example: Overconfidence bias in Blockbuster’s business model led it down a path of irrelevancy and, ultimately, liquidation. We must acknowledge, however, that assuming Blockbuster could easily have decided to purchase Netflix is an example of hindsight bias.

While the effect of this example is larger than most process decisions in your daily life and business, you never fully know how the bias you use now will affect future results and outcomes. However, employing basic tactics to avoid as many biases as possible should improve your company’s ability to operate.

What Are We Doing Here?

Let’s set the stage with an example we’ll draw upon throughout the article: We want you to imagine that you’ve agreed to plan the next family trip. Before you and your family get to departure day, you must consider a significant number of variables.

The process potentially started years ago. Where have you been in the past that you’ve enjoyed? What about the rest of your family? Are there places you’ve been that they haven’t had a chance to experience? Or is your family a collective creature of habit? Are you looking for time to just “get away,” or does the family need to spend some time together bonding, outside of the normal routine?

Depending on the goal of the trip, what you do, where you go, and how long you stay could vary widely.

You must also think through the logistics of getting there. Are you driving, flying, or taking another mode of transportation? Which is the bigger concern: cost or time? Must everyone leave and arrive at the same time, or is simply getting there the only concern?

Additionally, you must consider what to take with you. What’s your destination’s climate that time of year? Does your transportation mode constrain what you can bring? Can you buy what you need when you get there, or does your preparation make or break the trip?

As you can see, there is a process that you follow each time you take a trip. Next, we break down the stages of process design and discuss how bias might influence your decision.

Process Design

Goal Setting and Scoping

As we do with nearly any type of project, we must start with the end in mind. Too often, we forget to ask the most important question: Why? We know something can be improved, but without a goal, the improvements could actually undermine the overall process.

Our first bias crops up here: the curse of knowledge. We tend to think others have the same information we do when making decisions. This is especially important when setting goals. Because we think others have the same information, we assume their “rational” choice is to have the same goals for a project’s outcome as we do.

Also, for whom are we designing the process? Without considering the end customer, we could design a process that’s great for the operator and terrible for the customer.

In our vacation example, we must set a goal for designing the best process we can: Are we looking for efficiency first, or cost savings? What about enjoyment? Are we able to cope with intense stress throughout the trip, or do we want to enjoy the ride? Are we traveling with others and need to consider them, or are we going solo?

By starting with the outcome we’re trying to achieve, we purposely employ another bias in our favor: outcome bias.

Suppose we have a family of four and we all agreed that our high-level goal is to reduce everyone’s stress as a part of this trip. We’ve agreed we have about a week to get away and that flying will be much less stressful than driving to our destination.

Now that we’ve determined our goal, how much process and planning should we apply to the details of the trip? Do we want to change how we plan each day, or do we want to tackle a pain point from the past? We could go all out, from luggage choices and security pre-screening needs to unpacking—and starting to work back up afterward.

We could also just choose one small portion of that process. We could easily get caught up by including too much scope—or leaving too much out—but scoping is another important step to ensure you and others focus your energy on the critical decisions.

In our example, we’ve traveled enough to know that we do well once we’re on the plane, en route to our destination. We also do really well managing our time during the vacation. What stresses us out, however, is what happens before we board the plane to our destination.

Scoping also allows us to avoid the analysis-paralysis that inhibits many improvement or design projects. When scoping, it’s not just important to capture the boundaries of the process you’re solving for, but the information you’ll need to solve for it as well.

The need for too much information is called information bias, the belief that the more information you have, the better the decision. Sometimes, having too much information (especially irrelevant information) can slow down your ability to make a decision, or even lead to a sub-optimal decision.

When planning for trips (and in other situations), we often tend to fall victim to this bias. In our example, we could think that a place that sleeps 12 is a better value than the one that sleeps eight, but if we have four, does it really matter? Why do we care if our vacation spot has high-speed Wi-Fi if we don’t plan on bringing our computers?

To limit the amount of information we need to make the most optimal decision, we must determine and prioritize the factors that truly influence the outcome we want.

If our goal is to reduce everyone’s stress, we may opt to take the time to interview to get TSA pre-check status, take our time with packing, and drive back roads to the airport.

If our goal is to minimize the time for packing and going through security, we may stuff everything into as few bags as possible and check our luggage through to our final destination. If our goal is to reduce stress, the airline baggage fees may be irrelevant. If the goal is reducing time, who cares how many of our toiletries we can bring with us?

As we do with nearly any type of project, we must start with the end in mind. But without a goal, the improvements could actually undermine the overall process.

Process Definition

Once we’ve determined our goal for the overall trip and scoped what we want to improve, the next step of the process design is to define the process.

First, we must map out the high-level process as we, the end customers, experience it. Too often, processes are mapped out with egocentric biases; in other words, the process design and definition is weighted too heavily on the person defining it and not from the point of view of the person at the center of the outcome.

In our example, we’re concerned only with reducing stress. What happens with our luggage once we drop it off is irrelevant for our process design and outside of our control.

For our scope, we start with the high-level steps of:

  • Pack luggage
  • Get to airport
  • Check in
  • Clear security
  • Board plane

At this level, we can determine whether we have enough detail to determine where we need to focus to improve the process—or whether we need more. To avoid information bias, we each assign a value of stress to each step of the high-level process. After doing so, we find that getting to the airport and clearing security are where most of our stress is generated.

From here, we more thoroughly define the stress-inducing areas of the process. For getting to the airport through clearing security, we walk the process and find that today, we follow these steps:

Get to airport
  • Get luggage in family car
  • Get in family car and drive to airport
  • Park at the airport
  • Leave car and board airport parking shuttle
  • Arrive at airport terminal
Check in
  • Download airline mobile app
  • Check in before arriving to the airport to reduce steps required once there
  • Use curbside check-in to avoid crowds at check-in area inside airport
  • Ensure baggage reference numbers show up in mobile app
  • Proceed to security
Security
  • Stand in line
  • Find ID and boarding pass
  • Present ID/pass to TSA agent
  • Get to security conveyor
  • Take off jacket, belt, and shoes
  • Unpack toiletries
  • Load all items on conveyor
  • Walk through security checkpoint
  • Retrieve and put on jacket, belt, and shoes
  • Repack toiletries
  • Leave conveyor and head toward gate

Building more detail into these steps allows us to evaluate more deeply where the stress points of the process are. It also produces improvement opportunities. At the higher level, we cannot change the fact that we have to pack our luggage and clear airport security (unless we buy everything at our destination or avoid the airport altogether—options we ruled out during scoping). But we can start to change things at this level.

False consensus bias would have us think something that works for us will work for everyone.

Analysis and Design

Now that we’ve added more definition to our process, we should add data to the steps. We know our goal is to reduce stress in the process. Now, we want to avoid what is called shared information bias, the tendency to spend more time on information we each know instead of information the whole group does not.

We could spend time talking about why we have to go through security at the airport in the first place, but that’s neither new nor relevant to our outcome. After some discussion, we each identify the major stressors in the process at this level:

  • Fitting everything and everyone into the family car
  • The uncertainty about when the airport parking shuttle arrives
  • Having to take off your belt and shoes
  • Having to unpack and repack your toiletries at the security checkpoint

Now that we’ve identified our major stressors, we can just solve for them one by one, correct? There are four of us. Suppose one person solves for each stressor and shares the result back to the group. How often are we asked in our day-to-day world to divide an issue, solve it, and share our solution with our team members?

If we go about process analysis and design in this fashion, we’re bound to suffer from two more biases: availability bias and false consensus bias.

Availability bias, or the tendency to focus only on the information available to you, is problematic when teams are asked to divide and conquer. Instead of creating solutions that are beneficial to the entire process and outcome, solutions can be localized to the point of irrelevancy. For example, if we’re focused only on solving for the uncertainty of airport parking shuttle arrivals, we may make a localized improvement opportunity that routes us to a separate airport parking facility. If we had a broader perspective, we might see solutions that weren’t apparent just solving for the shuttle issue.

On the other hand, false consensus bias would have us think something that works for us will work for everyone. If we’re tasked with reducing stress over removing belts and shoes, we may ask everyone to wear gym clothes and flip-flops to the airport. That may work for some in the group, but may not be acceptable travel attire for others.

Instead, we should attack each of these issues as a group and determine the root cause of the issue.
If we do this, we might find we can eliminate two major stressors (packing the car and waiting for the shuttle) by hiring a rideshare service to bring us to the airport (root cause of both issues: using our own car). We could eliminate the other two issues by applying for TSA pre-check (root cause: going through the standard security process).

As we walked through planning a trip, you can see the number of decisions required. More importantly, you can see how biases significantly affect the efficiency and effectiveness of the process. We only scratched the surface with the decision-making biases introduced here. Knowing which biases you are susceptible to, and the impact of each, allows for better process design from the beginning and less rework on the back end.

Our family trip example is a process that affects our personal lives. The same process design framework and biases apply to business decisions, from small ones that affect only a few people, to large ones that affect the longevity of a business, such as Blockbuster’s decision to spurn Netflix.

Sources

  1. Anderson, Mae and Liedtke, Michael, “Hubris—and late fees—doomed Blockbuster,” NBC News, 23 Sept. 2010, https://www.nbcnews.com/id/39332696/ns/business-retail/t/hubris-late-fees-doomed-blockbuster/#.WwhI50gvzD4
  2. Graser, Marc, “Epic Fail: How Blockbuster could have owned Netflix,” Variety, 12 Nov. 2013, https://variety.com.
  3. “Netflix, Inc.” Google Finance, 25 May 2018, https://www.google.com/search…nflx…financte.
Share This