Bad Inferences – Fallacies and Biases

Overview: What better way to avoid relying on weak inferences than knowing what they look like? This page lists common fallacies and biases.

We are never going to have complete certainty when choosing a causal or predictive inference. We just have to go with the strongest ones we can generate. Unfortunately, our brains are not set up to take the time to generate and evaluate lots of inferences. Instead, we programmed to think quickly, so we tend to generate one (or maybe two) and  act on these without much analysis.

Weak inferences tend to follow patterns, which can be lumped into a larger category of logical fallacies. These include:

1. Hasty Generalization – Making a claim about a large group based on a small sample. You infer that all dog’s bite because one dog bit you or that all New Yorkers are mean because one spat on you. This is the root fallacy of racism and prejudice, among other ugly shortcuts that our brains make to save time.

2. Post Hoc Ergo Propter Hoc (Correlation is not Causation) –  This is simply a weak causal inference. This infers that two events are connected simply because they happened at around the same time. Betty walked into the restaurant; ten minutes later, the building was engulfed in flames. Therefore, Betty must have burned it down. This fallacy is the root of superstition – you base your beliefs on a weak causal link. Despite understanding this fallacy, I continue to secretly believe that several of my own wardrobe and consumer choices have helped the Boise State football team flourish over the last decade.

3. False Dichotomy (Either/Or Fallacy) – You assume that there are only two inferences available when there are, in fact, many more. If you hear a bump in the night you might infer it was either a ghost or a burglar, even though there are many, far-stronger inferences available. If you are a salesperson, you can take advantage of this by presenting a buyer with a terrible product followed by whatever product you actually want to sell. It doesn’t matter that there are thousands of products out there. The buyer may very well assume that these two options are the only two available.

4. Slippery Slope – This fallacy is a predictive inference claiming that one action will unleash a flood of terrible events, even though we have safeguards in place to ensure that this doesn’t happen. This fallacy might also be described as a string of weak predictive inferences: If I leave my desk, someone will steal my computer, which means that I won’t be able to do my work, which means I will be fired, which means I won’t have enough money to buy food, which means I’ll die alone in the gutter.

In addition to these fallacies, a collection of biases pull us towards weak inferences:

1. Availability Bias (or Availability Heuristic) –  We make inferences with the information that is readily available to our brains. Not only is this information a small fraction of what’s out there, but our brains gravitate to information that is sensational, weird, or otherwise emotionally compelling. Fear of flying is largely based on this bias. Planes are outrageously safe, but we don’t hear about the thousands upon thousands of flights that land safely every hour. Instead, we hear disproportionately about troubled flights. A good way to conquer a fear of flying is to hang out at the airport and watch planes take off and land all day. Fear of sharks follows a similar pattern. The movie Jaws ruined the ocean for many of us.

2. Confirmation Bias – This is a tendency to generate and positively evaluate claims and inferences that support a previously held view of the world. If you feel strongly that flying is dangerous, you will notice more information that seems to verify your belief. Similarly, I seek out articles establishing Boise State football as the greatest program ever, and I quickly dismiss any dissonant or contradictory reports. This bias is significantly more dangerous now that the internet allows us to feed exclusively on information that supports our political, religious, social, and scientific points of view.

3. Suggestion Bias (or Priming Effect) – This is an exceptionally powerful and sneaky inclination to be unduly influenced by recently acquired information. Much of this happens unconsciously based on the context of the moment, which can include our associations with the various sights, sounds, and the general context around us. We generate and favor inferences that jibe with what we just experienced. Young people tend to move more slowly after being primed with words that they associate with old people. (This is called the “Florida Effect.”) A slick salesperson can use this to his or her advantage. There are many important priming effects for education, perhaps none more important than Stereotype Threat.

4. Sunk Cost Fallacy – We tend to stick with inferences and ideas that we have somehow invested in. We can invest with our time and energy or simply by being emotionally tied to one inference over another. For example, if If I pay for a study on the relationship between bee deaths and cell phone towers, I will favor that inference over others, even if the evidence isn’t as compelling. Similarly, if I have spent a lot of time explaining how much I hate Lebron James, I will continue to point to his negative attributes despite the flood of evidence that he is the greatest basketball player of all time.

5. Perspective Gap – We don’t know what it’s like to be other people, so our inferences about them are based on our own needs and preferences. If Pat is stressed out about work, I may insist that he join me on a weekend fishing trip to help him relax. Because I find fishing relaxing, I assume that he feels the same way. In fact, Pat hates to fish, and he infers that I am taking him out to the woods to rob him and leave him for dead.

6. Correspondence Bias – We infer that a person’s mind, including his or her intentions and abilities, match his or her actions.  While it’s true that this is often the case  (I take a drink of water because I’m thirsty, and I’m smart enough to know that this will do the trick), any action can be motivated by a wide variety of intentions, actions are perceived differently by different people, and people often succeed and fail for reasons that don’t reflect their skills. For example, when I accidentally drove through a red light, other drivers infer that I was arrogant and in a rush, pedestrians infer that I was a crazed lunatic, and I infer that this was just a little bad luck because I’m normally a wonderful and charming driver (see Fundamental Attribution Error).

7. Fundamental Attribution Error – When we mess up, we prefer causal inferences that incorporate the special circumstances of the situation. When others mess up, we prefer causal inferences that point to flaws in their respective characters. I failed that math test because I had to stay up all night with my sick puppy. Pat failed because he’s an idiot and never does anything right.

8. Transparency Effect – We weakly infer that our own inner thoughts and feelings are clearly communicated through our expressions, words, and actions. In fact, people really don’t have any idea what you are thinking and feeling. Other people are mostly attending to their own thoughts rather than the thoughts of others. Plus, others happen to be pretty bad at inferring inner thoughts from outer actions (see Correspondence Bias).

9. Spotlight Effect – We think everyone is watching us, but they aren’t. We overestimate how much others notice what we are doing, and many of our self-conscious inferences (Everyone is laughing at my bad haircut!) are far weaker than we think.

10. Impostor Syndrome – We think that we don’t belong or don’t deserve the success we have achieved despite the fact that we’ve worked really hard and are perfectly competent. This is a bias born of excess humility. It plagues new students, new employees, and lots of really nice people.

11. Anchoring Effect – We greatly over-value the first piece of information we get about a person, place, idea, or any other subject. If Brad sneezes on me the first time I meet him, I will feel a sense of disgust thereafter when I see him, and I will interpret all of his actions as signs of his messiness. (He’s washing his hands? Gross. They are probably full of snot.)

12. “Mere Exposure” Effect – We tend to value people and things that we are familiar with, even slightly familiar with, more than people and things that are new. Familiarity implies safety in many cases, which is enough for us to prefer the known over the unknown.

13. Halo Effect – The tendency to infer from one trait a matching value for other traits. If we notice a one good or bad quality of a person, we tend to extend that good or bad judgement across every aspect of that person’s character. For example, it’s good to be charming, so I might infer that a charming voice on the phone is being delivered by a good-looking, virtuous person. This is likely why we are so easily fooled by charming liars.

14. Essentialist Bias – Our deep drive to classify and categorize the world leads us to infer deep commonalities between people or things based on one shared trait. Perhaps the most damaging instance of this is the belief that the members of a human group (like a “race” of people) share some invisible essence. See Hasty Generalization above.

This article from Business Insider provides a great overview of these and several more cognitive biases.

Note: Daniel Kahneman’s Thinking, Fast and Slow digs deep into these biases and effects. My goal is to make much of this material digestible for students. Regrettably, some important distinctions and nuances get lost in the simplification process.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s