13 Mental Traps You Need to Avoid

My Prison is an Open Cage by Pensiero

If you want to make good decisions, or at least less wrong ones, it’s important to avoid these common mental traps.

In almost all situations the best way to reach the most beneficial option in a tough decision is solid, rational thought. There’s something to be said certainly for going with your gut at times, particularly in situations where an immediate decision is required to get you out of danger. For bigger less immediate decisions though taking a long objective look at things gives you the best vantage point from which to make the best decision.

The problem is, in a lot of ways our brains suck at rational, objective thought.

We suffer from a host of cognitive biases that disrupt our ability to make good, rational decisions. These likely conferred an evolutionary advantage in the past when focusing on the negative or over emphasizing imagined patterns made you more likely to survive to reproducing age and less likely to get eaten by a Smilodon. In modern times, they tend to just get in the way and encourage us to make bad decisions.

Thankfully we can fight their influence once we know what to look out for. Here are thirteen of the more common ones and some easy ways to counteract them.

The Common Cognitive Biases

There are definitely more than thirteen cognitive biases total, but these are the ones that seem to pop up the most and the ones which have the potential to cause the most problems on a day to day basis. In a lot of respects just knowing about these biases and the tendency of people to default to them can help you avoid them – if you’re aware of the trap you can tell when you’re about to walk right into it.

The Anchoring Effect / Focalism

Focalism is a cognitive bias rooted in our tendency to fixate on a specific number and then base all of our further calculations on that value. That means for example in a negotiation if you set the initial price higher and then ask people how much they think it’s actually worth they will tend to guess higher. It also leads to our tendency to fixate on the price of things on sale in terms of the money saved from the original price rather than evaluating it by the price itself.

If you intend to spend $200 and someone says they have a $500 item on sale for $350 or an item at full price for $190, you’re more likely to evaluate that based on the reduction in price rather than the fact that the final price of the sale item is still more than you intended to spend. In other words you’ll likely pick the item on sale regardless of whether it’s the best choice. This also leads to our tendency to pick the middle option when given a set to choose from.

If you offer people an item for $100, $300 and $1,000 the high anchoring point of the $1,000 option makes the $300 option more attractive than if you were only given the $100 and $300 option.

Unfortunately, the Anchoring Effect is one of the hardest to counteract. Being aware of it doesn’t always actually help. There is some evidence suggesting expertise in a relevant field can help, but it’s inconclusive. As it stands the best way to counteract it is to recognize when faced with a variety of options that you’ll tend to overestimate based on the value of the most extreme anchor point.

Negativity Bias

We tend to fixate more on negative news than positive news. This isn’t just a general observation either, our amygdala (one of the parts of your brain responsible for the creation of long-term memories) is specifically primed to search out negative experiences and make them into long-term memories first. Our limbic or emotional processing system also puts a strong prevalence on negative information and stimuli over positive.

From an evolutionary perspective this was probably useful for keeping us alive in the past. Knowing that fire will hurt you is more important to your survival than knowing that hugs feel good. In modern times though it can encourage us to focus too much on the negative. This can make us excessively risk averse, and interfere with the way we accept criticism and praise.

The best way to combat this bias is to make a concentrated effort to be mindful of all the positive things that happen. Don’t go too far into optimism and begin overestimating the positive things, but be aware of them. Also recognize that you’re more effected by negative input like criticism than you are by positive input like praise.

Neglect of Probability

Humans suck at intuitive estimations of probability.

Even worse, when we actually have the math done for us and know the statistics, we still tend to just ignore probabilities all together. Take most people’s fears for example. A lot of people are afraid of being in a plane crash or killed in a terrorist attack or something like that. At the same time, they don’t think twice about hopping in their car, running down their stairs or eating three pounds of fast food a day.

The fact is though, you’re way, way, way more likely to die in a car crash, or falling down your stairs, or from a heart attack than any of those other things. More people have been shot and killed in this country by toddlers this year than have been killed by terrorist attacks. Chances of dying in a car accident are 1 in 84, chances of dying in a plane crash are 1 in 5,000 at least. More people are shot and killed in a year in the U.S. than the number of people around the world killed by terrorist attacks.

The problem is, even when you give people these statistics their behavior doesn’t change. You can tell someone 200 lbs. overweight that heart disease is the number one killer of people in the U.S. and they’ll still be more scared of someone breaking in and murdering them than they will be of eating crappy food.

The best way to get over this bias is to actually allow statistical information to inform your behavior. When you see a statistic like 1 in 87 people die in a car accident, actually become a more careful and aware driver as a result of it. Worry less about the things that are statistically unlikely and more about the things that are more likely.

Ingroup Bias

The Ingroup Bias ties into our tendency of giving preference to those we consider to be in our own ingroup, our general circles of association. It’s a type of automatic tribalism that encourages people to treat people in their own group better and people in groups considered to be outside of one’s own group worse.

This is reflected in the sense of ‘other’ that is often exaggerated by the force of the Ingroup Bias. We show favoritism toward groups we consider ourselves as belonging to in terms of treatment and allocation of resources and over-estimate the abilities and positive features of those groups.

In it’s milder forms, this can lead to unnecessary competition and the exclusion of people that would actually be helpful to your goal. In it’s more extreme forms it can lead to racism and genocide. It’s important to recognize whenever you start to feel competitive (outside of a genuine competition of course) or start to think of people in terms of ‘them’ vs. ‘us’ that the people you’re referring to in the ‘them’ group are likely not all that dissimilar to you.

The Gambler’s Bias

The Gambler’s Bias, also sometimes called the Gambler’s Fallacy, is the tendency for people to think that past outcomes affect future results of genuinely randomized systems. In other words, people who are doing well at a dice game might say they’re “On a hot streak,” or conversely someone who’s been losing consistently may say they’re “due for some good luck.”

In reality in a random system past outcomes have no effect on future outcomes. If you flip a penny and get 10 heads in a row, the odds of landing a tails is exactly the same on the 11th flip as it was on the 1st. Now what makes this confusing for some people is that the odds of landing 10 heads in a row do differ from those of landing other possible combinations.

Where this gets people in trouble is that they think they’re ‘lucky’ or, in the opposite case, ‘overdue’ for a win. That encourages them to continue to bet beyond when it’s prudent to do so. It also encourages people to overestimate the odds of a positive outcome in situations.

The best way to avoid this fallacy is to understand that the concept of ‘luck’ – an invisible or indeterminate force that tilts probabilities in favor of or against specific individuals – is as imaginary as the concept of fairies or Santa Claus. In a randomized system no matter what your past wins or losses were you are no more guaranteed a win than when you first started.

The Dunning-Kruger Effect

The Dunning-Kruger Effect is basically the tendency for incompetent people to overestimate their ability and for skilled people to underestimate their ability.

Put another way, people who are unskilled or non-proficient in something are much more likely to rate their ability in that thing as being above average. This is mostly because they’re not proficient enough to recognize their own lack of skill and suffer from a general case of anosognosia. On the inverse most skilled people assume everyone else is equally as proficient and as a result fail to accurate estimate their own skill level in relation to others.

Always reevaluate and test your presumptions about how skilled in something you actually are as time goes on and don’t just assume your initial assessment is correct. In general if you think you’re better than average at something you may not be and if you think you’re at or below average you may be better than you think. Don’t assume though that just because you think you’re bad at something means you’re actually good at it – actually test and compare to others.

Selection Bias

Selection Bias is the tendency of people to pick out the examples of something that make a certain pattern while unconsciously disregarding examples that contradict that pattern. It’s similar to the Baader-Meinhof Phenomenon where you learn of something new, maybe a new word or you hear a new song, and then suddenly it starts popping up everywhere you go seemingly by coincidence.

In the same manner Selection Bias causes us to pay attention to the things we’re primed to pay attention to for some reason at the exclusion of other things. This causes us to erroneously perceive patterns that don’t necessarily exist. In the case of the Baader-Meinhof Phenomenon it generally comes down to the thing you’re now noticing everywhere having been there all along but you never payed attention to it until primed to.

This isn’t a terrible bias in terms of causing problems, but it can make people believe or do odd things based on ‘patterns’ they’re seeing that just aren’t there. It also causes us to pick out things that reinforce our current beliefs at the cost of blindness to examples that contradict those beliefs. If you start seeing patterns pop up do some objective analysis and see if there’s anything actually going on there before you start coming up with crazy ideas.

Confirmation Bias

Confirmation Bias is kind of the big brother of Selection Bias in that it causes people to fixate on things that confirm their opinions and beliefs and ignore things that run contrary to them.

Conservatives will watch FOX News and liberals will watch MSNBC. You’ll also tend to associate with people who hold the same views rather than those who hold differing views. In more extreme cases you may also blind yourself to good evidence contrary to your position. A person who thinks dreams foretell the future for example will remember the one time they dreamed about a car accident and then had one and completely ignore the thousands of dreams they had that never came true.

Confirmation Bias can be dangerous because it blinds us to truth for the sake of feeling good about our opinions. A good exercise for working on confirmation bias is to expose yourself to material contradictory to the beliefs you hold on a regular basis. Approach it honestly though, going into it with the attitude of ‘debunking’ it is just another expression of confirmation bias.

Change Aversion

People are terrified of change.

This may tie into the fact that humans experience a sense of loss much more forcefully than a sense of gain. We’re extremely loss averse as well. Regardless of the cause, people are almost always more willing to stick with the status quo than to change things up.

This may not sound so bad at first, but the reason this habit can be detrimental lies in the fact that it encourages us to disregard options that may be objectively, empirically better just so that we can feel better about avoiding change. We will willfully choose the worse situation we have now rather than choose to change things for the better.

This self-destructive homeostasis can stop us from improving our lives and the lives of others. Whenever you’re thinking of making a change, make an objective pro-con list. That way you tally the relative scores up and make a more informed decision as to whether you really should keep things the same or whether changing things up would be more beneficial without the tendency to give more weight to keeping things the same.

Herd Behavior / The Bandwagon Effect

People also really want to fit in.

They want to fit in so much in fact that like the Asch experiments showed decades ago people are even willing to sacrifice their own morality in order to follow the influence of the group. Most people are much more willing to go with the flow and conform at the expense of their own conscience and free agency than they are to actually exercise them in defiance of the herd.

Most people are at least aware of the tendency toward the mob mentality or herd behavior but that isn’t always enough to be able to fight the effects of it. In important decisions or with positions on important topics it’s always important to stop every now and again and closely examine the impetus for why you think the way you do on it. Can you rationally defend your position or choice based on hard evidence? If not, you may need to take a closer look at it, particularly if you’re on the side of the majority since you may just be following the group.

Post-Purchase Rationalization

Post-purchase rationalization is pretty much exactly what the name says it is – the rationalization of the decision to purchase a specific product after it’s been purchased. It’s usually most prevalent in more expensive purchases because more expensive purchase often involve more pre-purchase research and deliberation and hold a lot more emotional investment.

The danger of post-purchase rationalization lies in its ability to blind us from seeing when we’ve made emotional decisions or errors in our reasoning and prevents us from correcting them in the future. If you bought something and it turned out to be a huge waste of money, just admit that you made a mistake and move on. Spending additional effort to overcome the cognitive dissonance between the expectations of the product and emotional investment pre-purchase with the reality of the product and the disappointment post-purchase is just a further waste of your time.

If you want to take this a step further, apply that reasoning to the rest of your life and stop rationalizing your bad decisions away. Recognize why they were faulty and resolve not to do it again in the future.

Projection Bias

There are two biases that are frequently labeled ‘projection bias’, so I’m just going to lump the two together since they really boil down to the same problem – in general we’re very bad at imagining a mind substantially different from our own and, as a result, have a tendency to project our current mind onto all conceptual models of minds we’re working with.

What does that mean?

First, it means we tend to naturally assume everyone else thinks in a manner very much like our own way of thinking. Currently it is impossible for you to really experience any mind other than your own in a direct sense. You can interact with other people and through that develop models and understand that they too have a sense of mind but you can never completely verify it through direct experience.

This is where we get ‘brain-in-a-jar’, Matrix-esque, solipsism arguments. I can’t prove to you, at least not completely, that I’m not a very well-constructed figment of your imagination.

This is a problem because in practice people often think very, very differently. Not just in terms of conclusions but in the methods used to arrive at those conclusions. Assuming that everyone else thinks the same as you is only going to cause difficulty.

Second, it means we tend to be unable to predict our own future states of mind as being anything different from our current ones. Basically, we assume our minds will never change.

Again, in practice, this is almost never the case. Our tastes, preferences and opinions are changing constantly. Something that we want now we may not want in the future and our wants in the future may be for things we couldn’t even fathom now. That makes it very hard to make truly informed decisions on things that will have a large effect on your life far down the road like career choices.

So what’s the best way to overcome this bias? In my opinion a great deal of fiction reading can help. Fiction allows you the closest proxy to being able to occupy another person’s head for a while. It exposes you to alien thought processes and reasoning and helps you develop a much better theory of mind – an ability to place yourself in another’s shoes.

Immediacy Bias

Anyone who’s ever lost weight will be keenly familiar with immediacy bias.

Immediacy bias is the tendency to choose things that offer gratification immediately, even to a net detriment, over things that provide gratification in the future, even to a net benefit. In other words you’re much more likely to choose the thing that makes you happy now (candy, procrastination, etc.) over the thing that will make you happy in the future (healthy food, work, exercise, etc.).

It’s obvious why this is a bad thing – we are more than happy to completely destroy our futures for a little bit of immediate pleasure than to have exponentially more pleasure on a delayed timescale. The immediacy bias is like rocket fuel for self-destructive behaviors.

One way to mitigate the effects of immediacy bias is through cultivation of your willpower. Now, willpower is a finite resource and, while you can build Batman levels of will, you’re going to run out eventually.

In order to assist your willpower it’s best to do things to limit your agency in situations where you know you’re likely to give in to temptation. Like Odysseus ordering himself bound to the sails in order to hear the Sirens without drowning himself, placing obstructions in your way in advance when you know you’re going to be tempted into irrational decisions takes your willpower out of the equation – or at least gives it a big boost.

These may be the most common, but there are a lot of other cognitive biases that lead people into bad decisions. Being aware of our human tendency to make irrational decisions for bad reasons is one of the best first steps in making not quite so bad decisions.

Do you have any good tricks for overcoming some of these mental traps? Any other that you think should be included? Leave a comment and let us know!

Photo Credit: Stefano Corso

Adam is a former English teacher turned personal trainer and writer. He’s addicted to learning, parkour and martial arts. In addition to being a voracious bibliophile Adam’s fascinated by anything related to health, fitness and language. When not studying or training he can usually be found curled up with a good piece of fiction. You can e-mail Adam at Adam@RoadtoEpic.com