Why We Think We’re Right

In late September of 2013, Congress was trying to come to an agreement about the budget for the 2014 fiscal year. They couldn’t agree on funding for the Affordable Care Act—Republicans voting to defund it while Democrats resisted. By the end of September, they still hadn’t made a decision, so the spending plan couldn’t be solidified. Without that plan, the government can’t run, and so it came to a halt.

From October 1–16, 2013, the government was shut down.

This led to approximately 800,000 government employees losing their jobs indefinitely, and another 1.3 million required to work without knowing when they would be paid again. People who had nothing to do with the decision-making process were being affected by other people’s inability to come to an agreement and make a decision.

When making decisions, people often don’t think about how they came to the decision that they did. They don’t stop to wonder why they’ve chosen one side of an argument, and why that one side is right while the other is wrong. People know what they think is right, and leave it at that.

In general, this process isn’t taken any further. But some people have made a career out of studying it. Kerri Pickel, a psychology professor at Ball State University, studies cognition, or how the brain works. She explained that our brains naturally put things into categories to help sort information, and it’s something that happens in all individuals.

However, some—and she would argue most, to an extent—people take this a step further and develop biases, which is something no one is quite able to explain. A bias is a departure from rationality, but there’s no exact explanation for why they come about.

People have all kinds of cognitive biases hardwired into their brains that basically control how they make decisions and choose sides—an answer to the previously mentioned “why.” These vary from person to person in type and how prevalent they are.

Pickel explained that not everyone seems to have these present in the same way as others. Not everyone has the same experiences or makes the same choices, so their biases aren’t going to be the same, logically.

For example, Pickel says that one person could dislike someone based on an experience they’ve had with a person who’s similar. Another person, who didn’t have that same experience, might not hold that same feeling of dislike. No one’s experiences are the same, so the biases that an individual has will tend to show up differently.

These biases aren’t always bad or problematic, though. They work in conjunction with one another to help individuals assure themselves that they are making the right decision, or have the correct opinion. It becomes a problem when this isn’t the case—when someone is wrong and doesn’t want to be wrong, or when a person uses their bias to justify discriminatory behavior.

A Group Mindset

Nora Hopf sat on the curb outside of an abortion clinic in Louisville, Kentucky. She could see women walking to and from the clinic, but she wasn’t allowed to talk to them—the attendants wouldn’t permit it. She wanted to make sure they knew they had options, that they were loved and they could talk to her, or anyone, about what else they could do. But she, along with the group of pro-life students from her hometown of Jasper, Indiana, were prevented from doing that.

They had been taken there as a part of the group Teens For Life, which was sponsored by her church. The advisors wanted the kids to see what an abortion clinic was like. She felt overwhelmed with how much she wished she could help the women, and that’s one of the reasons she decided to make that her own, personal cause.

Nora has been pro-life for as long as she can remember, and she says that most of the people in her life are supportive of this and don’t oppose her views. She first got involved when she heard about the group from people at her church, and she has stayed with a similar set of people since coming to Ball State. She joined the Ball State Students For Life group as a freshman and even became president as a sophomore, a position she continues to hold.

One of the reasons that Nora may have these opinions is due to a cognitive bias called the false-consensus effect. Obviously, there isn’t one single factor that entirely determines how opinions are formed but who one surrounds themselves with does have to do with which ideas and thoughts one is exposed to.

Oftentimes, the people associated with an individual the most, such as family and friends, feel similarly on important topics, such as politics or religion. This is the main component of the false-consensus effect. A person sticks to those who are like-minded—people who affirm his or her beliefs. Because these people hold the same opinions, they tend to believe their opinion is the correct way to think.

This is the false-consensus effect in a nutshell.

When an individual actively associates mostly with people who feel the same as them on a regular basis, they begin to develop a mindset that it is the norm. Since others around an individual feel a certain way, and confirm that person’s mentality, there is an assumption that all people feel the same.

Pickel says that when this bias is present, we go through a process of determining what we should believe and what’s the right opinion. We start by thinking of our own belief and what we personally feel about a subject. For example, in Nora’s case, she thinks that women should consider all of their options before having an abortion.

Then, we move on to what we know others believe. However, we don’t know what every person believes, and we aren’t able to. So we go with what’s available: The opinions of people we know. We tally up their beliefs and compare them to ours: Are they pro-life or pro-choice?

Usually, according to Pickel, the people we know share the same beliefs as us. This is mostly true in Nora’s case. She grew up in a conservative town, and the people she chose to associate with were also pro-life. Thus, when a person goes through this process, he or she comes to the conclusion that most of the people close to him or her feel the same, and therefore this opinion is the correct one.

Yet the false-consensus effect isn’t the only mind trick at play here. There are other ways we convince ourselves that we are right and our way is the way.

An Irrational Feeling

Emily Skelton was in highschool when a friend of hers introduced her to someone they knew and thought she would get along with. The person was a straight, white, cisgender male and Emily was immediately scared of him. She didn’t have a reason—she had just met him and he had never done anything to give her that impression, or make her scared of him. But because of a traumatic past experience with a person who was similar to the guy she was introduced to, Emily was afraid he was going to hurt her, like the previous man had.

She avoided him for a while before finally realizing that she was acting irrationally, and in a biased way.

This happens every time Emily meets a man who shares the same characteristics as the one from her past, and she can’t necessarily control it. She can only recognize and attempt to counteract it.

Emily’s experience demonstrates confirmation bias.

This is a pretty individualized cognitive process, meaning that it doesn’t necessarily depend on other people to form. Instead, it bases its influences on past experiences. With this, a person sticks to his or her own beliefs and rejects others by actively refusing to take in information that contradicts their own feelings. At the same time, they seek out information that supports their feelings, taking it to mean that they are, in fact, right. They confirm their preexisting beliefs and reject or ignore the others.

Pickel said that when this is at play, the person wants to be right, and wants to support his or her stance on the topic, so they won’t even pay any attention to information that threatens the validity of their point of view. For Emily, this means that at first, all men who remind her of the past experience are a threat, and not much can convince her that they aren’t trying to hurt her.

This process happens first by forming a hypothesis, or a thought, about a certain topic. In Emily’s case, due to her experiences, she has an aversion toward straight, white, cisgender men. Because she feels that this type of person is inherently bad, she only sees information that proves that feeling she holds. When a person is experiencing confirmation bias, like Emily, they take information that validates their views as proof that their thoughts about the topic are correct, without considering the other side. Essentially, this boils down to finding a fact and taking it at face value, never questioning its accuracy.

Pickel said that this is problematic because it’s not the proper way to test a hypothesis, or in this case, a thought about a topic. It could mean that people are defending their beliefs without ever really having considered that they may be wrong, or that there may be another way. Scientifically speaking, the result that they’ve come to can’t actually be trusted, since it never went through a process of comparing it to the opposite.

In other words, sometimes we have no idea what we’re talking about.

An Emotional Defense

Right before the presidential debate in September of 2016, Josie Weaver and Alex Abasi went to a friend’s house to watch the debate together. When it came time for it to start, there was a disagreement about which channel to watch it on.

One roommate suggested CNN. Josie got upset and said, “You would put it on CNN because you agree with them.”

Alex said something similar about Fox News. Both of them felt that those channels would be biased in how they represented the debate. They thought that the outcome would be misrepresented, because they each felt that the other’s station’s commentary would not be objective.

Even though all news outlets were broadcasting the exact same live stream of the debate, they continued thinking that the one they favored somehow offered more impartial versions.

Another, slightly more involved, type of cognitive bias is called motivated reasoning. It takes confirmation bias a step further and actively puts down opposing sides.

Motivated reasoning is highly powered by emotions. People get defensive when they discover that they may not be right, so instead of accepting this fact, they bring out the guns and shoot down the other side. If the information supports their opinion, even the smallest bit, it’s used as if it is definite—even if a person knows, deep inside, that it’s not.

We want to be right, at any cost, because the alternative is something we don’t like to handle—defeat.

When Josie and Alex were trying to get their friends to watch the debate from their preferred outlet, they had no basis for thinking others would be biased. They also didn’t have any proof that the news outlet they wanted to watch it from would be more accurate, considering it was a live debate. They had an opinion, and they wanted to be right—so they said the other side was wrong.

This can also occur when a person discovers information and defends it in a biased way. They have a goal—proving that they are, in fact, right, and everyone else is wrong—and so they go about achieving that goal in a biased way. Rather than approaching the search for information in an objective manner, some may search only for information that proves the point they want to make and likewise, information that disproves the other side.

This can be dangerous, because some people cannot be swayed, no matter how much information is presented to them. And this causes a rift. Sometimes, it doesn’t matter.

But sometimes it does.

An Inability to Agree

On September 27, 2016, a group of Ball State students crowded around a man at the Scramble Light, yelling and crying, trying to drown out his words. The man was a representative of Cincinnati’s Official Street Preachers organization, and was there to spread his beliefs—which consisted mostly of an anti-gay message, as well as some others that the students found offensive.

“You’re all going to Hell!” The preacher shouted at students as they waited for the chirping sounds that meant they could cross safely. “You believe it’s okay to be a homosexual, and so you’re all sinners, doomed to Hell!”

Students were outraged, believing that he had no right to spread what they considered to be hateful messages. They crowded around him, and another preacher who joined later in the afternoon, and even held signs that countered his, such as “Love never fails” and other messages they felt were positive. They were clearly upset at the presence of the preachers, and argued relentlessly with them. This interaction lasted well into the afternoon until the preachers finally left.

Both sides, the preachers and the students, were spewing information at one another, trying to prove that they were right and the other wrong. They exchanged Bible verses and different interpretations of what those meant, neither side giving in to the other or accepting what they had to say. The students were angry, some in tears, and both the preachers and the students felt like they were correct in their views.

No sensible, calm conversation could be had, because both were too set in their opinions to attempt a rational exchange.

This encounter didn’t end in violence—it was simply a standstill of butting heads and strong-willed individuals. But what it shows is that arguing over opinions and being unrelenting in personal views is sometimes problematic, causing the aforementioned rift in society.

The cognitive biases that lead to our stubbornness in opinion-forming are there for a reason—making decisions and sticking to them is good for us and humanity’s survival. There’s no doubt that constant indecisiveness would be problematic. But inflexibility also causes problems and divides us on things that might be best if we were united on. Which is a problem.

No one is sure why or how we develop these cognitive biases. Pickel’s best guess is that it’s a byproduct of the way our cognitive system works. We’re naturally inclined to sort things into categories in our mind, because if we didn’t, our brains would get overwhelmed. This includes people, as well as things. It’s not wrong or bad; it’s just what we do.

However, going the next step and using those biases to treat people differently, or assume that an opinion is the correct one, is not something that we’re required to do to avoid confusion.

So the presence of cognitive biases is inherently human. The way that they’re used, however, is a result of experiences and upbringing, as well as some things that still haven’t been completely figured out. It’s a tough concept to grasp, which is why most people don’t give it a second thought. We think the way we do because we just do. And that’s that.

Except that it’s not. Because if we want to have a chance to get over these biases, to realize that we’re not always right, and we’re not always going to be, then Pickel says that we need to recognize that these biases exist.

Just through increasing our knowledge about them, and how they can affect our decision making, can help us to examine our own actions and make changes. Just the act of stopping to think about why we believe what we believe, as well as entertaining the thought that we might not be right about it, is enough to somewhat counteract the problem. This is what Emily does to decrease her biases of straight, white, cisgender men. She wants to be a therapist, and she knows that there isn’t room for biases in her future profession. Because she is able to acknowledge their existence, she’s also able to overcome them.

If you’re convinced that you’re right, logically, you aren’t going to be searching for reasons why you might not be. It just doesn’t make sense. Which is why so many of us are in this situation of thinking we’re right when we might not be. And also why addressing the problem might be necessary.

However, there is no fix-all to this problem. There are so many different factors at play here that it would be impossible to change the way we all think, according to Pickel. We’ll probably never be able to stop everyone from thinking that their way is the only way, but it might be possible to change a few. And that’s how progress can be made.