Why Humans Are NOT Reasonable

32 days before his death, on January 17th, 1546, the greatest reformer in modern Christian history, Martin Luther gave this sermon; https://drive.google.com/file/d/1Ub2mxo47UjBH7_6SLYWidVIZeYkrsqhz/view?usp=sharing
It was his last sermon at Wittenberg, Germany.

In this sermon, Luther said, “Reason is by nature a harmful whore”
This statement is still today referred to as an “Anti-reasoning comment”, especially because it came from a religious man.

If you look around you, you’ll see that everybody believes that they’re reasonable, logical, and they make fact-based decisions.

But wait!

Everybody also believes that the other person isn’t reasonable.

For example, a Democrat might think that Trump followers are unreasonable.

People who believe that Moon-Landing was faked might also believe that the rest of the world is irrational and atheists often accused religious people of being “Irrational”, or unintelligent.

The problem about accusing other people of being irrational is that you’re as irrational as they are, you just don’t see it and that’s what this video is all about.

On page 239 of his 2008 book, Predictably Irrational, Dan Ariely wrote;
“Standard economics assumes that we are rational… But… we are far less rational in our decision-making”

The Pride of Logic

The idea that reason is what distinguishes humans from other animals is generally traced back to the ancient Greek philosopher, Aristotle.

In fact, in his 1781 book, Critique of Pure Reason, the German philosopher Immanuel Kant stated that… with Aristotle logic reached its completion.

Then came the 17th Century French philosopher, René Descartes.

In his 1637 book, Discourse on Method

Descartes wrote the account of how in the cold autumn of 1619, at age 23 he found himself in what is now southern Germany with time to spend and nobody around he deemed worth talking to.

There, in a stove-heated room, Descartes formed the stunningly ambitious project of ridding himself of all opinions, all ideas learned from others, and of rebuilding his knowledge from scratch, step by step.

In other words, Descartes wanted to avoid human lies, group thinking, and group deception.
He wanted to use his reasoning to figure out everything.

The problem with this idea is that….. IT’S IMPOSSIBLE.

As humans, it’s impossible to live by the truth. Instead, we live by the group.

The only truth we believe, and the only fact we accept are those that our tribe accepts, even if those truths are indeed lies
More on this later in the video.

For now, let’s assume for a moment that we’re all reasonable beings who come to conclusions based on reasoning.

This will then mean that facts, evidence, and science determines what we believe, right?

Unfortunately, that was never the case.

In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide.

They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life.

The students were then asked to distinguish between genuine notes and fake ones.

Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times.

Others discovered that they were hopeless. They identified the real note in only ten instances.

As is often the case with psychological studies, the whole setup was a put-on.

Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious.

The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

In the second phase of the study, the deception was revealed.

The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.)

Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right.

At this point, something curious happened.

The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this.

Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—an equally unfounded conclusion.

“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

Think about this for a moment.

If we are reasonable, or logical beings as we always want to believe, it means that our conclusions would be influenced by facts.

It means that we’ll be able to change our minds if confronted with facts.

Unfortunately, that’s not the case.

For example in April of 2014, Brendan Nyhan, a professor of political science at Dartmouth, published the results of this study;
It’s the study of 2,000 parents which was conducted by a team of pediatricians and political scientists, to test a simple relationship: Could parents change their attitudes toward vaccines?

These parents were shown facts from the Centers for Disease Control and Prevention.

They were also shown images as evidence that some kids do suffer from diseases that vaccines could take care of but NOTHING CHANGE THEIR MINDS.

In fact, instead of a positive response, the first leaflet actually decreased intent among parents who held the most negative attitudes toward vaccines… a phenomenon known as the backfire effect

Wait a moment and think about what happened in this 3 years-long study.

It’s like, someone comes to you to tell you that having unprotected sex with strangers could lead to STD (which is a fact) but because someone dares to tell you that, you decide to throw away a few condoms you have.

Or someone tells you that staying in a crowded place could give you Covid 19 if you’ve not been vaccinated so you decided to go to an anti-vaccine rally to prove a point;

The author of this study said, “It’s depressing,” and it is.

But Why Don’t Facts Change Our Minds?

In their 2017 book, The Enigma of Reason, the cognitive scientists Hugo Mercier and Dan Sperber wrote;
“How come humans are not better at reasoning, not able to come, through reasoning, to the nearly universal agreement among themselves?”

In other words, if humans are truly reasonable beings who based their decisions on logic, how come we can’t agree on anything?

According to Hugo Mercier and Dan Sperber, reason is an evolved trait, emerged on the savannas of Africa, with the sole purpose of keeping us together so that we can remain alive.

Let me repeat that in another way;

Our capacity to reason evolved primarily, not to make a logical conclusion but to keep us alive, and for millions of years, we could only remain alive if we are faithful to our tribe, group, and community.

In his 2018 book, Atomic Habits, James Clear wrote;
“Humans are herd animals. We want to fit in, bond with others, and earn the respect and approval of our peers. Such inclinations are essential to our survival. For most of our evolutionary history, our ancestors lived in tribes. Becoming separated from the tribe—or worse, being cast out—was a death sentence.”

What this means is that, as humans, remaining faithful to our tribe, our faith, our beliefs, and previous conclusions is far more important than truth or facts which is why we’re all irrational beings who disregard facts.
Now, even though our species is no longer under threat from this guy;

This guy;

Or even this guy;

Even though most of us are no longer under fear of invasion by another tribe or kingdom, we’re still very much a people of tribes.

In today’s world, our faith, opinion, and belief are our tribes and we’ll do anything to remain faithful to those tribes…no matter what the fact or science says.

Whether you talk about QAnon, Taliban, Catholicism, Pentecostalism, Islam, Atheism, Trumpism, CNNism, FOXism, Feminism, or even Veganism, you’re talking about tribes which we can also call cults.

Whether you talk about being a male, a female, black, white, or Asian, you’re talking about tribes or even cults.

Even ordinary things like being a scientist, an accountant, or a YouTuber, preferring Coffee to tea, having black as your favorite color, or driving a Tesla Car can easily become a tribe, a community, or even a cult in our mind.

If you think I’m exaggerating this, think about Apple users.

According to a cultural historian and NYU professor, Erica Robles-Anderson, Apple users are perfectly qualified for the title of a cult;

Like most people, I used Apple products, so I’m not here to make you feel bad if you’re in the cult of Apple.

The point I’m making here is that, as humans, we’re not as reasonable or as rational as we want to believe.

Our tribes, groups, and communities form the primary base for what we believe.

This idea or belief formed by our tribes then becomes the status quo and any other fact, truth, or evidence outside this status quo is easily discarded.

In his 2008 book, Nudge, Richard H. Thaler wrote;
“People have a strong tendency to go along with the status quo or default option.”

Now, let’s see how we create the status quo and disregard facts or truth.

Living by alternative facts

During a Meet the Press interview on January 22, 2017, U.S. Counselor to the President, Kellyanne Conway, was defending the White House Press Secretary Sean Spicer’s false statement about the attendance numbers of Donald Trump’s inauguration as President of the United States.

When pressed during the interview with Chuck Todd to explain why Spicer would “utter a provable falsehood”, Conway stated that Spicer was giving “alternative facts”.

As humans, we all have alternative facts and believe those lies to be true.

To even convince us that our lies are truth, we read the information that supports our views, listen to people who tell us what we want to hear, and search Google for whatever we wish to see.

This is what psychologists called Confirmation Bias
Confirmation bias is a person’s tendency to accept information that confirms their views or prejudices while ignoring or rejecting contradicting information.

According to this 2015 Psychology Today article by Dr. Shahram Heshmat,
Once we have formed a view, we embrace information that confirms that view while ignoring, or rejecting, information that casts doubt on it. We don’t perceive circumstances objectively. We pick out those bits of data that make us feel good because they confirm our prejudices.

In their book Enigma of Reason, Hugo Mercier and Daniel Sperber refer to this as “myside bias.”

According to them, rationality is less about making decisions based on logic but providing justifications for decisions one has already made.

The illusion of Explanatory Depth

Another reason why beliefs are so hard to change is the illusion of explanatory depth.

According to this concept, people think they understand an issue well enough to be able to have an opinion about it. The only time they become aware of their ignorance is when they are asked to explain about it and they fumble in doing so.

This 2006 study was done by U.K. researcher Rebecca Lawson.

Lawson asked a group of psychology students from the University of Liverpool to rate their knowledge of how bicycles work and draw the pedals, chain, and extra frame onto a sketch of a bicycle. There was also a multiple-choice task that required them to identify the usual position of the frame, pedals, and chain.

The study found that over 40% of the participants who were not experts in bicycles made at least one mistake in the drawing type and the multiple-choice task. This is even though almost all participants learned how to ride a bike, that almost half of them owned a bicycle, and despite bicycles being common, everyday objects. One striking comment from a participant was “I never knew how little I knew about things until I had to draw them.” Thus, the results suggest that people have a vague, incomplete, and often inaccurate understanding of how everyday objects function.

In his 2001 book, Fooled by Randomness, Nassim Nicholas wrote;
“People overvalue their knowledge and underestimate the probability of their being wrong.”

This should remind you of Dunning Kruger Effect
Since we are mostly ignorant about many things and tend to exaggerate how much we actually know, it’s easy to understand why we won’t change our minds about things

Avoidance of Complexity

In their 2016 book, Denying to the Grave, Sara and Jack Gorman point out that one of the causes of science denial is that making decisions based on science is complicated and requires a great deal of mental energy.

Intimidated by difficult concepts they struggle to understand, people resort to simplistic explanations even though they may not be that accurate.

According to behavioral researchers Daniel Kahneman and Amos Tversky, the more primitive parts of the brain like the amygdala cannot process complicated information.

Conclusion

The idea that we’re all irrational, illogical, and biased isn’t what anyone likes to accept but that’s the truth.

While you think you’re rational, I’m sure you’re seeing other people’s irrationality all the time.

But they’re seeing yours too.

I think it’s just like your body.

Other people know how you look but you don’t.

In his book, Misbehaving, Richard H. Thaler wrote;
“What makes the bias particularly pernicious is that we all recognize this bias in others but not in ourselves.”
I hope this video makes you humble, just as making it makes me humble.

Thank for watching!

Leave a Reply

Your email address will not be published. Required fields are marked *