After almost two years of this mess, I decided I needed a break and wanted to do some traveling. I booked all the tickets, got the paperwork done, and was all set to go. And then, I noticed, on the corner of the screen - the plane I was about to fly not once, but twice, was the 737 MAX 8.
I am no aviation expert, but I do read the news from time to time, and I think the MAX 8 crashed. Twice. The route I was flying had no alternatives, and so, I kinda had to fly that plane. Of course, I made it to the other side, and here I am talking to you. But up until the moment the plane came to a halt on the tarmac for the second and final time, my heart rate was elevated and all I could think of was whether I would become the next headline on news websites for all the wrong reasons.
It didn’t matter to me that the MAX 8 was only cleared to fly after thorough revisions and independent authorizations from the US, EU, and each airline that subsequently chose to fly it - all of which are institutions we would’ve trusted if we got on another plane. It didn’t matter that after being grounded for nearly 2 years, the plane had actually been in the sky for nearly a year before I boarded one. It also didn’t occur to me that after landing, to go to the hotel, I casually got onto a taxi, an act that is nearly a 100 times more likely to result in my death.
This is just one of the ways in which the things we believe completely violate rules of rational, scientific ways of thinking. You and I hold many such beliefs. We all know this to some extent, and yet when someone sitting across the tables displays this very tendency, for just the right topics, we can’t help but be enraged about how clearly we see the facts and they don’t.
Whether it be vaccination, climate change, gun ownership, or any of the heavily debated topics of today, we have all experienced being on one side of the debate or the other, where we desperately tried changing someone else’s opinion. Almost always, regardless of the nature of the debate, neither party changes their stance. In fact, most people feel a defensive need to double down and be even firmer about the position they already held.
The result of all this is ever increasing polarization of the overall social and political climate.
Take climate change for instance. One could theorize that the more educated a person is, the more likely they are to believe climate change is happening and humans are largely responsible for it. Here, educated refers to institutional education. So, people that have bachelors and masters should, in theory, find climate change a no-brainer. But, studies do not reflect that. Level of education, is in fact, a very poor predictor of whether or not you believe in climate change or any other issue really. What is a better predictor is how you politically identify, unfortunately. As a matter of fact, generally more educated people tend to have even more polarized views on this issue and others than less educated people.
Contrary to what you and I may think, people that don’t believe in climate change don’t necessarily do so because they are stupid. Far from it. Their refusal to believe scientific evidence relies on how bias driven the human thought process is.
Once an initial bias has been developed, it is extremely difficult to change our minds. This has been demonstrated by numerous experiments where scientists presented participants with a made-up study about climate change. Neither party was aware whether the study was made up or not. The participants who already believed in climate change, used it as a reinforcement of their beliefs and those that did not, simply dismissed the study as bogus.
Fake studies that argue against climate change have the same effect. People who already believe in climate change dismiss it with ease and those who don’t, embrace it. Presented with a new fact, both groups essentially became more polarized than they initially were. The more you feed into it, the worse it gets. Another experiment conducted in 2005, revealed what is known as “choice blindness.” Participants were shown 2 photos, one of which they had to pick as the more attractive one. They were then asked to argue what about the photo led them to their decision. After their choices, the researchers went on to secretly switch the photo with the option they had not chosen. Some participants recognized this switch, of course, but most did not, and they remarkably went ahead and actually justified a choice that they didn’t even make. This just goes to show that once the initial choice was made through gut feeling, emotion or some other impulse, all the participants really cared about was convincing themselves that they had made the most rational decision. It reiterates that our positions often have very little to do with evidence.
Then, what do they have to do with? And why do we end up defending them? Well, it has to do with our evolutionary desire to belong. Straying away from the herd does not do your chances of survival in the wild any good. Sure, you want to discover which new fruit you can eat or what new place you can find food in, but this curiosity must not overwhelm the need for survival. And even if we don’t run the risk of being eaten in modern times, that tendency has very much remained in place. While curiosity and the search for truth often excite us, the need to belong to whatever group we find ourselves in seems to overwhelm all else. Today, when you see relatives arguing about politics over thanksgiving dinner, what they are doing is really just signaling to everyone else what group or political ideology they belong to and how proud they are about it.
Another reason why people tend to defend these, often ill-formed, opinions is because of how much they think they know about these topics. Known as “Illusion of Explanatory Depth,” it’s a mistake we are all guilty of having made. It’s so easy to start a discussion with a friend and veer off deep into some uncharted territory, but carry on and make things up as you go along so as not to seem uninformed. Despite better access to information, it’s especially prevalent in today’s age where tweets are cheap, but thoughts are not. In an interesting 2014 experiment to illustrate this tendency, participants were asked about their recommendations about military intervention by the United States in Crimea. They were also asked to point out Ukraine on a map. The median guess as to where Ukraine was was off by nearly 1800 miles, and the farther off a participant was, the more likely they were to recommend military intervention.
The researchers, Sloman and Fernbach, essentially summed up this result by the following quote: “As a rule, strong feelings about issues do not emerge from deep understanding.”
These examples should make it clear that the conspiracy theorists that people love to bash on, may actually not be that far ideologically from the rest of us as we may think. Their reflexive need to belong is just as strong, if not stronger, than the rest of us. The only difference, perhaps, is the initial set of assumptions. In fact, conspiracy theories are often initiated on factual grounds. It’s only later that they tend to diverge into unrealistic and unfounded realms. Case in point - a few months ago, there were reports of people licking hand rails and doorbells because they thought it would strengthen their immune system.
Before I say anything else, I just want to say - please don’t lick handrails or doorbells.
Now, the idea that these people have in mind is that your body reacts to some stress by building itself back better and stronger. This is a totally scientifically sound idea and it’s called hormesis.
In fact, a hormetic response takes place in your body every time you eat broccoli. Where these people are wrong, however, is that hormesis doesn’t just respond to just any stress. It has to be very mild. While exposure to germs in small doses does induce a hormetic response, the surface of a doorbell has far too much bacteria to be even remotely safe. For reference, the surface of a mobile phone is more contaminated than that of a toilet seat. And your phone is only used by you. Imagine a surface that is touched all day, everyday by multiple people.
Then, there are caveats to how information is created in our world that doesn’t lend itself to confidence for everyone. For example, the scientific method often relies on the idea of uncertainty, and that is something human beings are not confident with. All uncertainties are not equal, of course, but even the slightest shred of doubt is sufficient in convincing critics of a piece of evidence why they were right all along.
Why pay attention to these unsure scientists when you can just listen to confident Uncle Tom?
This is, unfortunately, an aspect that was taken advantage of by the lobbyists of the smoking companies when the public wasn’t sure about its harms, and now by people that don’t believe climate change is either happening or being caused by humans. Anything less than total, absolute consensus in their minds is not just insufficient, but incriminating evidence that they have been lied to this whole time. But insistence on the need to have uncertainty is perhaps the greatest gift from the scientific method. After all, it quantifies, as best we can, how far we are from the actual truth, and what we might need to get there.
So, how do we try and change someone’s mind when we think they are wrong? Well, we can first start by assessing our proposition. We should only propose changes that are relatively minor in comparison to the position the person already holds. It can help to think of ideological scales. For example, on a scale of 1 to 10, if a person believes an idea that is a 7, and our ultimate goal is to convince the person of a position that is a 2, we are better off by proposing a 5 first. Of course, over time we can expose them to other ideas, but we ought to start gently. This is because a drastic ideological shift is not very realistic, and is more likely to encourage polarization.
Before you tell them that they’re wrong, tell them they’re right.
People usually come at an argument from a certain angle. If we are able to recognize that angle instead of arbitrarily laying out facts, and we allow others to discover, as opposed to being embarrassed by our argument from their own point of view, we stand a better chance of changing their minds.
Another method to have a fruitful and informative discussion is to have an aggressive moderator. Now of course having a moderator in any kind of unofficial debate is not very routine, but it might as well keep your family together. Hear me out. Generally the antagonism between two debaters, or even two relatives, who are arguing on a hot topic is because they feel as though they belong to different groups - as we have said earlier. Now, if you introduce a moderator, and an aggressive one at that, the focus quickly shifts from needing to defend each other from themselves to defending each other from the moderator. This can have a unifying effect that forces the debaters to see more reason in their opponents argument.
Another mistake people make when arguing for or against a government position is to forget that the person who is arguing is not the government or even a representative of it. If, say, you are debating the policies of Brazilian foreign policy, with a citizen from Brazil, chances are, the person you are talking to had no hand in formulating the policy, whether or not they voted for the government that did. Also likely is that they actually know very little about the policy and are defending it for the same old reason - to feel like they belong. In this instance, it is far too easy to attribute bad faith to the person in front of you, but it is important to remind yourself they are probably as clueless as you are and, in reality, you’re arguing for no point.
As much as we would like to believe, facts don’t change minds. We have more access to information than ever before, and yet we are also more polarized than ever. In this seemingly irreparable climate, it’s important to recognize why people believe the things they do and how we are never too far from having believed them too. We can always be kinder in our effort to understand the people across the table, only then do we stand a chance to bridge our ideological gaps.
After all, facts don’t change minds.