Exercise your ability to focus, recharge, and connect.

Blog Archive

When You Argue with Yourself, You Win

This Facebook post (6/29/20) by David McRaney from You Are Not So Smart feels super important. It’s a condensed excerpt from his upcoming, timely book How Minds Change (author site, library, GoodReads).

"Research keeps coming out showing that people are more scientifically literate, more trusting of science, more politically engaged, and more savvy consumers of information than ever before.

Also, fake news has almost zero impact on public opinion, and conspiracy theories are about as common and influential as they've always been.

In other words, we are better critical thinkers, more educated, and more likely to apply both to the values we care about within our institutions and governments than ever before in our history as a species.

I hear you though — what about...WAVES HAND AT EVERYTHING!? Well, just because we are smarter doesn't mean we AGREE. The research on this is clear. The more intelligent you are, and the more educated, the better you become at rationalizing and justifying your existing beliefs and attitudes, regardless of their accuracy or harmfulness.

In addition, reason ain’t logic.

We feel first and reason second.

Reasoning is literally coming up with reasons for the things you feel for the sake of justifying to others those feelings and the behavior (or plans to behave) they generate.

Add these two things together, put them on the internet where people can create communities around what they believe and feel, and you get...WAVES HAND AT EVERYTHING.

We are primates, and primates evolved sociality to survive and thrive. We evolved the ability to form and maintain groups, and since we are smart and organized, the greatest threat to our wellbeing soon became other groups like ourselves. That makes us tribal by nature, and we will use just about anything to identify ourselves to others as either US or THEM.

Photo by Ketut Subiyanto from Pexels

Photo by Ketut Subiyanto from Pexels

Once things, like wearing masks or believing in climate change or eating meat, became signals of allegiance to our tribes, welp, that was that. When something becomes politicized, it becomes more important to you to pursue belonging goals over accuracy goals, because signaling to your group that you are a good member is more important than being factually correct about an issue.

One person being wrong about an issue will likely not affect that person all that much. But getting ostracized will have a huge impact. And you can easily lose your social support network via public shaming.

The result — you feel strongly about issues that signal your loyalty to your group, and then you employ your reasoning to justify your feelings — usually by searching for articles and videos that agree with you on the internet to bolster your claims.

And anything can become politicized. Literally anything. Research shows that given the incentive, knowing nothing else about you, people will use how many dots you estimate are on a scatterplot as a clue to your loyalties. So, it's not surprising that evidence-based issues like evolution or GMOs or vaccines or roundness of the Earth can leave the realm of evidence and enter the realm of tribal psychology — all it takes is for a community to form around those beliefs.

By the way, this is also how cults work, and politics most of the time. People form cultural norms and then use ambiguously written sacred texts to justify those norms. They may defend their attitude with the text, but it's not the source of their feelings, just a justification. But I digress.

Is this good or bad? Different researchers tell me different things, but honestly, a lot of them say it's very, very good. It's how we built...WAVES HAND AT EVERYTHING. Which means it's also how we change those things. It's just that we are more aware than ever of the process and the disagreement it produces. We are more likely than ever to engage with people with whom we disagree.

Here's why it is good. People tend to argue from a biased position, sure, but that's a feature, not a bug. That's because we evaluate the arguments of others from a much more nuanced stance. It's a dual process system. Ask a person to produce arguments, and they'll reason themselves into extreme attitudes.

In studies where they tell someone they took a position they didn’t, switching their answers with sleight of hand, they will argue for that position as if they did. But ask them to evaluate other people's arguments, and they tend to shift their views away from the extremes. In fact, when people are manipulated to believe that their own arguments are actually those of others, they tend to find all the flaws in their own reasoning and solve logic problems that escaped them when they were evaluating those same arguments as their own.

Their reasoning isn't flawed or irrational; it's biased and lazy. We save cognitive labor by offloading it to group discussions. Each person contributes hastily drawn, biased conclusions, but then we get to arguing and start sorting out the wheat from the chaff.

This system evolved because, as primates in groups, one person can only see and experience a limited range of inputs. One person's perspective is literally that, a viewpoint that reveals only one side of the reality in question. Add a view from another angle, and then another, and another, and each brain is more than happy to update its model of reality. As a group, everyone updates together, and then the consensus reality changes after a few rounds of arguing. That's why we don't believe the sun revolves around the Earth or burn witches at the stake -- but, it's also why we DID do those things. The same system that creates reality also breaks it.

Coronavirus: you feel X or Y about masks, then you go looking in your head for reasons to justify that feeling, then you produce an argument for your position, and if challenged, you search for information, preferably from people who agree with you, and make another argument.

You repeat this process until you can walk away feeling secure in your reasoning, or until you can’t justify yourself and begin to doubt it. Research on a moderately political issue showed about 13% of the information you encounter during this process must be counter-attitudinal to generate doubt. But you don’t update right then, you just stop counterarguing. In that same study, about 30% of the information received from the arguments of others (or through self-directed search) had to be counter-attitudinal before people started updating their priors — that is, before they changed their minds.

So, even though we are motivated reasoners (more motivated by tribal loyalty than by accuracy), arguments/facts do work on us, it just takes a lot — and we actively avoid them when threatened, which makes getting to that [X]% tough when taking a new position is risky, as in the case of politics and cultural norms/values.

Oh, and people have higher or lower thresholds given the salient risks and rewards of updating their prior reasoning from one conclusion to the next — for instance, your opinion on clouds probably has a low threshold, yet, if you are a meteorologist, it's likely high.

We are divided and arguing because we are social animals first and individual reasoners second, a system built on top of another system, biologically via evolution.

We depend on our groups to come to a consensus via deliberation and argumentation before we pursue goals or settle on attitudes. Reasoning’s role is to produce biased arguments for your own perspective. Basically, all culture is 12 Angry Men at scale.

Almost every cognitive bias and flawed heuristic and logical fallacy I've written about for more than decade plummets in its impact on decision-making when people reason in groups, but only if those groups are allowed to argue freely without social costs for dissent or subversion.

A lot of arguing on the internet doesn't work that way. People retreat into like-minded enclaves where it seems like they are arguing, but it's mostly just people affirming one another that they chose the right group. What usually happens in those communities is that people who think of themselves as moderates will realize that the extreme is much farther along the spectrum than they thought, so to be a true moderate, they must shift their attitudes in the direction of the extreme, dragging their beliefs with them. If everyone is doing that in turn, after a few rounds, the whole group radicalizes.

This is how cults and political and conspiracy theory communities get catalyzed by the internet. It seems to them like they are arguing together while alone, but they are really arguing alone while together. It's a community of people arguing with themselves, coming up with reasons for their own feelings without contest, and when you argue with yourself, you win.

So if we want better reasoning, we must create better groups and better platforms for people to meet up and argue. Arguing is an evolutionary gift that allows us to change our minds faster than our genes could.

Other animals have to spend generations to develop new behaviors, especially at the group level, whereas we can coordinate in minutes toward shared goals. That’s because when the right thing becomes the wrong thing — factually, morally, or otherwise — we can update our priors. And the research says as a group it works best if done face-to-face, regularly, and peacefully, without the threat of social costs for dissent. Also, someone should take good notes. Otherwise, we will be biased by reputation management and use motivated reasoning to arrive at inaccurate conclusions and potentially harmful attitudes.

There's also about 70 years of research into what makes for the most persuasive argument, and another 30 on how ideas spread from small groups to entire nations in less than a decade, but I've written so much that Facebook is taking 15 seconds to register each keystroke. Anyway, this is all I think about, so welcome to my obsession.”

See also:


Header photo by Headway on Unsplash