View from
The Center

The Limits of Political Reasoning

However, regardless of it being mathematically the same task, the politically-charged task’s results came out in a polarized fashion…”

Introduction

Most of us assume that we hold the correct beliefs, can formulate rational arguments, and typically make logical decisions and judgements. Often, this is indeed the case. Sometimes, however, we fall victim to cognitive biases and heuristics (i.e. mental shortcuts that are meant to reduce the use of cognitive resources). And, when this happens, we are often completely unaware of that fact that we are falling victim to them. Displaying these biases and heuristics can give others the impression that we lack the necessary information or intelligence to arrive at the correct conclusion. Some may even believe that—when we are suffering from them—we have become completely irrational, even if this  is rather unlikely

Given the importance of using objective and rational methods of argumentation in political discourse, my goal with this piece is to draw attention to a particular heuristic that might lead to unnecessary disagreements. Furthermore, I’ll discuss another phenomenon that suggests—in a comparable manner to the heuristic—how our beliefs are influenced by, in this case, our tribal loyalty towards our peer groups. 

The Affect Heuristic

The affect heuristic is an excellent example of how irrational attitudes influence judgements. The term, which was coined by  the psychologist Paul Slovic and his colleagues, tells us how our subjective and emotional interpretations of a stimulus (e.g. a certain topic, statement, belief, technology, law, regulation, etc.) might affect our eventual judgement of that particular stimulus. For instance, a 1978  study by Slovic and his co-researchers revealed how people generally exhibit a positive attitude towards non-nuclear electric power—contrary to its nuclear counterpart. This particular attitude makes most people view non-nuclear electric power as highly beneficial and relatively low on risks. Because of our greater familiarity with non-nuclear electric power, most people perceive it as the “Abel” of types of electrical power, without taking into account the actual risks it holds. Due to a relative lack of familiarity with nuclear power, participants in the study overestimated its risk. 

The affect heuristic primarily rests on the work of the eminent social psychologist, Robert Zajonc. In particular, it’s Zajonc’s mere exposure theory that set the stage for further research and that, ultimately, gave rise to the affect heuristic. Zajonc’s theory suggests that frequent exposure to a certain stimulus may enhance the extent to which someone likes or prefers that stimulus. In other words, the mere exposure theory insinuates how our attitudes in relation to a stimulus might be altered after repeated exposure. This subsequently could lead to altered judgements regarding that stimulus.  As with most heuristics and biases, we usually do not notice it when we apply the affect heuristic to our thoughts and arguments. This makes it, as Slovic and his colleagues describe, “frightening in its dependency upon context and experience, allowing us to be led astray or manipulated—inadvertently or intentionally—silently and invisibly.” 

Because of this visceral approach to political argumentation, people often assess the benefits and risks of a stimulus not by the accompanying data and statistics but, rather, by the extent to which they like or dislike that particular stimulus.

Involving emotions while formulating  arguments is in-and-of-themselves nothing to worry about. We frequently formulate these arguments at home with our family, at work with colleagues, or when out with friends. After all, we cannot expect people to be perfectly rational—because we aren’t. Our innate (yet infrequent) irrationality does not suggest, however, that we have nothing to worry about. In Thinking, Fast and Slow, the psychologist and Nobel laureate in economics Daniel Kahneman describes how the affect heuristic is frequently relied upon when formulating political arguments. Because of this visceral approach to political argumentation, people often assess the benefits and risks of a stimulus not by the accompanying data and statistics but, rather, by the extent to which they like or dislike that particular stimulus.

As I previously mentioned, the primary goal of a heuristic is to diminish the use of cognitive resources. For instance, questions about how one feels about a certain topic are much easier to answer (and consume less cognitive resources) than difficult questions—for example, about rating the  risks and benefits of certain activities. Kahneman calls this process substitution in his book, Thinking, Fast and Slow: “the operation of answering one question [the heuristic question] in place of another [the target question].” Regrettably, the use of heuristics and our predilection for a particular political argument will not come without a cost: namely, our rational responses.

Identity-Protective Reasoning

A somewhat similar phenomenon—regarding our reaction towards politically-charged stimuli—has been studied by the legal scholar Dan Kahan and his colleagues. Kahan and his co-authors are concerned with the notion that disagreement about certain topics persists, regardless of the vast amount of available scientific evidence. To answer their concerns, they propose two theses: First, this conflict is a result of the fact that people lack the essential knowledge and abilities to reason (described as the “Science Comprehension Thesis (SCT)). Secondly, they suggest that the conflict arises as a result of our political or cultural predisposition and the manner we engage with “decision-relevant science.” The latter thesis is described as Identity-protective Cognition Thesis (ICT) and will be my primary focus. The ICT argues that politically-charged information might influence our objectivity by indicating that we hold opinions and beliefs that line up with the political stances of that of our peers and ourselves, as a sign of loyalty. That being said, if one’s peer, for instance, believes that safe spaces are an ideal policy for  colleges and universities, one might readily adhere to this view, instead of taking into account the available science that contradicts the peer’s belief.  

In their 2017 paper entitled “Motivated Numeracy and Enlightened Self-Government,” Kahan and his colleagues describe how they challenged people of different political viewpoints to solve various mathematical reasoning tasks. Kahan and his colleagues essentially compared politically-neutral tasks (about a skin rash) with the results of politically-charged tasks (about gun control). Other than the topic of the tasks, the researchers did not make any other adjustments. The neutral task was not the easiest; only 41% of the participants arrived at the correct answer. However, regardless of it being mathematically the same task, the politically-charged task’s results came out in a polarized fashion, especially when it came to those participants who scored high on numeracy (the level of competency when dealing with quantitative information). Results demonstrated that their answers depended more on their political predisposition than a politically neutral evaluation of the problem in question. Or as Kahan described it: “In other words, higher numeracy improved subjects’ performance in detecting covariance only in the ‘gun control’ condition in which the correct response was congenial to the subjects’ political outlooks.”

The paper’s authors noted that people usually conform to the best scientific evidence available, except when “a policy-relevant fact does become suffused with culturally divisive meanings, the pressure to form group-congruent beliefs will often dominate whatever incentives individuals have to ‘get the right answer’ from an empirical standpoint.” The cognitive psychologist Steven Pinker  pointed out—after analyzing Kahan’s findings—that someone’s agreement or disagreement with a certain scientific fact or belief gives others an impression of “who they are, rather than telling us “what they know.” Akin to the affect heuristic, our liking or disliking—agreement or disagreement—of a political stimulus will, in turn, negatively influence our mathematical reasoning and decision-making. Altogether, it makes objective discussions rather challenging when personal opinions get involved. 

“A given belief,” Pinker argues in Enlightenment Now “can become a touchstone, password, motto, shibboleth, sacred value, or oath of allegiance to one of these tribes.” However, both Kahan and Pinker assert that this method of approaching politically-charged arguments is—contrary to the affect heuristic—actually rational. Kahan, for instance, argues that from an “individual-welfare perspective,” an individual’s opinion about a particular topic like climate change is, on average, too insignificant to make an impact on broader society. By way of comparison, most of us perceive a larger repercussion when we hold stances on topics that go against that of our social groups. As a means to avoid any negative feedback from our peers, we would rather choose a political side that aligns with that of our peers, as opposed to deviate from it—regardless of their arguments possibly being unscientific, or just totally bogus. 

Antidote to our Imperfections

Both of these cognitive imperfections—as I would call them—are part of a larger  group of biases, heuristics, and fallacies (e.g. the availability heuristic, confirmation bias, coverage bias, etc.) that obstruct efforts to achieve a consensus about important scientific and political issues. Apart from educating ourselves regarding the effects of heuristics, a 2017 study by Jason Mosser and his colleagues suggests that talking to ourselves in the third-person might alleviate emotional involvement during stressful activities—such as a political debate. Whether teaching others about the theories and workings of human behavior changes anything is still up for debate. As Daniel Kahneman put it in Thinking, Fast and Slow, after suggesting that teaching psychology is “mostly a waste of time” :

“People who are taught surprising [psychological] facts about human behaviour may be impressed to the point of telling their friends about what they have heard, but this does not mean that their understanding of the world has really changed.”

With regards to the Identity-protective Cognition Thesis, Kahan and his colleagues suggest that simply informing the public about scientific matters and sharing critical-reasoning skills will not achieve the desirable outcome. “To dissipate persistent public conflict over decision-relevant science,” he writes. The ICT, as mentioned, proposes that we are too rational. This means that we are amazingly aware of our social impact, and, as a result, we would rather adhere to the political attitudes of the groups we are affiliated with. So instead, Kahan recommends we rid ourselves of the actual source that drives this given way of engagement: “The conditions that generate symbolic associations between positions on risk and like facts, on the one hand, and cultural identities, on the other, must be neutralized in order to assure that citizens make use of their capacity for science comprehension.”

In one of their revisions, however, Kahan and his colleagues made clear that they do not rule out the possible existence of any cognitive bias that would diminish our proclivity to reason in an “identity-protective fashion.” The degree to which these inclinations exist remains for the most part unanswered. During a Joe Rogan podcast, Pinker mentions how we should focus on the challenge of aligning beliefs “more with truth, less with tribal loyalty.” Without regard to the actual difficulty of this challenge, Pinker gives us a bit of hope when pointing out that many beliefs—including  climate change and  gay marriage—in the past few centuries have either changed political sides, or changed attitudes all over the political spectrum. 

Conclusion

It would n0t hurt to assume that the next time one encounters a friend yelling at the television during a speech by President Donald Trump or facing someone during a debate who firmly rejects the available climate science, that his emotional involvement or ideologically-motivated reasoning might affect his stances on relevant, scientific issues. It is more likely this than that his reactions are due to a lack of knowledge or understanding. Additionally, we might conclude that—regardless of whatever scientific discoveries that are being made—the innate nature of these processes by which we engage with politically polarized topics, such as global warming or income inequality, will in all likelihood not leave us any time soon. Although, irrespective of this innate character, we should keep in mind that, “if you believe something cus you’re on the right or on the left”, as Pinker puts it, “then you’re an idiot.”.

Alessandro van den Berg is an economics teacher in the Netherlands.

One thought on “The Limits of Political Reasoning

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.