Monday, December 5, 2016

Politically Uninformed and Unaware Of It: The Dunning-Kruger Effect

Given the current degree of interest in the pervasiveness of political misinformation and the role that it may have played in this year's election campaign, an important question to ask is whether individuals have the ability to recognize that they're misinformed.

As it turns out, as part of the survey I conducted earlier this year I'd set about to examine that very question.  Simply put, my research question was this:

Do uninformed people realize just how uninformed they are? And if they don't, can their lack of awareness be explained in any way?

In 1999, a couple of Cornell University Psychology professors, David Dunning and Justin Kruger, documented the phenomenon wherein individuals who lack certain skills also tend lack the ability to recognize that they lack those skills. They presented their findings in an article in the Journal of Personality and Social Psychology and discussed the phenomenon which now bears their name: the "Dunning-Kruger Effect."

Their study was fairly straightforward: They presented their subjects with a series of tests to assess their ability in a number of areas. Afterwards, they asked each of the subjects to assess how well they'd done on these tests.  In each instance, the subjects who scored lower consistently over-estimated their actual abilities. Dunning and Kruger took this to mean that people who performed poorly lacked the metacognitive skills to realize it.

 Simply put: "incompetent" people lacked the ability to recognize their actual level of "incompetence."

Using this research as a starting point, I employed the same strategy in a survey I administered this Summer to a national sample of American adults. The ability test I used was a standard battery of ten factual questions about politics (e.g. "Do you happen to know which political party currently has the most members in the United States House of Representatives?" "How many times under current laws can someone be elected President of the United States?" "What political office does Paul Ryan currently hold?" etc). Each respondent thus received a Political Information score ranging from 0 to 10 based on the number of correct responses they gave to these questions.

I then had them assess their own abilities in two ways. First, I asked them after each question how certain they were on a scale from 1 to 4  that they'd answered it correctly. 1 indicated they weren't certain at all and 4 indicated that they were very certain.  In addition, at the end of the "test" I asked them to estimate how many of the questions they'd gotten correct.

The results I obtained from this are consistent with those found by Professors Dunning and Kruger. The individuals with lower Political Information scores significantly over-estimated how much they actually knew. Simply put, they were politically uninformed and unaware of it.

The graph below shows the basic pattern demonstrating this:



The horizontal axis represents people's Political Information scores.  The further to the right, the more questions they got correct.  The red line represents the average estimate of the number of answers respondents believed they got correct, arranged according to how many they actually got correct. For example, people who got none of the questions correct stated that they believed they got, on average, between 3 and 4 correct. Those that got four or fewer correct pretty consistently over-estimated the number of correct responses they actually gave. Those that got five or more correct were significantly more likely to also correctly estimate their number of correct responses, although those at the highest end of the scale tended to underestimate their ability, but only slightly so.

Of course, it could be that people with lower levels of Political Information simply aren't any good at guessing the number of correct responses they gave. That's where the additional self-assessment measure comes into play.  By asking them how confident they were that they'd answered each question correctly we're not just measuring their ability to guess the number of correct answers, but their level of confidence in what they believe they know about politics.

For each respondent, I calculated an overall confidence score by adding up each of their 4-point confidence ratings.  A person who was "very certain" that they'd answered all ten questions correctly would then receive a confidence score of 40. A person who was "not certain at all" that they'd answered any of the questions correctly would have a confidence score of 0.

The results in the graph below are similar to those presented above and confirm the interpretation that those with low levels of information weren't just bad at guessing their number of correct responses, they truly believed they knew more than they actually did. In many instances they were "very certain" that they knew something was a "fact" when it actually wasn't a fact at all. Indeed, the gap between their level of confidence and that which they would have been justified in believing was even greater than was suggested by their simple overestimation of correct responses.


So, not only were they politically uninformed and unaware of it, they were significantly over-confident of their own ability.  They weren't just uninformed, they were misinformed.

I'm sure many of you may think you know a person like this. And I'm fairly certain what the next question many of you might want to ask next is: Which candidate's supporters are more likely to arrogantly believe they know more than they actually do?

Who Are The Overconfident Overestimators?

To determine this I calculated two additional scores: overestimation and overconfidence. Overestimation is simply the difference between the number of responses they thought they got correct and their actual number of correct responses.  I measured their overconfidence by adding up their certainty scores for each question they got wrong. This differentiates the person who got a question wrong but wasn't certain they did, from one who had gotten it wrong, but believed they hadn't. In short, it highlights the difference between being simply uninformed and being misinformed.

Despite what I'm sure many of you might want to be true, the results showed that neither candidates' supporters were more likely to overestimate or be overconfident. There was no significant difference in the likelihood of Clinton or Trump supporters to either overestimate their level of information, or to be overconfident in their misinformation.

However, what I did find was actually quite interesting: While there was no relationship between the direction of a person's political leanings and their level of overestimation or overconfidence, there was a significant pattern associated with the extremity of their views.

I had people place themselves on a standard 7-point political ideology scale where 1 meant "Extremely liberal" and 7 meant "Extremely conservative."  The middle of the scale, 4, represents "moderate."  Neither side of the scale showed a greater likelihood to overestimate their level of information or be overconfident in it, but when I folded the scale in half to measure not whether they were liberal or conservative, but  how extreme they were in their ideological views a very clear pattern emerged: Extreme liberals and extreme conservatives were both more likely to overestimate their level of information and be overconfident in their misinformation compared to those at the ideological center.

The effect was not huge, but it was significant.  Those on the ideological extremes incorrectly believed they'd gotten approximately one more correct answer than they actually did than those who identified themselves as moderate. Their level of overconfidence was slightly higher as well. Simply put: With extremity comes certainty, and some of that certainty is clearly misplaced.

In addition, I also found that overestimation and overconfidence were both significantly correlated with one of the dimensions of anti-intellectualism I talked about in my previous post. Overestimation and overconfidence were both higher among those demonstrating higher levels of unreflective instrumentalism: the belief that questions the value of education, especially in its function beyond simply providing job training.

So, not only are they less informed and unaware of it, but they also tend to hold a certain animosity towards the thing that could make them less uninformed.

The Challenge of Combatting Misinformation

To be sure, there's more to being politically informed than simply knowing a handful of facts, but these findings possibly give us insight as to why it seems so challenging to try and "correct" a person's misinformation.  When people firmly believe something to be true, it makes it that much more difficult to convince them otherwise. Those who are more extreme in their views are that much more likely to have the firm conviction of their beliefs.

People want to feel justified in their beliefs. They'll seek out information that reaffirms what they already believe in order to make them even more confident in their position. Furthermore they will rigorously resist any attempts to challenge that.

Misinformation spreads because people want it to be true and, therefore, often don't do their due diligence to determine if it actually is true. The only real solution to the problem is to point out misinformation when you see it. There's no guarantee that it will actually be effective though. The real challenge is in finding a way to do it that will be effective and not cause them erect and reinforce their cognitive barriers of resistance.

There are no easy answers for doing that, but one thing is probably certain: Insulting someone while trying to tell them they're wrong about something is probably not going to be very effective. All too often that's what I see happening in online comment threads. Such "discussions" often devolve into counter-productive name-calling and insults. (I put "discussions" in quotes because they're often not really about an exchange of ideas and developing understanding, but rather winning "burn" points)

It may give the insulter an emotional payoff, but it's not likely to have much effect in making the uninformed insultee less uninformed.   This may be a less than satisfying solution for many, but politics isn't simple, why would you think that fixing people's misconceptions about it would be?

4 comments:

  1. Nice work. Not particularly surprising. Conservative and Extremely conservative may however be a misleading variable. It depends on what they want to conserve. It may not translate with catholic socialists in some cases. There are no good term; actual test of beliefs may get some conservative lefties. After all Conservative has been used as a term in other countries for Russians and Chinese on the left.

    ReplyDelete
    Replies
    1. Dang now I see the typo's. lol. No edit option.

      Delete
    2. The measure of ideology I use here is the standard measure that has been used by the American National Election Study survey for decades. It may be problematic for a variety of reasons, but none of them are very relevant in this case. With respect to your specific comment, since my focus is on the extremity of their views rather than the direction, the problem of what "conservative" means doesn't really matter, especially given the fact that this isn't cross-national data.

      Delete
  2. To gain a wider response across the whole political spectrum the survey will preferably be conducted through an independent channel so that people's opinion of a particular candidate does not influence the research. guarantor loans

    ReplyDelete