Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, May 2, 2013

Foxes, hedgehogs, and extreme politics

As if we needed anything to make us less confident about what goes on inside our skulls, an article in e! Science News appeared on Monday, entitled, "Extreme Political Attitudes May Stem From an Illusion of Understanding."

The study's principle author, Philip Fernbach of the University of Colorado, explained that the study came out of an observation that people who loudly expressed views on politics often seemed not to have much in the way of factual knowledge about the topic upon which they were expounding.

"We wanted to know how it's possible that people can maintain such strong positions on issues that are so complex -- such as macroeconomics, health care, foreign relations -- and yet seem to be so ill-informed about those issues,"  Fernbach said.

What the study did was to ask a group of test subjects to rate how well they understood six different political issues, including instituting merit pay for teachers, raising the age on Social Security, and enacting a flat tax.  The subjects then were asked to explain two of the policies, including their own position and why they held it, and were questioned on their understanding of facts of the policy by the researchers.  Afterwards, they were asked to re-rate their level of comprehension.

Across the board, self-assessment scores went down on the subjects they were asked to explain.  More importantly, their positions shifted -- there was a distinct movement toward the center that occurred regardless of the political affiliation of the participant.  Further, the worse the person's explanation had been -- i.e., the more their ignorance of the facts had been uncovered -- the further toward the center they shifted.

This seems to be further evidence for the Dunning-Kruger effect -- a bias in which people nearly always tend to overestimate their own knowledge and skill.  (It also brings to mind Dave Barry's comment, "Everyone thinks they're an above-average driver.")

I'm also reminded of Philip Tetlock's brilliant work Expert Political Judgment, which is summarized here but which anyone who is a student of politics or sociology should read in its entirety.  In the research for his book, he analyzed the political pronouncements of hundreds of individuals, evaluating the predictions of experts in a variety of fields to the actual outcome in the real world, and uses this information to draw some fascinating conclusions about human social behavior.  The relevant part of his argument, for our purposes here, is that humans exhibit two basic "cognitive styles," which he calls "the fox and the hedgehog" (the symbols come from a European folk tale).

Foxes, Tetlock says, tend to be able to see multiple viewpoints, and have a high tolerance for ambiguity (in the interest of conciseness, quotes are taken from the summary, not from the original book):
Experts who think in the 'Fox' cognitive style are suspicious of a commitment to any one way of seeing the issue, and prefer a loose insight that is nonetheless calibrated from many different perspectives.  They use quantification of uncertain events more as calibration, as a metaphor, than as a prediction.  They are tolerant of dissonance within a model - for example, that an 'enemy' regime might have redeeming qualities - and relatively ready to recalibrate their view when unexpected events cast doubt on what they had previously believed to be true.
Hedgehogs, on the other hand, like certainty, closure, and definite answers:
In contrast to this, Hedgehogs work hard to exclude dissonance from their models. They prefer to treat events which contradict their expectations as exceptions, and to re-interpret events in such a way as to allocate exceptions to external events. For example, positive aspects of an enemy regime may be assigned to propaganda, either on the part of the regime or through its sympathizers...  Hedgehogs tend to flourish and excel in environments in which uncertainty and ambiguity have been excluded, either by actual or artificial means. The mantra of "targets and accountability" was made by and for Hedgehogs.
The differences, Tetlock said, are irrespective of political leaning; there are conservative and liberal foxes, and conservative and liberal hedgehogs.  But, most importantly, the foxes' tolerance of many viewpoints, and awareness of their own ignorance, gives them the appearance of knowing less than they actually do, and lessens their influence on policy and society; and the hedgehogs' certainty, and clear, concise answers to complex problems, gives them the appearance of knowing more than they actually do, and increases their influence.

Hedgehogs, Tetlock found, were more often wrong in their assessment of political situations, but their views achieved wide impact.  Foxes were more often right -- but no one listened.

So, anyway, I read all of this with a vague sense of unease.  Having a blog, after all, implies some level of arrogance -- that you believe your views to be important, intelligent, and interesting enough that people, many of them total strangers, will want to read what you have to say.  Given Fernbach's study, not to mention the Dunning-Kruger effect and the conclusion of Tetlock's research, it does leave me with a bit of a chill.  Would my views on topics become less extreme if I were forced to reconsider the facts of the situation?  Do I really think I'm more knowledgeable than I actually am?  Worst of all (for a blogger), am I a simplistic thinker that is often wrong but whose views have wide social impact, or a complex thinker that no one pays attention to?

Oy.  I'm not sure I, um, want to reevaluate all this.  I think I'll just go have breakfast.  That sounds like a definitive solution to the problem, right?

Of course right.

4 comments:

  1. I guess I have some new reading to chew on. While I have seen this behavior all of my life, I had no idea what to call it. This sums up so much!

    ReplyDelete
  2. I've seen that you're open to changing your opinion based on evidence, and the fact that you spend time worrying about it is a pretty good indication that you're "foxy."

    This study (when compared with others that show that when you give people contrary information they often become even more set in their dumbass views) also seems to confirm an opinion of mine -- that the best way to convince people of something is not to tell them things, but to ask them things.

    ReplyDelete
  3. I always thought Ayn Rand looked like a Hedgehog... Now I know why.

    ReplyDelete
  4. A good look at two different ways of human thinking (or not thinking)! Mark Twain once said:

    "Sane and intelligent human beings are like all other human beings, and carefully and cautiously and diligently conceal their private real opinions from the world and give out fictitious ones in their stead for general consumption."

    We should strive to change our inner-opinions so we don't revert back to the hedgehog mindset all too often. It is not enough just to say strongly, we must train ourselves to think strongly, which should fix the problem altogether. Easier said than done for the average human, harder for the upper and lower 50%...if you get my drift :)

    ReplyDelete