Beyond Labels

A 360° Discussion of Foreign, National and Local Policy Issues

Uncategorized

Description

Tribal Affiliation, Scientific Literacy, and Climate Change

Surveys say that most fewer conservatives than liberals believe that climate change is a global threat caused mainly by humans releasing greenhouse gasses.

Is that because they don’t understand science? If they did, would they change their minds? Are liberals convinced of the threat because they understand the underlying science?  Can research help us understand which of these statements are likely to be true?

I think that the answer to the last question is: yes.

And the research says that the answers to the others are no, no, and no.

The details are in the paper, The Tragedy of the Risk-Perception Commons: Culture Conflict, Rationality Conflict, and Climate Change,  linked through the web site of the Cultural Cognition Project at Yale University.  There’s a lot of good stuff at the site beyond this article, including some interesting studies on tribal affiliation, scientific literacy and evolution, nuclear energy, and other controversial topics.

Here’s the Cliff Notes Version of the referenced article:

If you are a liberal, the more science you know, the more you are likely to be convinced climate change is mainly caused by humans and a big problem.

If you are a conservative, the more science you know, the less you believe that. Really. That’s what the research says.

How do you do research that tells you that?

First, you develop a test that measures general understanding of science. The test itself has been well studied and vetted.

Give a bunch of people the test.  Their studies typically include 1500-200 people. Have the people give their views on climate change and their political orientation.

Analyze the results.

If it’s true that the more people understand science the more they are convinced of climate change you’d expect to see scientific illiteracy correlated with climate concern and scientific illiteracy correlated with climate skepticism.

Thus you’d expect to see ignorant people (mainly conservatives) clustered at the low end of the science knowledge scale and the knowledgeable ones (mostly liberals) at the other end.

It turns out that that’s not the way it comes out.

The range of science knowledge among liberals and conservatives is roughly the same. There is no significant correlation between science knowledge and political orientation.

You discover that for liberals, science knowledge is correlated with climate concern. The more science a self-identified liberal knows, the more likely that they are concerned about climate. That’s not surprising.

But you discover that for conservatives, science knowledge is correlated with climate skepticism. The more science that a conservative knows, the less convinced they are. That’s surprising to a lot of people.

Indeed if you look just at the most scientifically literate and numerate people you find that they are slightly less likely than average to see climate change as a serious threat.

This research does not tell us what conclusions to draw about climate change, or how concerned to be.

It does tell us, though, that tribal membership influences opinions far more than our knowledge of science does.

Whatever your opinion, it’s worth keeping that in mind.

Can We Be Rational Citizens

Never mind the other guys (whatever “the other guys” might mean to you on any particular topic) are we (meaning me, you, and the people you find yourself agreeing with on that topic) being rational?

This is a big subject, and the more I think about and research it, the bigger it gets. So here are some subtopics/questions for discussion.

  1. What does it mean to be a rational person?
  2. What does it mean to be a rational citizen/voter?
  3. Is it even a good idea to try to be a one?
  4. Do you think that you vote in a rational way all the time? Most of the time? If so, what definition of rationality are you using?
  5. Are you (and the people you agree with on a particular topic) rational citizens?
  6. Do you think that the people who disagree with you are rational citizens?
  7. Is there something that you do to help you improve your level of rational discourse?
  8. Is there something that you know that you are not doing, but which you could do?

Here’s some food for thought on the topic by a blogger whose nom de blogger is “Scott Alexander.”  His real name (or at least the name he uses elsewhere) is Scott Siskind. Either way, he seems to be call himself Scott, which prejudices me (and perhaps at least one other of us) in his favor.

Scott writes long, long , very well thought out and very well documented posts. If you’re new to his writing,  you probably won’t be able to wade through his build-up to the main argument in the first post. So let me give you a teaser from the post and suggest that if you find it interesting that you start reading around Section VI or VII and go back, bit by bit, if you like it.

To understand the teaser you’ve got to know about the Implicit Association Test. This is a well-studied, time-tested, cleverly-constructed psychological test, which tries to measure association between associations between examples from one concept area, for example:

Good Joy, Love, Peace, Wonderful, Pleasure, Glorious, Laughter, Happy
Bad Agony, Terrible, Horrible, Nasty, Evil, Awful, Failure, Hurt

 

and from the domain being tested, for example pictures of black people and of white people.

If  you have an easier time associating “good” things with pictures of white people than associating those same “good” things with black people, and you have an easier time associating “bad” things with black people than you do associating “bad” things with white people, the test concludes that have a  measurable prejudice in favor of white people and against black people.

If you believe that you hold no prejudice and your responses show this effect nonetheless, then the theory holds that you are still prejudiced, but unconscious of the prejudice.   (Wikipedia article)

You can try it yourself here. You can do the test as a guest or login or register.

Here’s the promised teaser Scott’s article:

Anyway, three months ago, someone finally had the bright idea of doing an Implicit Association Test with political parties, and they found that people’s unconscious partisan biases were half again as strong as their unconscious racial biases (h/t Bloomberg. For example, if you are a white Democrat, your unconscious bias against blacks (as measured by something called a d-score) is 0.16, but your unconscious bias against Republicans will be 0.23. The Cohen’s d for racial bias was 0.61, by the book a “moderate” effect size; for party it was 0.95, a “large” effect size.

Read the whole thing here: “I Can Tolerate Anything Except The Outgroup”

The other post (also long) is called “Five Case Studies in Politicization” and applies some of the theory from the first article to real situations. Scott provides  links to news stories illustrating his points, just in case you don’t believe that an article like “Fat Lesbians Got All The Ebola Dollars, But Blame The GOP” could possibly exist. Apparently does. Or did when I clicked on it. This post is easier to get into, but like many of his posts long. And thought provoking.

 

  • Subscribe via Email

    Receive email notification of new posts/announcements about our weekly meeting.

    Join 241 other subscribers
  • Recent Posts

  • Recent Comments