Field of Science

Does Global Warming Exist, and Other Questions We Want Answered

This week, I asked 101 people on Amazon Mechanical Turk both whether global temperatures have been increasing due to human activity AND what percentage of other people on Amazon Mechanical Turk would say yes to the first question. 78% agree with the answer to the first question. Here's the answers to the second, broken down by whether the respondent did or did not believe in man-made global warming:

Question: How many other people on Amazon Mechanical Turk believe global temperatures have been increasing due to human activity?

                     Average            1st Quartile-3rd Quartile
Believers         72%                         60%-84%
Denialists        58%                         50%-74%
Correct            78%                             ------

Notice that those who believe global warming is caused by human activity are much better at estimating how many other people will agree than are those who do not. Interestingly, the denialists' answer is much closer to the average of all Americans, rather than  Turkers (who are mostly but not exclusively American, and are certainly a non-random sample).

So what?

Why should we care? More importantly, why did I do this experiment? A major problem in science/life/everything is that people disagree about the answers to questions, and we have to decide who to believe. A common-sense strategy is to go with whatever the majority of experts says. There are two problems, though: first, it's not always easy to identify an expert, and second, the majority of experts can be wrong.

For instance, you might ask a group of Americans what the capital of Illinois or New York is. Although in theory, Americans should be experts in such matters (it's usually part of the high school curriculum), in fact the majority answer in both cases is likely to be incorrect (Chicago and New York City, rather than Springfield and Albany). This was even true in a recent study of, for instance, MIT or Princeton undergraduates, who in theory are smart and well-educated.

Which of these guys should you believe?

So how should we decide which experts to listen to, if we can't just go with "majority rules"? A long chain of research suggests an option: ask each of the experts to predict what the other experts would say. It turns out that the people who are best at estimating what other people's answers will be are also most likely to be correct. (I'd love to cite papers here, but the introduction here is coming from a talk I attended earlier in the week, and I don't have the the citations in my notes.) In essence, this is an old trick: ask people two questions, one of which you know the answer to and one of which you don't. Then trust the answers on the second question that come from the people who got the first question right. 

This method has been tested on a number of questions and works well. It was actually tested on the state capital problem described above, and it does much better than a simple "majority rules" approach. The speaker at the talk I went to argued that this is because people who are better able to estimate the average answer simply know more and are thus more reliable. Another way of looking at it though (which the speaker mentioned) is that someone who thinks Chicago is the capital of Illinois likely isn't considering any other possibilities, so when asked what other people will say guesses "Chicago." The person who knows that in fact Springfield is the capital probably nonetheless knows that many people will be tricked by the fact that Chicago is the best-known city in Illinois and thus will correctly guess lots of people will say Chicago but that some people will also say Springfield. 

Harder Questions

I wondered, then, how well it would work on for a question where everybody knows that there are two possible answers. So I surveyed Turkers about Global Warming. Believers were much better at estimating how many believers there are on Turk than were denialists.

Obviously, there are a few ways of interpreting this. Perhaps denialists underestimate both the proportion of climate scientists who believe in global warming (~100%) and the percentage of normal people who believe in global warming, and thus they think the evidence is weaker than it is. Alternatively, denialists don't believe in global warming and thus have trouble accepting that other people do and thus lower their estimates. The latter proposal, though, would suggest that believers should over-estimate the percentage of people who believe in global warming, though that is not in fact the case.

Will this method work in general? In some cases, it won't. If you asked expert physicists in 1530 about quantum mechanics, presumably none of them would believe it and all would correctly predict that none of the other would believe it. In other cases, it's irrelevant (near 100% of climatologists believe in man-made global warming, and I expect they all know that they all believe in it). More importantly, the method may work well for some types of questions and not others. I heard in this talk that researchers have started using the method to predict product sales and outcomes of sports matches, and it actually does quite well. I haven't seen any of the data yet, though.


------
For more posts on science and politics, click here and here.

1 comment:

Tim said...

I think you're talking about the Bayesian Truth Serum. Here's the link to the paper:

http://www.sciencemag.org/cgi/content/abstract/306/5695/462?etoc

and the reference is:

Prelec, D. (2004). A Bayesian Truth Serum for Subjective Data. Science, 306 (5695), 462 - 466.