Field of Science

Citizen Science Works!

Earlier this month, we presented the results of the pilot phase of VerbCorner -- our citizen science project probing the nature of linguistic structure -- at a scientific conference (the Workshop on Events in Language and Cognition). You can see the poster describing the work here

For those who don't know or don't remember, in VerbCorner, we're trying to work out the grammar rules that apply to verbs. Why do you say Agnes looked at the wall but not Agnes saw at the wall? Why do you say Bart filled the glass with water but not Bart poured the glass with water? Many -- but not all -- linguists believe that grammatical idiosyncrasies are explained by the meanings of the verbs, but evidence is sketchy. Volunteers have been visiting our website to help analyze the meanings of verbs so we can find out.

High-Quality Analyses by Volunteers

Our initial work -- the pilot for the pilot, if you will -- suggested that we could get high-quality analyses from volunteers. But that was based on a very small sample. As of late Feb, over 10,000 volunteers had contributed over 525,000 analyses. In general, the agreement between different volunteers was pretty high -- which is a good sign. Just as importantly, we had a smaller set of 'test' items, for which we knew what professional linguists would say. When we combine the analyses of different volunteers for the same sentence in order to get a 'final answer', the results match the analyses of professional linguists very well. This shows that we can trust these results.

Where Quantity Becomes Quality

Just as importantly, we were able to analyze a lot of sentences. In the VerbCorner project, we are trying to determine which sentences have which of a very specific set of aspects of meaning. One aspect is whether the sentence involves something changing physical form (example: Agnes broke the vase as opposed to Agnes touched the vase). Another aspect is whether the sentence involves anything applying physical force to anything else (ex: Agnes pushed Bart as opposed to Agnes looked at Bart). 

For purposes of bookkeeping, let's call one aspect of meaning for one sentence an 'item.' After combining across different volunteers, the results were clear enough to definitively code 31,429 items. This makes VerbCorner the largest study of it's kind by far. (A typical study might only look at a few hundred items.) 

This quantity makes a big difference. Given how small studies usually are, they can only look at one tiny corner of the language. The problem is that that corner might not be representative. Imagine studying what Americans are like by only surveying people in Brooklyn. This tends to lead to disagreements between different studies; one linguist studies "Brooklyn" and another studies "Omaha", and they come to very different conclusions! Unfortunately, language is so complex and so vast, one person can only analyze one corner. This is why we are recruiting large numbers of volunteers to help!

The results

One major question we had was how much the rules of verb argument structure (that is, the kinds of grammatical rules described above) depend on meaning. Some linguists think they depend entirely on meaning: If you know the meaning of a verb, you know what its grammar will be like. Others think meaning has very little role to play. Most linguists are probably somewhere in the middle.

The results suggest that the first group is right: These rules depend almost entirely on meaning. Or maybe even entirely; it's so close it is hard to tell.

The reason I say "suggest," however, is that while we have the biggest study of its kind, it still only covers about 1% of English. So we've gone from studying Brooklyn to studying all of NYC. It's an improvement, but not yet enough. 

This is why I called this first phase a "pilot". We wanted to see if we could get high-quality, clearly-interpretable results from working with volunteers. Many researchers thought this would be impossible. After all, linguists have to go through a lot of schooling to learn how to analyze sentences. But a key finding of the Citizen Science movement is that there are a lot of smart enthusiasts out there who may not be professionals but can very much contribute to science.

The next phase

We have set a goal of reaching 50,000 completed items by July 1st. That will require upping our game and increasing the rate at which we're analyzing items by almost 4x. But the beauty of Citizen Science is that this does not really require that much work on anyone's part. If 3,000 volunteers each spend about one hour contributing to the project, we'll more than hit that goal. So please help out, and please tell your friends. You can contribute here.