Computers and language
I just dictated the following note to Siri
Many of our best computer systems treat words as essentially meaningless symbols that need to be moved around.Here's what she wrote
Many of our best computer system street words is essentially meaningless symbols that need to be moved around.I rest my case.
The problem of meaning.
I don't know for sure how Siri works, but her mistake is emblematic of how much language software works. Computer systems treat and Computer system street sound approximately the same, but that's not something most humans would notice because the first interpretation makes sense and the second one doesn't.
Decades of research shows that human language comprehension is heavily guided by plausibility: when there are two possible interpretations of what you just heard, go for the one that makes sense. This happens speech recognition like in the example above, and it plays a key role in understanding ambiguous words. If you want to throw Google Translate for a look, give it the following:
John was already in his swimsuit as we reached the watering hole. "I hope the tire swing is still there," John said as he headed to the bank.Although the most plausible interpretation of bank here is side of a river, Google Translate will translate it into the word for "financial institution" in whatever language you are translating into, because that's the most common meaning of the English work bank.
So what's the problem?
I assume that this limitation is not lost on the people at Google or at Apple. And, in fact, there are computer systems that try to incorporate meaning. The problem there is not so much the computer science as the linguistic science.** Dictionaries notwithstanding, scientists really do not know very much about what words mean, and it is hard to program the computer to know what the word means when you actually do not know.
(Dictionaries are useful, but as an exercise, pick* definition from a dictionary and come up with a counterexample. It is not hard.)
One of the limitations is scope. Language is huge. There are a lot of words. So scientists will work on the meanings of a small number of words. This is helpful, but a computer that only knows a few words is pretty limited. We want to know the meanings of all words.
Solving the problem
We've launched a new section of the website, VerbCorner. There, you can answer questions about what verbs mean. Rather than try to work out the meaning of a word all at once, we have broken up the problem in a series of different questions, each of which tries to pinpoint a specific component of meaning. Of course, there are many nuances to meaning, but research has shown that certain aspects are more important that others, and we will be focusing on those.
I will be writing a lot more about this project, it's goals, the science behind it, and the impact we expect it to have over the coming weeks. In the meantime, please check it out.
----
*Dragon Dictate originally transcribed this as "pickled", which I did not catch on proofreading. More evidence that we need computer programs that understand what words mean.
**Dragon Dictate make spaghetti out of this sentence, too.
4 comments:
What's most puzzling about the "systems treat" vs. "system street" error is that it almost looks as if Siri isn't using even a n-gram language model, let alone anything that incorporates semantic plausibility. The frequency of these two bigrams is obviously hugely different:
http://books.google.com/ngrams/graph?content=systems+treat%2Csystem+street&year_start=1800&year_end=2000&corpus=15&smoothing=3&share=
Mark Liberman has made a similar point on Language Log recently:
http://languagelog.ldc.upenn.edu/nll/?p=4610
I wondered about that, too. On the other hand, “street words” is much more common than “treat words”. So I suppose it matters how exactly they weight overlapping N-grams.
"Dictionaries are useful, but as an exercise, pickled definition from a dictionary and come up with a counterexample."
I assume examples like this in the text (there are several) are not Siricisms but are intended to make a point about human comprehension?
Interesting.
Ironically, due to Dragon Dictate transcription errors. I think I've fixed them all, and updated the post with that fact. Thanks for pointing them out.
Post a Comment