Field of Science

Nature, Nurture, and Bayes

I generally have very little good to say about the grant application process, but it does force me to catch up on my reading. I just finished several papers by Amy Perfors, who I think does some of the more interesting computational models of language out there.*

A strange sociological fact about language research is that people generally come in two camps: a) those who don't (really) believe language is properly characterized by hierarchical phrase structure and also don't believe in much innate structure but do believe in powerful innate learning mechanisms, and b) those who believe language is properly characterized by *innate* hierarchical phrase structure and who don't put much emphasis on learning mechanisms. But there's no logically necessary connection between being a Nativist and believing in hierarchical phrase structure or being an Empiricist and believing in relatively simple syntactic forms. In the last few years, Perfors has been staking out some of that (largely) unclaimed territory where hierarchical phrase structure and Empiricism meet.

In "The learnability of abstract syntactic principles," she and her colleagues consider the claim by (some) Nativists that children must have an innate expectation that language be something like a hierarchical context-free grammar because there isn't enough data in the input to rule out alternative grammars. (Empiricists often buck the whole question by saying language is no such thing.) Perfors et al. show that, in fact, with some relatively simple assumptions and a powerful (Bayesian) learning device, the learner would conclude that the most likely representation of English is a hierarchical context-free grammar, based on relatively little input (reproducing what happened in linguistics, where linguists came to the same conclusion). You do have to assume that children have the innate capacity to represent such grammars, but you don't need to assume that they prefer such grammars.

"Joint acquisition of word order and word reference" presents some interesting data bearing on a number of questions, but following the theme above, she notes that her model does not require very much data to conclude that the typical word-order in English is subject-verb-object. She and her colleagues note: "The fact that word order can be acquired quickly from so [little data] despite the lack of bias [for a particular word order] may suggest no need to hypothesize that children are born with strong innate constraints on word ordering to explain their rapid acquisition."

I'm sympathetic to all these points, and I think they bring an important perspective to the question of language learning (one that is not, I should say, unique to Perfors, but certainly a minority perspective). What I can't help wondering is this: she (and others) show that you could learn the structure of language based on the input without (certain) innate assumptions that the input will be of a particular sort. Fine. But why is the input of that particular sort across (most? all?) languages? One thing the Nativist positions Perfors argues against have going for them is that they give a (more or less) principled explanation. Empiricists (typically) do not. (I am aware that some try to give explanations in terms of optimal information structure. What I have seen of this work has not struck me as overwhelmingly convincing, but I admit I haven't read enough of it and that I am willing to be convinced, though my prior on this line of argumentation is fairly low).


*My quasi-journalistic training always makes me want to disclose when I know personally the people I am writing about. But psycholinguistics is a small world. It would be safe for the reader to assume that I know *all* of the people I write about to one degree or another.

*********
Perfors A, Tenenbaum JB, & Regier T (2010). The learnability of abstract syntactic principles. Cognition PMID: 21186021

Maurits, L., Perfors, A., & Navarro, D. (2009). Joint acquisition of word order and word reference Proceedings o the 31st Annual Conference of the Cognitive Science Society, 1728-1733

9 comments:

Anonymous said...

ok, maybe you can help me understand this, because I've heard your question at the end, but I'm not sure I understand it.

You are not really asking, "why is the world not random?" right?

Are you asking, "why is language input not random?"

or maybe something more like, "why is language input optimalish?"

none of the above? =)

-ecl.

GamesWithWords said...

None of the above.

There are patterns in the input. Perfors shows, under certain assumptions, those patterns can be learned. But why do those particular patterns exist, rather than some other patterns?

"The world is non-random" is not an explanation, any more than it is an explanation of why apples fall or light behaves sometimes like a particle and sometimes like a wave.

Anonymous said...

I wasn't suggesting "the world is not random" as an explanation. I was suggesting it as a rephrasing of the question.

Joshua Zev Levin, Ph.D. said...

What is remarkable here is the poor grammar exhibited by the original author. The word "data" is the Latin plural of the word "datum", which means "that which is given". In formal or scientific writing, "data" should always be used as a plural.

Thus, instead of "very much data", it should be "very many data", and "little data" should be "few data".

Also, different languages have different preferred word orders. I learned in my Bar-Mitzvah lessons that ancient Hebrew used verb-subject-preposition-object (although modern Hebrew uses a more English-like word order); and I have read that in heavily inflected languages, such as Latin, you can mix up the words and the sentence will still make sense.

I also have a problem with Bayesian analysis. I remember my frustration with the then-popular search engine Alta Vista. I would start a search, and Alta Vista would go off on a tangent and give me results that did not match my needs. I would then reframe the search, using more specific terms, but Alta Vista would be stuck in some kind of Bayesian hysteresis and give me results reflecting its previous misunderstanding of what I was trying to ask. Then Google came to the rescue, and Alta Vista is now part of Yahoo.

Matthew Harvey said...

What are your thoughts on people like Edward S. Reed who argue (see his 2007 paper, The Ecological Approach to Language Development: A Radical Solution to Chomsky's and Quine's Problems) who essentially argue that the Nativist/Empiricist debate is founded in the wrong presuppositions, and that we should instead adopt a ecological perspective?

Th idea basically appeals to the idea of a developmental system, wherein there is no "innate" anything, just the combination of factors (i.e. genetic, environmental (biochemical), social (social), etc.) that result in particular developmental patterns. By this view, language is not a unique, distinct faculty or capacity but rather non-separable part of a complex information sharing and processing system, that most humans end up developing in pretty similar ways because we all get raised in pretty similar ways.

I am an undergrad studying linguistics and neuroscience, and I would be delighted to hear your thoughts on the matter.

Matthew Harvey said...

oy. Reed's paper is 1997, not 2007. Sorry. Wasn't thinking.

GamesWithWords said...

@Matt: I haven't read this particular paper, though I've heard similar arguments. I'm not sure I completely understand them, and I suspect people may be talking past one another. For instance, take the claim that there is nothing innate, only genetic, environmental, and social factors (among others). What definition of "innate" is being used such that genetics isn't innate?

Matthew Harvey said...

Well, it's fortuitous that you picked that particular question, as it's one I think I have the ability to address. The idea is that we should do away entirely with "innate" as a word, because there's no particular reason to say, "This is innate, that is not." Genes are something that a human zygote (and later, infant) has - but their expression is modified, altered, controlled, and influenced by a literally endless list of environmental factors, from nutrient densities to social environment later in life. Basically, it says, nothing exists in isolation.

So in a developmental system, genes don't work according to the Central Dogma of Crick and Watson. There's nothing like: genes (genotype) -> transcription, translation -> amino acids -> proteins -> phenotype (the way the organism is). Instead, genes are just one of many things that contribute to the phenotype - along with social cues (why are you blushing? did your genes just change?), electrochemical cues, etc. etc.

While this is mostly a theory, there is strong, consistent evidence of the fact that epigenetic modification (non-genetic factors that change gene expression) are stable and heritable. So genes, while you're born with a certain set, produce one outcome in one environment and a different outcome in another (search for anything on methylation).

So, to extend this to language, while exaggerating as little as possible: we aren't born with "innate" anything, nor to we "learn" everything as we go along. There is a massively complex, interdependent system of factors ranging from genes to physical environment to linguistic and social environment that allow children to learn to make use of a complex method if information transfer, which is different from, say, pheromone-level communication only cosmetically. This, if you take it seriously, applies Occam's razor swollen to the size of a battering ram to the innate-learning-mechanism-versus-innate-hierarchical-phrase-structure issue. Both are irrelevant, because language is not a faculty, or a single contiguous ability, or even a single entity in any sense. Body language, smells, immediate circumstances, tone, vocal contour, and so on are not auxiliary to a phrase, they are part of it. But that's a different argument. Basically, language is construed not as a regularized series of rules and symbols and so on, but as a dynamic method of communication. Words and phrases are only part of the process. After all, you can never achieve fluency learning rules and exceptions, not with a human brain (which makes room for C-3PO, thank god).

Sorry I've been so verbose. Hopefully you skipped to the end and are not horribly bored.

GamesWithWords said...

@Matt: Your comment got held up in my spam filter for some reason, and so I just saw it.

I wonder if perhaps we mean different things by "innate". Would you be comfortable with stating that each organism has a starting state, from which some end points can be reached and others cannot? That is, the starting state for the human and chimpanzee embryos are different, with the human embryo capable of developing into things that the chimpanzee embryo cannot develop into, and vice versa.

This is all I -- and most people, I think -- mean by saying something is innate: it's a prior constraint on where the system can go. Without that concept or something very similar to it, it seems that it is impossible to ask what is different between different species of animal. We know that it's not just the environment (cf the Noam Chimpsky experiments).