Defining a word is notoriously difficult. Try to explain the difference between hatred and enmity, or define chair in such a way that includes bean bag chairs but excludes stools.
This is an annoyance for lexicographers and a real headache for philosophers and psychologists. Several centuries ago, British philosophers like Hobbes worked out what seemed like a very reasonable theory that explained human knowledge and how we acquire it. However, this system is based on the idea that all words can be defined in terms of other words, except for a few basic words (like blue) which are defined in terms of sensations.
This difficulty led at least one well-known philosopher, Jerry Fodor, to declare that words cannot be defined in terms of other words because word meaning does not decompose into parts the way a motorcycle can be disassembled and reassembled. You can't define chair as an artifact with legs and a back created for sitting in because chair is not a sum of its parts. The problem with this theory is that it makes learning impossible. Fodor readily acknowledges that if he is correct, babies must be born with the concept airplane and video tape, and in fact all babies who have ever been born were born with every concept that ever has or ever will exist.
This seems unlikely, but Fodor is taken seriously partly because his arguments against definitions have been pretty convincing.
Ray Jackendoff, a linguist at Tufts University, argued in his recent Foundations of Knowledge, that words do in fact have definitions. However, those definitions themselves are not made up of words composed into sentences.
Jackendoff has proposed that a very different system (lexical semantics) using different rules is employed when we learn the meanings of new words by combining little bits of meaning (that themselves may not map directly on to any words).
I think that this is a very attractive theory, in that it explains why definitions have been so hard to formulate: we were using phrasal semantics, which is just not equipped for the task. However, he hasn't yet proven that words do have definitions in terms of lexical semantics. He has the sketch of a theory, but it's not yet complete.
This is an annoyance for lexicographers and a real headache for philosophers and psychologists. Several centuries ago, British philosophers like Hobbes worked out what seemed like a very reasonable theory that explained human knowledge and how we acquire it. However, this system is based on the idea that all words can be defined in terms of other words, except for a few basic words (like blue) which are defined in terms of sensations.
This difficulty led at least one well-known philosopher, Jerry Fodor, to declare that words cannot be defined in terms of other words because word meaning does not decompose into parts the way a motorcycle can be disassembled and reassembled. You can't define chair as an artifact with legs and a back created for sitting in because chair is not a sum of its parts. The problem with this theory is that it makes learning impossible. Fodor readily acknowledges that if he is correct, babies must be born with the concept airplane and video tape, and in fact all babies who have ever been born were born with every concept that ever has or ever will exist.
This seems unlikely, but Fodor is taken seriously partly because his arguments against definitions have been pretty convincing.
Ray Jackendoff, a linguist at Tufts University, argued in his recent Foundations of Knowledge, that words do in fact have definitions. However, those definitions themselves are not made up of words composed into sentences.
Observing (correctly) that one usually cannot find airtight definitions that work all of the time, Fodor concludes that word meanings cannot be decomposed. However, his notion of definition is the standard dictionary sort: a phrase that elucidates a word meaning. So what he has actually shown is that word meanings cannot be built by combining other word meanings, using the principles that also combine words into phrases. (p. 335)That is, there are ways that words can be combined in sentences to achieve meaning that is greater than the sum of the meanings of the words (compare dog bites man to man bites dog). This is called phrasal semantics. Although linguists still haven't worked out all the rules of phrasal semantics, we know that there are rules, and that these allow for certain combinations and not others.
Jackendoff has proposed that a very different system (lexical semantics) using different rules is employed when we learn the meanings of new words by combining little bits of meaning (that themselves may not map directly on to any words).
I think that this is a very attractive theory, in that it explains why definitions have been so hard to formulate: we were using phrasal semantics, which is just not equipped for the task. However, he hasn't yet proven that words do have definitions in terms of lexical semantics. He has the sketch of a theory, but it's not yet complete.
No comments:
Post a Comment