In the history of science, one recurring theme running through some of the big breakthroughs has been that we humans are not special. We are not at the center of the solar system or the universe, there is no biological reason to believe that we were created in any other way than through evolution, and so on. So, I was surprised when I was young to learn of Noam Chomsky’s views on language. Whereas I assumed that we humans had language simply because we were smarter than the other animals, and that they would have language too if they were to become as smart as we are, Chomsky insists that we humans are special. We are specially designed to understand language. (Some people go a lot further and insist that animals are mere automatons that never think or have feelings.) At the time, he didn’t seem to have any details, but now he or his fans are saying that human language began about 50,000 years ago with an important mutation. Before then, there was no language.
Strictly speaking, there was no human language because what existed was animal languages (and maybe proto-languages among humans). Chomsky characterizes animal “languages” as of two varieties. One consists of the simple commands we give to dogs in which there is one word or phrase associated with one command. The other is exemplified by the language of bees, in which there are an infinite number of “words” that correspond with an infinite number of meanings. In this case, the words tell how far away the pollen can be found. So, either there are a finite number of words with a finite number of meanings, or an infinite number of words with an infinite number of meanings. But only human languages can have a finite number of words with an infinite number of meanings. So says Chomsky, anyway.
Now we can have an infinite number of meanings because of one feature of our languages: recursion. For example, “he is an old man” can be turned into “he is an old, old man.” And in fact we can add in any number of instances of the word “old” that we like. Or we can have an infinite number of clauses, such as “this is the cat that killed the rat that ate the malt that lay in the house that Jack built.” That of course doesn’t have an actual infinity of clauses, but it is part of a nursery rhyme that keeps going with many more clauses and which could theoretically go to infinity.
Anyway, how can young children learn a language, given that there is such a bewildering array of data they have to deal with? And how can they learn which combinations of words are grammatically correct and which are not, given that the data they encounter isn’t ample enough for them to learn this? Chomsky answered this question by assuming we have innate knowledge. He quotes with approval a passage from Descartes about how the mind when it first encounters triangles in infancy must find it too confusing, unless we have innate knowledge of triangles to help us along. Not many people still believe this, but Chomsky was undaunted. We must have innate knowledge to help us learn language when we are young, and that knowledge of course could not be of any particular language, so it must be of general grammatical rules that would help us learn every language. As far as I know, these general rules have never been spelled out in their entirety.
But what I know is this: pluralization is dealt with in such a wide variety of ways in all the world’s languages that I doubt if there could be any innate rule other than “expect just about anything.” Some languages don’t use plurals, others do and do it simply, while still others do it in a complicated way. Some languages have not just singular and plural, but singular, dual (for two things), and plural. And some have “trial,” that is, grammatical forms for three things. Supposedly, no language has forms for four things (see here), though of course we can’t be sure there was never a language that had it. Assuming we could, we could enshrine it as a rule of universal grammar:
Don’t expect any language to have special grammatical forms for four things, much less for any number greater than four.
Needless to say, this rule isn’t that helpful.
More to the point, the argument of the rationalists is always the same and always fails for the same reason: reality is too confused, they say, for us to understand without innate knowledge to help us. Except that even with innate knowledge things are still very confused. The innate knowledge may reduce the confusion a little, but one still needs to figure out which sense impressions to apply that knowledge to. Take Descartes’s example of innate knowledge of a triangle. He claims the first time I see two triangles, I can’t manage to figure out that they are two instances of the same type of entity because they are so imperfect. So, I need innate knowledge of a triangle to help me along. But how exactly does my mind, using this innate knowledge, figure out that the imperfect triangle I now see should have the concept triangle applied to it rather than some other concept entirely? And if my mind can do that, then it can probably also figure out that two triangles are the same type of shape without needing any innate knowledge.
Anyway, starting from the quasi-religious view that we humans are special, Chomsky has to believe that human languages are qualitatively different from anything animals use, and the big difference he latched onto was recursion. Recursion allows us to create an infinite number of sentences from a finite number of words, thus differentiating human languages from animal languages. So, it came as no doubt a big shock to Chomsky and his fans when another linguist, Daniel L. Everett, announced that there was a language used in the Amazon jungle that didn’t use recursion.
The language in question uses very simple sentences, and its practitioners use context a lot to figure out what speakers mean. For example, we know from context that a sign at the door of a restaurant saying, “no shoes, no shirt, no service” means “if you aren’t wearing both shoes and a shirt, you won’t be served.” The grammar of these languages is very simple, of the “me Tarzan, you Jane” variety. Everett’s claim is that simple languages like this with no recursion and very simple grammar are used today. He also claims that they were used in the very distant past by homo erectus. If he is right, then language goes back way more than a million years instead of a mere 50,000.
His evidence for this comes from a lot of discoveries and anthropological claims that I have to take on faith, since I know nothing of this area. He claims, for example, that there were islands that were settled by homo erectus and that to get to these islands they must have known how to build and sail boats, and since building a boat and sailing it must have required lots of coordinated activities, they must have had language to have achieved this. In fact, what he is saying is that there was once a species different from us and probably less intelligent (based on their brain size) that used language, but presumably their language was not as sophisticated as the recursive languages that Chomsky talks about.
I find this quite exciting, actually. The frustrating thing about contemplating the development of language is that we humans use language, whereas animals don’t. There just aren’t any animals that sort of use language, that can say hi to us and ask us for food, say they are sick or afraid, but can’t say much beyond that. They just don’t use it at all (except for some specially-trained chimps). So, it’s exciting to think there were human ancestors who had language, though languages without the sophisticated grammar of English, French, etc. They represent a missing link, a missing linguistic link.
The basic purpose of Everett’s book is to present a plausible story about how language developed. It starts, he says, not with grammar, but with the use of what he calls icons, namely entities that resemble something else. In Egyptian hieroglyphics, for example, the earlier forms are more pictorial. As time goes on, eventually people (or homo erectus or whoever) figure out that they don’t need icons and they can get by with symbols, that is, arbitrary things standing for something else. And this is what happened with hieroglyphics: they got simplified as time went on. And so we get to the level of language Everett found in the Amazon.
Later on, sophisticated grammar develops as a way we can communicate to others without needing a lot of contextual information to help us out since that information gets encapsulated in the grammar (think of the longer version of “no shoes, no shirt, no service”). In fact, while Everett doesn’t mention this, it seems that humanity went overboard inventing grammar. Proto-Indo-European had eight or nine cases for nouns (here), ancient Greek had five, German has four, while English has mostly dispensed with cases. In Swedish, more and more speakers in the twentieth century decided to eliminate plural forms for verbs (something my grandmother, a native Swedish speaker who learned the language here in America, never knew about). So, it looks like some people went way overboard with grammar and then backed off as they realized all those distinctions were not as necessary as had been thought.
All this would make for a fairly short book, except that Everett has to show that Chomsky is wrong about a lot of things that, if he weren’t so important, wouldn’t even need to be said. No, there is no part of the brain devoted entirely to language. No, disabilities related to language don’t show that a part of the brain is devoted entirely to language because such disabilities always involve other disabilities as well. No, the limited vocal capabilities of homo erectus doesn’t mean they didn’t have a language using sounds; they just had fewer of them than we do. No, the failure to find any explicit evidence of language among homo erectus doesn’t mean they didn’t have language, for if these Amazonian Indians were to be investigated 100,000 years from now, no one could know they used language, either, since none of their language is written down. No, the point of language isn’t whatever Chomsky says it is, which I will get to shortly, but communication, and communication doesn’t always need a sophisticated grammar. No, languages don’t need recursion. And so on.
Face it, Chomsky’s views are peculiar. If he were right, then when humanity developed this mutation that allowed us to learn a language, there were no languages around that could be learned with this mutation, which seems to suggest the mutation would just disappear never to be seen again (p. 71). What survival benefits would it bring, after all? None, apparently. Also, if he were right, a species ten times as intelligent as we are could never develop language if they didn’t have the right mutation, and a species with half our intelligence could develop language if they did. A species with a slightly different mutation would apparently have different rules for universal grammar, which means their languages and ours would be mutually unintelligible, I guess. Or at any rate, a human raised among such a species would never learn their language as a first language. I think. Frankly, I find Chomsky’s views so alien that I find it hard to develop the consequences.
Then there is something I learned only while reading this book, that Chomsky doesn’t see language’s purpose as communication but something else (clarifying one’s thoughts, perhaps?). Everett quotes John Searle describing Chomsky’s views (p. 226): “The syntactical structures of human languages are the products of innate features of the human mind, and they have no significant connection with communication, though, of course, people do use them for, among other purposes, communication. The essential thing about languages, their defining trait, is their structure.” Also from Searle, “It is important to emphasize how peculiar and eccentric Chomsky’s overall approach to language is.” Yup. And this brings up another problem. If syntactical structures were everything, then American students taking a few years of foreign language in French, say, where they learn a lot about grammar should be able to go to France and have no problem. In fact, they have big problems, which is why some schools are now changing direction radically and offering immersion instead.
To give Chomsky the benefit of the doubt, his theory is answering the question of how children learn a language given that they have (allegedly) too little data to do it, while Everett is answering the question how language began. Those are two different questions, and it could be that they are both right. It could be that Everett is right that language began with very little grammar, but that at some point as grammar developed, there was a mutation that allowed children to learn a language’s grammar much more easily than they otherwise would have learned it. I don’t think this is true, because I think that what Chomsky sees as a problem is not really a problem.
I'm not going to get into that because I've already gone on too long already. A lot more could be said on this whole topic, but I’m assuming that Everett’s view of language development will prevail, and that after another century of work in linguistics and other areas, Chomsky will be forgotten. Saying that humans are special just hasn’t worked out very well in science.
"I'm not going to get into that because I've already gone on too long already."
Not to mention repetition :)
Posted by: TheBigHenry | 11/13/2017 at 09:08 AM
What you report here is similar to what I am reading in Daniel C. Dennett's "From Bacteria to Bach and Back: The Evolution of Minds", which also mentions both Chomsky and Searle, a thinker who "insists that there can be no genuine comprehension without consciousness." Dennett's book is an extended argument involving evolutionary theory. I hope I can follow the argument with just a 140-character attention span.
Posted by: Mark Spahn (West Seneca, NY) | 11/13/2017 at 01:42 PM
Are you familiar with Julian Jaynes book "Origin of Consciousness in the Breakdown of the Bicameral Mind?" It's all highly speculative, but one thing he refers to is "aptic structure." It's a an inherent capability or potential one has, but it must be developed by learning. Humans have sufficient brain development (are smart enough) to understand language and humans have vocal organs suitable for language -- those are aptic structures. But language itself is an accretion of knowledge developed and passed on over generations. Language involves the development of concepts, symbols for concepts, and rules or patterns for organizing those symbols to generate new concepts.
Humans have this. I'll not try to prove it, but other species can handle at least some human language, but haven't the aptic structure for vocalizing our language. If so, this refutes Chomsky's "mutation" theory.
http://ngm.nationalgeographic.com/2008/03/animal-minds/virginia-morell-text
The hard thing about all this is conceptualization, not language. Can an individual, human or not human, grasp a concept? IMO, there's spectrum of ability to handle abstractions, and non-humans exhibit varying degrees of being able to grasp human concepts, with some members of some species proving more adept than one might presuppose. They have the brains for it, to one extent or another (that particular aptic structure).
But for concept formation, language provides a tool for forming more concepts, and more complicated concepts, and passing them on. I don't mean anything like the Whorf hypothesis, where language defines the limits to one's ability to form concepts, but language is an accelerant for concept formation. Other species don't have the aptic structure -- our ability to vocalize, and later, to write -- for that.
Posted by: Charles N.Steele | 11/16/2017 at 04:37 PM
No, I haven't heard of Jaynes' book. Everett talks about how we can use sign language as quickly as we can use our vocal chords, and that this means that animals aren't dependent on vocal chords for using language. I don't know if or how that fits into these aptic structures you are talking about.
Posted by: John Pepple | 11/21/2017 at 12:05 PM
I should read Everett.
It strikes me that our hands are remarkably dexterous, giving us options for sign language that most animals lack. Dogs have sign language with tails, but it seems to me it would be difficult to build a complex language using tail positions and motions. Hence while I suspect dogs can learn a great deal more of our language than is commonly realized, they have no good way of communicating with it and cannot pass it on or develop their own complex concepts.
My point is that humans *are* special, but not in the way Chomsky thinks. It's not our brains, it's our opposable thumbs and our ability to vocalize.
Posted by: Charles N.Steele | 11/21/2017 at 05:16 PM