That, at least, is the hope. But a comparative study of linguistic traits published online today (M. Dunn et al. Nature doi:10.1038/nature09923; 2011) supplies a reality check. Russell Gray at the University of Auckland, New Zealand, and his colleagues consider the evolution of grammars in the light of two previous attempts to find universality in language.
The most famous of these efforts was initiated by Noam Chomsky, who postulated that humans are born with an innate language-acquisition capacity — a brain module or modules specialized for language — that dictates a universal grammar. A few generative rules are then sufficient to unfold the entire fundamental structure of a language, which is why children can learn it so quickly. Languages would diversify through changes to the 'parameter settings' of the generative rules.
The second, by Joshua Greenberg, takes a more empirical approach to universality, identifying traits (particularly in word order) shared by many languages, which are considered to represent biases that result from cognitive constraints. Chomsky's and Greenberg's are not the only theories on the table for how languages evolve, but they make the strongest predictions about universals.
Gray and his colleagues have put them to the test using phylogenetic methods to examine four family trees that between them represent more than 2,000 languages. A generative grammar should show patterns of language change that are independent of the family tree or the pathway tracked through it, whereas Greenbergian universality predicts strong co-dependencies between particular types of word-order relations (and not others). Neither of these patterns is borne out by the analysis, suggesting that the structures of the languages are lineage-specific and not governed by universals.
Nature (2011) doi:10.1038/nature09923
Evolved structure of language shows lineage-specific trends in word-order universals
Michael Dunn et al.
Languages vary widely but not without limit. The central goal of linguistics is to describe the diversity of human languages and explain the constraints on that diversity. Generative linguists following Chomsky have claimed that linguistic diversity must be constrained by innate parameters that are set as a child learns a language1, 2. In contrast, other linguists following Greenberg have claimed that there are statistical tendencies for co-occurrence of traits reflecting universal systems biases3, 4, 5, rather than absolute constraints or parametric variation. Here we use computational phylogenetic methods to address the nature of constraints on linguistic diversity in an evolutionary framework6. First, contrary to the generative account of parameter setting, we show that the evolution of only a few word-order features of languages are strongly correlated. Second, contrary to the Greenbergian generalizations, we show that most observed functional dependencies between traits are lineage-specific rather than universal tendencies. These findings support the view that—at least with respect to word order—cultural evolution is the primary factor that determines linguistic structure, with the current state of a linguistic system shaping and constraining future states.
Link
its important to distinguish between Chomsky's and Greenberg's approaches to universals. Chomsky took a deductive approach. He thought that language universals were the result of the existence of a specialized language acquisition device in the brain. Universals in this view are constraints imposed by the structure of this device. Greenberg surveyed languages to look for correlations between features that might be universal. His surveys were limited by the failure of linguists to use a common system of description. He did report some tendencies but these were not very sharp. He regarded them as signals of deeper mechanisms that have not yet been uncovered.
ReplyDelete"The second, by Joshua Greenberg..."
ReplyDeleteIt's Joseph or Joe Greenberg. Pretty embarrassing for Nature to allow such a mistake.
LINGUISTIC NON SEQUITURS
ReplyDelete(1) The Dunn et al article in Nature is not about language evolution (in the Darwinian sense); it is about language history.
(2) Universal grammar (UG) is a complex set of rules, discovered by Chomsky and his co-workers. UG turns out to be universal (i.e., all known language are governed by its rules) and its rules turn out to be unlearnable on the basis of what the child says and hears, so they must be inborn in the human brain and genome.
(3) Although UG itself is universal, it has some free parameters that are set by learning. Word-order (subject-object vs. object-subject) is one of those learned parameters. The parameter-settings themselves differ for different language families, and are hence, of course, not universal, but cultural.
(4) Hence the Dunn et al results on the history of word-order are not, as claimed, refutations of UG.
Harnad, S. (2008) Why and How the Problem of the Evolution of Universal Grammar (UG) is Hard. Behavioral and Brain Sciences 31: 524-525