That, at least, is the hope. But a comparative study of linguistic traits published online today (M. Dunn et al. Nature doi:10.1038/nature09923; 2011) supplies a reality check. Russell Gray at the University of Auckland, New Zealand, and his colleagues consider the evolution of grammars in the light of two previous attempts to find universality in language.
The most famous of these efforts was initiated by Noam Chomsky, who postulated that humans are born with an innate language-acquisition capacity — a brain module or modules specialized for language — that dictates a universal grammar. A few generative rules are then sufficient to unfold the entire fundamental structure of a language, which is why children can learn it so quickly. Languages would diversify through changes to the 'parameter settings' of the generative rules.
The second, by Joshua Greenberg, takes a more empirical approach to universality, identifying traits (particularly in word order) shared by many languages, which are considered to represent biases that result from cognitive constraints. Chomsky's and Greenberg's are not the only theories on the table for how languages evolve, but they make the strongest predictions about universals.
Gray and his colleagues have put them to the test using phylogenetic methods to examine four family trees that between them represent more than 2,000 languages. A generative grammar should show patterns of language change that are independent of the family tree or the pathway tracked through it, whereas Greenbergian universality predicts strong co-dependencies between particular types of word-order relations (and not others). Neither of these patterns is borne out by the analysis, suggesting that the structures of the languages are lineage-specific and not governed by universals.
Nature (2011) doi:10.1038/nature09923
Evolved structure of language shows lineage-specific trends in word-order universals
Michael Dunn et al.
Languages vary widely but not without limit. The central goal of linguistics is to describe the diversity of human languages and explain the constraints on that diversity. Generative linguists following Chomsky have claimed that linguistic diversity must be constrained by innate parameters that are set as a child learns a language1, 2. In contrast, other linguists following Greenberg have claimed that there are statistical tendencies for co-occurrence of traits reflecting universal systems biases3, 4, 5, rather than absolute constraints or parametric variation. Here we use computational phylogenetic methods to address the nature of constraints on linguistic diversity in an evolutionary framework6. First, contrary to the generative account of parameter setting, we show that the evolution of only a few word-order features of languages are strongly correlated. Second, contrary to the Greenbergian generalizations, we show that most observed functional dependencies between traits are lineage-specific rather than universal tendencies. These findings support the view that—at least with respect to word order—cultural evolution is the primary factor that determines linguistic structure, with the current state of a linguistic system shaping and constraining future states.