Linguists traditionally view language as the result of an innate language instinct, which has evolved via natural selection for some social function. This paper argues that language evolves by cultural transition, by a new generation learning linguistic competence from the linguistic performance of the previous generation, ad infinitum.
The paper introduces both a mathematical and a neural net model of this iterative learning process. Simulations of these models suggest that compositionality, a key property of languages, evolves when the environment is structured, and there is a transmission bottleneck: a flagrant mismatch between the number of sentences possible in the language and those heard by the learner. Since this is probably true for more elaborate and realistic models of language evolution by cultural transition, yet another nail is driven in the coffin of innate language instincts.