Acceso abierto

A Few Misconceptions about Cultural Evolution


Cite

Two Kinds of Evolution

Alex Mesoudi makes several excellent points in his essay in this issue of the Journal of Cultural Science but none so important as this: If “Darwinian evolution” is defined minimally as a system of variation, selection, and inheritance, irrespective of the details of how those principles operate, then it should be noncontroversial to argue that culture constitutes a Darwinian evolutionary process. This kind of statement with respect to cultural evolution has been made repeatedly in the social-science literature for decades by a small but growing number of archaeologists, anthropologists, psychologists, ethologists, and others, yet even now, at the beginning of the second decade of the twenty-first century, it is fair to say that there remains a wide gulf between those who view culture as a Darwinian process and those who view it as something else. I agree with Mesoudi that an evolutionary approach to culture change is viewed with suspicion or downright hostility by the rank and file members of the social sciences, but I would add a caveat: Not all cultural evolution is viewed with suspicion, only Darwinian evolution.

Plenty of social scientists, and we can include humanists as well, will readily admit that culture is an evolutionary process, but instead of viewing it as a Darwinian evolutionary process, they see it as another kind of evolution—the “something else” referred to above. That “something else” can take several forms, sometimes depending on disciple and other times on personal predilection, but the one thing the various forms share in common is a deep rooting in the cultural evolutionary writings of the mid- to late nineteenth century, all of which emphasized the notion of human progress. If one wishes to trace the origins of this notion of human progress, a starting point would be the writings of Enlightenment scholars—Locke, Diderot, Rousseau, Voltaire, Montesquieu—where progress was rendered in terms of “betterment” or increased “complexity” and usually subdivided for analytical purposes into stages or phases (O’Brien and Lyman 2002). Comte, for example, proposed a three-phase system of human development—theological, metaphysical, and positivist. Similarly, Montesquieu divided early mankind into savages and barbarians, and Turgot proposed a three-phase system of hunting, pastoralism, and agriculture. In some cases technological advancement was identified as the proximate cause of mankind, or a portion thereof, progressing from one phase or stage to the next higher plateau. Similarly, the environment, both physical and social, often was invoked as the proximate cause of developmental stasis, or a period of no change (O’Brien and Lyman 2002).

The notion of progress was nowhere stated so clearly as in the writings of Herbert Spencer, who wrote in Social Statics (1851:80) that:

Progress, therefore, is not an accident, but a necessity. Instead of civilization being artifact, it is part of nature; all of a piece with the development of the embryo or the unfolding of a flower. The modifications mankind have undergone, and are still undergoing, result from a law underlying the whole organic creation; and provided the human race continues, and the constitution of things remains the same, those modifications must end in completeness. . . . [S]o surely must man become perfect.

For Spencer, perfection was the result of mankind’s long struggle out of a series of lower stages, propelled along its way by underlying laws, with the caveat that not all peoples were equally imbued with the capacity to raise themselves to higher levels. This racial determinism manifests itself throughout the late nineteenth century in the writings of numerous social scientists, perhaps most evident in the works of Edward B. Tylor and Lewis Henry Morgan.

For Tylor (1881:74), it was inescapable that there:

seems to be in mankind inbred temperament and inbred capacity of mind. History points the great lesson that some races have marched on in civilization while others have stood still or fallen back, and we should partly look for an explanation of this in differences of intellectual and moral powers between such tribes as the Native Americans and Africans, and the Old World nations who overmatch and subdue them.

Morgan thought likewise, noting, for example, “the Indian and European are at opposite poles in their physiological conditions. In the former there is very little animal passion, while with the latter it is super-abundant” (Morgan 1870:207). Morgan went further, using comparative data to conclude in his lengthy treatise Ancient Society that “the experience of mankind has run in nearly uniform channels; that human necessities in similar conditions have been substantially the same” (Morgan 1877:8). Morgan’s tripartite evolutionary scheme—savagery, barbarism, and civilization—was an attempt to pigeonhole ethnic groups, often referred to as “tribes,” on the basis of the presence or absence of specified cultural traits, and although the scheme appears naive and racially deterministic, it called attention to cultural differences. More important to discussion here, Leacock (1963:Ixi) pointed out that “in spite of the disfavor into which Morgan’s work fell, his general sequence of stages has been written into our understanding of prehistory and interpretation of archaeological remains, as a glance at any introductory anthropology text will indicate.”

This brand of evolution became a topic of considerable debate and restatement in the 1940s and 1950s, primarily through the work of two anthropologists, Leslie White (e.g., 1949, 1959a, 1959b) and Julian Steward (e.g., 1953, 1955). To Steward, White’s brand of evolution was “unilinear” and traced its roots directly back to the nineteenth-century notions of Tylor and Morgan, but in reality Steward, despite his use of the term “multilinear” for his own evolutionism, was as much or more a “unilinear” evolutionist than White was. Steward (1953:15) suggested that the use of cultural evolution as an explanatory model demanded two “vitally important assumptions. First, it [assumes] that genuine parallels of form and function develop in historically independent sequences or cultural traditions. Second, it explains those parallels by independent operation of identical causality in each case.” White expressed a similar outlook, noting that the cultural evolutionary process was lawlike (1949, 1959b) and that the sequences of stages was inevitable in the sense that all societies would eventually represent civilizations, whether they all were at one time something else (1947, 1959b). Despite White’s (1943:339) disclaimer that he was not saying that “man deliberately set about to improve his culture,” close reading of what he said indicates that he strongly believed all organisms, including humans, had an “urge” to improve and that this was the “motive force as well as the means of cultural evolution.” White (1947:177) also regularly indicated that he and other cultural evolutionists “did not identify evolution with progress [and that they] did not believe progress was inevitable.”

But by default White’s cultural evolution was synonymous with progress: “[B]y and large, in the history of human culture, progress and evolution have gone hand in hand” (White 1943:339). In White’s view the key evolutionary mechanism—urge or necessity as a motive force—demanded absolutely no reference either to a source of variation or to natural selection. Humans thus invent new tools as necessary, and the tools are always better than the preceding ones because they allow the procurement or exploitation of additional energy. That acquisition is what “allows” or, if one is more deterministically minded, “propels” humans up the ladder of cultural process. The social sciences today are founded in large part on the notion of progress and its partner, economic rationality (Bentley, Earls, and O’Brien 2011).

Macro- versus Microevolution

Despite their disinterest in viewing culture as a Darwinian process, most social scientists probably are willing to accept that the evolution described by Darwin is the cause of organismic (biological) change over time. Social scientists, especially anthropologists and archaeologists, have little problem with the fact that some 5–6 million years ago the line that produced chimpanzees diverged from the line that produced hominins and eventually members of the genus Homo. When we see fossils lined up in a certain way, and we can see the profound changes that hominins have gone through during the last 5–6 million years, we ask ourselves, what else but Darwinian evolution could have caused such large-scale change? Most everyone would agree. But what about change over the last 100,000 years? Can we see enough morphological change to indicate evolution has taken place? Sometimes yes, although it is more difficult to see the cumulative changes in phenotypes separated by 100,000 years than it is in phenotypes separated by 5–6 million years. Why? Because various evolutionary processes have had 50–60 times longer to effect change in the latter sample than in the former. This means that the effects are much more evident than they are when a shorter period of time is involved.

Suppose we shorten the period to 10,000 years. Do we see any large-scale change? Not very often. Does this mean that evolution has stopped operating on humans? No, it means simply that in the vast majority of cases the time span is too short even to begin to see the large-scale changes that we customarily associate with evolution (O’Brien 1996; Lyman and O’Brien 1998, 2001). But critics (e.g., Bamforth 2002) want to see these large-scale changes so that they can feel assured that evolution—both biological as well as cultural—has taken place. To them, anything less than large-scale change is not evolution, or at least it is not worth studying. Critics, however, would profit from reading Jonathan Weiner’s (1994) The Beak of the Finch or Peter Grant’s (1999) Ecology and Evolution of Darwin’s Finches (O’Brien, Lyman, and Leonard 2003). Both books make it plain that once in a while evolution can be seen empirically in successive generations of organisms, and we do not need to reach the molecular level to see it.

Social scientists are not alone in failing to recognize the complementarity of micro- and macroevolutionary perspectives when it comes to human evolution, regardless of whether it is cultural or biological evolution (O’Brien and Lyman 2000, 2002). Several prominent evolutionary biologists and paleontologists (e.g., Gould 1996; Huxley 1956; Mayr 1982; Simpson 1949; Tëmkin and Eldredge 2007) have also stated that humans are a single, largely unchanging species. Under this view, evolutionary processes such as selection and drift do not operate on humans because our capacity for culture has decoupled us from evolution. If such is the case, and culture and its attendant features have created a gulf between humans and evolutionary processes, then a Darwinian perspective is non-applicable to the vast majority of human tenure on Earth. I contend, however, that culture is simply one adaptive response that a particular lineage of organisms evolved (O’Brien and Lyman 2000). As such, it in no sense exempts its bearers from evolutionary processes.

Invoking culture as a decoupling agent locates cause in the wrong place (Lyman and O’Brien 1998; Rindos 1984). As Mesoudi points out in his essay (this volume), culture is a different mode of transmission than genes are, but the difference does not lead to the inescapable conclusion that humans as organisms have evolved the means to stop evolving. Do these differences indicate that selection and drift play at best minimal roles in reshuffling both somatic and nonsomatic characters? No. Humans today are no more immune to evolutionary processes than they were 30,000 years ago. Here is what one evolutionary biologist said about culture: It merely altered “the components of fitness [and the] directional changes” prompted by selection. “What has happened is that the [selective] environment, the adjudicator of which genotypes are fit, has been altered” (Lerner 1959:181).

Our growing understanding of gene–culture interactions strongly suggests that there is every reason not to uncouple genes and culture (Durham 1991; Richerson, Boyd, and Henrich 2010). For example, Laland, Odling-Smee, and Myles (2010) collate 27 separate genes known to have been subject to recent selection and for which the inferred cultural selection pressure is a change in diet associated with the advent of agriculture. The development of lactose tolerance and the evolved ability of West Africans to withstand the ravages of malaria are the most familiar of these, but the list also includes genes expressed in the metabolism of carbohydrates, starch, proteins, lipids, phosphates, plant secondary compounds and alcohol, as well as in jaw-muscle fibre and tooth-enamel thickness. Laland and colleagues (2010) also collate a further 30 cases of genes that provide some immunity from, or resistance to, disease or pathogens thought to have been promoted by agriculture or other farming practices.

There are several other categories of genes, expressed in energy metabolism, heat or cold tolerance, skeletal development, the nervous system, brain function, the externally visible phenotype, and more that are documented to have been subject to recent selection and for which the inferred selection pathway is a cultural practice (Laland, Odling-Smee, and Myles 2010). Such genetic analyses are not foolproof, and each example will need to be followed up with detailed empirical research before gene–culture co-evolution can be proven. However, the data are sufficiently tantalizing to suggest that there are likely to be rich pickings here for social scientists interested in investigating such gene–culture interactions.

Ignoring the simple dichotomy between long-term, cumulative evolutionary results and short-term aspects of evolution is responsible for the question that bothers critics: “Where’s the evolution?” Skeptics are looking for the big results and missing the point that those large-scale, cumulative results are the end products of countless small-scale changes that took place over a very long time period. Paleontologists do not have access to the fine detail that archaeologists can see, but they do not doubt that their macroscale picture comprises literally millions of tiny structures and routine processes that went on day after day, century after century, millennium after millennium. They accept such detail as axiomatic, just as they accept that genetic change was behind some of the change they see. Conversely, archaeologists rarely have access to anything approaching the evolutionary big picture, but we should not get so lost in detail that we forget that it is those details that cumulatively are evolution.

A Final Thought

Returning to biologists and paleobiologists who have problems with human evolution, both cultural and biological, let’s briefly consider a few comments by Stephen Jay Gould and Ernst Mayr. Gould was a very good conchologist, and Mayr was an excellent ornithologist, but when the conversation turned to anything remotely “cultural,” they were both essentially helpless. Here, for example, is what Mayr said during with an interview with Natalie Angier (1997), who asked about further human evolution:

There’s absolutely no chance of the human species evolving. First of all, we can never speciate. We cover every niche, every spot on the earth, so there’s no opportunity for isolation. Moreover, I do not feel there’s any natural selection in any positive sense going on right now. . . .

Theoretically we could have cultural evolution and develop higher and better concepts. But if you have no basis for a change in genes, then unfortunately you can only develop through cultural evolution.

In other words, “development” but not “evolution.” Gould (1996) said similar things in Full House, claiming that all cultural evolution is Lamarckian and that humans can no longer evolve. By “Lamarckian” he meant that in his view cultural evolution is based solely on the transmission of acquired characteristics—the accumulation of favourable innovations.

Without doubt, a certain kind of cultural transmission—“guided variation” (Boyd and Richerson 1985)—is Lamarckian (Mesoudi 2011). This form of learning is called “unbiased” (Boyd and Richerson 1985; Henrich 2001) because, at the population level, it simply replicates the distribution of behaviours from the previous generation. After acquiring a behaviour or tool, an individual can obtain environmental information about the relative payoffs of alternative skills or tools. If the difference in payoffs is clear, then the individual adopts the behaviour indicated by the environmental information. If not, then the individual sticks with the behaviour acquired through unbiased cultural transmission. Unbiased transmission creates what anthropologists refer to as “traditions.” To suggest, however, that simply because there are traditions in culture—vertically transmitted ways of doing things—all cultural evolution is Lamarckian shows a complete lack of understanding of culture and how it operates.

Mayr and Gould examined culture and found it to be everything Mother Nature isn’t. They and others of a similar bent did what countless social scientists have done for well over a century: set humans aside as being something special because of their extrasomatic means of adaptation—culture. To those of us who view culture as a Darwinian process, this locates cause in the wrong place. Yes, the mode of transmission is different when culture is involved—although this is more in quantitative rather than qualitative terms in light of what we know of animal behaviour—and there can be no doubt that the tempo of cultural transmission differs significantly from that of genetic transmission. But do these differences lead to the inescapable conclusion that humans have stopped evolving—that they somehow are beyond the reach of selection? Do these differences indicate that other evolutionary processes such as drift play at best minimal roles in reshuffling both somatic and nonsomatic characters? In my opinion the answer to both questions is an emphatic “no.” Humans today are no more immune to evolutionary processes than they were ten thousand or fifty thousand years ago. The game of life has perhaps become more sophisticated, at least as seen through modern eyes, and there certainly are more players, but the rules of the game haven’t changed. And I seriously doubt they ever will. But unless we begin viewing culture and genes as part of a co-evolving system, we will continue to retard research in the social sciences.