The problem with environmentalists, Lynn Margulis used to say, is that they think conservation has something to do with biology.
A researcher who specialized in cells and microorganisms, Margulis was one of the most important biologists in the last half century — she literally helped to reorder the tree of life, convincing her colleagues that it did not consist of two kingdoms (plants and animals), but five or even six (plants, animals, fungi, protists, and two types of bacteria).
Until her death in 2011, she lived in my town, and I would bump into her on the street from time to time. She knew I was interested in ecology, and she liked to needle me. “Hey Charles,” she would call out, “are you still all worked up about protecting endangered species?”
Margulis was no apologist for unthinking destruction. Still, she couldn’t help regarding conservationists’ preoccupation with the fate of birds, mammals and plants as evidence of their ignorance about the greatest source of evolutionary creativity: the microworld of bacteria, fungi and protists. More than 90 percent of the living matter on earth consists of microorganisms and viruses, she liked to point out. Heck, the number of bacterial cells in our body is ten times more than the number of human cells!
Bacteria and protists can do things undreamed of by clumsy mammals like us: form giant supercolonies, reproduce either asexually or by swapping genes with others, routinely incorporate DNA from entirely unrelated species, merge into symbiotic beings—the list is as endless as it is amazing. Microorganisms have changed the face of the earth, crumbling stone and even giving rise to the oxygen we breathe. Compared to this power and diversity, Margulis liked to tell me, pandas and polar bears were biological epiphenomena—interesting and fun, perhaps, but not actually significant.
Does that apply to human beings, too? I once asked her, feeling like someone whining to Copernicus about why he couldn’t move the Earth a little closer to the center of the universe. Aren’t we special at all?
This was just chitchat on the street, so I didn’t write anything down. But as I recall it, she answered that Homo sapiens actually might be interesting—for a mammal, anyway. For one thing, she said, we’re unusually successful.
Seeing my face brighten, she added, “Of course, the fate of every successful species is to wipe itself out.”
THE STRUGGLE FOR EXISTENCE
Because I had spoken to Margulis before, I understood what she meant by this gnomic utterance. She was talking about one of her scientific heroes, the Russian microbiologist Georgii Gause. As an University of Moscow student in the 1920s, Gause spent years trying to drum up support from the Rockefeller Foundation, then the most prominent funding source for non-American scientists who wished to work in the United States. Hoping to impress the foundation, Gause decided to perform some nifty experiments and record the results in his grant application.
Gause wasn’t flying blind. In 1920, two Johns Hopkins biologists, Raymond Pearl and Lowell Reed, tried to come up with a mathematical formula that described the rate at which the population of the United States grew over time. Their argument was completely theoretical. They imagined what the rate of growth should have looked like, given their knowledge of biology, and sought to match their hypothetical curve to the actual population of the United States as recorded in census data. The two matched well enough that Pearl and Lowell believed they might be on to something. In the next few years, Pearl developed his ideas further, publishing them in The Biology of Population Growth (1925). Pearl’s book was quickly recognized as a classic. But it was almost entirely theoretical. To prove his qualifications, Gause decided to test Pearl’s ideas in the laboratory.
By today’s standards, his methodology was simplicity itself. Gause placed half a gram—that is, just a pinch—of oatmeal in 100 cubic centimeters (about three ounces) of water, boiled the results for ten minutes to create a broth, strained the liquid portion of the broth into a container, diluted the mixture by adding water, and then decanted the contents into small, flat-bottomed test tubes. Into each he dripped five Paramecium caudatum or Stylonychia mytilus, both single-celled protozoans, one species per tube. All of Gause’s test tubes were pocket ecosystems, food webs with a single node. He stored the tubes in a warm place for a week and observed the results. The conclusions appeared in a 163-page book,The Struggle for Existence, published in 1934.
Today The Struggle for Existence is viewed as a scientific landmark, one of the first successful marriages of experiment and theory in ecology. But it was not enough to get Gause a fellowship. Perhaps because his results merely validated Raymond Pearl’s equations, the Rockefeller Foundation decided that the 24-year-old student was insufficiently eminent. Gause was unable to visit the United States for another twenty years, by which time he had indeed become eminent. But he had also left microbial ecology and become an antibiotics researcher.
What Gause saw in his test tubes is often depicted in a graph, time on the horizontal axis, the number of protozoa on the vertical. The line on the graph is like a distorted bell curve, with its left side twisted and stretched into a kind of flattened S. At first the number of protozoans grows slowly, and the graph line slowly ascends to the right. But then the line hits an inflection point, and suddenly rockets upward—a frenzy of exponential growth. The mad rise continues until the organism begins to run out of food, at which point there is a second inflection point, and the growth curve levels off again as bacteria begin to die. Eventually the line descends, and the population falls toward zero.
Years ago I watched Margulis demonstrate Gause’s conclusions to one of her classes with a time-lapse video of Proteus vulgaris, a bacterium that resides in the intestinal tract. To humans, she said, P. vulgaris is mainly notable as an occasional cause of hospital infections. Left alone, it divides about every 15 minutes, producing two individuals where before had been one. Margulis switched on the projector. Onscreen was a tiny dot — P. vulgaris — in a shallow, circular glass container: a Petri dish, its bottom covered with a layer of reddish nutrient goo. The students gasped. In the time-lapse video, the colony seemed to pulse, doubling in size every few seconds, the colony rippling outward until the mass of bacteria filled the screen. In just 36 hours, she said, this single bacterium could cover the entire planet in a foot-deep layer of single-celled ooze. 12 hours after that, it would create a living ball of bacteria the size of the Earth.
Such a calamity cannot happen, Margulis said, because rival organisms and lack of resources prevent the vast majority of P. vulgaris from reproducing. This is natural selection, Darwin’s great insight. All living creatures have the same purpose: to make more of themselves, ensuring their biological future by the only means available. And all living creatures have a maximum reproduction rate: the greatest number of offspring they can generate in a lifetime. (For people, she said, the maximum reproduction rate is about 20 children per couple per generation. The potential maximum for dachshunds is around 330: 11 pups per litter, three litters a year, for roughly ten years.) Natural selection ensures that only a few members of each generation manage to reach this rate. Most do not reproduce at all; blocked, they fall by the wayside. “Differential survival is really all there is to natural selection,” Margulis said. In the human body, P. vulgaris is checked by the size of its habitat (portions of the human gut), the limits to its supply of nourishment (food proteins), and other, competing microbes. Thus constrained, its population remains roughly steady.
From P. vulgaris’s point of view, the Petri dish initially seems limitless, a boundless ocean of breakfast, no storm on the horizon, no competition for sustenance. The bacterium eats and divides, eats and divides. Racing across the nutrient goo, it hits the first inflection point and rockets up the left side of the curve. But then its colonies slam into the second inflection point: the edge of the dish. When the dish’s food supply is exhausted, P. vulgaris experiences a mini-apocalypse.
By luck or superior adaptation, a few species manage to escape their limits, at least for a while. Nature’s success stories, they are like Gause’s protozoans; the world is their Petri dish. Their populations grow exponentially; they take over large areas, engulfing their environment as if no force opposed them. Then they hit a barrier. They drown in their own wastes. They starve from lack of food. Something figures out how to eat them.
When I lived in New York City, zebra mussels invaded the lower Hudson River, the western boundary of the island of Manhattan. An inch to two inches long, their shells patterned with wriggly bands of brown and white, zebra mussels are capable of spitting out a million eggs per year apiece. The species originated in the Azov, Black and Caspian Seas on Europe’s Russian and Turkic-speaking periphery. Globalization has been good to it. Mussel larvae are microscopic and can be carried unnoticed in ship bilges and ballast water. Adults have a knack for attaching themselves to anchors and anchor cables. Escaping their native waters, zebra mussels hitch-hiked around the world. They have been recorded in Europe since the 18th century. The Hudson first saw them in 1991. Within a year zebra mussels constituted half the mass of living creatures in the river. In some places tens of thousands carpeted every square foot. They covered boat bottoms, blocked intake tubes, literally smothered other species of mussel with a chitinous blanket of striped shell. Zebra mussels were rocketing up the S-shaped curve.
Bust followed boom; the population collapsed. In 2011, two decades after the mussel was first sighted in the Hudson, its survival rates were “1% or less of those in the early years of the invasion” (the quote is from one long-range study). Unlike Gause’s bacteria, the mussels had not run into a physical wall—the physical world is always more complex than a test tube. They did exhaust their food supply, but they also were attacked by a local predator, the blue crab, which had learned to eat the newcomers. Their S-shaped curve wiggled more than those in Gause’s book, but the result was the same. 15 years ago, when I went to a park at the edge of the Hudson, I couldn’t step into the river—the sharp edges of open mussel shells were too thick underfoot. Nowadays at the park the creatures are mostly gone. Children splash happily in the shallows. Crumbled shells lie in the sediment, testament to the collapse of a briefly successful species.
All to be expected, Margulis would have said. The implication of modern biology is that human beings are just one among many of Earth’s creatures, merely another weed on the tangled bank of evolution, subject to the same laws as all the rest. After she taught her class, we went out for coffee. At the most fundamental level, she told me then, there is no reason to believe that people are different than P. vulgaris. In the video of the Petri dish, she saw humanity’s future, dark and inevitable. To her, Homo sapiens looked like just another briefly successful species.
CHILDREN OF TOBA
About 75,000 years ago, a huge volcano exploded on the island of Sumatra. The biggest blast for several million years, the eruption created Lake Toba, a 50-mile-long hole that is the biggest lake in Southeast Asia, and ejected the equivalent of as much as 700 cubic miles of rock, enough to cover the District of Columbia in a layer of magma and ash that would reach to the stratosphere. A gigantic plume spread west, enveloping southern Asia in tephra (rock, ash, and dust). Drifts in Pakistan and India reached as high as 20 feet. Smaller tephra beds blanketed the Middle East and East Africa. Great rafts of pumice filled the sea and drifted almost to Antarctica.
In the long run, the eruption raised Asian soil fertility. In the short term, it was catastrophic. Dust hid the sun for as much as a decade, plunging the earth into a years-long winter accompanied by widespread drought. A vegetation collapse was followed by a collapse in the species that depended on vegetation, followed by a collapse in the species that depended on the species that depended on vegetation. Temperatures may have remained colder than normal for several centuries. Orangutans, tigers, chimpanzees, cheetahs—all were pushed to the verge of extinction.
The reconstruction of Toba’s effects I have just given is speculative and controversial. Some researchers argue that the temperature drop was not quite as severe, long, and universal as early estimates proposed; regions along the Indian and African coast, for instance, may have been spared the worst of Toba’s wrath. Although genetic studies indicate that the animals’ population plummeted around the time of Toba, nobody can be certain that the volcano was the cause. Toba was a calamity, the critics say, but not a game-changer.
Disputed most is Toba’s effect on humankind. About the time of the eruption, many geneticists believe, Homo sapiens’ numbers shrank dramatically, perhaps to a few thousand people—the size of a big high school. Or even fewer: one team of researchers calculated that the total number of women of childbearing age in the world could have fallen to as low as 40.
The clearest evidence of this bottleneck is also its main legacy: humankind’s remarkable genetic uniformity. Countless people have viewed the differences between races as worth killing for, but compared to other primates—even compared to most other mammals—human beings are almost indistinguishable, genetically speaking. DNA consists of long, string-like molecules. Each string in turn is composed of two chains that are twisted around themselves to create the famous double helix. Individual links in the chains are called “nucleotides” or “bases.” Roughly speaking, about one out of every thousand bases differs between one person and the next. An equivalent figure for two Escherichia coli, the most common bacterium in the human gut, might be one out of 50. The bacteria in our intestines, by this measure, have twenty times more innate variability than their hosts—evidence, researchers say, that our species is descended from a small group of founders. In 1993, Stanley Ambrose, a University of Illinois paleoanthropologist (researcher into ancient humans), proposed that Toba caused this bottleneck by blasting our young species almost to extinction. All of our genetic resources today came from the few survivors.
Uniformity is hardly the only effect of a bottleneck. When a species shrinks in number, chance can alter its genetic makeup with astonishing rapidity. New mutations can arise and spread; a snippet of scrambled DNA in a single member of the isolated group that populated Ice-Age Europe apparently led to the blue eyes that predominate in most of Scandinavia. Or genetic variants that may have already been in existence—arrays of genes that confer better planning skills, for example—can suddenly become more common, effectively reshaping the species within a few generations as once-unusual traits become widespread.
Did Toba, as theorists like Richard Dawkins have argued, cause an evolutionary bottleneck that set off the creation of behaviorally modern people, perhaps by helping previously rare genes—Neanderthal DNA or an opportune mutation—spread through our species? Or did the volcanic blast simply clear away other human species that had previously blocked H. sapiens’ expansion? Or was the volcano irrelevant to the deeper story of human change?
Some researchers quickly backed Ambrose’s hypothesis; others attacked it with equal rapidity. It was a classic scientific combat, the subject of careful back-and-forth in refereed journals and heated argument in faculty lounges. Much dust has been kicked up. From the outside, the only thing which seems clear is that at about the time of Toba our species took a fateful step.
One way to illustrate the impact of this change is to consider Solenopsis invicta, the red imported fire ant. Geneticists believe that S. invicta originated in northern Argentina, an area with many rivers and frequent floods. The floods constantly wipe out ant nests. Over the millennia, these small, furiously active creatures have acquired the ability to respond to rising water by coalescing into huge, floating, pullulating balls—workers on the outside, queen in the center—that drift to the edge of the flood. Once the waters recede, colonies swarm back into previously flooded land so rapidly that S. invicta actually can use the devastation to increase its range. Like criminal gangs, fire ants thrive on chaos.
In the 1930s, Solenopsis invicta was transported to the United States, probably in ship ballast, which often consists of haphazardly loaded soil and gravel. As an adolescent bug enthusiast, Edward O. Wilson, the famed biologist, spotted the first colonies, in the port of Mobile, Alabama. He saw some very happy fire ants. From the ant’s point of view, it had been dumped into an empty, recently flooded expanse. S. invicta took off, never looking back.
More than likely, the initial incursion watched by Wilson was just a few thousand individuals—a number small enough to hint that random, bottleneck-style genetic change played a role in the species’ subsequent history in this country. (The evidence is not yet conclusive.) In its natal Argentina, fire-ant colonies constantly fight each other, reducing their numbers and creating space for other types of ant. In the United States, by contrast, the species often forms cooperative super-colonies, linked clusters of nests that can spread for hundreds of miles. Systematically exploiting the landscape, these super-colonies monopolize every useful resource, wiping out other insect species along the way—models of zeal and rapacity. Transformed by chance and opportunity, new-model S. invictus needed just a few decades to conquer most of the southern United States.
Homo sapiens did something similar in the wake of Toba. Our species first appears in the archaeological record about 200,000 years ago. For the next 125,000 years or so—the majority of our existence on earth—humankind was restricted to East Africa, though occasionally we seem to have sent out unsuccessful forays into the rest of the world. Around the time of Toba, everything changed. People raced across the continents like so many imported fire ants. Men and women charged so fast into the tephra that human footprints appeared in Australia within as few as ten thousand years, perhaps within four or five thousand. Stay-at-home Homo sapiens 1.0, a wallflower that would never have interested Lynn Margulis, had been replaced by aggressively expansive Homo sapiens 2.0. Something happened, for better and worse, and we were born.
No more than a few hundred people initially escaped Africa, if geneticists are correct. But they emerged into landscapes that by today’s standards were as rich as Eden. Cool mountains, tropical wetlands, closed forests—all were teeming with food. Fish in the sea, birds in the air, fruit on the trees: the buffet was all-you-can-eat, the tables unoccupied. People settled in.
For a long time after our successful colonization of Europe, Asia and, eventually, the Americas, humankind remained thin on the ground. As recently as ten thousand years ago we numbered barely five million, about one human being for every hundred square kilometers of the Earth’s habitable surface. Homo sapiens were a scarcely noticeable dusting on the surface of a planet dominated by microbes.
At about this time—10,000 years ago, give or take a millennium—our species rocketed around the first inflection point, with the invention of agriculture. The wild ancestors of cereal crops like wheat, barley, rice and sorghum have been part of the human diet for almost as long as there have been humans to eat them. (The earliest known evidence comes from Mozambique, where researchers found tiny bits of 105,000-year-old sorghum on ancient scrapers and grinders.) In some cases people may have watched over patches of wild grain, returning to them year after year. Yet despite the effort and care, the plants were not domesticated. As botanists put it, wild cereals “shatter” — individual grain kernels fall off as they ripen, scattering grain haphazardly, making it impossible to harvest the plants systematically. Only when an unknown genius discovered naturally mutated grain plants that did not shatter—and purposefully selected, protected, and cultivated them — did true agriculture begin. Planting great expanses of these mutated crops, first in southern Turkey, later in half a dozen other places, early farmers created landscapes that, so to speak, waited for hands to harvest them.
Farming converted most of the habitable world into a Petri dish. Foragers manipulated their environment with fire, burning areas to kill insects and encourage the growth of useful species—plants we liked to eat, plants that attracted the other creatures we liked to eat. Nonetheless, their diets were largely restricted to what nature happened to provide in any given time and season. Agriculture gave humanity the whip hand. Instead of natural ecosystems with their haphazard mix of species (so many useless organisms guzzling up resources!), farms are taut, disciplined communities dedicated to the maintenance of a single species: us. Before agriculture, the Ukraine, Middle West and lower Yangzi had been barely hospitable food deserts, sparsely populated domains of insects and grass; they became breadbaskets, as people scythed away suites of species that used soil and water we wanted to dominate and replaced them with wheat, maize and rice. To one of Margulis’s beloved bacteria, a Petri dish is a uniform expanse of nutrients, all of which it can seize and consume. For Homo sapiens, agriculture transformed much of the planet into something similar.
As in a time-lapse movie, we divided and multiplied across the newly opened land. It had taken Homo sapiens 2.0, aggressively modern humans, not even 50,000 years to reach the farthest corners of the globe. Homo sapiens 2.0A—A for agriculture—took a tenth of that time to conquer the planet.
The growth curve grew steeper in the early 19th century, after a German chemist, Justus von Liebig, discovered that plant growth was tied to the supply of nitrogen. Without nitrogen, neither plants nor the mammals that eat plants can create proteins, or for that matter the DNA and RNA that direct the production of proteins. Pure nitrogen gas (N2, in chemical notation) is plentiful in the air but plants are unable to absorb it, because the two nitrogen atoms in N2 are welded together so tightly that plants cannot split them apart for use. Instead, plants take in nitrogen only when it is mixed with oxygen and other elements, chemical combinations that are easier to break apart. To replenish the nitrogen in the soil, farmers traditionally grew peas, beans, lentils and other pulses, which we now know have symbiotic microorganisms that take in gaseous nitrogen and convert it into forms usable by plants. Because these other crops preempted planting more-productive cereals, most arable land could not support more than two or three people per acre.
In the early 20th century, two more German chemists, Fritz Haber and Carl Bosch, discovered the key steps to making synthetic fertilizer from fossil fuels. (The process involves turning natural gas into ammonia, which is then combined into plant-friendly nitrogenous compounds.) Haber and Bosch are not nearly as well-known as they should be; their discoveries, linked into what is called the Haber-Bosch process, have literally changed the chemical composition of the Earth. Farmers have injected so much synthetic fertilizer into the soil that soil and groundwater nitrogen levels have risen worldwide. Today, roughly a third of the weight of all the crops consumed by humankind is nitrogen derived from synthetic fertilizer. Another way of putting this is to say that Haber and Bosch enabled Homo sapiens to extract more than two billion people’s worth of food from the same land.
Synthetic fertilizer is not alone in its impact. The improved wheat, rice and (to a lesser extent) corn varieties developed by plant breeders in the 1950s and 1960s are often said to have prevented a billion deaths from starvation. Antibiotics, vaccines and water-treatment plants pushed back humankind’s bacterial, viral and fungal enemies. All allowed humankind ever-more unhindered access to the planet.
Rocketing up the growth curve, human beings every year take ever more of the Earth’s richness — “about 40% of the present net primary production in terrestrial ecosystems.” This figure, a famous estimate by a team of Stanford biologists, dates from 1986. Ten years later, a second Stanford team calculated that the “fraction of the land’s biological production that is used or dominated” by our species had risen to “39 to 50%.” In 2000, the chemist Paul Crutzen and the biologist Eugene Stoermer awarded a name to our time: the “Anthropocene,” the era in which Homo sapiens became force operating on a planetary scale. In that year, observed the environmental historian J.R. McNeill, almost 40 percent of the world’s available freshwater was appropriated by human beings.
Lynn Margulis, it seems safe to say, would have rolled her eyes at these last assessments, which in every case that I am aware of do not take into account the enormous impact of the microworld. But she would not have disputed the central idea: Homo sapiens had become a successful species.
As any biologist would predict, this success led to an increase in human numbers—slow at first, then rapid, tracing Gause’s oddly-shaped curve. We hit the steepest part of the slope in the 16th or 17th century. If we follow Gause’s pattern, growth will continue at delirious speed until the second inflection point, when we have exhausted the global Petri dish. After that, human life will be, briefly, a Hobbesian nightmare, the living overwhelmed by the dead. When the king falls, so do his minions; it is possible that our fall might also take down most mammals and many plants. Possibly sooner, quite likely later, in this scenario, the Earth will again be a choir of bacteria, fungi and insects, as it has been through most of its history.
It would be foolish to expect anything else, Margulis thought. More than that, it would be strange. To avoid destroying itself, the human race would have to do something deeply unnatural, something no other species has ever done or could ever do: constrain its own growth (at least in some ways). Brown tree snakes in Guam, water hyacinth in African rivers, gypsy moths in the U.S. northeast, rabbits in Australia, Burmese pythons in Florida—all these successful species have overrun their environments, heedlessly wiping out other creatures. Like Gause’s protozoans, they are racing to the limits of their Petri dish. Not one has voluntarily turned back. When the zebra mussels in the Hudson River began to run out of food, they did not stop reproducing. When fire ants relentlessly expand their range, no inner voices warn them to consider the future. Why should we expect Homo sapiens to fence itself in?
What a peculiar thing to ask! Economists talk about the ”discount rate,“ which is their term for the way that humans almost always value the immediate and local over the faraway and distant in time. We care more about the broken stoplight up the street today than falling-apart conditions next year in Croatia, Cambodia, or the Congo. Rightly so, evolutionists point out: Americans are far more likely to be killed at that stoplight today than in the Congo next year. Yet here we are asking governments to focus on potential planetary boundaries that may not be reached for decades or even centuries. Given the discount rate, nothing could be more understandable than the U.S. Congress’s failure to grapple with, say, climate change. From this perspective, is there any reason to imagine that Homo sapiens, unlike mussels, snakes and moths, can exempt itself from the fate of all successful species?
To a biologist like Margulis, who spent her career arguing that humans are simply another species, the answer should be clear. All life is similar at base, she and other biologists say, subject to the same biological rules and imperatives. All species seek without pause to make more of themselves—that is their goal. By multiplying till we reach our maximum possible numbers, we are following the laws of biology, even as we take out much of the planet. Eventually, in accordance with those same laws, the human enterprise will wipe itself out. From this vantage, the answer to the question, “Are we doomed to destroy ourselves?” is, “Yes.” It should be obvious. The idea that we could be some sort of magical exception—it seems ludicrously unscientific. Why should we be different? Who put us in the center of the universe?
In mulling over these questions, one could do worse than to consider Robinson Crusoe, hero of Daniel Defoe’s famous novel. Shipwrecked alone on an uninhabited island off Venezuela in 1659, Crusoe is an impressive example of fictional human resilience and drive. During his 27-year exile, he learns to catch fish, hunt rabbits and turtles, tame and pasture island goats, prune and support local citrus trees, and create ”plantations“ of barley and rice from seeds that he salvaged from the wreck. (Defoe apparently didn’t know that citrus and goats were not native to Venezuela and thus wouldn’t have been found on the island.) Rescue comes at last in the form of a shipful of ragged mutineers, who plan to maroon their captain on the supposedly empty island. Crusoe helps the captain recapture his ship and offers the defeated mutineers a choice: permanent exile on the island or trial in England. All choose the former. Crusoe has harnessed so much of the island’s productive power to human use that even a gaggle of inept seamen can survive there in comfort.
Robinson Crusoe’s first three chapters tell how its hero ended up on his ill-fated voyage. The youngest son of an English merchant, he has a restless spirit that leads him to defy his father and set himself up as an independent slave trader. On a voyage to Africa, his ship is captured by “a Turkish rover,” captained by a Moor from Morocco. “As his proper Prize,” Crusoe becomes the captain’s house slave. After two years of servitude, Crusoe manages to steal his master’s fishing boat and escape. He wanders, without food or water, down the West African coast and is rescued by a Portuguese slave ship bound for Brazil. There the enterprising Crusoe establishes a small tobacco plantation. But he is short of labor, and decides with some other plantation owners to obtain that labor by taking a ship to Africa and buying some slaves. The ship is wrecked on the return voyage. Except for Crusoe, all hands perish, slaves included. He ends up alone on his island.
What is striking to a modern reader is that Defoe clearly intended Crusoe to be a sympathetic character, and that he saw nothing remarkable about expecting readers to sympathize with a man in the slave trade. Indeed, Crusoe seems to have no qualms about the slavery trade even after having been, most unhappily, a slave himself. Here, character is echoing author: Defoe extolled slavery as “a most Profitable, Useful, and absolutely necessary Branch of our Commerce.” Backing words with deeds, he owned shares in the Royal African Company, one of England’s first joint-stock firms, chartered for the express purpose of buying people in Africa and transporting them as slaves to the Americas. When the company was attacked in Parliament, he offered to write the equivalent of editorials in its favor. The company paid him the rough equivalent of $50,000 for his public-relations services.
Defoe was not exceptional. Three centuries ago, when he was writing Robinson Crusoe, societies from one end of the world to another depended on slave labor, as had been the case since at least the Code of Hammurabi, in ancient Babylon. Customs differed from one place to another, but slavery was sanctioned and practiced everywhere from Mauritania to Manchuria. Unfree workers existed by the million in the Ottoman Empire, Mughal India and Ming China. Slaves were less common in continental Europe, but Portugal, Spain, France, England and the Netherlands happily exploited huge numbers of them in their American colonies. In the last half of the 18th century alone, almost four million people were taken from Africa in chains. In colonies throughout the Americas at that time, in places ranging from Brazil to Barbados, from South Carolina to Suriname, slaves were so fundamental to the economy that they outnumbered masters, sometimes by ten to one.
Then, in the space of a few decades in the 19th century, slavery almost stopped entirely.
The implausibility of this change is stunning. In 1860, slaves were the single most valuable economic asset in the United States, collectively worth more than $3 billion, an eye-popping sum at a time when the U.S. gross national product was less than $5 billion. (In terms of their impact on the economy, the slaves would be worth as much as $10 trillion in today’s money.) Rather than investing in factories like northern entrepreneurs, southern businessmen had sunk their capital into slaves. Rightly so, from their perspective—slaves had a higher return on investment than any other commodity. Enchained men and women had made the region politically powerful, and gave social status to an entire class of poor whites. Slavery was the foundation of the social order. It was, thundered South Carolina Senator John C. Calhoun, “instead of an evil, a good—a positive good.” Calhoun was no fringe character; when he spoke, he had already served as Secretary of War and Vice President and would become Secretary of State. Yet despite the institution’s enormous economic value and social prestige, part of the United States set out to destroy it, wrecking much of the national economy and killing half a million citizens along the way.
Incredibly, the turn against slavery was as universal as slavery itself. Great Britain, leader of the global slave trade, banned its market in human beings in 1807. Two laws enacted in 1833 and 1838 freed all British slaves. Denmark, Sweden, the Netherlands, France, Spain and Portugal soon outlawed their slave trades, too, and after that slavery itself. Like stars winking out at the approach of dawn, cultures across the globe removed themselves from the previously universal exchange of human cargo. Anti-human trafficking organizations correctly remind us that slavery continues to exist and must still be fought. Millions still work in dreadful sweatshops with little chance of escape. But in no society anywhere is slavery a legally protected institution—part of the social fabric—as it was throughout the world two centuries ago.
Historians have provided many reasons for this extraordinary transition, not least among them the fierce opposition of slaves themselves. But one of the most important is that abolitionists convinced ordinary people around the world that slavery was a moral disaster. An institution fundamental to human society for millennia was made over by ideas and a call to action, loudly repeated.
In the last few centuries, such profound changes have occurred repeatedly. Since the beginning of our species, almost every known society has been based on the subjugation of women by men. Tales of past matriarchal societies abound, but there is little archaeological evidence for their veracity, leading most researchers to believe that female-dominated cultures, if they existed, were rare. In the long view, women’s lack of liberty has been as central to the human enterprise as gravitation to the celestial order. The degree of suppression varied from time to time and place to place, but women never had an equal voice; indeed, in some places the penalty for possession of two X chromosomes increased with technological progress. Union and Confederacy clashed over slavery, but they were in accord on the status of women: in neither could women attend school, have a bank account, or, in many states, own non-personal property. Equally confining were female lives in Europe, Asia and Africa, though in different ways. Nowadays women are the majority of U.S. college students, the majority of the U.S. workforce and the majority of U.S. voters. Again, historians assign multiple causes to this shift, rapid in time, confounding in scope. But a central element was the power of ideas—the voices and actions of suffragists, who through decades of ridicule and harassment pressed their case. In recent years something similar seems to have occurred with LGBT rights: first a few lonely advocates, censured and mocked; then victories in the social and legal sphere; finally, perhaps, a slow movement to equality.
Less well known, but equally profound: the decline in violence. Ten thousand years ago, at the dawn of agriculture, societies mustered labor for the fields and controlled harvest surpluses by organizing themselves into states and empires. These promptly revealed an astonishing appetite for war. Their penchant for violence was unaffected by increasing prosperity or higher technological, cultural, and social accomplishments. When classical Athens was at its zenith in the 4th and 5th century B.C., it was ever at war: against Sparta (First and Second Peloponnesian Wars, Corinthian War); against Persia (Greco-Persian Wars, Wars of the Delian League); against Aegina (Aeginetan War); against Macedon (Olynthian War); against Samios (Samian War); against Chios, Rhodes and Cos (Social War). Classical Greece was nothing special—look at the ghastly histories of China, sub-Saharan Africa, or Mesoamerica. Look at early modern Europe, where wars followed each other so fast that historians simply gather them into catch-all titles like the Hundred Years’ War or the even more destructive Thirty Years’ War. The brutality of these conflicts is difficult to grasp; to cite an example from the Israeli political scientist Azar Gat, Germany lost between a fifth and a third of its population in the Thirty Years’ War — “higher than the German casualties in the First and Second World Wars combined.” The statistic is sobering: the nation lost a greater percentage of its people to violence in the 17th century than in the 20th, despite the intervening advances in the technology of slaughter, despite the fact that in the Second World War the German government was controlled by maniacs who systematically murdered millions of their fellow citizens.
As many as one out of every ten people met violent deaths in the first millennium A.D., the archaeologist Ian Morris has estimated. Ever since, violence has declined—gradually, then suddenly. In the decades after the Second World War, rates of violent death plunged to the lowest levels in millennia. Today, the average person is far less likely to be slain by another member of the species than a hundred years ago, or a thousand, or ten thousand—an extraordinary transformation that has occurred, almost unheralded, in the lifetime of many of the people reading this book. Given the murder and mayhem documented in the headlines, the idea that violence is diminishing seems absurd. Nonetheless, every independent effort to collect global statistics suggests that we seem to be winning, at least for now, what the political scientist Joshua Goldstein calls “the war on war.”
Multiple causes for this remarkable change have been suggested. But Goldstein, one of the leading scholars in this field, argues that the most important is the emergence of multinational institutions like the United Nations, an expression of the ideas of peace activists earlier in the last century. These organizations have by no means stopped all fighting, as a cursory glance at the news would make clear. But over the years, Goldstein argues, they have snuffed out, almost invisibly, conflicts in other places that in previous eras would have led to horrific brutality.
Given this record, even Lynn Margulis might pause. No European in 1800 could have imagined that in 2000 Europe would have no legal slavery, women would be able to vote and gay couples would be able to marry. No one could have guessed that a continent which had been tearing itself apart for centuries would be largely free of armed conflict, even amid terrible economic times. No one could have guessed back then, even, that Europe would have vanquished famine.
Preventing Homo sapiens from destroying itself a la Gause would require a still greater transformation, because we would be pushing against Nature itself. Success would be unprecedented, biologically speaking. But might our species be able to do exactly that? Might Margulis and her peers have got this one wrong?
Crusoe, again, is an example. Confronted with a threat to his survival, he changed his way of life, root and branch, to meet it. Working alone for years, he transformed the island, enriching its landscape. And then, to his surprise, he realized that he “might be more happy in this Solitary Condition, than I should have been in a Liberty of Society, and in all the Pleasures of the World.”
Crusoe was a fictional character, of course (though Alexander Selkirk, the castaway whose triumphant survival apparently inspired Defoe, was not). And the challenge facing the next generation is vastly larger than anything Crusoe faced. Still, removing the shackles from slaves and women has unleashed the suppressed talents of two-thirds of the human race. Drastically reducing violence has prevented the waste of countless lives and staggering amounts of resources. Which is a good thing, because we will need them.
HARA HACHI BU
The Japanese have an expression, hara hachi bu, which means, roughly speaking, “belly 80 percent full.” hara hachi bu is shorthand for an ancient injunction to stop eating before feeling full. Nutritionally, the command makes a great deal of sense. When people eat, their stomachs produce peptides that signal fullness to the nervous system. Unfortunately, the mechanism is so slow that eaters frequently perceive satiety only after they have consumed too much—hence the all-too-common condition of feeling bloated or sick from overeating. Japan—actually, the Japanese island of Okinawa—is the only place on earth where large numbers of people are known to restrict their own calorie intake systematically and routinely. Some researchers claim that hara hachi bu is responsible for Okinawans’ notoriously long life spans. But I think of it as a metaphor for stopping before the second inflection point, voluntarily forswearing short-term consumption to obtain a long-term benefit.
By 2050, demographers predict, as many as 10 billion human beings will walk the earth, three billion more than today. Not only will more people exist than ever before, they will be richer than ever before. In the last three decades hundreds of millions in China, India and other formerly poor places have lifted themselves from destitution—arguably the most important, and certainly the most heartening, accomplishment of our time. Yet, like all human enterprises, this great success will pose great difficulties.
In the past, rising incomes have invariably prompted rising demand for goods and services. Billions more jobs, homes, cars, fancy electronics—these are things the newly prosperous will want. (Why shouldn’t they?) But the greatest challenge may be the most basic of all: feeding these extra mouths. To agronomists, the prospect is sobering. The newly affluent will not want their ancestors’ gruel. Instead they will ask for pork and beef and lamb. Salmon will sizzle on their outdoor grills. In winter, they will want strawberries, like people in New York and London, and clean bibb lettuce from hydroponic gardens.
All of these, each and every one, require vastly more resources to produce than simple peasant agriculture. Already 35 percent of the world’s grain harvest is used to feed livestock. The process is terribly inefficient: between seven and ten kilograms of grain are required to produce one kilogram of beef. Not only will the world’s farmers have to produce enough wheat and maize to feed 3 billion more people, they will have to produce enough to give them all hamburgers and steaks. Given present patterns of food consumption, economists believe, we will need to produce about 40 percent more grain in 2050 than we do today.
How can we provide these things for all these new people? That is only part of the question. The full question is: How can we provide them without wrecking the natural systems on which all depend?
Scientists, activists and politicians have proposed many solutions, each from a different ideological and moral perspective. Some argue that we must drastically throttle industrial civilization. “Stop energy-intensive, chemical-based farming today!” “Eliminate fossil fuels to halt climate change!” Others claim that only intense exploitation of scientific knowledge can save us. “Plant super-productive, genetically modified crops now! Switch to nuclear power to halt climate change!” No matter which course is chosen, though, it will require radical, large-scale transformations in the human enterprise—a daunting, hideously expensive task.
Worse, the ship is too large to turn quickly. The world’s food supply cannot be decoupled rapidly from industrial agriculture, if that is seen as the answer. Aquifers cannot be recharged with a snap of the fingers. If the high-tech route is chosen, genetically modified crops cannot be bred and tested overnight. Similarly, carbon-sequestration techniques and nuclear power plants cannot be deployed instantly. Changes must be planned and executed decades in advance of the usual signals of crisis, but that’s like asking healthy, happy 16-year-olds to write living wills. It’s like asking the entire human race to adopt a policy of hara hachi bu.
Evolutionarily speaking, a species-wide implementation of hara hachi bu would be unprecedented. Thinking about it, I can picture Lynn Margulis rolling her eyes. But is it so unlikely that our species, a congeries of changelings, would be able to do exactly that before we round that fateful curve of the second inflection point and nature does it for us? I can imagine Margulis’s response: You’re imagining our species as some sort of big-brained, hyperrational, cost-benefit-calculating computer! A better analogy is the bacteria at our feet! Still, Margulis would be the first to agree that removing the shackles from women and slaves has begun to unleash the suppressed talents of two-thirds of the human race. Drastically reducing violence has prevented the waste of countless lives and staggering amounts of resources. Is it really impossible to believe that we wouldn’t use those talents and those resources to draw back before the abyss?
Our record of success is not that long. In any case, past successes are no guarantee of the future. But it is terrible to suppose that we could get so many other things right and get this one wrong. To have the imagination to see our potential end, but not have the imagination to avoid it. To send humankind to the moon but fail to pay attention to the earth. To have the potential but to be unable to use it—to be, in the end, no different from the protozoa in the petri dish. It would be evidence that Lynn Margulis’s most dismissive beliefs had been right after all. For all our speed and voraciousness, our changeable sparkle and flash, we would be, at last count, not an especially interesting species.