We have named the era of runaway climate change the “Anthropocene,” which tells you everything you need to know about how we understand our tragic nature. Human beings are apparently insatiable consuming machines; we are eating our way right through the biosphere. The term seems to suggest that the relentless expansion of the world economy, which the extraction and burning of fossil fuels has made possible, is hard-wired into our DNA. Seen from this perspective, attempting to reverse course on global warming is likely to be a fool’s errand. But is unending economic growth really a defining feature of what it means to be human?
For the longest part of our history, humans lived as hunter-gatherers who neither experienced economic growth nor worried about its absence. Instead of working many hours each day in order to acquire as much as possible, our nature—insofar as we have one—has been to do the minimum amount of work necessary to underwrite a good life.
This is the central claim of the South African anthropologist James Suzman’s new book, Work: A Deep History, From the Stone Age to the Age of Robots, in which he asks whether we might learn to live like our ancestors did—that is, to value free time over money. Answering that question takes him on a 300-millennium journey through humanity’s existence.
Along the way, Suzman draws amply on what he has learned since the 1990s living and dissertating among the Ju/’hoansi Bushmen of Eastern Namibia, whose ancestral home is in southern Africa’s Kalahari Desert. The Ju/’hoansi are some of the world’s last remaining hunter-gatherers, although few engage in traditional forms of foraging anymore.
Suzman has less to say in Work about his years as the director of corporate citizenship and, later, the global director of public affairs at De Beers, the diamond-mining corporation. He took that job in 2007. Around the same time, in response to a public outcry after the Botswanan government evicted Bushmen from the Kalahari so that De Beers could conduct its mining operations there, the company sold its claim to a deposit to a rival firm, Gem Diamonds, which opened a mine in the Bushmen’s former hunting grounds in 2014. It later shuttered the mine and then sold it in 2019, after reportedly losing $170 million on the venture.
Suzman’s employment with De Beers—a company that has spent vast sums on advertising to convince the world’s middle classes that diamonds, one of the most common gems, are actually among the scarcest—may have left its mark on Work nonetheless. “The principal purpose” of his undertaking, Suzman explains, is “to loosen the claw-like grasp that scarcity economics has held” over our lives and thereby “diminish our corresponding and unsustainable preoccupation with economic growth.” It is an arresting intervention, although one that reveals the limits of both contemporary economics and anthropology as guides to thinking about our era of climate emergency.
For 95 percent of our 300,000-year history, human beings have lived as hunter-gatherers on diets consisting of fruits, vegetables, nuts, insects, fish, and game. Ever since Adam Smith published The Wealth of Nations in 1776, it has largely been taken for granted that staying alive was an all-consuming activity for our ancestors, as well as for the remaining hunter-gatherers who still lived as they did. Latter-day foragers appeared to have been “permanently on the edge of starvation,” Suzman explains, and “plagued by constant hunger.”
Popular
"swipe left below to view more authors"Swipe →
This disparaging perspective on the life of the hunter-gatherer found ample support in Western travel narratives and then in ethnographic studies. Explorers treated contemporary foraging peoples as if they were living fossils, artifacts of an earlier era. In reality, these foragers were living in time, not out of it, and trying to survive as best they could under adverse historical conditions. Expanding communities of agriculturalists, like both colonial empires and post-colonial states, had violently pushed most foragers out of their ancestral homelands and into more marginal areas. Western reportage has made it seem as if these dispossessed refugees were living as their ancestors had since time immemorial, when in fact their lives were typically much more difficult.
A countercurrent of thinkers has provided a consistent alternative to this largely contemptuous mainstream perspective. The 18th-century French philosopher Jean-Jacques Rousseau, for example, took the forager to be an unrealizable ideal for modern humans rather than our embarrassing origin story. In the 20th century, anthropologists Franz Boas and Claude Levi-Strauss continued this tradition: They countered racist, stage-based theories of human evolution by showing that foraging peoples possessed complex and intelligent cultures. These thinkers form important precursors to Suzman’s perspective, but, in Work, he sets them aside.
Instead, Suzman focuses on the comparatively recent “Man the Hunter” conference, co-organized by the American anthropologist Richard Lee. That 1966 gathering marked a decisive shift in how anthropologists thought about foragers as economic actors, and this is the point that Suzman wants to emphasize. Lee had been conducting research among the !Kung Bushmen of southern Africa, a people related to the Ju/’hoansi. Lee showed that the !Kung acquired their food through only “a modest effort,” leaving them with more “free time” than people in the advanced industrial societies of the West. The same was likely true, he suggested, of human beings over the largest part of their history.
One implication of this finding is that economists since Adam Smith have been consistently wrong about what Lee’s colleague Marshall Sahlins called “stone age economics.” Using modern research methods, social scientists have confirmed that Lee and Sahlins were largely right (although they may have underestimated foragers’ average work hours). The chemical analysis of bones has demonstrated conclusively that early humans were not constantly teetering on the brink of starvation. On the contrary, they ate well despite having at their disposal only a few stone and wooden implements. What afforded these early humans existences of relative ease and comfort? According to Suzman, the turning point in the history of early hominids came with their capacity to control fire, which gave them access to a “near-limitless supply of energy” and thereby lightened their toils.
Fire predigests food. When you roast the flesh of a woolly mammoth—or, for that matter, a bunch of carrots—the process yields significantly more calories than if the food was left uncooked. The capacity to access those additional calories gave humans an evolutionary advantage over other primates. Whereas chimpanzees spend almost all of their waking hours foraging, early humans got the calories they needed with just a few hours of foraging per day.
Mastering fire thus made for a radical increase in humanity’s free time. Suzman contends that it was this free time that subsequently shaped our species’s cultural evolution. Leisure afforded long periods of hanging around with others, which led to the development of language, storytelling, and the arts. Human beings also gained the capacity to care for those who were “too old to feed themselves,” a trait we share with few other species.
The use of fire helped us become more social creatures in other ways as well. Recently unearthed evidence has demonstrated that early humans did not live in small bands for the whole of their existence, as anthropologists and archaeologists had long supposed. Where food was less abundant, people spread out, keeping enough distance from one another to ensure an ease of acquisition. By contrast, where food was abundant, early humans gathered into larger, albeit temporary social formations. At Göbekli Tepe in southeastern Turkey, archaeologists uncovered a major complex of “chambers and megaliths” that had been periodically built up and reburied from around 10,000 years ago—long before the advent of settled agricultural societies.
These findings support a surprising thesis, one that reverses everything we used to believe about the deep history of humanity. It was not the hunter-gatherers who “suffered from systematic dietary deficiencies,” working themselves to the point of exhaustion yet attaining no lasting security. On the contrary, their descendants among the farming peoples were the ones who lived like that. In contrast to the hunter, the peasant eked out an existence that truly was, in Thomas Hobbes’s famous phrase, “nasty, brutish, and short.” As Suzman explains, this shift in how we understand the relative fortunes of hunter-gatherers and early agriculturalists makes the three major transitions that followed fire—for Suzman, agriculture, the city, and the factory—much harder to explain. Their advent cannot be told as a progressive story of humanity’s climb out of economic deprivation.
To see why debates about human origins carry so much significance, you need only turn to the first page of any economics textbook. There you will discover the “scarcity postulate,” the theory that human beings have infinite needs and wants but only a limited quantity of resources. You experience the truth of this principle every time you open your banking app and discover that you can afford only a portion of what you’ve placed in your online shopping cart. This leads to an endless series of calculations: In order to have this, you must forgo that.
Economics positions itself as the study of how the choices we make under the constraints of scarcity facilitate the allocation of our productive capacities. Every gain in economic efficiency loosens those constraints just a bit, so some of us can afford to satisfy a few more of our desires without taking away from other people’s ability to meet their own needs. Why the wealthy few are able to satisfy so many of their whims before the world’s poor achieve basic levels of economic security has always been an uncomfortable question for the economic profession. But economists assure us that, in any case, the only long-term solution to global poverty is more economic growth.
That is why economists speak of our history primarily as one long story of economic expansion, as if our task as humans always has been and always will be to struggle out of penury and acquire more things. Seeing the world that way has enormous consequences for how we think about climate change, among the many other ecological threats to human well-being, such as deforestation and overfishing. If confronting these threats means making do with less, such a limitation can only appear, in the economist’s eyes, as a regression against which human nature will rebel.
The account of human nature undergirding this standard economic perspective is precisely what Suzman’s anthropological evidence allows him to reject. In reality, the scarcity postulate applies only to a limited period of humanity’s existence. For the vast majority of our history, humans have thought of their material needs as limited. Families divided up the work required to meet those needs, and when the work was done, they called it a day.
When people have found themselves in possession of an abundance of goods, they have generally seen those goods not as resources to be deployed in the service of economic expansion, but rather as so many excuses to throw gigantic parties, like the ones that presumably took place at Göbekli Tepe or, for that matter, at Stonehenge. In many cultures, giving away or even ritualistically destroying one’s possessions at festivals has been a common way to show one’s worth. That people all over the world continue to spend their meager incomes on elaborate marriage celebrations and funerals is something mainstream economists can understand only as anomalous.
For Suzman, anthropological insights into our pre-scarcity past lend support to a post-scarcity tradition in economics, which he associates with the work of John Maynard Keynes. Keynes famously argued that states should engage in deficit spending rather than balance their budgets during economic downturns. Less well known is that, in making this argument, Keynes wanted not merely to stabilize Western economies but to advance beyond them, to a post-scarcity society in which economic concerns had largely faded from human consciousness. To so much as conceive of this alternative, Keynes asserted, economists would have to reconsider the nature of economics.
If you attempt to interrogate people’s preferences to figure out why they want what they want, most neoclassical economists would laugh you out of the room. As Suzman points out, Keynes was not so hasty. His insights into the nature of human wants were anthropologically astute. He described desires as coming in two types, which he called “absolute” needs and “relative” wants. For a city dweller, for instance, absolute needs might include things like clean water, an apartment, running clothes, and an annual bus pass. Relative wants, by contrast, refer to things that connote social status, like Gucci loafers and an Ivy League education. We cannot all be upper class, just as we cannot all be above average. Unlike desires based in social status, which can be infinite, absolute needs are limited.
In fact, a long history of technological progress has made it possible to fulfill everyone’s needs in ever more resplendent ways with ever fewer hours of work. Keynes predicted that by his grandchildren’s generation, we would have at our disposal such an immense quantity of buildings, machines, and skills as to overcome any real scarcity of resources with respect to meeting our needs (including new ones like the 21st-century need for a smartphone).
Of course, many of our wants might remain unfulfilled. But in Keynes’s view, wants mostly evince desires for status rather than possessions. Giving everyone Gucci loafers won’t help, since they’re worthless as status symbols once everybody has a pair. Only reducing levels of inequality would relieve society-wide status anxieties, since each individual’s relative position would then matter much less. With enhanced production capacities and absolute needs met, Keynes argued, people would stop feeling so frustrated and striving so hard. Instead, they would “devote their further energies” to a variety of “non-economic purposes.” Keynes went on to suggest that in a future post-scarcity society, people would probably work just 15 hours a week, and then mostly for the pleasure of it.
For Suzman, Keynes’s remark on the length of the future work week is serendipitous. When Keynes “first described his economic utopia,” Suzman points out, “the study of hunter-gatherer societies was barely more than a sideshow in the newly emerging discipline of social anthropology.” It was only in the 1960s, two decades after Keynes’s death, that we began to understand that for most of our history, humans did in fact work about 15 hours a week, as hunter-gatherers. Keynes’s vision of a post-scarcity future was as much a recovery of our species’s pre-scarcity past. Humanity’s “fundamental economic problem” is not scarcity at all, but rather satiety.
What can we learn from our hunter-gatherer ancestors about how to organize our lives once the daily grind of work no longer needs to be so central to our identities? That was the motivating question of Suzman’s first book, Affluence Without Abundance, published in 2017.
Work, the sequel, concerns itself mostly with the opposite question: Why do we continue to cling so hard to our work-based identities, in spite of an inner nature that tells us not to work so much? Long after Keynes’s own metaphorical grandchildren (since he had no direct descendants) have grown up, grown old, and had children of their own, we continue to work long hours, consuming ever more and posing an ever-greater threat to the biosphere. “Humankind,” Suzman writes, is apparently “not yet ready to claim its collective pension.” So why haven’t we traded rising incomes for more free time?
John Kenneth Galbraith provided one plausible answer in The Affluent Society, his 1958 study of the postwar American economy. In it, he suggested that Keynes had underestimated the degree to which we can be manipulated into seeing our relative wants as absolute needs. Through advertising, companies like De Beers create desires in us that we didn’t have before. Then they tell us that in order to fulfill those desires, we have to buy their products. Since we purchase big-ticket items like diamonds largely to maintain or increase our status in society (in the then-popular phrase, to “keep up with the Joneses”), these goods lose their mystique once too many people have acquired them. New, harder-to-acquire gems must then take the place of the old stones that have lost their luster.
For Galbraith, writing in the 1950s, the reason we opt for this irrational, limitless politics of production was clear: The point is not really to meet people’s needs (most of which are manufactured wants in any case) but to keep workers employed and wages growing. In other words, expanding production serves as a distraction from the fraught issue of economic redistribution. As long as everyone’s income is growing, we don’t worry so much about who has more than whom.
But in an era of stagnant real wages and rising inequality, Galbraith’s explanation no longer holds much water. As Suzman explains, beginning in the mid-1980s, we began to see a “Great Decoupling”: The incomes of the rich increased at an accelerating pace, while the growth in everyone else’s earnings slowed dramatically. Rising inequality should have called into question the politics of endless growth in wealthy countries. Yet the average work week has not shrunk—in fact, in the United States, it has lengthened.
Suzman draws on the work of a fellow anthropologist, the late David Graeber, to supplement Galbraith’s account. In Bullshit Jobs, Graeber detailed the immense amount of pointless work that suffuses the economy. Button pushers, box tickers, and assorted yes-men add no real value to the economy; yet instead of weeding out this sort of work, Graeber argued, the economy seems to sow it in every corner. Graeber hypothesized that the expansion of bullshit jobs has been an indirect consequence of the financialization of the economy. As the economy becomes more focused on extracting rents than on new production, society has come to look more neo-feudal than capitalist, even as elites employ gigantic entourages of useless underlings as a way to display their wealth.
Suzman has his own answer for why irrational forms of make-work have proliferated across the economy, but he approaches this question from an odd direction. He says that since the agricultural revolution, we have continued to work even when we don’t have to because the physical laws of the universe compel us to do so. The answer is strange because it explains a recent trend in human societies in terms of the background conditions of life itself. Suzman essentially argues that nature has programmed us, just as it has every other creature, to deal with surpluses of energy by working those surpluses out of our systems. With lots of available energy but little to do, we make work to release the tensions building up inside of us.
Suzman appears to have reached this conclusion through the following argument: Since it is our nature as human beings not to work more than we need to and instead to spend our time in pursuits that make us happy—hanging around with friends, cooking and eating, singing and sleeping—then if we aren’t doing that today, there must be some deeper mechanism at work within us, pushing us to labor until our hearts give out rather than directing our surplus energy toward play. For Suzman, this deeper mechanism must ultimately be located at the level of biology itself.
In a passage reminiscent of Freud’s account of the death drive, Suzman postulates that “biological systems” likely emerged spontaneously “because they more efficiently dissipate heat energy than many inorganic forms.” Life turns out to be a labor-saving device for creating entropy, or disorder, which physical systems deploy in their efforts to hasten the heat death of the universe. Suzman suggests that this deeper purpose of life—to serve as a tool of “entropy, the trickster god”—reveals itself in many ways that we are only just beginning to understand.
For example, ever since the work of Charles Darwin, we have understood the spectacular tail feathers of male peacocks to be an evolutionary outcome of their competition for mates. However, recent studies have demonstrated that more beautifully plumed birds gain no mating advantage over their ruffled competitors. “Energy-expensive evolutionary traits like peacock tails” serve no other function, Suzman asserts, than to “expend energy,” to get rid of an excess. Abundance breeds ostentation.
For Suzman, the same principle is at work in human life. In certain geological layers, one turns up large numbers of “Acheulean hand-axes.” Our ancestors apparently had a habit of banging on rocks long and hard enough to sharpen them to a point at one end. Early humans made and discarded large numbers of these devices all around Eurasia and Africa. The problem is that Acheulean hand-axes are useless as hand-axes. Based on an intriguing paper by the Dutch anthropologist Raymond Corbey and his collaborators, Suzman suggests that the primary purpose these axes served, much like peacock tails, was to work off excess energy. Biology has programmed us so that, like peacocks, when we have “surplus energy,” we “expend it by doing work in compliance with the law of entropy.”
The same entropic principle is at work, Suzman continues, in the origination of agriculture and, later, in the construction of “proper towns and cities.” Is it possible that our human nature, which tells us to stop working past a certain point, has been overridden by this deeper nature pushing us to work until we drop?
Suzman sees these two principles, like Freud’s Eros and Thanatos, battling it out for supremacy in the heart of humankind. On the one hand, he says, technological breakthroughs are bringing us ever closer to the full automation of production, which will make it so that most people never have to work again. That is our human side—our potential to break through to Keynes’s post-scarcity society. On the other hand, “our governments remain as fixated on economic growth and employment creation [today] as at any point in our recent history.” This fixation manifests the deeper biological force that could destroy us by generating runaway climate change.
The question that puzzles Suzman—why haven’t we arrived by now at Keynes’s post-scarcity future?—has stumped two generations of economists. But Suzman’s answer, while provocative, is ultimately unsatisfying. All of life may have to heed entropy’s command to expend surplus energy, but surely human beings could have found other ways to do that. People could organize their lives around throwing parties, for example, rather than continuing to serve as cogs in the late-capitalist work machine. Society must remain as it is for some other reason.
One could do worse than look to Keynes himself for answers. Keynes was far from seeing the 15-hour work week as a natural evolutionary outcome of capitalist development. After writing his essay on the possibilities for his grandchildren’s generation, he devoted much of the rest of his life to explaining the forces that stood in the way of humanity’s arrival at a post-scarcity future.
Keynes argued that mature capitalist societies no longer grow quickly enough to maintain a high demand for labor without government intervention, a phenomenon that his disciple Alvin Hansen termed “secular stagnation.” Long before we produce enough structures, machines, and equipment to meet the needs of all humanity, Keynes said, the rate of return on investment in these fixed assets will fall below the level required to balance out the risks for private investors. In other words, long before we reach post-scarcity, the engine of capitalist prosperity will give way. The result is not a reduced work week for all but rather underemployment for many and overwork for the rest.
When one considers the long decline in economic growth rates since the 1970s, it is easy to see why more economists are now saying that Keynes was right. With so much productive capacity already in place, the return on purchases of new plant and equipment has fallen to low levels. Private investors have become increasingly reluctant to invest in the expansion of the economy, so economic growth rates have fallen and average unemployment rates have risen.
Governments have faced enormous pressure to get our stagnant economies back on track. In order to revive economic growth rates, one country after another has tried to entice private investors to invest more by spending in excess of tax receipts, deregulating the economy, reducing taxes, and beating back the strength of organized labor. That has encouraged an increase in the number of poor-quality jobs and caused inequality to rise, but it has done little to revive the economic growth engine.
Keynes was hardly unique in thinking that stagnation would mark the end point of capitalist development. What differentiated him from other practitioners of the dismal science was that, like John Stuart Mill, Keynes saw stagnation as an opportunity rather than a tragedy. Writing in the 1840s, Mill looked forward to the end of economic growth: “Hitherto it is questionable if all the mechanical inventions yet made have lightened the day’s toil of any human being,” he observed. Once the flows of private investment had been reduced to a trickle—a condition Mill called the “stationary state”—society might finally begin to use its riches to improve the lot of average people. That would require an increase in public investment: to raise workers’ education levels, to lessen the burden of their labor, and to transform ownership structures to create a cooperative economy.
Keynes has been misrepresented as saying that the capitalist economy could be revived under conditions of stagnation through the government’s stimulation of private demand. On the contrary, as the economist James Crotty has shown, Keynes styled himself in the tradition of Mill as a “liberal socialist”: What he imagined might come after the onset of economic stagnation was a barrage of public investment, which would displace private investment as the primary engine of economic stability. This public investment would be deployed not to make private investment more attractive, but rather to improve our societies directly through the provision of public goods.
So why hasn’t this post-scarcity future come to pass? Clearly, Keynes was overly optimistic about what it would take to change the role of the government in a capitalist economy. He was an idealist in the sense that he thought the world would be transformed more by changing ideas than material interests. Other economists in the post-scarcity tradition were less naive. Galbraith spoke of “vested interests” supporting the politics of production. Mill sounds almost like Marx when addressing the subject: “All privileged and powerful classes have used their power in the interest of their own selfishness.” Elites would never abandon the current engine of economic growth and put public powers, rather than private investors, in the driver’s seat unless they were forced to do so.
Suzman also criticizes Keynes for thinking that economic elites would lead us to the “promised land,” yet in his own account, the power of “ambitious CEOs and money-men” mostly fades into the background. Suzman has written a magisterial book that seeks to cover the entire tapestry of humanity’s economic life, yet one of Work’s major oversights is its lack of interest in how the “haves” have gained and maintained power over the “have-nots.”
Until recently, historians and anthropologists assumed that economic classes emerged in tandem with a specific technological breakthrough, such as the advent of agriculture or urban life. Suzman cites the archaeological evidence that proved this thesis incorrect. Many early agrarian and even urban societies remained “assertively egalitarian,” he writes, including the “oldest almost-urban settlement discovered so far, Çatalhöyük in Turkey.”
However, after dispensing with these explanations, Suzman goes on to argue that the emergence of an economic elite was the simple “by-product” of another technology: “the invention of writing.” As the division of labor became more complex, he suggests, scribes and merchants gained power as a result of the increasing importance of their trades.
The anthropologist James C. Scott has already explained why such writing-based accounts of the economic elite’s origin are unsatisfying. The development of written scripts could not have given birth to domination, since writing was one of domination’s main products. Conquerors developed writing systems 5,000 years ago to tally and tax the possessions of the peoples they conquered. Those taxes in turn served as the funds that allowed conquerors to free themselves from manual labor and become mini-emperors. The earliest statelets of the Fertile Crescent were fragile and prone to collapse, but over time, empires grew and conquered the globe.
Suzman lists fire, agriculture, cities, and factories as the key events in human history. But the emergence of the state is an epochal transition equal in importance to the other four. From a deep historical perspective, the capacity of the “haves” to determine the rules of state politics, and to prevent the “have-nots” from seizing the reins of power even in representative democracies, would have to be counted among the most important forces slowing our progress toward a post-scarcity future. Lacking a theory of politics, Work ends up almost entirely sidestepping the question of how we might achieve that transition.
In the book’s final pages, Suzman gestures toward “proposals like granting a universal basic income,” “shifting the focus on taxation from income to wealth,” and “extending the fundamental rights we give to people and companies to ecosystems, rivers, and crucial habitats.” But he provides no argument for where constituencies supporting these policies might be found or how coalitions working toward them might be constructed. The absence of a politics in Work clearly connects to the way the book deals with another crucial technological transformation—not the emergence of writing or the development of the state, but the automation of production. For Suzman, automation is the key both for explaining humanity’s present-day economic troubles and for unlocking the entrance to a post-scarcity future.
At the core of Work is the theory that automation and AI have unleashed massive quantities of excess energy that need to find an outlet. In Suzman’s view, the expansion of the service sector—which employs more than 90 percent of the workforce in countries like the United States—has been “a result of the fact that wherever and whenever there has been a large, sustained energy surplus, people (and other organisms) have found creative ways to put it to work.” Suzman thinks that automation explains why inequality began to worsen starting in the 1980s: At that time, “technological expansion” was already “cannibalizing the workforce and concentrating wealth in fewer hands.” Citing a famous study by Carl Frey and Michael Osborne, Suzman claims that “47 percent of all current jobs” will be “automated out of existence by as early as 2030.”
If what Suzman is saying were true, getting to post-scarcity would require not so much a policy change as a cultural revolution. That is likely why, instead of focusing on concrete policy prescriptions, Suzman simply expresses the hope that “catalysts” like a “rapidly changing climate” and increasing popular anger, “ignited by systematic inequalities” as much as by a “viral pandemic,” will shake people to their senses.
But Suzman is wrong about automation. He fails to take heed of the limitations of Frey and Osborne’s study, which its own authors have openly acknowledged. The study does not distinguish between jobs that will be partially automated and those that will be fully automated, and it does not specify a time interval for when the jobs will be lost (assuming they will be lost at all). Follow-up studies have suggested that only 14 percent of jobs are likely to be automated out of existence in the coming decades—fewer than were fully automated in decades past.
Entropy turns out to be an equally poor explanation for the expansion of jobs in the service sector. Employment in hospitals and schools expands steadily, not as a way to work off our excess energy, but rather because these occupations have seen so little automation over time. The more health care we want to provide, the more doctors, nurses, and home health aides we will need to employ.
Given how much work remains to be done, humanity can’t simply wake up to the end of work. Getting to post-scarcity will require instead that we reorganize work so that it is more satisfying for workers and better able to meet our needs. That reorganization will necessarily be a complex political process requiring new institutions that both build trust in specialists and subject their recommendations to democratic deliberation. We aren’t going to get to post-scarcity with the push of a button on an automated control panel. Instead, we will have to coordinate across a detailed division of labor. What we can learn from our earliest ancestors in that regard is unfortunately limited.
Suzman is one of a growing number of anthropologists—including Scott, Graeber, and Graeber’s coauthor, David Wengrow—who have mustered the available evidence to demonstrate that human nature is far different from what economists have long led us to believe. We humans are capable of “moderating our personal material aspirations,” but only, as Suzman suggests, if we address currently unsustainable levels of economic and social inequality. Yet looking for inspiring examples from humanity’s rich pre-scarcity past, as Suzman does, may leave one feeling more despondent than optimistic about our chances for achieving a post-scarcity future.
After all, the foragers at the heart of Suzman’s investigations maintain their affluent lifestyle by taking what Sahlins called the “Zen road to affluence”: They limit their material possessions to what they can carry. Anything too large to keep on one’s person during a long trek across the desert is not worth having at all. Meanwhile, to maintain equality on those travels, foragers engage in “demand sharing”: Each person has the right to demand the possessions of any other and generally tries to make reasonable requests. There is simply no chance that we will return to such a nomadic way of life, nor that we will accept such intense scrutiny of our personal possessions.
Most consequentially of all, the foraging groups Suzman looks to in Work organize their lives around so-called ”immediate-return” economies and do not plan for the next day, let alone the next year. (The more complex “delayed-return” foraging societies that Graeber and Wengrow describe might have more to offer by way of example but are less egalitarian.) By contrast, producing the goods we feel are essential to our flourishing, including heating, electricity, and transportation for billions of people—and doing so sustainably—will require lots of planning. If the pre-scarcity forms of life that anthropologists have documented hold the key to post-scarcity living, then it would seem likely that we are doomed. The existing tradition of post-scarcity economics similarly falls short in its efforts to model a viable future society.
The 20th century saw a number of attempts to constrain or even replace the private, profitability-based engine of economic growth with public alternatives: Think of midcentury Keynesian welfare states and Khrushchev-era Soviet socialism. Both ended up mired in secular stagnation and its attendant social crises.
Technocratic elites on either side of the Iron Curtain tried to run their increasingly complex economies from central stations, as if by remote control. Doing so made the achievement of post-scarcity impossible, as it allowed unresolved tensions to build and masses of people to become disaffected. Technocrats collected information and offered incentives to produce that powerful social actors manipulated or ignored. Without much say in how their lives were governed, large numbers of people disengaged from work and society or revolted. In the West, the result was inflation and strikes; in the East, shortages and widespread discontent.
Instead of trying to recover a long-lost past or aligning ourselves with the latest views on human nature, we will have to create novel institutions to facilitate our journey to new, 21st-century destinations. We should set the course not to Mars, for vacationing with Elon Musk and Jeff Bezos, but rather to a post-scarcity planet Earth on which their wealth has been confiscated and put to better ends. Getting there will require that we overcome the endemic insecurity that continues to plague nine-tenths of humanity, while also reducing and transforming the work we do.
Achieving those ends will in turn require that we transform the investment function, as Keynes suggested, but in ways that make investment not only public but also democratically controlled. Freed from the constraints of “scarcity economics,” we will then serve the “trickster god” entropy in new ways, expending excess energy not only in the hunt for efficiency gains or in making whatever Acheulean hand-axes our engineers dream up next, but also in the service of a variety of other ends, such as justice and sustainability, science and culture—and throwing parties, too.