Toggle Menu

Ad Hoc Nation

The unmaking of the steady job.

Laura Marsh

October 25, 2018

Illustration by Tim Robinson.

What happened to the steady job? Gig-economy start-ups like to imply that it has outlived its usefulness. Americans are supposed to have rejected it, leaving behind the ornery supervisors, fixed schedules, and rigid corporate culture that come with dependable employment. Whether they are freelance writers or cab drivers, engineering contractors or couriers or cleaners, these workers, we are told, want to choose their own hours and assignments—to be their own bosses—and the rise of mobile technology has at last made that possible.

Of course, all this independence comes with more than a few drawbacks. Unlike full-time employees, temps don’t have paid sick days or vacation days, and their positions are, by their nature, short-term, which can make planning for the future difficult in basic ways. (Will the next job mean moving to a new city? Will next month bring a significant drop in income?) For a specialist who can demand generous fees, these might be minor considerations. But freelance work is now common at almost every level: 94 percent of jobs created in the last 10 years were “nontraditional” employment, and one-third of Americans now do some form of contract work. More often than not, it is far from a liberating option: In many cases, the pay is measly—after operating costs, Uber drivers in Detroit would have made more working at Walmart—and stringing together hours can itself be a struggle.

Louis Hyman’s new book, Temp: How American Work, American Business, and the American Dream Became Temporary, shows that this shift in work did not happen on its own, and that it began long before the founding of Uber or TaskRabbit. In this persuasive and richly detailed history, Hyman traces a decades-long campaign to eliminate salaried positions and replace them with contract work. Between the emergence of the first temp agencies in the 1940s and the growing power of management consultants in the ’70s, American business adopted a new set of principles and began to squeeze not just blue-collar workers but also middle managers and top executives. The unmaking of the good job, Hyman argues, followed not from technological advances but from an organizational breakthrough, as executives at companies like Manpower Inc. and McKinsey & Co. convinced businesses to add and shed staff at a moment’s notice, with little regard for their employees’ well-being or the effects on society.

For Hyman, this leads to the conclusion that stable employment was always a too-fragile thing. Overly reliant on the economic growth of a unique historical moment, it was always vulnerable to prophets of disruption. As a result, he sees little point in trying to replicate the labor relations of the postwar era; today’s conditions require a different and, in his view, more “flexible” arrangement. Yet Hyman’s history seems to suggest the opposite conclusion: throughout Temp it’s the actions of people that decide what work will mean, what it should offer, and who should benefit. Those people have often been executives and consultants seeking to undermine the stability and security of jobs, but they can also be workers, fighting for more stable, equitable visions of work.

Current Issue

View our current issue

Subscribe today and Save up to $129.

To understand how the steady job began to disappear, you have to understand why it emerged in the first place in the 1940s and ’50s. The post–World War II industrial economy didn’t naturally create well-paid employment, but it did encourage companies to place a high value on stability and long-term planning. At a moment when aerospace and automobiles defined manufacturing, the construction of a new factory required a vast investment, which would take years to pay off. If, during this time, people unexpectedly bought fewer cars or the price of steel shot up or workers put down their tools and went on strike, the whole enterprise could crumble. As a result, companies prized predictability—secure supply chains, steady demand, and smooth operations.

This stability didn’t trickle down to employees as a matter of course; they had to fight for a share in it. Labor had agreed not to strike during the Second World War, but as soon as peace arrived, unions resumed the vigorous activity of the 1930s, starting with the biggest walkout in American labor history in 1945. As workers continued to organize, industrial behemoths were forced to give them a better deal, since “whatever labor cost,” Hyman writes, “it cost less than the machines going idle.” In 1950, the United Auto Workers and General Motors drew up the so-called Treaty of Detroit, a five-year agreement that granted workers cost-of-living raises, health insurance, retirement funds, and a grievance process; it was, GM admitted, good for management too, as it guaranteed a period of calm and fixed labor costs. As other large companies committed to similar arrangements in the years that followed, the standard for the good postwar job was set.

But not everyone was so enthusiastic about this new era of stable employment. An early critic was a Midwestern lawyer named Elmer Winter, who founded the temping agency Manpower Inc. in 1948. Winter recognized that every man and woman who works wants “a good job—good health and security,” but he insisted that these things were too expensive for American companies to provide. They would be more profitable if they relied more on temps, who were not eligible for benefits and didn’t expect raises. Temps were also, Winter claimed, more efficient. They didn’t require training or time to adjust to their new setting. They didn’t get distracted by office gossip. And if they made a mistake, they could simply be replaced with new temps. All of which could also make the permanent staff more productive, as they’d need to work just as relentlessly if they wanted to keep their positions.

It was a grim vision of the workplace—pitting colleagues against one another in a relentless competition—and Winter knew it would be a tough sell. Companies wouldn’t necessarily trust outsiders, and other workers wouldn’t like the idea of temps replacing them. But Manpower Inc.—along with its rivals Kelly Girl and Olsten—found a neat way around these fears: No one needed to worry about temps taking over, because their temps were women. A temp could fill in for a secretary when she went on vacation or help out with a sudden influx of paperwork, but she would never want to stay for a long period. She was just picking up a few hours here and there in order to get out of the house or to earn money for fancy clothes—or so Manpower Inc. claimed. Agencies also used the temp’s sexuality as a selling point, instructing her on dress and comportment, and sending her to assignments wearing white gloves—an alluring symbol of propriety and efficiency. If a client requested “ ’a size 10 secretary’ to double as a secretary and model,” they could produce one.

The charming, feminine image that the agencies created concealed the hard realities of temping, which became increasingly clear in the decades that followed. A survey from 9to5, a working-women’s collective started in the 1970s, noted that a temp’s paycheck was often her “bread and butter, not pin money,” as her employers liked to assume. Temps supported themselves and their families, and their work was often essential to the local economies. In Boston, clerical workers made up nearly one-fifth of the workforce; they were as important to their city as autoworkers were to Detroit, one 9to5 organizer pointed out. Yet, unlike autoworkers, temps doing clerical work didn’t have the protections of a union contract. They described feeling underpaid, disrespected, and duped. The supposed variety of temping was a “bogus lure,” one respondent wrote, comparing the system to a “roulette wheel.”

The temp agencies were far from the only holdouts against the stable job in the years after World War II. Hyman devotes a large portion of his book to the labor practices of the electronics industry, a sector that never embraced the postwar vision of planning. Semiconductor manufacturers released new products and models much more quickly than, say, automobile companies, and they didn’t have time to automate the bulk of their rapidly changing production processes. Instead, they relied largely on extremely poorly paid recent immigrants and undocumented people to assemble their products by hand. They didn’t employ them directly, but through subcontracting firms, which allowed them to claim ignorance of the low safety standards and appalling working conditions. Hyman describes women screwing components onto circuit boards using their fingernails. For these people, the good postwar industrial job was never accessible.

More and more Americans would soon find themselves similarly shut out. It had been a mistake, Hyman notes, for unions and long-term employees to ignore the plight of less fortunate workers, since “the experiences of the people who were left out of the good postwar jobs became the rehearsal for most people’s jobs today.” In the new world, there would be more temps, and less security even for the permanent.

We cannot back down

We now confront a second Trump presidency.

There’s not a moment to lose. We must harness our fears, our grief, and yes, our anger, to resist the dangerous policies Donald Trump will unleash on our country. We rededicate ourselves to our role as journalists and writers of principle and conscience.

Today, we also steel ourselves for the fight ahead. It will demand a fearless spirit, an informed mind, wise analysis, and humane resistance. We face the enactment of Project 2025, a far-right supreme court, political authoritarianism, increasing inequality and record homelessness, a looming climate crisis, and conflicts abroad. The Nation will expose and propose, nurture investigative reporting, and stand together as a community to keep hope and possibility alive. The Nation’s work will continue—as it has in good and not-so-good times—to develop alternative ideas and visions, to deepen our mission of truth-telling and deep reporting, and to further solidarity in a nation divided.

Armed with a remarkable 160 years of bold, independent journalism, our mandate today remains the same as when abolitionists first founded The Nation—to uphold the principles of democracy and freedom, serve as a beacon through the darkest days of resistance, and to envision and struggle for a brighter future.

The day is dark, the forces arrayed are tenacious, but as the late Nation editorial board member Toni Morrison wrote “No! This is precisely the time when artists go to work. There is no time for despair, no place for self-pity, no need for silence, no room for fear. We speak, we write, we do language. That is how civilizations heal.”

I urge you to stand with The Nation and donate today.

Onwards,

Katrina vanden Heuvel
Editorial Director and Publisher, The Nation

If it was the strong economy of the postwar era that supported stable companies and stable jobs, then it was the crises of the 1970s that helped undo both. As recession and stagnation threw a wrench in big corporations’ long-term planning, uncertainty crept back in. Businesses began to doubt the virtues of being big. The 1960s had seen most of the United States’ largest companies transform themselves from rigorously structured corporations into conglomerates sprawled across many different industries. For a few years, their valuations had soared. But when the conglomerate bubble burst in 1969, they found themselves in a tangled mess. In their attempts to restore order, they turned to management consultants, who finally got an opportunity to reconfigure huge organizations around the ideals of flexibility and nimbleness.

For this reason, the rise of management consultancy as a profession, and the workings of individual firms (particularly McKinsey, the Boston Consulting Group, Price Waterhouse, and Coopers & Lybrand) is central to Hyman’s story. The peculiar internal cultures of these companies, he shows, affected decision-making at the corporations they were hired to help reorganize. The ranks of these firms were filled, especially in their earliest decades, by privileged young men, handpicked for their impressive academic credentials and for their lack of real-world experience. They were also selected for social class and certain personality traits; when hiring an associate, McKinsey’s second chairman, Marvin Bower, insisted it was important for him to feel he had chosen someone he “would be glad to go on a tiger hunt with.”

In their experience of work, these consultants also differed significantly from most employees in most postwar companies. They did a lot of their work autonomously, as they visited other companies to make reports. They were expected to find “self-expression and personal fulfillment” in their duties. And they had low expectations of job security. From its early years, McKinsey enforced a Darwinian “up-or-out” policy: If an associate wasn’t promoted within a few years, he was asked to leave the company. In the 1960s, only 17 percent of post-MBA consultants made partner. For those who didn’t, the odds were that they would quickly find lucrative work elsewhere (unlike, perhaps, a machinist or middle manager laid off at their recommendation). Despite the pressures of the job, many of the firm’s alumni believed fervently in the system, evangelizing it in their own firms and in books like Thomas Peters and Robert Waterman’s In Search of Excellence.

This new generation of management consultants thought that companies should be built around workers who were like them. Corporations had grown too big and stable; they were weighed down by labor costs and long-term investments and couldn’t respond quickly to changes in the economy; their employees were complacent and unproductive. “The dinosaur skeletons in the museums remind us,” said Gilbert Clee, Bower’s successor at McKinsey, that “great size has its dangers.” To survive the economic flux of the 1970s and ’80s, corporations would need to become leaner and more agile, outsourcing most of their routine operations and retaining a core staff of only the most adaptable employees. These employees wouldn’t have traditional jobs; instead, they would jump from project to project, problem-solving in “groups of relative strangers.” Where once the ranks of company men formed an organization’s backbone, the new professional would be “part of the organization only on an individual, adhoc basis.”

These ideas saved money, because they allowed companies to reduce their staffs; they also glistened with the prestige of forward thinking and cultural relevance. It was easy to criticize the organization man and the large bureaucracies he inhabited: The cultural critics of the 1950s blamed postwar ennui on him, and the counterculture of the ’60s rebelled against him. By contrast, business “gurus” styled themselves as visionaries, speaking the language of “creativity” and producing a steady stream of buzzwords. Warren Bennis heralded the 1970s as an age of “organizational revitalization,” while Alvin Toffler in Future Shock forecast the coming of the “adhocracy,” a system with no set structures, in which teams could regroup as needed to take on any range of new tasks.

These futuristic visions of disruption aligned perfectly with Elmer Winter’s vision of temporary staffing. Manpower Inc., which had begun by claiming that temps were never meant to replace full-time staff, now talked openly about “using staff on an ad-hocracy basis” and proposed “hiring the very best skills possible for a particular job and then terminating the people when the work is completed.” Rivals like Kelly Services (which had dropped the “Girl” from its name as the dominion of temp work expanded) similarly proposed a “core and ring” model—a core of permanent staff, surrounded by temps who came and went with fluctuations in the business cycle.

Within a decade, this model became the norm. In Search of Excellence would become one of the most influential management books of the ’80s, with its recommendation that no company needed more than 100 employees at its headquarters. Walmart believed in “empty headquarters.” So, the authors boasted, did Intel, where “all staff assignments are temporary ones given to line officers.”

By 1988, 90 percent of all businesses in America employed temp workers. Santa Clara County, the heart of Silicon Valley, had 180 temp agencies alone. Hewlett Packard, a company once staunchly committed to job security for its employees, retooled for flexibility, creating its own internal pool of temps called Flex Force. The steady jobs that remained were everywhere becoming fewer, and the people who held them often found themselves running faster and faster, like Wile E. Coyote, over a chasm.

What Hyman shows with striking clarity is how extensively the ideal of the steady job had been undermined before the start of the 21st century. Long before computers cut down on office work, companies had been splitting up tasks and outsourcing many of them to temps. Almost everyone was replaceable; computers just made it worse.

Support our work with a digital subscription.

Get unlimited access: $9.50 for six months.

Similarly, it is not technological brilliance, he argues, that has enabled the rapid ascendancy of companies like Uber. The main reason so many people have turned to on-demand apps to pick up work since 2008 is that good jobs are so scarce. The alternative to driving for Uber or delivering takeout for Seamless is not a union job on an assembly line or an entry-level position at a corporate headquarters; it is other precarious work, like waiting tables or stocking shelves at a Walmart. “Uber is possible because shift work, even with a W2, is so bad,” Hyman writes, listing the indignities to which low-wage workers are subjected, from bag searches to routine drug-testing.

Hyman is less than optimistic that the turn toward temping can be reversed. He describes mostly failed efforts to fight back. Some of these are cultural—he unearths zines with titles like Temp Slave! and Processed World, which, furtively printed on office-owned Xerox machines by night, chronicled the contingent worker’s plight. But the contributors’ attempts at resistance were not particularly well coordinated; in one article, the author recounts deliberately making mistakes while doing data entry for General Electric, in an act of sabotage that probably had little effect in the grand scheme of things. Other writers praise slow workers and “time thieves,” but note that to tip the scales even modestly, workers would need institutions, like labor unions or political parties, to negotiate on their behalf.

The women of 9to5 did exactly that, forming a local with SEIU and lobbying the Massachusetts Legislature for better regulation, but they enjoyed little success. Later, programmers at Microsoft sued the company for misclassifying them as contractors when they were doing the work of employees, but the $97 million settlement they received—less than half a percent of Microsoft’s annual revenue—was, Hyman judges, a “bargain” for the company. And more recently, taxi drivers’ associations have taken Uber to court, while some drivers have launched grassroots campaigns to demand better terms from the company.

These efforts all reveal clear limitations. The protections established in the mid-20th century, Hyman concludes, have long been insufficient for an age in which so much work is now temporary and precarious. Freelancers, contractors, and gig workers need expanded labor policies and new labor organizations to establish fair practices and protections. Hyman doesn’t go into detail about what these institutions might look like. Workers could, he suggests, form digital cooperatives and start their own Uber-style platforms; they should also push for changes through the political system, though again he doesn’t say what their goals should be.

Most surprising, however, is that he sees little potential for established labor unions to help win greater stability for more workers. This seems to be an oversight at a moment when precarious millennials are enthusiastic about organized labor, and when unions are making inroads among white-collar workers as never before. It is true that unions cannot solve the whole of the problem, since they cannot represent contractors, under current law. But Hyman overlooks their potential to stem some of the damage, by organizing permanent employees who are currently not represented by a union. With union density at just under 7 percent in the private sector, there is plenty to be done on this front.

Hyman also overlooks the role that unions can play in educating a broad range of workers about organizing strategies and labor issues. What’s vital is that new and long-standing union members develop and strengthen forms of solidarity with nonmembers, especially with contingent workers. That might mean helping misclassified permalancers gain recognition as full-time employees, or joining temps in advocating for fair treatment. Union-backed living-wage campaigns like the Fight for $15 have already found ways to organize outside of traditional shops, while groups like the National Domestic Workers Alliance bring together isolated independent contractors. Hyman seems to dismiss unions mostly because he sees them as a direct counterpart to the lumbering, bureaucratic corporation of the postwar era, with all the same flaws; but he misses a crucial difference between them—the fact that a union was always built on the principle of solidarity, and the power of raising a collective voice.

Today’s temps, permalancers, subcontractors, and underemployed do have an advantage that their predecessors didn’t: The effects of the gig economy permeate society more thoroughly and visibly than any of the downsizing and outsourcing that came before them. There are hints of disruption and quiet reminders of insecurity anywhere you care to look. You can order almost anything—cleaning, furniture assembly, food—at the touch of a button and never have to go outside or consider the effects of Uber, TaskRabbit, Seamless, and Craigslist on the industries they’ve taken over. But at the same time, as you scroll through the apps on your phone, how can you be sure your own job won’t be chopped up and posted on Upwork?

Laura MarshLaura Marsh is the literary editor of The New Republic.


Latest from the nation