“Debtpocalypse” is merely the latest installment in a tragic, forty-year story of the dispossession of American workers.
Steve FraserThis article originally appeared at TomDispatch.com. To stay on top of important articles like these, sign up to receive the latest updates from TomDispatch.com. “Debtpocalypse” looms. Depending on who wins out in Washington, we’re told, we will either free fall over the fiscal cliff or take a terrifying slide to the pit at the bottom. Grim as these scenarios might seem, there is something confected about the mise-en-scène, like an un-fun Playland. After all, there is no fiscal cliff, or at least there was none—until the two parties built it.
And yet the pit exists. It goes by the name of “austerity.” However, it didn’t just appear in time for the last election season or the lame-duck session of Congress to follow. It was dug more than a generation ago, and has been getting wider and deeper ever since. Millions of people have long made it their home. “Debtpocalypse” is merely the latest installment in a tragic, forty-year story of the dispossession of American working people.
Think of it as the archeology of decline, or a tale of two worlds. As a long generation of austerity politics hollowed out the heartland, the quants and traders and financial wizards of Wall Street gobbled up ever more of the nation's resources. It was another Great Migration—instead of people, though, trillions of dollars were being sucked out of industrial America and turned into “financial instruments” and new, exotic forms of wealth. If blue-collar Americans were the particular victims here, then high finance is what consumed them. Now, it promises to consume the rest of us.
Scenes from the Museum
In the mid-1970s, Hugh Carey, then governor of New York, was already noting the hollowing out of his part of America. New York City, after all, was threatening to go bankrupt. Plenty of other cities and states across what was then known as the “Frost Belt” were in similar shape. Yankeedom, in Carey’s words, was turning into “a great national museum” where tourists could visit “the great railroad stations where the trains used to run.”
As it happened, the tourists weren’t interested. Abandoned railroad stations might be fetching in an eerie sort of way, but the rest of the museum was filled with artifacts of recent ruination that were too depressing to be entertaining. True, a century earlier, during the first Gilded Age, the upper crust used to amuse itself by taking guided tours of the urban demi-monde, thrilling to sites of exotic depravity or ethnic strangeness. They traipsed around “rag-pickers alley” on New York’s Lower East Side or the opium dens of Chinatown, or ghoulishly watched poor children salivate over toys in store window displays they could never hope to touch.
Times have changed. The preference now is to entirely remove the unsightly. Nonetheless, the national museum of industrial homicide has, city by city, decade by decade, grown more grotesque.
Camden, New Jersey, for example, had long been a robust, diversified small industrial city. By the early 1970s, however, its reform mayor Angelo Errichetti was describing it this way:
It looked like the Vietcong had bombed us to get even. The pride of Camden…was now a rat-infested skeleton of yesterday, a visible obscenity of urban decay. The years of neglect, slumlord exploitation, tenant abuse, government bungling, indecisive and short-sighted policy had transformed the city’s housing, business, and industrial stock into a ravaged, rat-infested cancer on a sick, old industrial city.
That was forty years ago and yet, today, news stories are still being written about Camden’s never-ending decline into some bottomless abyss. Consider that a measure of how long it takes to shut down a way of life.
Once upon a time, Youngstown, Ohio, was a typical smokestack city, part of the steel belt running through Pennsylvania and Ohio. As with Camden, things there started turning south in the 1970s. From 1977 to 1987, the city lost 50,000 jobs in steel and related industries. By the late 1980s, the years of Ronald Reagan’s presidency when it was “morning again in America,” it was midnight in Youngstown: foreclosures, an epidemic of business bankruptcies, and everywhere collapsing community institutions including churches, unions, families and the municipal government itself.
Burglaries, robberies and assaults doubled after the steel plants closed. In two years, child abuse rose by 21 percent, suicides by 70 percent. One-eighth of Mahoning County went on welfare. Streets were filled with dead storefronts and the detritus of abandoned homes: scrap metal and wood shingles, shattered glass, stripped-away home siding, canning jars and rusted swing sets. Each week, 1,500 people visited the Salvation Army’s soup line.
The Wall Street Journal called Youngstown “a necropolis,” noting miles of “silent, empty steel mills” and a pervasive sense of fear and loss. Bruce Springsteen would soon memorialize that loss in “The Ghost of Tom Joad.”
If you were unfortunate enough to live in the small industrial city of Mansfield, Ohio, for the last forty years, you would have witnessed in microcosm the dystopia of destruction unfolding in similar places everywhere. For a century, workshops there had made a kaleidoscope of goods: stoves, tires, steel, machinery, refrigerators and cars. Then Mansfield’s rust belt started narrowing as one plant after another went shut down: Dominion Electric in 1971, Mansfield Tire and Rubber in 1978, Hoover Plastics in 1980, National Seating in 1985, Tappan Stoves in 1986, a Westinghouse plant and Ohio Brass in 1990, Wickes Lumber in 1997, Crane Plumbing in 2003, Neer Manufacturing in 2007 and Smurfit-Stone Container in 2009. In 2010, General Motors closed its largest, most modern US stamping factory, and thanks to the Great Recession, Con-way Freight, Value City and Card Camera also shut down.
“Good times” or bad, it didn’t matter. Mansfield shrank relentlessly, becoming the urban equivalent of skin and bones. Its poverty rate is now at 28 percent, its median income $11,000 below the national average of $41,994. What manufacturing remains is non-union and $10 an hour is considered a good wage.
Midway through this industrial auto-da-fé, a journalist watching the Campbell Works of Youngstown Sheet and Tube go dark, mused that “the dead steel mills stand as pathetic mausoleums to the decline of American industrial might that was once the envy of the world.” This dismal record is particularly impressive because it encompasses the “boom times” presided over by Presidents Reagan and Clinton.
The “Pit” Deepens
In 1988, in the iciest part of the Frost Belt, a Wall Street Journal reporter noted, “There are two Americas now, and they grow further apart each day.” He was referring to Eastport, Maine. Although the deepest port on the East Coast, it hosted few ships, abandoned sardine factories lined its shore, and its bars were filled with the under- and unemployed. The reporter pointed out that he had seen similar scenes from a collapsing rural economy “coast to coast, border to border”: shuttered saw mills, abandoned mines, closed schools, rutted roads, ghost airports.
Closing up, shutting down, going out of business: last one to leave please turn out the lights!
Such was the case in cities and towns around the country. Essential public services—garbage collection, policing, fire protection, schools, street maintenance, healthcare—were atrophying. So were the people who lived in those places. High blood pressure, cardiac and digestive problems, and mortality rates were generally rising, as was doubt, self-blame, guilt, anxiety and depression. The drying up of social supports, even among those who once had been friends and workmates, haunted the inhabitants of these places as much as the industrial skeletons around them.
In the 1980s, when Jack Welch, soon to be known as “Neutron Jack” for his ruthlessness, became CEO of General Electric, he set out to raise the company’s stock price by gutting the workforce. It only took him six years, but imagine what it was like in Schenectady, New York, which lost 22,000 jobs; Louisville, Kentucky, where 13,000 fewer people made appliances; Evendale, Ohio, where 12,000 no longer made lights and light fixtures; Pittsfield, Massachusetts, where 8,000 plastics makers lost their jobs; and Erie, Pennsylvania, where 6,000 locomotive workers got green slips.
Life as it had been lived in GE’s or other one-company towns ground to a halt. Two travelling observers, Dale Maharidge and Michael Williamson, making their way through the wasteland of middle America in 1984 spoke of “medieval cities of rusting iron” and a largely invisible landscape filling up with an army of transients, moving from place to place at any hint of work. They were camped out under bridges, riding freight cars, living in makeshift tents in fetid swamps, often armed, trusting no one, selling their blood, eating out of dumpsters.
Nor was the calamity limited to the northern Rust Belt. The South and Southwest did not prove immune from this wasting disease either. Empty textile mills, often originally runaways from the North, dotted the Carolinas, Georgia and elsewhere. Half the jobs lost due to plant closings or relocations occurred in the Sunbelt.
In 2008, in the Sunbelt town of Colorado Springs, Colorado, one-third of the city’s street lights were extinguished, police helicopters were sold, watering and fertilizing in the parks was eliminated from the budget, and surrounding suburbs closed down the public bus system. During the recent Great Recession one-industry towns like Dalton, Georgia (“the carpet capital of the world”), or Blakely, Georgia (“the peanut capital of the world”), or Elkhart, Indiana (“the RV capital of the world”), were closing libraries, firing police chiefs and taking other desperate measures to survive.
And no one can forget Detroit. Once, it had been a world-class city, the country’s fourth largest, full of architectural gems. In the 1950s, Detroit had a population with the highest median income and highest rate of home ownership in urban America. Now, the “motor city” haunts the national imagination as a ghost town. Home to 2 million a quarter-century ago, its decrepit hulk is now “home” to 900,000. Between 2000 and 2010 alone, the population hemorrhaged by 25 percent, nearly a quarter of a million people, almost as many as live in post-Katrina New Orleans. There and in other core industrial centers like Baltimore, “death zones” have emerged where whole neighborhoods verge on medical collapse.
One-third of Detroit, an area the size of San Francisco, is now little more than empty houses, empty factories and fields gone feral. A whole industry of demolition, waste-disposal and scrap-metal companies arose to tear down what once had been. With a jobless rate of 29 percent, some of its citizens are so poor they can’t pay for funerals, so bodies pile up at mortuaries. Plans are even afoot to let the grasslands and forests take over, or to give the city to private enterprise.
Even the public zoo has been privatized. With staff and animals reduced to the barest of minimums and living wages endangered by its new owner, an associate curator working with elephants and rhinos went in search of another job. He found it with the city—chasing down feral dogs whose population had skyrocketed as the cityscape returned to wilderness. History had, it seemed, abandoned dogs along with their human compatriots.
Looking Backward
But could this just be the familiar story of capitalism’s penchant for “creative destruction”? The usual tale of old ways disappearing, sometimes painfully, as part of the story of progress as new wonders appear in their place?
Imagine for a moment the time traveler from Looking Backward, Edward Bellamy’s best-selling utopian novel of 1888 waking up in present-day America. Instead of the prosperous land filled with technological wonders and egalitarian harmony Bellamy envisioned, his protagonist would find an unnervingly familiar world of decaying cities, people growing ever poorer and sicker, bridges and roads crumpling, sweatshops a commonplace, the largest prison population on the planet, workers afraid to stand up to their bosses, schools failing, debts growing more onerous and inequalities starker than ever.
A recent grim statistic suggests just how Bellamy’s utopian hopes have given way to an increasingly dystopian reality. For the first time in American history, the life expectancy of white people, men and women, has actually dropped. Life spans for the least educated, in particular, have fallen by about four years since 1990. The steepest decline: white women lacking a high school diploma. They, on average, lost five years of life, while white men lacking a diploma lost three years.
Unprecedented for the United States, these numbers come close to the catastrophic decline Russian men experienced in the desperate years following the collapse of the Soviet Union. Similarly, between 1985 and 2010, American women fell from fourteenth to forty-first place in the United Nation’s ranking of international life expectancy. (Among developed countries, American women now rank last.) Whatever combination of factors produced this social statistic, it may be the rawest measure of a society in the throes of economic anorexia.
One other marker of this eerie story of a developed nation undergoing underdevelopment and a striking reproach to a cherished national faith: for the first time since the Great Depression, the social mobility of Americans is moving in reverse. In every decade from the 1970s on, fewer people have been able to move up the income ladder than in the previous ten years. Now Americans in their thirties earn 12 percent less on average than their parents’ generation at the same age. Danes, Norwegians, Finns, Canadians, Swedes, Germans and the French now all enjoy higher rates of upward mobility than Americans. Remarkably, 42 percent of American men raised in the bottom one-fifth income cohort remain there for life, as compared to 25 percent in Denmark and 30 percent in notoriously class-stratified Great Britain.
Eating Our Own
Laments about “the vanishing middle class” have become commonplace, and little wonder. Except for those in the top 10 percent of the income pyramid, everyone is on the down escalator. The United States now has the highest percentage of low-wage workers—those who earn less than two-thirds of the median wage—of any developed nation. George Carlin once mordantly quipped, “It’s called the American Dream because you have to be asleep to believe it.” Now, that joke has become our waking reality.
During the “long nineteenth century,” wealth and poverty existed side by side. So they do again. In the first instance, when industrial capitalism was being born, it came of age by ingesting what was valuable embedded in pre-capitalist forms of life and labor, including land, animals, human muscle power, tools and talents, know-how and the ways of organizing and distributing what got produced. Wealth accumulated in the new economy by extinguishing wealth in the older ones.
“Progress” was the result of this economic metabolism. Whatever its stark human and ecological costs, its achievements were also highly visible. America’s capacity to sustain a larger and larger population at rising levels of material well-being, education and health was its global boast for a century and half.
Shocking statistics about life expectancy and social mobility suggest that those days are over. Wealth, great piles of it, is still being generated, and sometimes displayed so ostentatiously that no one could miss it. Technological marvels still amaze. Prosperity exists, though for an ever-shrinking cast of characters. But a new economic metabolism is visibly at work.
For the last forty years, prosperity, wealth, and “progress” have rested, at least in part, on a grotesque process of auto-cannibalism—it has also been called “dis-accumulation” by David Harvey—of a society that is devouring its own.
Traditional forms of primitive accumulation still exist abroad. Hundreds of millions of former peasants, fisherman, craftspeople, scavengers, herdsmen, tradesmen, ranchers and peddlers provide the labor power and cheap products that buoy the bottom lines of global manufacturing and retail corporations, as well as banks and agribusinesses. But here in “the homeland,” the very profitability and prosperity of privileged sectors of the economy, especially the bloated financial arena, continue to depend on slicing, dicing and stripping away what was built up over generations.
Once again a new world has been born. This time, it depends on liquidating the assets of the old one or shipping them abroad to reward speculation in “fictitious capital.” Rates of US investment in new plants, technology, and research and development began declining during the 1970s, a fall-off that only accelerated in the gilded 1980s. Manufacturing, which accounted for nearly 30 percent of the economy after the Second World War, had dropped to just over 10 percent by 2011. Since the turn of the millennium alone, 3.5 million more manufacturing jobs have vanished and 42,000 manufacturing plants were shuttered.
Nor are we simply witnessing the passing away of relics of the nineteenth century. Today, only one American company is among the top ten in the solar power industry and the United States accounts for a mere 5.6 percent of world production of photovoltaic cells. Only GE is among the top ten companies in wind energy. In 2007, a mere 8 percent of all new semi-conductor plants under construction globally were located in the United States. Of the 1.2 billion cell phones sold in 2009, none were made in the United States. The share of semi-conductors, steel, cars and machine tools made in America has declined precipitously just in the last decade. Much high-end engineering design and R&D work has been offshored. Now, there are more people dealing cards in casinos than running lathes, and almost three times as many security guards as machinists.
The FIRE Next Time
Meanwhile, for more than a quarter of a century the fastest growing part of the economy has been the finance, insurance and real estate (FIRE) sector. Between 1980 and 2005, profits in the financial sector increased by 800 percent, more than three times the growth in non-financial sectors.
In those years, new creations of financial ingenuity, rare or never seen before, bred like rabbits. In the early 1990s, for example, there were a couple of hundred hedge funds; by 2007, 10,000 of them. A whole new species of mortgage broker roamed the land, supplanting old-style savings and loan or regional banks. Fifty thousand mortgage brokerages employed 400,000 brokers, more than the whole US textile industry. A hedge fund manager put it bluntly, “The money that’s made from manufacturing stuff is a pittance in comparison to the amount of money made from shuffling money around.”
For too long, these two phenomena—the eviscerating of industry and the supersizing of high finance—have been treated as if they had nothing much to do with each other, but were simply occurring coincidentally.
Here, instead, is the fable we’ve been offered: Sad as it might be for some workers, towns, cities, and regions, the end of industry is the unfortunate, yet necessary, prelude to a happier future pioneered by “financial engineers.” Equipped with the mathematical and technological know-how that can turn money into more money (while bypassing the messiness of producing anything), they are our new wizards of prosperity!
Unfortunately, this uplifting tale rests on a categorical misapprehension. The ascendancy of high finance didn’t just replace an industrial heartland in the process of being gutted; it initiated that gutting and then lived off it, particularly during its formative decades. The FIRE sector, that is, not only supplanted industry, but grew at its expense—and at the expense of the high wages it used to pay and the capital that used to flow into it.
Think back to the days of junk bonds, leveraged buy-outs, megamergers and acquisitions, and asset stripping in the 1980s and 1990s. (Think, in fact, of Bain Capital.) What was getting bought and stripped and closed up supported windfall profits in high-interest-paying junk bonds. The stupendous fees and commissions that went to those “engineering” such transactions were being picked from the carcass of a century and a half of American productive capacity. The hollowing out of the United States was well under way long before anyone dreamed up the “fiscal cliff.”
For some long time now, our political economy has been driven by investment banks, hedge funds, private equity firms, real estate developers, insurance goliaths and a whole menagerie of ancillary enterprises that service them. But high times in FIRE land have depended on the downward mobility of working people and the poor, cut adrift from more secure industrial havens and increasingly from the lifelines of public support. They have been living instead in the “pit of austerity.” Soon many more of us will join them.
For more on the corporate interests pushing us into the "pit of austerity," check out George Zornick on "What the CEOs Lobbying on the Fiscal Cliff Really Want."
Steve FraserSteve Fraser is the author of the just-published Mongrel Firebugs and Men of Property: Capitalism and Class Conflict in American History. He is a co-founder and co-editor of the American Empire Project.