A mere thirty-three years ago, on January 20, 1977, Jimmy Carter inaugurated his presidency by proclaiming from the Capitol steps, "Because we are free we can never be indifferent to the fate of freedom elsewhere…. Our commitment to human rights must be absolute." Most people had never heard of "human rights." Except for Franklin Delano Roosevelt in a couple of passing references, no president had really mentioned the concept, and it never had gained much traction around the world either. Carter’s words sparked an intense debate at every level of government and society, and in political capitals across the Atlantic Ocean, about what it would entail to shape a foreign policy based on the principle of human rights.
The concept of rights, including natural rights, stretches back centuries, and "the rights of man" were a centerpiece of the age of democratic revolution. But those droits de l’homme et du citoyen meant something different from today’s "human rights." For most of modern history, rights have been part and parcel of battles over the meanings and entitlements of citizenship, and therefore have been dependent on national borders for their pursuit, achievement and protection. In the beginning, they were typically invoked by a people to found a nation-state of their own, not to police someone else’s. They were a justification for state sovereignty, not a source of appeal to some authority—like international law—outside and above it.
In the United States, rights were also invoked to defend property, not simply to defend women, blacks and workers against discrimination and second-class citizenship. The New Deal assault on laissez-faire required an unstinting re-examination of the idea of natural rights, which had been closely associated with freedom of contract since the nineteenth century and routinely defended by the Supreme Court. By the 1970s, rights as a slogan for democratic revolution seemed less pressing, and few remembered the natural rights of property and contract that the New Deal had once been forced to challenge. Carter was free to invoke the concept of rights for purposes it had never before served. (Arthur Schlesinger Jr. once called on future historians to "trace the internal discussions…that culminated in the striking words of the inaugural address." No one, however, yet knows exactly how they got there.)
It looks like Carter was an exception in another sense. He inaugurated the era of human rights in this country, but now it seems to be fading. Bill Clinton dabbled in human rights while outlining a new post–cold war foreign policy, but the Democratic politician now in the White House has spurned them. Few developments seem more surprising than the fact that Barack Obama rarely mentions human rights, especially since past enthusiasts for them like Samantha Power and Anne-Marie Slaughter have major roles in his foreign policy shop. Obama has given no major speech on the subject and has subordinated the concerns associated with human rights, such as taking absolute moral stands against abusive dictators, to a wider range of pragmatic foreign policy imperatives. As his Nobel remarks made plain, Obama is a "Christian realist" inclined to treat human sin, not human rights, as the point of departure for thinking about America’s relation to the world’s many injustices and horrors.
Popular
"swipe left below to view more authors"Swipe →
The rise and fall of human rights as an inspirational concept may seem shocking, but perhaps it is less so on second glance. Ever since Carter put human rights on the table, Republican presidents have found uses for them too, typically by linking them to "democracy promotion" abroad. There is no denying the powerful growth of nongovernmental organizations in the United States and around the world that has occurred since slightly before Carter’s time, and impressively ever since. But George W. Bush, placing himself in an almost equally longstanding tradition, invoked human rights as the battle cry for the neoconservative vision of transforming the Middle East and beyond—at the point of a gun, if necessary—perhaps sullying them beyond recuperation. Obama seems to think so. If their current abeyance is surprising, perhaps it’s because of a historical mistake: the belief that human rights were deeply ingrained in American visions of the globe in the first place.
But what about the 1940s, when FDR essentially coined the phrase "human rights" and set in motion a series of events that culminated in the United Nations–sponsored Universal Declaration of Human Rights in 1948? Beginning in the 1990s, when human rights acquired a literally millennial appeal in the public discourse of the West during outbreaks of ethnic cleansing in Southeastern Europe and beyond, it became tempting to treat 1948 as a moment of annunciation, with large political consequences. Carter, and the 1970s, were rarely mentioned. It became common to assume that, ever since their birth in a moment of postgenocidal revulsion and wisdom, human rights had become embedded slowly but steadily in humane consciousness in what amounted to a revolution of moral life. In a euphoric mood, high-profile observers like Michael Ignatieff believed that secure moral guidance, born of incontestable shock about the Holocaust, was on the verge of displacing self-interest and power as the foundation of international relations. In Samantha Power’s "A Problem From Hell": America and the Age of Genocide (2002), Raphael Lemkin, who crafted the draft resolution of the 1948 Convention on the Prevention and Punishment of the Crime of Genocide, was dusted off as a human rights sage and hero, with Carter earning attention only for failing to intervene against Pol Pot’s atrocities.
In fact, when "human rights" entered the English language in the 1940s, it happened unceremoniously, even accidentally. Human rights began as a very minor part of a hopeful alternative vision to set against Adolf Hitler’s vicious and tyrannical new order. In the heat of battle and shortly thereafter, a vision of postwar collective life in which personal freedoms would coalesce with more widely circulating promises of some sort of social democracy provided the main reason to fight the war.
It’s important to enumerate what human rights, in the 1940s, were not. Ignatieff was wrong. They were not a response to the Holocaust, and not focused on the prevention of catastrophic slaughter. Though closely associated with the better life of social democracy, only rarely did they imply a departure from the persistent framework of nation-states that would have to provide it.
Above all, human rights were not even an especially prominent idea. Unlike later, they were restricted to international organization, in the form of the new United Nations. They didn’t take hold in popular language and they inspired no popular movement. Whether as one way to express the principles of Western postwar societies or even as an aspiration to transcend the nation-state, the concept of human rights never percolated publicly or globally during the 1940s with the fervor it would have in the ’70s and the ’90s, including during negotiations over the Universal Declaration.
What if the 1940s were cut loose from the widespread myth that they were a dry run for the post–cold war world, in which human rights began to afford a glimpse of the rule of law above the nation-state? What if the history of human rights in the 1940s were written with later events given proper credit and a radically different set of causes for the current meaning and centrality of human rights recaptured? The central conclusion could only be that, however tempting, it is misleading to describe World War II and its aftermath as the essential source of human rights as they are now understood.
From a global perspective, the brief career of human rights in the 1940s is the story of how the Allied nations elevated language about human rights as they reneged on the earlier wartime promise—made in the 1941 Atlantic Charter—of the self-determination of peoples. Global self-determination would have spelled the end of empire, but by war’s end the Allies had come around to Winston Churchill’s clarification that this promise applied only to Hitler’s empire, not empire in general (and certainly not Churchill’s). The Atlantic Charter set the world on fire, but because similar language was dropped from the Universal Declaration, human rights fell on deaf ears. It is not hard to understand why. Human rights turned out to be a substitute for what many around the world wanted: a collective entitlement to self-determination. To the extent they noticed the rhetoric of human rights at all, the subjects of empire were not wrong to view it as a consolation prize.
But even when it comes to the Anglo-American, continental European and second-tier states where human rights had at least some minor publicity, the origins of the concept need to be treated within a narrative explaining not their annunciation but their general marginality throughout the mid- to late 1940s. In the beginning, as a vague synonym for some sort of social democracy, human rights failed to address the genuinely pressing question of which kind of social democracy to bring about. Should it be a version of welfarist capitalism or a full-blown socialism? A moral language announcing standards above politics offered little at a moment in world history of decisive political choice. By 1947–48 and the crystallization of the cold war, the West had succeeded in capturing the language of human rights for its crusade against the Soviet Union; the language’s main advocates ended up being conservatives on the European continent. Having been too vague to figure in debates about what sort of social democracy to bring about in the mid-1940s, human rights proved soon after to be just another way of arguing for one side in the cold war struggle. Never at any point were they primarily understood as breaking fundamentally with the world of states that the United Nations brought together.
In considering the origins and peripheral existence of the concept of human rights, the focus should be on the formation of the United Nations, since until not long before Carter’s declaration human rights were a project of UN machinery only, along with regionalist initiatives, and had no independent meaning. Yet the founding of the United Nations, and the forging of its Universal Declaration, actually presents a very different story line from the one that actors in the drama of human rights in the 1990s would have us believe.
Recall that FDR had to be cajoled into accepting the idea of an international organization. In the Dumbarton Oaks documents, the startling outlines of a prospective international organization for the postwar era discussed by the Allies in 1944, it was clear that the wartime rhetoric that sometimes included the new phrase "human rights" masked the agendas of great-power realism. And the campaign by various individuals and groups up to and during the epoch-making San Francisco conference on the United Nations in mid-1945 to alter this tactic failed spectacularly, despite the symbolic concession of the reintroduction of the concept of human rights into the charter written there. The victorious wartime alliance had been enshrined as the security council of the new world government, as its seat of true authority, and while some minor states and private citizens attempted to resist a UN that would simply entrench and balance the power of the war’s victors, they did not succeed.
If a heroic view of human rights is familiar, it is because of two common but untenable ways of remembering the period. The first is to overstate—often drastically—the goals and effects of the campaign against the Dumbarton Oaks settlement. The second is to isolate the path toward the Universal Declaration as a road still traveled, even if the cold war temporarily erected a barrier on it. But instead of a rousing story of how the document emerged against all odds, one needs to tell an unflattering story about why no one cared about it for decades. As an early NGO chief, Moses Moskowitz, aptly observed later, the truth is that human rights "died in the process of being born." Why they were born again for our time is therefore the true puzzle.
The United States, which had helped drive the global inflation of wartime hopes, quickly retreated from the language it had helped to introduce, leaving Western Europe alone to cultivate it. Even there—especially there—the real debate in domestic politics was about how to create social freedom within the boundaries of the state. Coming after the announcement of the Truman Doctrine in March 1947, with its call for a decisive choice between two "alternative ways of life," the passage of the Universal Declaration in December 1948 offered the mere pretense of unity at a crossroads for humanity. And already by that point, with most emphasis on the right of conscience, European conservatives had captured the language of human rights by deploying it as a synonym for moral community that secularism (and the Soviets) threatened, while few others learned to speak it.
In any case, "human rights" meant something different in the 1940s. Despite its new international significance, its core meaning remained as compatible with the modern state as the older tradition of the domestic rights of man had been. Both were the background principles of the nations united by them. In this sense, if in few others, "human rights" preserved a memory of the "rights of man and citizen" more than summoning a utopia of supranational governance through law. The inclusion of social and economic rights in the mid-1940s very much mattered: still relevant rights to economic security and social entitlements were prominent and, unlike now, surprisingly consensual. But they were earlier products of citizenship struggles, and have still barely affected the international order.
From another view, however, the postwar moment gave the antique idea of declaring rights an altogether new cast: neither a genuine limitation of prerogative, as in the Anglo-American tradition, nor a statement of first principles, as in the French, the Universal Declaration emerged as an afterthought to the fundamentals of world government it did nothing to affect. No one registered this fact more clearly than the lone Anglo-American international lawyer still campaigning for human rights in 1948, Hersch Lauterpacht, who denounced the Universal Declaration as a humbling defeat of the ideals it grandly proclaimed.
After the 1970s, and especially after the cold war, it became usual to regard World War II as a campaign for universal justice, with the shock of the discovery of the camps prompting unprecedented commitment to a humane international order. Rather than Moskowitz’s story of death in birth, the proclamation of human rights became one of birth after death, especially Jewish death. In the postwar moment, however, across weeks of debate around the Universal Declaration in the UN General Assembly, the genocide of the Jews went unmentioned, despite the frequent invocation of other dimensions of Nazi barbarity to justify specific items for protection, or to describe the consequences of leaving human dignity without defense.
The more recent phenomenon of Holocaust memory has also encouraged a mystified understanding of the Nuremberg trials, which in reality contributed to the ignorance of the specific plight of the Jews in the recent war rather than establishing a morally familiar tradition of responding to mass atrocity. The Allies coined the new penal concept of "crimes against humanity" in the days between Hiroshima and Nagasaki, as they struggled with how to treat the defeated enemy elites. But on the rare occasion the notion referred to the Jewish tragedy, it got short shrift at Nuremberg, at a time when the West knew little and cared less about the Holocaust, and the Soviets wanted patriotic and antifascist victims rather than Jewish ones.
The concept of human rights was not prominently invoked in the proceedings. It is not at all obvious that, at the time, Nuremberg and related legal innovations like the genocide convention were conceived as part of the same enterprise as the itemization of human rights, let alone falling under their umbrella—though they are now often inaccurately described as if they were a single, though multifaceted, achievement. Lemkin, the main force behind the genocide convention, understood his campaign to be at odds with the UN’s human rights project. In any case, Lemkin’s project was even more marginal and peripheral in the public imagination than the Universal Declaration, passed by the General Assembly the day after the passage of the genocide resolution.
If there is a pressing reason to return to the history of human rights in the 1940s, it is not because of their importance at the time. The Universal Declaration was less the annunciation of a new age than a funeral wreath laid on the grave of wartime hopes. The world looked up for a moment. Then it returned to the postwar agendas that had crystallized at the same time that the United Nations emerged. A better way to think about human rights in the 1940s is to come to grips with why they had no function to play then, compared with the ideological circumstances three decades later, when they made their true breakthrough.
During that interval, two global cold war visions separated the United States and the Soviet Union, and the European continent they were splitting between themselves. The struggle for the decolonization of empire—movements for the very self-determination that had been scuttled as human rights rose—made the cold war competition global, even if some new states strove to find an exit from its rivalry to chart their own course. Whereas the American side dropped human rights, both the Soviet Union and anticolonialist forces were more committed to collective ideals of emancipation like communism and nationalism as the path into the future. They did not cherish individual rights directly, to say nothing of their enshrinement in international law. Utopian ideals were not lacking, but human rights were not one of them.
During the 1960s crisis of superpower order, the domestic consensus in the East and West around the terms of the cold war began to fracture. Without ever dying in the East, the dream of "building socialism" lost its appeal, while in the West the anxieties of the cold war and early worries about its costs drove a new generation to depart from the postwar consensus. Yet in the ensuing explosion of dissent, it was not human rights but other utopian visions that prospered. There were calls for community at home to redeem the United States from hollow consumerism; for "socialism with a human face" in the Soviet empire; for further liberation from "neocolonialism" in the third world. At the time, there were next to no nongovernmental organizations that pursued human rights; Amnesty International, a fledgling group, remained practically unknown. From the 1940s on, the few NGOs that did include human rights on their agenda worked invisibly and bureaucratically for them within the UN’s framework, but their failure over thirty years to become prominent, let alone effective, confirmed the agonizing fruitlessness of this project. As Moskowitz observed bitterly in the early ’70s, the human rights idea had "yet to arouse the curiosity of the intellectual, to stir the imagination of the social and political reformer and to evoke the emotional response of the moralist." He was right.
But within one decade, human rights would begin to be invoked across the developed world and by many more ordinary people than ever before. Instead of implying what they had come to mean at the United Nations by the 1960s—further colonial liberation—human rights were used by new forces on the ground, like NGOs, and most often meant individual protection against the state and by some authority above it. Amnesty International became visible and, as a beacon of new ideals, won the Nobel Peace Prize in 1977—in America, Carter’s year—for its work. The popularity of its mode of advocacy forever transformed the basis for agitating for humane causes, and spawned a brand and age of internationalist citizen engagement.
At the same time, Westerners left the dream of revolution behind, both for themselves and for the third world they had once ruled, and adopted other tactics, envisioning an international law of human rights as the steward of utopian norms and the mechanism of their fulfillment. Even politicians, Carter towering over them all, started to invoke human rights as the guiding rationale of the foreign policy of states; for Americans, it was a moment of recovery from Henry Kissinger’s evil as well as the foreign policy, hatched by Democrats before Kissinger took power, that had led to the Vietnam disaster. After Amnesty won a Nobel Prize, other NGOs began to sprout: Helsinki Watch—now Human Rights Watch—emerged the next year.
Most visible of all, the public relevance of human rights skyrocketed, as measured by the simple presence of the phrase in the newspaper, ushering in the recent supremacy of the notion compared with other schemes of freedom and equality. In 1977 the New York Times featured the phrase "human rights" five times more frequently than in any prior year. The moral world had changed. "People think of history in the long term," Philip Roth says in one of his novels, "but history, in fact, is a very sudden thing." Never has this been truer than when it comes to the history of human rights.
But how to explain the recent origins of what now looks like a short-lived faith? The designation of the 1940s as the era when contemporary global commitments were born is one version of a larger mistake. The roots of contemporary human rights are not to be found where pundits and professors have longed to find them: neither in Greek philosophy nor monotheistic religion, neither in European natural law nor early modern revolutions, neither in horror against American slavery nor Hitler’s Jew-killing. The temptation to ransack the past for such "sources" says far more about our own time than about the thirty years after World War II, during which human rights were stillborn and then somehow resurrected.
Human rights came to the world in a sort of gestalt switch: a cause that had once lacked partisans suddenly attracted them in droves. While accident played a role in this transformation, as it does in all human events, what mattered most was the collapse of universalistic schemes and the construction of human rights as a persuasive "moral" alternative to them. These prior universalistic schemes promised a free way of life but led to bloody morass, or offered emancipation from capital and empire but were now felt to be dark tragedies rather than bright hopes. They were the first candidates for replacing the failed premises of the early postwar order, but they failed too. In this atmosphere, an internationalism revolving around individual rights surged. Human rights were minimal, individual and fundamentally moral, not maximal, collective and potentially bloody.
Given its role in the 1940s, the United Nations had to be bypassed as human rights’ essential institution for them to matter. The emergence of new states through decolonization, earth-shattering in other respects for the organization, changed the meaning of the very concept of human rights but left it globally peripheral. It was, instead, only in the 1970s that a genuine social movement around human rights made its appearance, seizing the foreground by transcending government institutions, especially international ones. It, too, emphasized that human rights were a moral alternative to the blind alleys of politics.
To be sure, there were a number of catalysts for the explosion: the search for a European identity outside cold war terms; the reception of Soviet and later Eastern European dissidents by Western politicians, journalists and intellectuals; and the American liberal shift in foreign policy in new, moralized terms, after the Vietnam catastrophe. Equally significant, but more neglected, were the end of formal colonialism and a new view toward the third world. Empire was foreclosed, yet romantic hopes for decolonization were also smashed and the era of "failed states" was opening.
There is a great irony in the emergence of human rights as the last utopia when others failed. The moral claim to transcend politics that led people to ignore human rights in the 1940s proved to be the cause of the revival and survival of human rights three decades later, as "ideology" died. Not surprisingly, it was then that the phrase "human rights" became common parlance. And it is from that recent moment that human rights have come to define the hopes of the present day.
Beyond myth, the true history of human rights matters most of all so that we can confront their prospects today and in the future. A few holdouts aside, progressives have fully adopted human rights into—or even as another phrase for—their politics in the past few decades. And they are correct to do so, since many specific rights, such as principles of equality and well-being, or entitlements to work and education, are those whose content they have defended across modern history. Finally, there is no gainsaying the widespread germination and ambitious agendas of NGOs in the thirty years since human rights came to the world, most of which attempt pressing changes with the most honorable of intentions. All the same, to date human rights have transformed the terrain of idealism more than they have the world itself.
Moreover, human rights have many faces and multiple possible uses. As much as they call for social concern, they anchor property—the principle of rights having been most synonymous with this protection for most of modern history. They were put to use in the name of neoconservative "democracy promotion" and have justified liberal warfare and "intervention." They serve as the brand name for diverse schemes of global governance in which vulnerability and inequality persist. Tea Party Express chair Mark Williams recently claimed that his movement "is a Human Rights Movement (by virtue of being based on the greatest expression of Human Rights ever devised by our mortal hand—the United States Constitution)." What may matter is less the idea of human rights than its partisan interpretations and applications, which are inevitable.
If so, why persist in upholding the fiction that human rights name an inviolable consensus everyone shares? Like all universalist projects, human rights are violated every time they are interpreted and transformed into a specific program. Because they promise everything to everyone, they can end up meaning anything to anyone. Human rights have become an ideology—ours—except that, as in the 1940s, it is now difficult to see how the pretense of agreement can help when there is no consensus about how, or even whether, to change the world.
This contemporary dilemma has to be faced squarely; yet history as a celebration of origins will not offer any guidance. To be sure, Obama’s "Christian realism" is dubious too, and is no alternative to the human rights mindset of his recent Democratic predecessors. Carter and Obama have been the most assiduous presidential readers of Reinhold Niebuhr. But while Carter found in the Protestant divine the courage to indict national sin, Christian realism too often allows Americans to feel like children of light alone, facing darkness abroad rather than in themselves. Yet Obama’s initially surprising caution toward human rights remains useful: it suggests that the faith in the notion may be less deeply rooted than we thought, and not at all necessary. The real question is what to do with the progressive moral energy to which human rights have been tethered in their short career. Is the order of the day to reinvest it or to redirect it?
In his recent manifesto for a reclaimed social democracy, Ill Fares the Land, my late colleague Tony Judt stirringly calls for a revival of an unfairly scuttled domestic politics of the common good. Judt argues that if the left, after a long era of market frenzy, has lost the ability to "think the state" and to focus on the ways that "government can play an enhanced role in our lives," that’s in part because the ruse of international human rights lured it away. The antipolitics of human rights "misled a generation of young activists into believing that, conventional avenues of change being hopelessly clogged, they should forsake political organization for single-issue, non-governmental groups unsullied by compromise." They gave up on political tasks, Judt worries, for the satisfying morality of Amnesty International and other human rights groups.
Whether or not this description is correct, the retreat to the state as the forum of imagination and reform is not made any more plausible as a next step. After all, midcentury social democracy had its own global context. And today, as Judt points out, "The democratic failure transcends national boundaries." So it is definitely not a matter of choosing the state against the globe but of deciding how to connect our utopian commitments to make both more just, each goal being the condition of the other. The question remains not whether to have a language and strategy to confront a flawed world beyond our national borders; it is which language and strategy to choose.
One thing is for sure: the lesson of the actual history of human rights is that they are not so much a timeless or ancient inheritance to preserve as a recent invention to remake—or even leave behind—if their program is to be vital and relevant in what is already a very different world than the one into which they exploded. It is up to us whether another utopia should take the place of human rights, just as they emerged on the ruins of prior dreams.