Toggle Menu

An Accelerated Grimace: On Cyber-Utopianism

Clay Shirky's Cognitive Surplus is the latest monotonous revery about the Internet social revolution. Evgeny Morozov punctures that bubble.

Chris Lehmann

March 3, 2011

Gather round, netizens, for Clay Shirky has a story to tell. It’s a simple yet stirring saga of self-organization online, and an extension of the paean to the spontaneous formation of digital groups he delivered three years ago in his breakout book, Here Comes Everybody. But where Shirky’s earlier tract focused principally on the potential organizing power of the digital world, Cognitive Surplus asserts that the great Net social revolution has already arrived. The story goes likes this: once upon a time, we used to watch a lot of television, to spend down the new leisure we acquired during the automated postwar era, and to adjust to the vaguely defined social ills associated with atomized suburban life. That was a one-way channel of passive consumption, and it was bad.

Cognitive Surplus
Creativity and Generosity in a Connected Age.
By Clay Shirky.
Buy this book

The Net Delusion
The Dark Side of Internet Freedom.
By Evgeny Morozov.
Buy this book

Now, however, we have the World Wide Web, which has leveraged our free time into an enormous potential resource. This is very, very good. With the emergence of Web 2.0–style social media (things like Facebook, Twitter and text messaging), Shirky writes, we inhabit an unprecedented social reality, “a world where public and private media blend together, where professional and amateur production blur, and where voluntary public participation has moved from nonexistent to fundamental.” This Valhalla of voluntary intellectual labor represents a stupendous crowdsourcing, or pooling, of the planet’s mental resources, hence the idea of the “cognitive surplus.” Citing one of the signature crowdsourced reference works on the Web, Shirky contends that

People who ask “Where do they find the time?” about those who work on Wikipedia don’t understand how tiny that entire project is, relative to the aggregate free time we all possess. One thing that makes the current age remarkable is that we can now treat free time as a general social asset that can be harnessed for large, communally created projects, rather than as a set of individual minutes to be whiled away one person at a time.

For Shirky, producers and consumers of digital culture are mashed up into a vast, experimental quest to test the reaches of knowledge and social utility. Does it make for a cacophony of rival monologuing voices and a rapidly expanding market for rumor, pseudo-information and unrewarded intellectual work? Yes—and so much the better! Shirky cheers—for this new Internet is not stifled by old-media publishing standards and elitist gatekeepers. Shirky asks us to consider bloggy self-publishing, which is upending the decaying one-sender-to-many-receivers model of publication: “The ability for community members to speak to one another, out loud and in public, is a huge shift,” he writes, “and one that has value even in the absence of a way to filter for quality. It has value, indeed, because there is no way to filter for quality in advance: the definition of quality becomes more variable, from one community to the next, than when there was broad consensus about mainstream writing (and music, and film, and so on).”

It’s reasonable to ask if this sort of discursive world is one any sane citizen would choose to live in. Democratic culture—indeed, cultural activity of any kind—thrives on establishing standards and drawing distinctions; they furnish the elemental terms of debate for other equally crucial distinctions in civic life, beginning with the demarcation of the public and private spheres that Shirky announces the web has transformed into a dead letter.

By contrast, to hail a cascade of unrefereed digital content as a breakthrough in creativity and critical thought is roughly akin to greeting news of a massive national egg recall by laying off the country’s food inspectors. This contradiction should be obvious in an age where the best-known persecutor of the media mainstream—excuse me, lamestream—is one Sarah Palin, who has also cannily harnessed the social media revolution to a classic one-to-many political broadcasting concern. (One might also gingerly suggest that Shirky’s own blogging output could have benefited from a healthy dose of filtration, given the sexist character of his now notorious, if forthrightly titled, blog offering “A Rant About Women.”)

The invocation, and ritual immolation, of straw-man claims gleefully culled from the venerable storehouse of old-media cliché is standard fare in digital evangelizing tracts such as Cognitive Surplus. On one level, Shirky’s new book is just the latest, monotonous installment in the sturdy tradition of exuberant web yay-saying, from the overheated ’90s boom reveries of George Gilder (Telecosm) and Jon Katz (Virtuous Reality) to the more ambitious, but no less empirically challenged, late-aughts divinations of Wired magazine digiterati such as Chris Anderson (The Long Tail, Free). It’s more than a little disorienting—and not a little obscene—in a society of increasingly desperate financial distress and joblessness, to be marched one more time by a beaming missionary through the key points of the New Economy catechism, which holds that the social achievements of the web are remaking the world as we know it remorselessly for the better, abolishing all the old distinctions not merely of intellectual and cultural quality but also of social class, national identity, government regulation and the fabric of public and private life itself. Shopworn as this vision is, there’s no doubt that Shirky has continued plying it to great professional effect: he recently scored a full professorship at New York University’s Arthur L. Carter Journalism Institute, and boasts a long résumé of consulting gigs, including Nokia, News Corp., Procter & Gamble, the BBC, the US Navy and Lego.

* * *

On another level, though, Shirky’s new book is more than corporate-visionary hackwork. What’s striking is how Shirky pursues the utopian drift of the cottage industry in web apologetics to its logical conclusions—beginning with the collective time evoked in his book’s title. He estimates that this pooled global cache of time capital is a “buildup of well over a trillion hours of free time each year on the part of the world’s educated population.” Obviously, plenty of free time goes into all kinds of endeavors—from producing execrable reality television to composing crowdsourced fan fiction. To assign it all an aggregate value of potential hours of creative and generous activity is about as meaningful as computing one’s velocity on a bicycle as a fraction of the speed of light: it tells us nothing about either the public value or the opportunity cost of any given web-based activity.

Why assign any special value to an hour spent online in the first place? Given the proven models of revenue on the web, it’s reasonable to assume that a good chunk of those trillion-plus online hours are devoted to gambling and downloading porn. Yes, the networked web world does produce some appreciable social goods, such as the YouTubed “It Gets Better” appeals to bullied gay teens contemplating suicide. But there’s nothing innate in the character of digital communication that favors feats of compassion and creativity; for every “It Gets Better” video that goes viral, there’s an equally robust traffic in white nationalist, birther and jihadist content online. A “cognitive surplus” has meaning only if one can ensure a baseline value to all that dreary inconvenient time we “while away” in our individual lives, and establishing that baseline is inherently a political question, one that might be better phrased as either “Surplus for what?” or “Whose surplus, white man?”

Shirky’s approach to contested public values and political organization is another example of acute web myopia. To be fair, he does recap the story of a group of activists fighting Hindu-fundamentalist attacks on women who patronize bars in the Indian city of Mangalore, who united under the banner of a Facebook group called the Association of Pub-going, Loose and Forward Women. But on the negative side of the ledger, the most baleful use of web-enabled resources he seems able to imagine is Lolcats, the signature cute-pets-with-captions of the “I Can Has Cheezburger?” franchise, which he adopts as a stand-in for “the stupidest possible creative act” perpetrated on the web, with nary a whisper about faked Obama birth certificates or the James O’Keefe YouTube videos. (O’Keefe, you may recall, produced a series of videos in which he and an associate posed as a pimp and hooker seeking legal advice at ACORN offices; using extremely selective and misleading video editing, O’Keefe made ACORN employees appear to be colluding in their scheme to evade the law.) For a man who spends his career explaining how the web works, Shirky doesn’t seem to spend much time exploring the thing.

While Shirky clearly supports the formation of the Association of Pub-going, Loose and Forward Women (it was, he notes, partially inspired by Here Comes Everybody), he cautions that such exercises in “civic intervention” are rarities, even in the hypernetworked precincts of Web 2.0. Though “it’s tempting to imagine a broad conversation about what we as a society should do with the possibilities and virtues of participation” online, Shirky claims that “such a conversation will never happen.” The reason? “If you do a Web search for ‘we as a society,’ you will find a litany of failed causes, because society isn’t the kind of unit that can have conversations, come to decisions, and take action…. It’s from groups trying new things that the most profound uses of social media have hitherto come and will come in the future.”

There you have it: the idea of public cooperation, if not social solidarity, rendered nugatory by a web search. (One can’t help wondering whether Shirky would be equally cavalier about a search using the term “We the People,” which I seem to recall has lodged a rather important model of public cooperation in the American civitas.) Shirky’s conclusion—intended to champion the dynamism of small-group models of web activism—is, in reality, redolent of Margaret Thatcher’s famous dictum, “There is no such thing as society.” The idea of society as a terminally unresponsive, nonconversant entity would certainly be news to the generations of labor and gender-equality advocates who persistently engaged the social order with demands for the ballot and the eight-hour workday. It would likewise ring strangely in the ears of the leaders of the civil rights movement, who used a concerted strategy of nonviolent protest as a means of addressing an abundance-obsessed white American public who couldn’t find the time to regard racial inequality as a pressing social concern. The explicit content of such protests, meanwhile, indicted that same white American public on the basis of the civic and political standards—or rather double standards—of equality and opportunity that fueled the nation’s chauvinist self-regard.

Skepticism, conflict and the attendant public goods that they may help to identify and redefine are mostly taboo on the Shirky-channeled web. As the subtitle of Cognitive Surplus indicates, Shirky thinks there is little about the content of our trillion-hour tidal wave of just-in-time web data that is not benign, and quite reliably melioristic. As digital media continue to spread their influence across the globe, they also come bearing the relentlessly tinkering, innovative spirit of social generosity. For techno-optimists like Shirky, most people online are donating their time to the heroic project of the digital commons, and that act alone is a sort of all-purpose social fixative—enforcing cultural standards of behavior online, rewarding crowdsourced contributions and punishing trespasses against agreed-upon digital decorum.

But what’s different and exciting about the social-minded Web 2.0, Shirky insists, is that it seems to be well on its way toward abolishing the gray, divisive conceptions of cultural authority and labor that used to reign in the now discredited age of information scarcity. He argues that behavioral economic studies such as the so-called Ultimatum Game—in which one test subject offers to split $10 with another—point to the extra-material motivations most people share beneath the surface appearance of economic self-interest. As Shirky explains, neoclassical economic theory would hold that the subject who begins the game with the $10 in hand would always proffer a nine-to-one split, because that’s the cheapest way for the divider to enlist the recipient’s support—and because the latter subject is still up a dollar in the transaction. But in test after test, the exchanges cluster in the center spread, because a 50-50 split seems more intuitively fair—even when the stakes in the experiment are raised to hundreds of dollars. The inescapable conclusion, Shirky writes, is that, contra economic theory, markets and selfish behavior correlate “in the opposite way you might expect.”

Markets support generous interactions with strangers rather than undermining them. What this means is that the less integrated market transactions are in a given society, the less generous its members will be to one another in anonymous interactions…. Exposure to market logic actually increases our willingness to transact generously with strangers, in part because that’s how markets work.

In the manner of Malcolm Gladwell, Shirky extends this dictum across a range of didactic vignettes drawn from digital culture. He offers the standard encomiums to open-source computer-engineering collaborations such as the Linux operating system and Apache server software. But he also finds stirring samples of digitally enabled generosity in a volunteer-run charity that sprang up on a Josh Groban fan discussion board, a vast online exchange of J.K. Rowling–inspired fan fiction and PickupPal.com, a site that began life dispensing contact information for would-be Canadian carpoolers (which, wouldn’t you know, was the subject of an iron-fisted shutdown campaign from statist Ontario bus carriers).

What’s common to these parables of the information marketplace is a vision of an uncoerced social order within the reach of any suitably wired and enterprising soul inclined to donate a microsliver of that unfathomably huge surplus of time to crowdsourced tasks. This being the general drift of our social destiny, Shirky waves away the old-school leftist critique of crowdsourced content as “digital sharecropping” as so much “professional jealousy—clearly professional media makers are upset about competition from amateurs.” Such critics are also guilty of a category error, because “amateurs’ motivations differ from those of professionals.” What if the dispensers of free user-generated content “aren’t workers?” Shirky asks. “What if they really are contributors, quite specifically intending their contributions to be acts of sharing rather than production? What if their labors are labors of love?”

* * *

And if not? Consider the study that is often the touchstone of Shirky’s trippy speculations. The utility of the Ultimatum Game for a new market-enabled theory of human nature thins out considerably when one realizes that the players are bartering with unearned money. They aren’t dividing proceeds that “belong” to either player in any meaningful sense. Consult virtually any news story following up on a lottery winner’s post-windfall life—to say nothing of the well-chronicled implosion of the past decade’s market in mortgage-backed securities—and you’ll get a quick education in how playing games with other people’s money can have a deranging effect on human behavior.

In this respect, the Ultimatum Game is an all-too-apt case study to bring to bear on the digital economy—but to paraphrase Shirky, in the opposite of the way one might expect. Despite all the heady social theorizing of Shirky and the Wired set, the web has not, in fact, abolished the conventions of market value or rewritten the rules of productivity and worker reward. It has, rather, merely sent the rewards further down the fee stream to unscrupulous collectors like Chris Anderson, who plagiarized some of the content of Free, a celebration of the digital free-content revolution and its steady utopian progress toward uncompensated cultural production, from the generous crowdsourcing souls at Wikipedia. How egalitarian. It’s a sad truth that in Shirky’s idealized market order, some people’s time remains more valuable than others’, and as in that gray, old labor-based offline economy, the actual producers of content routinely get cheated, in the case of Free by the very charlatan who urges them on to ever greater feats of generosity.

As for crowdsourcing being a “labor of love” (Shirky primly reminds us that the term “amateur” “derives from the Latin amare—‘to love’”), the governing metaphor here wouldn’t seem to be digital sharecropping so much as the digital plantation. For all too transparent reasons of guilt sublimation, patrician apologists for antebellum slavery also insisted that their uncompensated workers loved their work, and likewise embraced their overseers as virtual family members. This is not, I should caution, to brand Shirky as a latter-day apologist for slavery but rather to note that it’s an exceptionally arrogant tic of privilege to tell one’s economic inferiors, online or off, what they do and do not love, and what the extra-material wellsprings of their motivation are supposed to be. To use an old-fashioned Enlightenment construct, it’s at minimum an intrusion into a digital contributor’s private life—even in the barrier-breaking world of Web 2.0 oversharing and friending. The just and proper rejoinder to any propagandist urging the virtues of uncompensated labor from an empyrean somewhere far above mere “society” is, “You try it, pal.”

There’s also the small matter of what, exactly, is being produced and exchanged in the social networks Shirky hails as the cutting edge of new-economy innovation. Services such as PickupPal and CouchSurfing—a site for tourists seeking overnight stays in the homes of natives—are mainly barter clearinghouses, enabling the informal swapping of already existing services and infrastructure support. Meanwhile, the Linux and Apache projects are the web equivalents of busmen’s holidays, places where software designers can test-drive and implement innovations that overlap with day job or research duties where their services are, in fact, compensated.

The one hint of possible production for exchange value in Cognitive Surplus unwittingly shows just how far this brand of web boosterism can go in studied retreat from economic reality; it involves a study by Eric von Hippel, a “scholar of user-driven innovation,” who found that a Chinese manufacturer of kites sought out a crowdsourced design from an outfit called Zeroprestige, which worked up shared kite designs using 3D software. The transaction, Shirky enthuses, meant that “the logic of outsourcing is turned on its head; it was possible only because the description of the kites, which was written in standard format for 3D software, was enough like a recipe for the manufacturer to be able to discover them online and to interpret them without help.”

That is not the logic of outsourcing “turned on its head”—it is the logic of outsourcing metastasized. Like the sort of fee-shifting exploitation of content providers that prevails in online commerce on the Anderson model—and, I should stipulate, in underpaid “content farms” operated within the orbit of my own corporate parent, Yahoo—outsourcing is a cost-cutting race to the bottom. All that’s achieved in the outreach of the Chinese kite maker is the elimination of another layer of production costs involving the successive prototypes of marketable kite designs. It’s certainly not as if those lower costs will translate into higher wages for China’s sweated, open-shop manufacturing workforce—the people who will end up making the kites in question.

But Shirky, like all true Net prophets, can’t be detained by such crude concerns. The old social contracts of labor, presumably, are like the discredited habit of “getting news from a piece of paper”—part and parcel of the “twentieth-century beliefs about who could produce and consume public messages, about who could coordinate group action and how, and about the inherent and fundamental link between intrinsic motivations and private actions,” which in his Olympian judgment turned out to be “nothing more than long-term accidents.” It’s little wonder that Shirky should show such fastidious disdain for recent history. Cognitive Surplus is already aging badly, with the WikiLeaks furor showing just how little web-based traffic in raw information, no matter how revelatory or embarrassing, has upended the lumbering agendas of the old nation-state on the global chessboard of realpolitik—a place where everything has a price, often measured in human lives. More than that, though, Shirky’s book inadvertently reminds us of the lesson we should have absorbed more fully with the 2000 collapse of the high-tech market: the utopian enthusiasms of our country’s cyber-elite exemplify not merely what the historian E.P. Thompson called “the enormous condescension of posterity” but also a dangerous species of economic and civic illiteracy.

In The Net Delusion, by Evgeny Morozov, we finally have a long-overdue market correction to cyber-utopianism, which Morozov defines as “a naïve belief in the emancipatory nature of online communication that rests on a stubborn refusal to acknowledge its downside.” Morozov, a Belarussian web activist who works with the New America Foundation, sizes up the social media web for what it is—a powerful tool for communication, which like most such tools in modern history is subject to grievous distortion and manipulation by antidemocratic regimes.

Since the remarkable popular protests that ousted Egyptian President Hosni Mubarak from power in February, Shirky’s cyber-utopian vision of crowdsourced social virtue has gone viral. US media have devoted extensive coverage to Egypt’s so-called Facebook and Twitter generation, the young anti-Mubarak activists who have been praised for using social media and cellphones to organize protesters in Tahrir Square and topple a tyrant. One activist ideally suited to this story line was 30-year-old Wael Ghonim, a Google executive detained by the Egyptian police for twelve days for acting as the anonymous administrator of a Facebook page that was facilitating the protests. Sure enough, the American media promptly adopted Ghonim as the face of Egypt’s revolt shortly after his release from detention.

Social networking mattered in Egypt, but the root causes of the uprising were scarcity, official corruption and social conflict, none of which fit the cyber-utopian narrative or flatter America’s technological vanity. The original scheduled protests of January 25 arose out of a past effort to organize an anti-Mubarak general strike, and it was the spread of the protests to the less wired workers in Egypt’s long-pinched labor economy that helped furnish the telling last blows to the Mubarak order. According to many reports from Cairo, the protests continued to gain momentum not from tweets or Facebook posts but instead from the direct spectacle of the populace congregating, We the People style, in Tahrir Square. Most Egyptians were following events on state television, which was parroting the official propaganda approved by the Mubarak regime, holding that the protests were the handiwork of foreign agitators. Not being regular blog readers, ordinary Egyptians went into the streets and saw that the state media were lying, that the protesters were their neighbors, their family members, their co-workers. The effort to coax a new political order into being grew from the power of popular witness, filtered through the evidence of citizens’ own eyes and ears.

Western cyber-utopian exuberance was disastrously projected onto the global stage during the 2009 protests over Iran’s stolen presidential election. Shirky pronounced the Twitter-aided revolt “the big one…. the first revolution that has been catapulted onto a global stage and transformed by social media.” Morozov patiently unpacks the ways that Shirky and other American Twitter champions overestimated the technology’s impact. Just over 19,000 Twitter accounts were registered in Iran before the uprising, he notes—meaning that roughly 0.027 percent of Iran’s population could have plugged into the Twitterfied protests. Many of the accounts reported on by the media belonged to sympathizers and Iranian diaspora, such as the blogger oxfordgirl, who supplied indispensable updates and aggregated news roundups on the protests from her perch in the British countryside.

Tapping into a digitally mediated experience of events in Iran felt extremely significant to Western Twitter clients, so much so that an Obama administration State Department official named Jared Cohen wrote to the social media company’s executives requesting that they postpone a scheduled suspension in service for site maintenance so as to keep Iranian dissidents online at a critical juncture in the Tehran demonstrations. Leaders in rival authoritarian states didn’t need to hear anything else in order to justify their own crackdowns on social media: Twitter may not have launched the anti-Ahmadinejad rebellion, but in one fell diplomatic swoop the world’s dictators saw cause to repudiate Twitter as a tool of a meddling Obama White House. This, Morozov writes, was “globalization at its worst.”

A simple email based on the premise that Twitter mattered in Iran, sent by an American diplomat in Washington to an American company in San Francisco, triggered a worldwide Internet panic and politicized all online activity, painting it in bright revolutionary colors, and threatening to tighten online spaces and opportunities that were previously unregulated…. The pundits were right: Iran’s Twitter Revolution did have global repercussions. Those were, however, extremely ambiguous, and they often strengthened rather than undermined the authoritarian rule.

The unfortunate propensity to log on to the web and pronounce it a global revolution in the offing is what Morozov dubs “the Google Doctrine”—the overconfident conviction, inherited from the West’s cold war propaganda, that the simple transmission of information beyond the reach of state-sanctioned channels has the power to topple authoritarian regimes. But just as the Eastern bloc’s downfall had far more to do with the internal stresses besieging the dying Soviet order, so does the Google Doctrine paper over a vast nexus of real-world causation in global affairs.

Nevertheless, the Google Doctrine remains central to American policy-making. Last year Secretary of State Hillary Clinton delivered a feverishly touted speech on the largely empty topic of “Internet freedom.” Like cold war–era pronouncements touting Western virtues for global consumption, Clinton’s broad-brush celebration of the Net’s innate democratizing thrust alternated between the vacuous—“we stand for a single Internet where all of humanity has equal access to knowledge and ideas”—and the hypocritical, with Clinton touting cyber-enabled popular revolts in Iran and Moldova while remaining conspicuously silent about a web censorship measure enacted the week before in Jordan, a vital US ally in the Middle East. As Morozov observes, “translated into policies, the very concept of Internet freedom, much like ‘the war on terror’ before it, leads to intellectual mush in the heads of its promoters and breeds excessive paranoia in the heads of their adversaries.”

Morozov’s dogged reporting on how authoritarian regimes have nimbly adapted to the Internet age underlines what an empty gesture it is to treat “Internet” and “freedom” as synonyms. Much as US policy thinkers have clung to the naïve cold war faith in data transmission as revolution by other means, they have also propped up the outmoded image of the authoritarian state as a lumbering, clueless mass bureaucracy, easily toppled or terrified into submission before a well-timed hacker attack or a heroic blog post. Instead, today’s strongmen are just as apt to be on the delivering as the receiving end of blog outbursts and denials of service.

Tomaar, a Saudi website promoting philosophical inquiry outside the confines of Muslim orthodoxy, attracted a mass following soon after it was launched, especially as its discussion boards expanded to include the question of politics and culture in the Arab world. In short order, though, the Saudi government denied access to the site on all servers used by its citizens. When Tomaar’s webmasters devised a straightforward workaround via a third-party Internet connection, that stopped working as well—and the US-based service provider abruptly canceled the site’s contract, condemning it to a series of improvised connectivity patches. Even so, it still suffers regular denial-of-service attacks—the same tools that have been used to disable the site for Julian Assange’s WikiLeaks operation. Nothing in the battery of attacks on Tomaar points directly back to the Saudi government—another sign, in all likelihood, that authoritarian webmasters have grown as adept in covering their tracks as they are in disrupting web service. As Morozov notes, “cases like Tomaar’s are increasingly common, especially among activist and human rights organizations. Burma’s exiled media—Irrawaddy, Mizzima, and the Democratic Voice of Burma—all experienced major cyber-attacks…; ditto the Belarussian oppositional site Charter97, the Russian independent newspaper Novaya Gazeta (the one that employed the slain Russian journalist Anna Politkovskaya), the Kazakh oppositional newspaper Respublika, and even various local branches of Radio Free Europe / Radio Liberty.”

* * *

It has never been the case that authoritarians are allergic to information technologies. Quite the contrary: as pioneers in the production of mass propaganda, they love mass media, and maintain an intense interest in later-generation digital technologies such as GPS and Twitter location that permit them to plot the real-time whereabouts of online dissidents. Yet one never encounters these uses of digital technologies in Shirky-style broadsides on cyber-liberation; in them, digital technology by definition unleashes and pools human creativity and generosity, because that’s what we Western progenitors of these technologies like to imagine them doing.

As the Tomaar episode also shows, American Net companies—hailed in State Department speeches as the vanguard of the freedom revolution—are often fleet of foot when political controversy threatens to roil their plans for overseas market expansion. It’s not hard to see why that should be the case: their shareholders expect them to be profitable, and in many stops along the global marketplace, freedom and democratization stand directly athwart that prime directive. To take just one example, last year Facebook pulled the plug on a group maintained by an activist in Morocco named Kacem El Ghazzali, which promoted discussion about secular education in the theocratic country. When El Ghazzali e-mailed Facebook engineers in Palo Alto requesting an explanation, they deleted his profile on the site for good measure. Eventually, Facebook relented and restored the education site, once the episode got press attention in the West, but El Ghazzali was left to rebuild his Facebook profile on his own. In Egypt, as the New York Times recently reported, Facebook shut down Wael Ghonim’s page because he had violated the company’s terms of service by using a pseudonym to create a profile as one of the page’s administrators. Hence, as Morozov observes, “contrary to the expectations of many Western policymakers, Facebook is hardly ideal for promoting democracy; its own logic, driven by profits or ignorance of the increasingly global context in which it operates, is, at times, extremely antidemocratic.”

This trend, too, runs counter to common wisdom on digital globalization, which has long held that authoritarian governments can’t afford to crack down on Net freedom, because the collateral loss in trade and commerce would be prohibitive. That argument is also an extension of the classic “dollar diplomacy” that, during the cold war, was supposed to force the hands of strongmen who otherwise lacked enthusiasm for Western anticommunist initiatives. With Western companies beating a hasty retreat to the sidelines, foreign dictators can now be confident that the battle over free online expression will never be fully joined.

Meanwhile, plenty of equally unsavory nonstate actors have also adapted to the new networked web—most notoriously in the cellphone-enabled Mumbai terrorist attacks, in which jihadists used Google maps to identify their targets. Mexican crime gangs have used Facebook to compile lists of kidnapping targets, while Indonesians can use a Craigslist-style service to arrange the sale of children’s organs. While Kenya has played host to a vital and influential site called Ushahidi, which helped modernize accurate citizen reporting of violence during the disputed 2007 elections, in that same episode ethnic leaders on both sides of the dispute used text messaging to spread violent attacks on their enemies. “The blood of innocent Kykuyus will cease to flow! We will massacre them right here in the capital,” one such message read. “In the name of justice put down the names of all the Luos and Kaleos you know from work, your property, anywhere in Nairobi, not forgetting where and how their children go to school. We will give you a number on where to text these messages.”

Morozov contends that the work of Ushahidi, while enormously valuable in certain ways, is decidedly ambiguous in others. While crowdsourcing is an indispensable good in responses to natural disasters, Ushahidi’s tracking of political violence and election monitoring inevitably involves data that, in the absence of third-party oversight, is “impossible to verify and easy to manipulate,” via false reports or rumors designed to foment panic in one camp or another. False reports are especially damaging to the documentation of human rights abuses, because just one falsifiable report can more or less permanently discredit an entire human rights operation. In addition, he writes, some details about such attacks should not be available online because, for example, “in many countries, there is still a significant social stigma associated with rape,” and small but telling details about an attack’s location “may reveal the victims, making their lives even more unbearable.” Ushahidi figures into Cognitive Surplus, but scrubbed of any ambiguity or unintended consequences. “Like all good stories,” Shirky chirps in a Gladwellian key, “the story of Ushahidi holds several different lessons: People want to do something to make the world a better place. They will help when they are invited to. Access to cheap, flexible tools removes many of the barriers to trying new things.”

Because Morozov is not an American web-booster, he’s especially attuned to the plank-in-the-eye hypocrisies of US Net evangelists. When Hillary Clinton was still in the Senate, she co-sponsored legislation with fellow culture scold Sam Brownback to fund government research on how Internet use could stupefy and endanger America’s youth. Such concerns never seem to arise in approving Net freedom for indiscriminate foreigners, though; as Morozov archly notes, “Chinese and Russian parents would never worry about such a thing! Or ask their governments to do something about it!” This contradiction, he adds, amounts to nothing less than Orientalism, and harms US critical thinking as much as it damages the Internet’s image abroad. “While many in the West concede that the Internet has not solved and may have only aggravated many negative aspects of political culture,” as is the case with James O’Keefe’s gotcha YouTube videos, “they are the first to proclaim that when it comes to authoritarian states, the Internet enables their citizens to see through the propaganda. Why so many of their own fellow citizens—living in a free country with no controls on freedom of expression—still believe extremely simplistic and misleading narratives when all the facts are just a Google search away is a question that Western observers should be asking more often.”

Morozov isn’t a Luddite. In his activist career in Belarus, he has witnessed the dramatic gains that democratic movements can make online. But he’s also seen—and chronicled, in this indispensable book—the many ways that the digital world mirrors the inequities, perverse outcomes and unintended consequences that dog all human endeavors in nondigital human history. If only we had spent the past two decades reading books like The Net Delusion instead of embracing the Clay Shirkys of the world as serious public intellectuals, we could have a far more coherent view of our new media revolution—and probably a much saner set of policy options in the bargain.

Chris LehmannTwitterChris Lehmann is the DC Bureau chief for The Nation and a contributing editor at The Baffler. He was formerly editor of The Baffler and The New Republic, and is the author, most recently, of The Money Cult: Capitalism, Christianity, and the Unmaking of the American Dream (Melville House, 2016).


Latest from the nation