Social media companies say consumers’ loss of privacy is just the cost of doing business. But what would happen if they actually had to bargain with users on equal footing?
Ari Melber, Woodrow Hartzog and Evan SelingerReuters Facebook is on the defensive again. Members of the social networking site sued the company for co-opting their identities in online ads, and Facebook agreed to revise its “Statement of Rights and Responsibilities” and offer a $20 million settlement. The case has drawn less attention than the dorm disputes portrayed in The Social Network, but the impact is far wider. An underpublicized aspect of the dispute concerns the power of online contracts, and ultimately, whether users or corporations have more control over life online. Similar class action suits have been leveled against the popular photo-sharing application Instagram, and the mother of all platforms, Google. Fed up with the “contracts” these companies force on their customers, some people are finally striking back.
While a few of the particulars here are new—filtered photos or copyrightable tweets—the legal dilemma is actually very old. As social media users, our rights are established through non-negotiable, one-sided and deliberately opaque “terms of service” contracts. These documents are not designed to protect us. They are drafted by corporations, for corporations. There are few protections for the users—the lifeblood powering social media.
In return for driving the profits of social media companies, users get free software. But too often, the cost is unpredictable vulnerability: confusing, generic contracts that give companies control over your data, prose, pictures, personal information and even your freedom to simply quit a given website.
This is a classic example of form contract abuse—when a single, powerful party pushes a contract onto a disparate group of other parties.
Think of that cellphone contract you didn’t read, or the waiver you must sign to go river-rafting. Many companies use form contracts as a blanket waiver to protect against future liability, a trend discussed in The Fine Print, a book by Pulitzer Prize–winning author David Cay Johnston.
The problem, however, extends beyond the ruthless profitability imperative. As Margaret Jane Radin documents in Boilerplate: The Fine Print, Vanishing Rights, and the Rule of Law</a>, a type of fine-print bullying has renegotiated the corporate-consumer relationship. Confusing contracts now degrade “traditional notions of consent, agreement, and contract,” she writes, by tricking people into forfeiting their “core rights,” such as the right to speak freely, control personal information or resort to courts for protection.
The impact of boilerplate is especially acute for minors. Teenagers contract with online companies without understanding the vulnerabilities created upon clicking “I agree.”
Today, the law protects these arrangements by assuming they were fairly negotiated, and thus reflect a “meeting of the minds” by equal parties. That is weird.
After all, these contracts are usually created through user confusion and one-sided demands. How can citizens even bargain with a standard, take-it-or-leave-it form?
Thanks to a doctrine called the Objective Theory of Contracts, the courts treat a contract as valid when people give the impression they accept or reject it, even if they were actually ignorant, or just confused. Bizarre as this legal logic might seem, it is deeply entrenched in the judicial system. It isn’t changing anytime soon.
So, what is the best way forward? How can consumer-friendly policy advance without waiting for courts to change their precedent?
The pushback against websites’ user policies has, to date, largely consisted of criticism, lawsuits and simply walking away. While these mechanisms can move public opinion, they don’t really address the one-sided contract bargaining problem.
Associated Press
Collective action is a promising route. The power of even a few million social network users, when coordinated, could push companies to offer more consumer rights as a good business decision. Given enough collective will, start-ups seeking a competitive advantage will offer consumer-friendly terms and established companies will compete over allegiance to user rights. To move the market in this direction, we need a common language that communicates clear priorities.
Imagine if social media users came together and developed a People’s Terms of Service Agreement—a common reference point and stamp of approval, like a Fair Trade label for the web, to govern the next photo-sharing app or responsible social network.
Initial Proposal for a People’s Terms of Service Contract
The very idea of a People’s Terms of Service agreement might initially seem strange. Right now, contract terms aren’t negotiable. You can’t ask Facebook to exempt you from Graph search. You could ask Instagram to waive its mandatory arbitration clause, in case you ever want to defend your rights before a jury, but they would say no. You forfeit those options just by using the services.
But why not try to collectively negotiate a contract that reflects some common consumer priorities? We think the best tack here is for interested users and consumer advocates to publicly debate their consensus priorities and draft them into a model contract.
There are two potential benefits: The result could be pressed on existing Internet companies, and also provide a model for new companies that want compete for users who demand respect for their freedom, choice and privacy.
To be effective, the contract would use plain English, not legal jargon. It should be short enough so people can read it. (That’s a contrast to Facebook, which offers a contract almost as long as the US Constitution.) Beyond terminology itself, we propose five values worth considering for a model agreement: security, confidentiality, transparency, permanency and respect for intellectual property.
Permanency: The contract cannot be unilaterally altered, period. This seemingly obvious rule, which applies to most contracts, has been undermined by technology companies that attempt to reserve the right to alter their terms of service without meaningful consent from users. A People’s Terms of Service would require meaningful opt-in from users for any material changes. Users would also retain the right to have all their materials permanently deleted if they choose to leave the site, and that provision could never be altered.
Transparency: Companies promise to be transparent and provide meaningful notice to the individual regarding its collection, use, dissemination, and maintenance of personal information.
Intellectual Property: Companies respect the value of an individual’s name or likeness and the copyrights of user information and work posted on the sites by taking minimal licenses for administrative use, or provide some profit-sharing for advertisements other commercial use of a user’s name or likeness. (This is an important option for companies building a business model on monetizing user-generated content).
Confidentiality: Companies promise not to disclose personal information to third parties, unless users meaningfully opt-in to such disclosure for each party. (As always, they would still respond to government and legal requests.) Facebook itself makes a similar promise. To ensure protection, companies also promise to contractually ensure that recipients of personal information are obligated to respect the anonymity of any transferred data sets, and to provide users with control over data portability so it’s easier to leave a given service.
Security: Companies promise to use appropriate and industry standard security safeguards protect all media against risks such as loss, unauthorized access or use, destruction, modification or unintended disclosure.
Many users already support some of these values, based on the public reactions to many terms-0f-service contracts to date. Any articulation of potential consensus values, however, will be incomplete. We think a first draft is a useful way to start the conversation, and we hope others will criticize it and improve it.
Apart from any debate over an ideal contract, others may object that pooling consumer power just won’t work.
After all, Internet companies have far more incentive to shape and win this battle than millions of users with varying levels of interest in their privacy, free speech and intellectual property. They have built a profitable industry on a contractual baseline, the argument goes, and they’re not going to walk away from that.
The same pessimism greeted other participatory successes on the web, including the Creative Commons, which used the collective power of artists and creators to better protect copyrighted works online. (The comparison also has its limits, since the default laws protecting intellectual property are stronger than the contract precedents discussed here.)
But even if companies don’t rush to adopt terms drafted by consumers, there is a great value to advancing a common baseline for values and expectations in the online world. (Some advocacy groups have contributed to the conversation with their own suggestions, like “Bill of Privacy Rights for Social Network Users” from the Electronic Frontier Foundation).
Either way, there is an opening here that should be seized. We’re finally moving past the simplistic notion that one-sided corporate agreements are an unavoidable “cost” of using social media—as if every company’s corporate policy must be accepted as the automatic baseline. That’s not how we regulate BP, why should our attitudes be more lax towards Google?
Sooner or later, we think enough people will wake up to that fact that our online lives are governed by form contracts and demand a little more control over their pictures, prose and digital selves. And if that group is vocal or organized, it could grow into something that Silicon Valley knows how to serve—a market.
Take Action: Tell Facebook and Google: We Deserve Fair Contracts
Ari MelberTwitterAri Melber is The Nation's Net movement correspondent, covering politics, law, public policy and new media, and a regular contributor to the magazine's blog. He received a Bachelor of Arts in Political Science from the University of Michigan at Ann Arbor and a J.D. from Cornell Law School, where he was an editor of the Cornell Journal of Law and Public Policy. Contact Ari: on Facebook, on Twitter, and at amelber@hotmail.com. Melber is also an attorney, a columnist for Politico and a contributing editor at techPresident, a nonpartisan website covering technology’s impact on democracy. During the 2008 general election, he traveled with the Obama Campaign on special assignment for The Washington Independent. He previously served as a Legislative Aide in the US Senate and as a national staff member of the 2004 John Kerry Presidential Campaign. As a commentator on public affairs, Melber frequently speaks on national television and radio, including including appearances on NBC, CNBC, CNN, CNN Headline News, C-SPAN, MSNBC, Bloomberg News, FOX News, and NPR, on programs such as “The Today Show,” “American Morning,” “Washington Journal,” “Power Lunch,” "The Last Word with Lawrence O'Donnell," "The Joy Behar Show," “The Dylan Ratigan Show,” and “The Daily Rundown,” among others. Melber has also been a featured speaker at Harvard, Oxford, Yale, Columbia, NYU, The Center for American Progress and many other institutions. He has contributed chapters or essays to the books “America Now,” (St. Martins, 2009), “At Issue: Affirmative Action,” (Cengage, 2009), and “MoveOn’s 50 Ways to Love Your Country,” (Inner Ocean Publishing, 2004). His reporting has been cited by a wide range of news organizations, academic journals and nonfiction books, including the The Washington Post, The New York Times, ABC News, NBC News, CNN, FOX News, National Review Online, The New England Journal of Medicine and Boston University Law Review. He is a member of the American Constitution Society, he serves on the advisory board of the Roosevelt Institute and lives in Manhattan.
Woodrow HartzogWoodrow Hartzog is an assistant professor at Samford University’s Cumberland School of Law and affiliate scholar at the Center for Internet and Society at Stanford Law School.
Evan SelingerEvan Selinger is an associate professor of philosophy at Rochester Institute of Technology and a fellow at the Institute for Ethics and Emerging Technology.