“The Silicon Valley ideal,” Joy Lisi Rankin writes in A People’s History of Computing in the United States, “venerates grand men with grand ideas.” This narrative, she argues, is widely accepted as the status quo. According to an elite cabal of Bay Area billionaires and a group of sympathetic tech journalists, the birth of personal computing in the mid-1970s and the social experiences of computing in the ’90s can be neatly attributed to the ingenuity of a few brilliant men.
Rankin contends that the myth of a “digital America dependent on the work of a handful of male tech geniuses” detracts from computing’s initial democratic promise: a project made by civilians for civilians. Society’s optimism about technology has waned, but some can’t help but valorize self-made men. Tesla CEO Elon Musk is a case in point: He’s still celebrated by some as a visionary and self-made billionaire (never mind that he is said to have roamed the streets of New York with emeralds from his father’s mine in his pocket), even as his boorishness has damaged his business. Tesla’s price has tanked as Musk’s Twitter conniptions have led to SEC fines, and a recent, bewildering appearance on the libertarian-leaning podcast The Joe Rogan Experience left some people questioning his judgment.
Anyone who spends time in the Bay Area will begin to see how intoxicating—and damaging—the forces of tech triumphalism and boosterism can be. Last year, in San Francisco’s Mission District, where the tech boom’s effects of displacement are most acute, tenants’-rights activists blockaded tech-company shuttles with smoke bombs and a mountain of discarded, on-demand e-scooters. Still, the narrative of the plucky entrepreneur panders to our economic self-regard, as well as to preconceived notions of who drives technological change. The tech industry is hardly the first to be seduced by the myths of its own exceptionalism or of male genius. But as Rankin argues, the persistence of these myths obscures a more intriguing history of technological development in the United States.
Rankin’s book is a powerful and densely detailed account of how digital culture in the 1960s and ’70s shaped our contemporary experiences of technology as a tool for social connection. As the hat tip to Howard Zinn in the title indicates, Rankin’s book locates the forgotten heroes of the personal- and social-computing movement in school classrooms and the academy, as well as the industry’s darker side. Rankin observes that computing’s social and personal origins can be traced back to 1964, when the first large-scale computer time-sharing network was developed at Dartmouth.
The earliest computers were monolithic mainframes—terribly large and, at $240,000 apiece ($2 million in today’s money), out of reach for the ordinary citizen. Even “smaller” computers, such as the Librascope General Precision machine used at Dartmouth, weighed 800 pounds. A time-sharing network allowed as many as 20 terminals to be connected to a single, centralized computer system, so that those on the network could work, collaborate, and communicate with each other simultaneously. These computers were a shared resource, built by educators and students. That made it cheaper to operate them, and shaped computing as an experience that could be physically shared with others.
Popular
"swipe left below to view more authors"Swipe →
Time-sharing networks also marked a profound shift in the relationship between human beings and computers. Previously, one would write a program for a mainframe by punching holes in cardboard cards, each hole representing a character or symbol. These cards were given to a computer operator, who fed them into the computer, which then spat out the results in the form of more punched cards or printouts. In contrast, “time-sharing provided a much more personal experience of computing,” Rankin writes, “connecting the individual directly with the terminal, and the terminal with the computer.” Yet it also seems that time-sharing networks were too successful for their own good. Where users were once content to share computers in social settings such as labs, classrooms, and dorms, tech firms like IBM and Apple in the late ’70s imagined a different possibility: device ownership for all. Computing’s communal character was shed in favor of something consumerist and atomized.
Rankin’s clear-eyed analysis of the skewed gender dynamics at Dartmouth makes for withering reading. (Today’s “brogrammers” aren’t without precedent.) Under the guidance of the computer-science scholars John Kemeny and Thomas Kurtz, computing at Dartmouth was conceived as a project for everyone. Kurtz believed that “computing access should be available to all students at no cost,” as in a public library.
Eighty percent of Dartmouth’s students and 40 percent of its faculty would use the time-sharing system. But computing was hardly the social equalizer that Kurtz and Kemeny envisioned. Dartmouth’s student body was overwhelmingly affluent, white, and male; women weren’t admitted as students until 1972. Diversity initiatives such as the ABC (“A Better Chance”) program helped to broaden Dartmouth’s demographic intake: The number of matriculating African-American students increased from just a handful in the mid-1960s to 9 percent of students by 1976. Yet the computing-center newsletter’s coverage of the ABC program, although well-intentioned, “ultimately called attention to the differences between ABC students and the white student majority,” Rankin argues, which along with a lack of diversity among the computer-center staff contributed to a culture in which whiteness was the norm.
The frat-house machismo that prevailed on the athletic field found another home in the teletype room, where students used BASIC (an early programming language invented by Kemeny and Kurtz) to create computer art and games on the network. A random sampling included no fewer than three types of football programs—Generic Football, “Dartmouth Championship” FTBALL, and GRIDIRON as well as Battleship and simulated slot machines. The racism evident in the choice of the school team’s unofficial mascot, a caricature of a Native American brave, was visible in the code: A program created by students was given the acronym SCALP, a knowing nod to the “racialized attribution of ferocity, bravery, and savagery to Native Americans,” Rankin notes.
The culture incubated at Dartmouth rendered women invisible in the very spaces they used to dominate. In the beginning, teletype work was seen as “low-status clerical work,” Rankin observes, “firmly fixed in the realm of women’s work.” Yet the centrality of teletype technology to Dartmouth’s time-sharing network meant that those gendered associations were forcefully swept aside. Teletype work, no longer seen as “pink-collar” labor, morphed into an almost aggressively macho pursuit. The systemic eclipsing and erasure of women’s contributions to the industry were set in motion.
At Dartmouth, women worked as application programmers, computer-program coordinators, keypunch operators, and technical librarians—yet they were seen as wives and mothers above all else. Nancy Broadhead tellingly described her job at the computing center as “part-time operator/consultant and probably more appropriately housemother.” Another senior staff member, Janet Price, had a PhD and was an expert on the FORTRAN programming language, but was referred to in the campus newsletter as “Mrs. Price,” while her other colleagues were referred to by their academic titles. Such casual sexism was a harbinger of things to come.
One of the most noteworthy developments in the shift toward social computing was the creation of the computer-assisted learning system PLATO (Programmed Logic for Automated Teaching Operations) at the University of Illinois. It perhaps most closely resembled the contemporary personal computer: Its personal terminals had individual plasma screens instead of teleprinters. First used in 1963 as an educational tool to teach nursing students how to treat heart attacks, PLATO created a simulated lab environment—in the form of an interactive quiz—for students to observe and learn from.
At the same time, the higher-education sector received a windfall from the federal government. The specter of the Cold War spawned numerous reforms that jump-started computing initiatives in the United States. In its early days, PLATO was touted as an instructional machine to attract substantial military funding. But as more institutions in the Champaign-Urbana community embraced the PLATO network and more of its users started creating their own programs, its appeal broadened significantly. One graduate student and programmer, Stuart Umpleby, recognized that PLATO wasn’t just an educational tool but a “mass communications system with feedback,” like a digital town square. Another PLATO user, Valarie Lamont, used it to create games with an activist bent. PLATO soon became a haven for civic engagement, as well as for those who wanted to play games, send messages to friends, and lurk for fun. At the same time, instances of men harassing and mocking women on the network became more prevalent. The hostility toward women ranged from mansplaining, to crank calls, to physical threats in private spaces, such as computer labs or restrooms. Abusive and obscene messages were rife among the gaming communities. These experiences are still endured by women who carve out a space and voice for themselves online. As Rankin’s analysis shows, racism and misogyny played a part in molding digital culture from its inception.
In today’s tech parlance, “community” is a slippery and overused word, mistakenly used to describe an aggregated mass of individuals rather than a group of people with shared bonds, values, and aspirations. Rankin’s chapter on the educational video game The Oregon Trail and community-driven computing in Minnesota shows that programming can create and nurture genuine physical and virtual spaces with grassroots organizing. Cooperative ventures and educational institutions mobilized local communities, ensuring that the users of these time-sharing networks weren’t merely consumers but rather active citizens in a broader social project. “Sometimes the Minnesotans improved the technology, but they always prioritized increasing access,” Rankin writes.
Technology’s true promise, she argues, is embedded in the act of using a computer rather than in owning any one device. She notes that in 1968, “public intellectuals called for computing as a public utility, comparable to electricity or water”—in other words, unsexy, essential, and subject to government regulation. Imagine how different our technological development might be if it were treated as public infrastructure, rather than as proof of industry exceptionalism.
But even the most community-minded initiatives aren’t immune to the siren song of private investment. The MECC, a key player in providing network access to schools in Minnesota, signed a contract with Apple in 1978 to provide students with their own computers. It marked the shift from a state-subsidized digital commons to an increasingly profit-driven enterprise. By the 1990s, the transition was complete. As soon as the MECC became a revenue-generating cash cow, it was sold off by the government to venture capitalists. Time and time again, we’ve seen tech investors go weak at the knees at the prospect of making a quick buck, which may prove far more alluring than sustained investment in public infrastructure.
You’d think the book would conclude on a despondent note, but it’s to Rankin’s credit that she ends with an optimistic call to arms, urging that more expansive and democratic histories of technology be told. “Let us overwrite the Silicon Valley mythology,” she writes, and “look beyond the narrow inspiration of our digital Founding Fathers.” Amid mounting public backlash and Senate investigations, our tech monopolies may well face some kind of reckoning—that is, if meaningful regulation is finally imposed. Taming technocapitalism won’t necessarily result in a future worth fighting for, but a civic-minded vision that prioritizes people over profits is a step in the right direction.