This article is part of The Nation’s 150th Anniversary Special Issue. Download a free PDF of the issue, with articles by James Baldwin, Barbara Ehrenreich, Toni Morrison, Howard Zinn and many more, here.
In the British Museum in London, amid the mummies and disputed marbles, there is a delicate wooden board around a foot long, inlaid with limestone and lapis lazuli. Its design gives a hint to its purpose: twenty squares, covered in flowers and dots. One of the oldest surviving games in the world, the Royal Game of Ur seems to have been played a lot like modern-day checkers, with competitors racing across the board. It comes from southern Iraq and dates to around 2,600 bce.
We know humans have played games for even longer than this: as the Dutch theorist Johan Huizinga put it in 1938, “Play is older than culture, for culture, however inadequately defined, always presupposes human society, and animals have not waited for man to teach them their playing.” He suggested that our species, Homo sapiens (the wise man), could be described with equal accuracy as Homo ludens (the playing man).
Huizinga’s work also helps us to understand why play is far from a frivolous enterprise: because it is voluntary, and not necessary to survival, how we have fun says more about our species than how we work. “Play is superfluous…it is free, is in fact freedom,” he writes. “Play is not ‘ordinary’ or ‘real’ life.” In the 1860s, just before this magazine was founded, soldiers distracted themselves from the horrors of the Civil War with pastimes such as louse-racing or ten-pin bowling using cannon balls. The Civil War Trust records that “by the last years of battle, decks of cards were hard to come by in the Southern ranks,” with Confederate soldiers reduced to taking them from Union prisoners and the bodies of the fallen. It’s not hard to imagine the effect this had on morale.
Nonetheless, Anglo-American culture has long grappled with the idea that fun can be wholesome and, in fact, necessary to happiness rather than a debauched, degenerate luxury. Perhaps that’s a hangover from the Puritans—in the seventeenth century, they were so hard on the idea of relaxation come Sunday that King James I was moved to issue a “Declaration of Sports,” which specifically permitted “leaping, vaulting, or any other such harmless recreation” on the Sabbath.
But taking games seriously, it turns out, is vital, both socially and politically: neuroscientists now acknowledge the role of imaginative play in the neural development of children. Women, too, often miss out on leisure time. As Rebecca Abrams’s 1997 feminist treatise The Playful Self asks: “A man has a God-given right to play football on a Sunday morning; a child cannot survive without two hours’ frenetic activity in the park. What does the woman in their life do? Make the lunch.” In 2014, Brigid Schulte’s book on work/life balance, Overwhelmed, observed that throughout history, “women’s time has been subjected to unpredictable interruptions, while men’s ability to experience blocks of unbroken time has been protected. The ‘good’ secretary and the ‘good’ wife were the ones guarding it.”
Popular
"swipe left below to view more authors"Swipe →
Strange as it may sound, these theoretical explorations of the concept of play provide the hidden background to 2014’s biggest story in the video-game world: Gamergate.
This months-long social-media fiesta of harassment (of women in games) and hand-wringing (over the future direction of the medium) had its roots in one fundamental fact: men used to dominate gaming, back when gaming meant big console titles that demanded hours of continuous attention. But gaming has changed. Over the last decade, there has been an explosion in “casual” games—smartphone puzzles, say, or iPad time-wasters. Meanwhile, the big console manufacturers have decided that they are close to maxing out the hard-core demographic. The next step is to capture the family market; in the words of Microsoft staffers, “to own the living room.” That means offering sports games, motion-sensitive exercise routines, and more creative titles like the blockbuster Minecraft, which appeals to everyone ages 3 to 93.
Casual games are popular with women too, perhaps precisely because they do not demand great blocks of unbroken time. They can be played while commuting, or watching the stove, or in those exhausted hours once the kids have gone to bed. Their popularity means that the gender split among video-game players is now close to even: the 2014 report by the Entertainment Software Association, the industry’s trade body, says that “women over the age of 18 represent a significantly greater portion of the game-playing population (36 percent) than boys age 18 or younger (17 percent).”
In practice, this shifting market means fewer nerd-rage simulators and macho power fantasies, and more titles with interesting roles for women and minorities—and more stories in which the primary method of interacting with others is not shooting them or running them over. Behind Gamergate’s apparent concern with “ethics in games journalism” was the fear that activists, gamers and critics were demanding an end to lazy stereotypes about race, gender and sexuality—as if having fewer games where you mow down faceless natives or bludgeon strippers to death meant banning fun itself.
Gamergate was right about one thing, though: many of the industry’s leading figures are trying to expand the medium’s appeal. In 2013, there was a spate of “dad games” like BioShock Infinite, The Last of Us and The Walking Dead, where instead of rescuing a princess from a castle or impressing a hot chick with your sniping abilities, the gamer was cast as a middle-aged man trying to protect a young girl. (Many writers speculated this was the result of game developers hitting middle age themselves—if so, look out for a spate of walker and cane simulators in about thirty years.)
The same year, Tomb Raider was rebooted—and Lara Croft got to wear trousers instead of hot pants. We now have war games that hate war (Spec Ops: The Line, This War of Mine), and games about mental health (Depression Quest), immigration (Papers, Please) and terminal illness (That Dragon, Cancer). One of my favorite games of 2014 was 80 Days, a retelling of Phileas Fogg’s journey around the world, which sought to shift the focus from Great White Men Making History to the ordinary people they meet along the way. It was written by Meg Jayanth, a woman of Indian descent living in London, who was unimpressed by the passive, objectified character of the Indian princess Aouda in Jules Verne’s original novel. She has said that her first question was: “How can I write a game which is, ostensibly, about two Victorian white guys racing around the world for a bet, that nonetheless has space for Aouda as something other than a prize for the protagonist?” (If you have $4.99 and a smartphone, you can find out how well she did.)
Inevitably, as the games become more mature, game journalism has to grow up, too. One of Gamergate’s demands was that reviews become more “objective,” meaning that games should be assessed on their technical specifications rather than criticized, as books or films are, for their ideological assumptions and messages. (“Ulysses: great font, very readable; all pages printed in correct order. A solid 7/10.”) An “objective” reviewer could then praise a game like Grand Theft Auto V for telling an interesting story—but never discuss that story’s content.
Games deserve better than that. They are both an $80-billion-per-year industry and an evolving, exciting artistic medium. They connect us to one another—despite the popular stereotype of a gamer “alone in his basement,” many of today’s blockbusters, such as Destiny and Hearthstone, are designed to be played with friends—and they also connect us to the long and winding thread of human history. If you Google “Royal Game of Ur” today, you can play the same game that entertained ancient Mesopotamians in the golden days of the Akkadian Empire. The only difference is that you’ll win now with a click of the mouse, not a throw of the dice.