Argument

Kill Screen

Is the new crop of hyperrealistic military video games driving home the reality of the Afghanistan and Iraq wars, or simply exploiting them?

Earlier this year, Electronic Arts, one of the biggest video-game publishers in the world, rented out both chambers of the Edison Ballroom in midtown Manhattan for its annual exhibition of holiday titles. Electronic Arts uses the expo to gin up excitement among gaming journalists, and by the time I got there, in the early afternoon, the ballroom was filled to capacity. I passed gaggles of bloggers playing the new Need for Speed racing franchise and the latest iteration of FIFA Soccer and headed toward a dais near the back of the building, where Craig Owens, an Electronic Arts marketing director, was demonstrating a much-anticipated first-person war shooter called Medal of Honor.

Medal of Honor is not a particularly young franchise. The first installment of the game debuted on the Sony PlayStation in November 1999 with a World War II-themed plot written in part by director Steven Spielberg. The game was bloody, vivid, and chaotic -- a kind of first-person Saving Private Ryan. But it retained enough of an arcade feel -- the enemy arrived in great, unending, and totally shootable waves - that gamers could have fun with it, too. Dozens of sequels (Pacific Assault, European Assault, and Rising Sun among them) have followed, all featuring bygone conflicts; most have sold fairly well. But the 2010 edition of Medal of Honor was something new: a "reboot," set in contemporary Afghanistan, and starring American special operations troops.

Owens, who is in his 40s, was wearing a striped button-down and a dark beard, which I later learned he grew out as part of a "beard-a-thon" to raise money for fallen soldiers. He waved a controller in my direction. "C'mon," he said. "Sit down."

I watched as he steered his avatar on foot up a steep mountain pass, toward a Taliban-held village; his fellow soldiers fanned out alongside him. Shadows flickered over the road, and someone off-screen muttered something in a language that might have been Pashto. The village was surreally calm. In a narrow alley, hard between two low-slung houses, Owens finally came face to face with two Taliban. He squeezed the trigger on his PlayStation 3 controller; on the screen, the muzzle of his assault rifle flared, and both guerrillas disappeared in soft pink puffs of blood. "This is going to be the most realistic war game yet," Owens said with a grin.

Medal of Honor, which debuted Oct. 12, largely delivers on Owens's boast -- and for some critics, therein lies the problem. Not long after the Edison Ballroom expo, Electronic Arts began receiving complaints from an array of politicians and veterans-rights groups, ranging from former U.S. Rep. Scott McInnis (R-Colo.), who questioned the moral integrity of the Electronic Arts staff, to a spokesman for the organization AMVETS, who argued that "products like this trivialize combat." Alabama's Montgomery Advertiser newspaper published an editorial urging retailers to refuse to sell Medal of Honor and consumers to refuse to buy it. "To release a video game that shows such death in such a graphic way is shameful and an insult to the families of the men and women in uniform who have died [overseas], and will die in the future," the editorialists wrote.

The game's detractors were mostly concerned with Medal of Honor's multiplayer mode, in which players could assume the role of Taliban guerrillas and shoot at U.S. troops. "I don't see how shooting soldiers based on real Americans is entertainment while people are dying every day for this country," Karen Meredith, the mother of Ken Ballard, a U.S. Army lieutenant killed in Iraq, told the San Jose Mercury News in August. "How can they say it's OK for someone to play the Taliban? You'll have people sitting at home, drinking beer, shooting at American soldiers, maybe missing, then starting over. Well, Ken didn't have a chance to start over."

In September, in response to what it stiffly described to Stars and Stripes as "well-documented reports of depictions of Taliban fighters engaging American troops," the Army and Air Force Exchange Service announced that it would move to block sales of Medal of Honor on military bases around the country. (Some of the most avid consumers of combat-oriented video games today are American soldiers who play them when they're not preoccupied with the real thing.) Electronic Arts called the ban and the criticism "disappointing," but eventually agreed to drop the Taliban from the multiplayer mode, albeit in name only: Gamers are still invited to shoot at simulacra of U.S. soldiers, and the Taliban avatars themselves are unchanged, but they're now labeled "Opposing Force" rather than "Taliban."

It was a controversy that wouldn't have occurred even five or six years ago. For most of the medium's history, video game studios seemed to be reticent about tackling contemporary conflicts, preferring instead to crank out games based in abstracted worlds and full of abstracted enemies. The baddies were aliens, zombies, heads of multinational corporations, or some unholy combination thereof -- the video-game equivalent of Hollywood's stock villains with ambiguous Eastern European accents. Until relatively recently, there were "almost no games set in current military conflicts," Tristan Donovan, the author of Replay: The History of Video Games, told me. "If they are out there, they were very minor releases that went largely unnoticed."

This had something to do with the nature of warfare in the post-Vietnam, pre-9/11 era, in which conflicts came and went before they would have even made it off the game designers' drawing boards. "The first Gulf War and the Falklands War, for example, lasted only a few months, and so there was little time for game companies to commission, create, and release games about those conflicts while they were taking place," Donovan says. Games that alluded to real-world wars came out long after the fact and approached their subjects obliquely; 1992's Sega Mega Drive game Desert Strike, for example, was inspired by the first Gulf War, but its designers set it in a second, fictional war in Iraq -- it may have been prescient, but it wasn't realistic. A handful of video games released in the early 2000s that looked back at the Vietnam War, including Shellshock: Nam '67 and Vietcong, caused minor ripples in the media, but nothing close to Medal of Honor-grade controversy.

Over the last half-decade, however, the studios' hesitance to exploit contemporary conflicts has begun to melt away. Among the first successful, semirealistic military simulations were the multiplatform SOCOM franchise, and Full Spectrum Warrior, a 2004 title for the Xbox, the PlayStation, and Windows, which put players in the midst of a heated anti-terrorism campaign in the Middle East. The weapons and uniforms in both games were meticulously modeled on those actually used in contemporary war, and there was a sense that the producers were flirting with something very real -- the country depicted in Full Spectrum Warrior was an imaginary land called Zekistan, but al Qaeda, the Taliban, and convoluted tribal politics in Pakistan all played important roles. Still, the level of graphical complexity was remedial enough that the game remained, well, a game: The explosions were choppy and unconvincing; gunfire sounded like small chunks of hail hitting a tin roof.

It took a perfect storm for war games to reach the level where anyone could really call them objectionable. The first step was the arrival of the Sony PlayStation 3 and the Microsoft Xbox 360, two powerful consoles capable of processing highly detailed landscapes and action. The second was a studio willing to mine a current conflict for source material: Activision, a veteran designer with a 30-year record of coming up with blockbusters, from Pitfall! to Guitar Hero.

In 2009, Activision released Call of Duty: Modern Warfare 2, a game that includes long, intense scenes in Afghanistan and the specter of a terrorist attack on Washington. Gone was the cartoonish violence of past simulations; ushered in was a world of dizzying alleyway firefights. In part due to a scene that gave players the option of gunning down innocent civilians at a Moscow airport, the game became a target for critics, who fretted that Activision had gone too far. And yet the complaints only seemed to fuel the success of Modern Warfare 2, which went on to earn almost universal critical admiration and a staggering $310 million in sales within 24 hours of hitting store shelves -- the highest grossing entertainment launch in history. (By comparison, the film Avatar made $232 million in its first weekend at the box office.) 

But the new rules were far from clear -- as independent publisher Atomic Games discovered when it came up with Six Days in Fallujah, a game modeled on Operation Phantom Fury, the 2004 U.S. military campaign to retake the city from Sunni insurgents that proved to be one of the most intense campaigns of the Iraq war. Six Days in Fallujah was created with the input of several soldiers who had weathered the actual siege and was touted as the height of military shooter realism. But after details of the game emerged, Konami, the company that was set to publish the title, was deluged with complaints from veterans groups. "When our loved one's 'health meter' dropped to '0,' they didn't get to 'retry' the mission," the sister of a fallen soldier said at the time. "When they took a bullet, they didn't just get to pick up a health pack and keep 'playing' ... They suffered; they cried; they died." Konami eventually pulled out of the deal to publish Six Days in Fallujah; the game has been shelved, probably for good.

With Medal of Honor, Electronic Arts seemed to be shooting for a title that would rival Call of Duty: Modern Warfare 2 in sales, while achieving the same level of ripped-from-the-headlines realism as Fallujah. The studio consulted extensively with two so-called "Tier 1" operators -- members of units like Delta Force and SEAL Team Six -- examined hours of video from Afghanistan, and, according to the New York Times, recorded "actual weapons fire at Fort Irwin in California, in a mock Iraqi village used by the military for training." (The Atlantic's Marc Ambinder reported that neither of the Tier 1 operators that worked with Electronic Arts received official permission from U.S. Special Forces command; the precise nature of their involvement in the game remains somewhat sketchy.)

Five days after it hit stores, the game had overcome its pre-launch controversy -- and middling reviews in the gaming press -- to sell about 1.5 million copies. It wasn't a particularly startling first week, but it was a solid one nonetheless.

What is it about realistic video games, as opposed to movies, that inevitably stirs up so much furor? The simplest explanation is that video games are active, not passive, entertainment -- you play them rather than watch them. It is one thing, the argument goes, to watch The Hurt Locker, and another to patrol a digitized street in Iraq, with a digitized rifle in your hand, and put a digitized enemy in your cross hairs. As Meredith, the mother of the lieutenant killed in Iraq, hinted, steering an avatar through a virtual Afghanistan firefight, when somewhere in the real Afghanistan real American troops are doing the exact same thing, raises no shortage of questions. Members of Gold Star Families Speak Out, a Massachusetts-based veterans organization, have repeatedly argued that war simulations inure the American public to the hazards and realities of actual war. (Or the boredom. In a 2009 article, the satirical newspaper the Onion revealed details for Modern Warfare 3, which included extended scenes of waiting around and "filling out paperwork.") 

But how much do we actually know about the psychological impact of these games? In most cases, there simply isn't enough data available for anyone to truly understand how realistic video games affect audiences. "I bet you're expecting that this is all well-thought-out because you've heard all these strong opinions or claims about video games," says Dmitri Williams, an associate professor at the University of Southern California's Annenberg School for Communication and Journalism who studies new media. "But the fact is that a lot of research doesn't account for the differences between games or for how they've changed." Many academics studying the implications of video-game realism, he argues, have only a passing familiarity with the video-game canon. "There's not a lot of conversation about the actual content of a specific game, nor of the context of the violence in that game," he says. "It's going to happen eventually," as younger researchers climb into positions of power at campuses around the country. "But it's going to take the academy a little while to catch up."

Video-game studios, in the meantime, have taken the not-unreasonable stance that most gamers can appreciate the difference between an entertainment experience and the real thing -- as have gamers, who frequently make the point that when kids play cops and robbers, someone has to play the robber. That doesn't make the kid stuck playing robber an actual criminal. (A similar argument has long been used to defend ultraviolent games, including the open-world shoot-'em-up franchise Grand Theft Auto.)

Reps at Electronic Arts, weary of the controversy, have mostly stopped talking to the media in recent weeks and declined to comment to me on the matter. (Activision did not return emailed requests for comment about the Call of Duty games, either.) Not long after the company agreed to pull the Taliban designation from Medal of Honor, Electronic Arts CEO John Riccitiello said the whole dust-up was "more about the newspaper industry than the game industry," hinting that journalists had overhyped the conflict in an effort to drum up page views. Speaking with a reporter from the blog Joystiq in October, Owens said, "The objection was -- kind of from an older generation that doesn't understand games -- that the sound-bite was, 'Play as the Taliban and kill U.S. soldiers.' ... There still is, it seems, a group that's still a little bit leery of a game taking place around an active conflict."

Both Owens and Riccitiello have repeatedly taken the position that the controversy around Medal of Honor was media-created. But to my knowledge, neither man has directly addressed critics like Meredith. Nor have they enumerated why Medal of Honor should take "place around an active conflict." At the Edison Ballroom expo, Owens talked a bit about the Tier 1 operators involved in the making of the game and said that Medal of Honor was the closest thing to participating in a firefight in some far-flung Afghanistan village. An optimist might argue that Medal of Honor can bring home a conflict in a way that a movie never could -- by actually dropping the audience into the war zone, feet first.

But if you've spent any time with a game like Medal of Honor, you probably find it exceptionally hard to see a first-person military shooter in that light. I am a great fan of these games -- I own plenty of them, and I'm sure I will buy plenty more in coming years. There is, for my money, no greater video-game thrill than dashing across a dusty battlefield with a unit of battle-hardened, AI-controlled soldiers while enemy tracers scream overhead. But I am always conscious that I am playing for the thrill, and not for any primer on the war in Afghanistan -- or what it's actually like to be a soldier, for that matter.

There is no moral nuance at play in any of the first-person military shooters on the market today, no greater cultural lesson to be learned -- there is only the opportunity to use a cool-looking machine gun to take the head off a bad-looking dude, in a beautiful-looking environment.

When I explained why I play these games to video-game designer Ian Bogost, he laughed and grunted his assent. Bogost is an associate professor at the School of Literature, Communication and Culture at Georgia Tech but also a founding partner of the innovative independent video-game studio Persuasive Games. "What does it mean to make a video game about any historical event?" Bogost said. "It's a complicated thing. You have to want to support a range of opinions." But the video-game industry, Bogost said, has never had an interest in politics. "Studios are stuck in this weird netherworld, between Silicon Valley and Hollywood," he said. "And games are stuck in that place, too. They want to be technology, and they also want to be entertainment."

I asked Bogost if he thought that video games could eventually say something meaningful about foreign conflicts. "I'm optimistic," he said, and he explained that video games offered the right storyteller an incredible range of possibilities. "Games are great at depicting systems instead of telling stories. ... And then there's role-playing: What is it like to be someone else?" he said. "That's the missed opportunity in Medal of Honor -- what does it really mean to be the Taliban? Where are they coming from? What does that feel like? Now that doesn't mean you have to endorse the opinion, but [in a video game] you can explore something from someone else's side."

Bogost paused. Medal of Honor, he added, "was never on that track, but if it had, it would have been interesting and powerful."

Electronic Arts

Argument

Don't Try This Abroad

Nick Kristof is wrong. Amateurs are not the future of foreign aid.

Many globally minded, can-do Americans these days have come to believe that the world's major problems have solutions, and that these solutions are within reach. This feeling often leads to frustration: Why doesn't someone just do something about these problems? Are the NGOs and foreign aid agencies lazy, incompetent, or both? Why can't we end poverty?

Last weekend, the New York Times Magazine ran a cover story about people who have taken matters into their own hands. The piece, Nicholas Kristof's "D.I.Y. Foreign Aid Revolution: The rise of the fix-the-world-on-your-own generation" offered several aren't-they-inspiring stories about Americans who have run off to save poor people in developing countries from whatever afflicts them. A woman from Oregon begins fundraising for community work in eastern Congo, and later shifts her attentions to conflict minerals. A recent high school graduate from New Jersey uses her babysitting money to start an orphanage and school in rural Nepal. You get the idea.

The stories sound lovely. I admit to feeling a little warm and fuzzy inside reading them. After all, this is what drives me to do development work: to make the world just a little better. (I study international development at New York University.) We all want to tell ourselves the story about fighting through hardship -- each of these women made personal sacrifices for their work -- to make the world a better place.

Unfortunately, such stories don't reflect reality. Spend a little time in any community in the world, and you'll see people from that community finding ways to improve it -- not outsiders. Working in eastern Uganda last summer, I found well-organized community groups who weren't waiting for any outsider's help. I worked with an NGO that conducted business and financial skills education in rural villages, and our best trainers were Ugandans from those very villages.

Yet these sort of people -- local community members helping their neighbors and themselves -- are absent from Kristof's stories. Instead, he gives the reader an American heroine (his stories are mostly about women) who comes to save the day. Local individuals exist as needy targets of the protagonist's benevolence. If they act on their own behalf or the behalf of their community, it's only after the American has prompted them to do so. Developing country governments and domestic civil society are barely mentioned. Saundra Schimmelpfennig, who blogs at Good Intentions are Not Enough, has dubbed this the "Whites in Shining Armor" storyline: Americans and other outsiders are uniquely positioned to bring change to a community, as if we are saviors come to deliver them from poverty.

Such implicit arrogance aside, a more fundamental problem is that Kristof's narratives make development seem simple. In his stories, the hero sees a problem and fixes it. Women are suffering from war and rape in Congo? Raise some money, build some homes, and regulate conflict minerals. Lack of affordable sanitary pads keeps women from work and girls out of school? Develop a cheaper pad. Orphaned children in Nepal? Build an orphanage. He even implies that the established foreign aid organizations "look the other way" when it comes to these problems. How could they miss such obvious opportunities for improving lives?

What Kristof misses is that even seemingly obvious solutions are more complicated than they appear. Development means change, and change is always complicated -- and often political. Change is also political. Being an outsider supporting development in a community raises difficult questions with both moral and strategic dimensions.

Here's one critical question: How can we ensure that the work actually serves the best interests of the beneficiaries, when the funding comes from the other side of the globe? A community's needs may be too complex for foreign staff and volunteers to understand, and too nuanced for a fundraising pitch. Outsiders in the community may see homeless children and relay the need for an orphanage to donors. With a few pictures to tug at heartstrings, the money starts flowing in. However, those children might be better off living with members of their extended families, and the same resources that built the orphanage would be better spent providing support to make this happen. Unfortunately, that work requires a deeper understanding of the community and a more complicated fundraising message.

Another question that's often overlooked: What impact do outside money and volunteers have on the local economy, political structures, and culture? Adding a wealthy outside investor can skew incentives in unexpected ways. Local businesses lose market opportunities when NGOs give donated goods away, for example. Similarly, local officials face less pressure to provide public services or cultivate a sustainable tax base when donors fund schools, health care facilities, and infrastructure. And since it is outsiders -- not the government -- providing those services, citizens have no means to hold them accountable for quality. Political and economic changes can also have unintended cultural impacts. For example, an agricultural project dividing communal land into private farming plots can weaken social ties. Even programs with intended cultural impacts have unpredictable repercussions in all spheres.

The world of aid has spent the last 50 years grappling with these questions. The development industry is by no means perfect, but it has made progress and learned valuable lessons. The lessons are often ignored by newcomers, and the same mistakes are made over and over again. Kristof nods toward this fact while breezing past it. He focuses on the passion and indignation of his heroines while downplaying their technical abilities.

I have nothing against the individuals described in Kristof's article. The concerns I expressed above apply to all development organizations -- they just happen to be especially relevant to small and new ones. Admittedly, every organization starts small and new. Muhammad Yunus spent decades developing the Grameen Bank model before winning the Nobel Prize. Paul Farmer delivered health services to rural Haitian communities for years before Partners in Health became a world-renowned organization. There have been books written about entrepreneurs like Yunus and Farmer, and the years they have spent understanding the communities they work in, refining their work, and building their organizations.

But other initiatives fail, and sometimes they draw massive support away from worthier projects before that happens. A recent high-profile example is the PlayPump, a merry-go-round that would let village children pump clean water as they play. An initiative to install PlayPumps across Africa received millions of dollars from the U.S. government and other donors -- until the high cost of the pumps, their potential to break down, and their basic inefficiency led to a drop in support. PlayPump's backers were lured by the mirage of a quick technical fix to a seemingly simple problem. But providing clean water was harder than it looked.

Yet Kristof's headline is: Do it yourself. Bring the same attitude you would have toward re-painting the living room or installing a new faucet. After all, how hard can it be? The developing world is like your buddy's garage. Why not just pop in, figure things out, and start hammering away?

But in this field, amateurs don't just hurt themselves. A project that misunderstands the community or mismanages that crucial relationship can undermine local leaders, ultimately doing harm to the very people it was meant to help. There are also opportunity costs when funding could have been used better. Every dollar spent on PlayPumps or an unnecessary orphanage could be spent on other, better interventions in the same communities. My advice is to hire a professional. And if you want to do this work yourself, become a professional.

Despite all my complaints, I think Kristof's article does some good if it convinces more people to pursue international development as a career. We all start as amateurs. The difference is whether we seek to learn more or assume that we can just start doing something, muddling through as we go. The "DIY foreign aid" concept might spur a few people to launch ill-advised ventures that eat up scarce resources and get in the way of better efforts, but it might also convince a few others to read a couple books, go to graduate school, get jobs with professional aid organizations, and spend their whole careers making a real impact.

Arif Ali/AFP/Getty Images