In Box

War Games: A Short History

How ancient Greek amusements became an indispensable 21st-century military tool.

Ever since the first warrior picked up a wooden stick in imitation of a sword, the line between war and entertainment has been decidedly blurry. Military training in ancient Greece and chivalric Europe gave rise to the Olympics and medieval jousting tournaments; paintball guns and video games have become tools for honing the skills of today's soldiers. The realm of strategy, however, is where games have exerted the most remarkable impact on the conduct of war, serving as a tool for, as one U.S. Army general put it, "writing history in advance."

5th century B.C.
The ancient Greeks begin playing petteia, among the first board games modeled on war.

6th century A.D.
Chess is invented in Northern India, spreads to Persia and then Europe, and by the late 15th century evolves into its modern form. Its original name in Sanskrit, chaturanga, means "four parts," referring to divisions of the military of the Gupta Empire.

15th century
Firearms, invented centuries earlier in China, spread to armies throughout Asia, Europe, and the Middle East. The new weapons mean battles can no longer be accurately simulated without killing people, forcing strategists to look to more abstract means of preparing for war.

1650
Chess enthusiasts in what's now modern-day Germany begin developing increasingly elaborate battlefield strategy games based on the original. By the late 18th century, military leaders take notice.

1811
Prussian army advisor Leopold von Reisswitz and his son Georg, an army lieutenant, publish an elaborate manual, Instructions for the Representation of Tactical Maneuvers under the Guise of a Wargame. Thirteen years later, Georg presents King Friedrich Wilhelm III with a refined version of their game, in which two teams face off across a scale map using dice to simulate the vagaries of war. The king is enthralled, and kriegsspiel, the grandfather of all modern military war games, is born.

1870
Prussia's decisive victories in the Franco-Prussian War bring international renown to the king's army and its training techniques, including the now widely imitated kriegsspiel. Militaries begin using war games to predict how future conflicts might unfold.

1887
The first American war games, modeled on kriegsspiel, are held at the Naval War College in Newport, R.I. Theodore Roosevelt, as assistant secretary of the Navy, later becomes an avid spectator.

1918-1941
Governmental interest in war games peaks, notably in Germany (where actual military exercises are restricted by the Treaty of Versailles), the United States (whose Navy conducts several hundred games, most of them focused on the Pacific, between the wars), and Japan.

1927
Fourteen years before Japanese planes descend upon the U.S. naval base at Pearl Harbor, Hawaii, officers in the Imperial Navy under the leadership of Lt. Comm. Sokichi Takagi play out the scenario in a war game -- and find it ending badly, with the base barely damaged and U.S. forces quickly retaliating against Tokyo. Officers redo the exercise repeatedly until they arrive at the battle plan used in 1941.

1940
Three months after the invasion of Poland, Hitler's Chief of Army General Staff Franz Halder oversees four months of war games to plan Nazi Germany's May 1940 conquest of Belgium, France, Luxembourg, and the Netherlands. The game correctly anticipates the Allies' first response: pre-emptively invading Belgium.

1950s
American strategic intellectuals like Herbert Goldhamer, Andrew Marshall, and Herman Kahn explore the implications of a nuclear apocalypse in elaborate games simulating not just military conflict but the geopolitics of the Cold War.

1958
The future of war gaming blinks to life with the Navy Electronic Warfare Simulator, a $7 million computer system that takes up three floors of a building on the Naval War College campus.

1962
Students at the Massachusetts Institute of Technology create Spacewar!, the first shooting-oriented video game.

1964
With U.S. military advisors on the ground in Vietnam, top Lyndon B. Johnson administration officials including McGeorge Bundy and Cyrus Vance play two political-military war games called Sigma II testing U.S. involvement. The games end in not just a military quagmire but also serious fallout in U.S. domestic politics.

1981
The U.S. Army opens the National Training Center, a 1,000-square-mile state-of-the-art combat-simulation facility in the Mojave Desert. It is nicknamed "Fort Atari" on account of its embrace of a new technology that first appeared as a Star Trek-themed toy: laser tag.

1983
The film WarGames, starring Matthew Broderick as a teenage hacker, brings the nuclear war gaming of Kahn, Marshall, and others to the big screen. Marshall later hires one of the movie's writers, Peter Schwartz, to do the real thing.

1995
U.S. Marines at Quantico hack the popular video game Doom II to create Marine Doom, an urban combat simulator.

2002
The summer before the Iraq invasion, a $250 million, three-week game is used to test the U.S. military's readiness for a confrontation with a major Middle Eastern country. Retired Marine Lt. Gen. Paul K. Van Riper, playing a wily Saddam-like dictator, quickly brings the U.S. military to its knees with tactics that presage the Iraq insurgency. Mortified Pentagon leaders suspend the game; Van Riper quits in protest.

March 2003
"The enemy we're fighting against is different from the one we war-gamed against," Lt. Gen. William S. Wallace, U.S. Army corps commander in Iraq, remarks as the unseating of Saddam Hussein takes longer than anticipated.

2008
The Pentagon begins developing a $130 million "scale model" of the Internet to conduct the first full-fledged cyberwar games to prepare for the next frontier of conflict. Meanwhile, U.S. Defense Secretary Robert Gates criticizes the Pentagon's "tendency toward what might be called next-war-itis," calling on military leaders to spend less time predicting the wars of the future and more on the wars at hand. Two years later, the Army decides to scale back its big spring war game for the first time in 15 years.

March 2009
In the wake of the 2008 financial collapse, the Pentagon convenes a group of hedge-fund managers, bank executives, and academics for a first-of-its-kind economic war game, designed to test the ability of other countries to wield the global economy as a weapon. The big winner? Unsurprisingly, China.

October 2010
Electronic Arts' Medal of Honor, a hyperrealistic first-person shooter video game set in the Afghanistan war, debuts to much fanfare. "We are probably in some ways back to the period before 1500, when war games were extremely popular," says military historian Martin van Creveld.

Thanks to U.S. Army Col. John F. Antal (ret.), military historian Martin van Creveld, and U.S. Army Col. Richard Sinnreich (ret.).

In Box

Millions May Die ... Or Not.

How disaster hype became a big global business.

On Sept. 28, 2001, with the U.S. invasion of Afghanistan imminent, Jeremy Hobbs, the executive director of Oxfam Community Aid Abroad, issued an urgent call for donations to the group's Afghan Refugee Crisis Appeal. "Up to 5 million innocent people face starvation and death in Afghanistan, Pakistan, and Iran," he insisted. "We must act now to prevent what could possibly be the worst humanitarian catastrophe since World War II."

Really? Worse than Biafra in 1967, worse than the Cambodian refugee crisis, worse than the Ethiopian famine of 1984-1985, and worse even than Somalia in 1991? How could a senior official at one of the most experienced and well-respected private relief agencies in the world suggest with such confidence that the crisis in Afghanistan was likely to outstrip these tragedies?

To be fair, Hobbs left himself a grammatical out. "Up to" 5 million people might die, he said; the crisis "could possibly" be the worst humanitarian disaster since World War II. But this was the moral equivalent of the fine print in a contract you get from a bank with a Visa card. Oxfam was not just advancing a possibility; it was issuing a warning about an event that might very well occur if an emergency response was not mounted immediately -- and it was staking its credibility on this assessment. 

And it's not just Oxfam that's given to this kind of exaggeration. The world's emergency relief organizations, from other major NGOs like Care International and World Vision to the U.N.'s specialized agencies like the World Food Program and the Office of the U.N. High Commissioner for Refugees (UNHCR), are always warning the public about the never-more-dire plight of war refugees, famine victims, and the latest unfortunate souls imperiled by nature's wrath. They should count themselves lucky that we have such short memories. If people actually remembered just how often their claims have proved to be overblown, contributions would almost certainly fall off dramatically. A quick search for the "world's worst humanitarian crisis" brings up a trove of competing claims: Darfur, Congo, Pakistan, Somalia. And the list goes on. Relief agencies are constantly insisting that what is about to take place in Afghanistan or Burma, Haiti or Rwanda, is nothing short of apocalyptic, only for it to turn out that these predictions of disaster are wildly exaggerated, when not simply unfounded.

Sadly, over the course of the past few decades, exaggeration seems to have become the rule in the world of humanitarian relief. The Indian Ocean tsunami of December 2004, which is generally believed to have killed almost a quarter of a million people in 14 countries, is a stark example. In the immediate aftermath, NGOs and U.N. agencies were predicting that without massive aid, the death toll would double because of hunger, lack of clean water, and the spread of infectious disease. Their appeals were extraordinarily successful, raising more than $14 billion from governments, corporations, and a remarkably large number of private donors. And yet, there was little basis for such anxiety: The general rule in natural disasters such as tsunamis and earthquakes is that most fatalities occur in the first 24 hours. The mismatch between the vast sums of money raised globally for tsunami relief and the real needs on the ground was so extreme that Doctors Without Borders soon began returning contributions, while Oxfam diverted funds to other crises. But this did not stop the U.N. from taking credit -- on what basis, no one could quite say -- for having prevented a second wave of deaths.

The culture of shameless embellishment never seems to dissipate for long. Here is Elisabeth Byrs, the spokeswoman for the U.N.'s Office for the Coordination of Humanitarian Affairs, speaking in the immediate aftermath of the earthquake that devastated Port-au-Prince, Haiti, on Jan. 12, 2010: "This is a historic disaster," she said. "We have never been confronted with such a disaster in the U.N. memory. It is like no other." Let's be clear: This is not the compassionate rhetoric of solidarity, but advertising hype. It's bigger, sadder, worse! The fact that those who dispense such misinformation mean well does not lessen the distortion.

In war zones, the inflation of anxieties that is the handmaiden of humanitarian work has become almost as extreme. In the two years after the 1994 Rwandan genocide -- in which more than a million Rwandan Hutu refugees huddled in camps -- it was common to hear UNHCR officials insisting that forcing the refugees to return to Rwanda would lead to tens of thousands dead on the march home or killed by the forces of the Tutsi Rwandan Patriotic Front (RPF). In 1996, though, the RPF did shut down the camps, and there were virtually no casualties among those who returned home. There were indeed terrible massacres in the forests of eastern Congo, but the victims were those who refused to return and tried to flee west.

The charitable interpretation of all this is that relief groups are just indulging in hyperbole, which the Oxford English Dictionary defines as "deliberate exaggeration, not meant to be taken literally." But this is hardly likely, precisely because most of the people who partake in this apocalypse-mongering know better. Clearly, the relief folks have a motive here. If they do not exaggerate, private donations and government and U.N. grants to their organizations will dry up.

And the tragic thing is that they're probably right. Doctors Without Borders puts out a list every year of the 10 most forgotten crises, largely measured by the lack of airtime and column inches devoted to these humanitarian emergencies in the international media. All relief agencies know that, where disasters are concerned, not only the media but the public as a whole practices a species of serial monogamy, focusing on one crisis to the exclusion of all others until what is sometimes called "compassion fatigue" sets in. Then, attention shifts to the next emergency.

These days, only the most extreme, most apocalyptic situations are likely to move donors in the rich world -- that is, the donors who count the most (some 90 percent of all the funding for humanitarian work still comes from the OECD countries). With donor fatigue an ever-present possibility, admitting that they might not actually know how many people have been killed or made homeless -- or acknowledging that original estimates may have been overstated -- is thought to undermine the cause of rescuing people. That's why a recent report prepared for the U.S. Agency for International Development by Haiti expert Timothy Schwartz, suggesting that the official death toll in the 2010 earthquake of 316,000 overstated matters by roughly 500 percent, caused such consternation both in Port-au-Prince and in Washington -- as if accurate figures somehow provided a license not to care.

At a time when foreign aid is even more unpopular than usual in Washington, these anxieties are understandable. But hyperbole is not just a morally questionable strategy; it's practically unsustainable. By continually upping the rhetorical ante, relief agencies, whatever their intentions, are sowing the seeds of future cynicism, raising the bar of compassion to the point where any disaster in which the death toll cannot be counted in the hundreds of thousands, that cannot be described as the worst since World War II or as being of biblical proportions, is almost certainly condemned to seem not all that bad by comparison. 

It is probably accurate, though, that telling the truth in all its complexity will make getting the attention and mobilizing the concern of people in the rich world that much more difficult. Whatever activists may sometimes prefer to imagine, we are human beings, not solidarity machines. But we are not indifference machines either, and presented with the facts as they are, instead of dark nightmares, we might just do the right thing.

Nikki Kahn/The Washington Post