Voice

Warning: Idealism Can Kill You

(And a lot of other people.)

Do our best instincts cause our worst failures?

Consider Iraq. Granted, some not-so-great instincts brought us there in 2003 -- but it was our best instincts that kept us there. Remember "you break it, you own it?" We took the Pottery Barn Rule seriously enough to stick around for the next eight years, trying earnestly to glue the shattered pieces back together.

The glue never stuck. We couldn't bring ourselves to believe gloomy predictions that looting might take place, and we convinced ourselves the Iraqis would be better off if we could just get the Baathists out of government and disband Saddam Hussein's army. But instead of bringing peace and democracy, early U.S. decisions in Iraq led to chaos, revenge killings, a government that could no longer provide the most basic services, and millions of angry, armed, and unemployed young men.

It only got worse. The continued U.S. presence sparked an insurgency and brought al Qaeda to Iraq. While the Iraq War's civilian death toll remains disputed, it was certainly high: during the eight years of U.S. military engagement, an estimated 162,000 Iraqis were killed, mostly civilians. Violence began gradually to subside after 2006, for reasons most analysts suspect had little to do with the U.S. troop surge. In 2011, we finally slunk off, tails between our legs. Behind us, we left a squabbling, barely functional government, an economy still in shambles, and a level of civil violence that remains astronomically high.

My point here is not that the Iraq War was a bad idea in the first place (though it certainly was). It's more depressing than that. My point is that this cynical, foolish, arguably illegal war might still have come right in the end if we had tried a little less hard to fix everything. I remember Iraq in summer 2003, before the bombing of the U.N. building, before suicide bombs and IEDs became near daily occurrences. There was a brief window that spring and summer, a window in which the mood in Baghdad was cautiously celebratory. People spoke freely. NGOs and human rights groups popped up out of nowhere. Saddam was out; hope was in. We should have left then.

What would have happened if the U.S. government had been less determined to fix Iraq's broken parts? What would have happened if we'd brought our troops back home in summer 2003? What if we'd quelled our national do-gooder instincts, and left the Iraqi army and all but a handful of top Baathist officials in place, offering the rest their lives, liberty, and some generous economic assistance in exchange for genuine cooperation on weapons inspections?

There's no way to know for sure, but I have an uneasy feeling that a more cynical U.S. government approach from the get-go -- an approach that never even contemplated the restoration of democracy -- might ultimately have caused less bloodshed.

Or consider Afghanistan, where we recently lost the 2000th U.S. service member. We've been at war in Afghanistan for over a decade, struggling to keep the Taliban at bay, build up a democratic government, eliminate corruption, and create an Afghan military capable of defending the population and disinclined to prey on it. All worthy aims -- and to achieve them, Presidents Bush and Obama let U.S. troop levels creep up over the course of eleven years, going from fewer than 10,000 in late 2002 to more than 30,000 in November 2008. By mid-2009, Obama had doubled that number. By mid-2010, he had tripled it. And although Obama's promised "civilian surge" never got very surge-like, we did substantially increase the number of U.S. civilian officials tasked to help with Afghan governance and development issues.

The result? Today the Afghan government remains corrupt, insecurity remains rampant, civilian deaths directly and indirectly attributable to the U.S. presence remain high, and a 2011 poll found that 76 percent of Afghans say they feel "some fear" or "a lot of fear" when encountering international coalition forces. If the Afghan population doesn't trust us much, we now trust them even less: green-on-blue attacks have spiked, killing more than 50 Americans so far this year. Meanwhile, the Taliban have decided they can't be bothered to negotiate with us, and as in Iraq, we're now limping ignominiously towards the exit.

What if the United States had done things differently? What if we had pummeled al Qaeda's strongholds, helped the Northern Alliance oust the Taliban, and then...left? If we had left early in 2002, we could have continued to strike al Qaeda targets of opportunity as needed, using special operations forces and aerial attacks; we could have used diplomacy and foreign aid to urge governance and human rights reforms. Perhaps there wouldn't have been many reforms. But would things really be any worse than they are today?

Here again, my point isn't that the war in Afghanistan was a mistake, or that our efforts in Afghanistan have fallen badly short, an argument that has been made often and persuasively. And I'm not arguing that we're now "less safe" than we used to be, or insufficiently "more safe" -- claims that have always struck me as hard to prove one way or the other. My point is that it has often been our best instincts, not our worst, that have led us to do harm in the world. In Afghanistan and Iraq, we spent billions of dollars and suffered thousands of U.S. casualties. Worst of all, we caused untold suffering for the very populations we so earnestly intended to help.

***

I'm not suggesting that the United States is all idealism, all the time: we're capable of plenty of cynicism, and occasional acts of plain old evil. But even our most cynical moments are accompanied by idealism. We want to help, and we want to set things right. We want everyone to share in peace, justice, and the benefits of the American way -- even if it hurts.

And hurt it does. The United States and those we try to "help" are is often the victims of our own idealistic commitments to democracy, human rights, and the rule of law.

This phenomenon plays out at a micro level as well a macro level. U.S. construction and economic development projects take far too long and cost far too much, in part because we want everything to satisfy stringent U.S. and international quality standards. Our reconstruction projects are so elaborate that only those Afghans who already have wealth and power have the capacity to serve as sub-contractors; by and large, the result is that power is concentrated even more in the hands of the (often corrupt and violent) few.

Even U.S. detention facilities in Afghanistan are built to Western specifications, complete with climate control systems and electronic security. As a result, we render them virtually useless for the Afghan officials who will inherit them -- and who won't have the trained staff or the unlimited supply of electrical power to make them run. But the suggestion that the Afghans might sometimes be better off with less reeks, to us, of unacceptable double standards.

Or consider a larger and more tragic irony: by late 2009, the United States had embarked on a counterinsurgency-influenced approach to the conflict in Afghanistan: the Afghan population, we decided, was the center of gravity. Our success or failure would depend on our ability to protect the population and enable the Afghan government to provide services and thus build legitimacy. Laudable goals! But by making the Afghan population the center of gravity, we also inadvertently placed the Afghan population at the center of a big red bulls-eye. We incentivized the Taliban to combat our efforts by placing IEDs in civilian structures and targeting police, courts, governance and economic development projects. They did so, with a vengeance.

I could go on -- and on, and on -- but it's too depressing.

It's not surprising that we often fail to achieve our idealistic goals. After all, building a culture that respects human rights, democracy, and the rule of law takes time. Our own imperfect form of democracy -- rife as it still is with injustice and corruption -- took us more than two centuries to build, though we stood on the shoulders of those who drafted the Magna Carta and the English Bill of Rights. So why should we imagine that durable change could come any faster in societies that start with far less -- less wealth, less education, less tradition of democratic government, human rights, or peaceful change?

Simple failure to achieve our loftiest goals could be excused. But if our efforts to help only cause more harm, it's inexcusable.

***

Scarred by Vietnam, my parents' generation came of age with a deep distrust of American power. They suspected that American interventionism never stemmed from pure motives, and never, ever, ended well. My generation came of age at a more hopeful moment: the Berlin Wall came down while I was in college, and the notion of non-ideological U.S. engagement with the world seemed suddenly possible again.

The Rwandan genocide taught my cohort that non-intervention can be as unconscionable as meddling, and Bosnia and Kosovo taught us that U.S. military power could be a force for good. My own early career revolved around human rights work, and brought me to Uganda during the early years of the Lord's Resistance Army, Kosovo in the wake of the NATO air campaign, and Sierra Leone during that country's brutal civil war. In each case, U.S. engagement seemed urgent and necessary.

But after all the waste and bloodshed in Iraq and Afghanistan, I've lost much of my faith in our government's ability to do good. The injustice and abuse that once motivated me still does -- but I don't have much faith anymore in our ability to restore peace or bring justice.

I'd love to have someone prove me wrong. But here's my fear: the more we try to fix things, the more we end up shattering them into jagged little pieces.

DAVID FURST/AFP/Getty Images

National Security

No Army for Young Men

Soldiers these days need less muscle and more maturity, so why do we still focus on recruiting 18-year-olds?

Military demographics change over time. Sixty-five years ago, the United States had a segregated military, but today people of every race, color, and creed train and fight side by side. Twenty-five years ago, women were excluded from half the occupational specialties in the Army and 80 percent of Marine Corps jobs; today, women can serve in all but a few combat-related occupational specialties. Just two years ago, gay and lesbian service members risked discharge; today, they can serve openly.

But there's one thing that hasn't changed much. Each year, the overwhelming majority of new military recruits are young and male. In that sense, the American military of 2012 still looks a great deal like the American military of the 1970s, the 1940s, the 1860s, or the 1770s. For that matter, it still looks a lot like virtually every group of warriors in virtually every society during virtually every period of human history.

It's time to question the near-universal assumption that the ideal military recruit is young and male. The nature of warfare has changed dramatically in the last century and the capabilities most needed by the military are less and less likely to be in the exclusive possession of young males. In fact, the opposite may be true: when it comes to certain key skills and qualities likely to be vital to the military in the coming decades, young males may be one of the least well-suited demographic groups.

For most of human history, having an army full of young men made lots of sense. As soldiers, young males have had two things going for them, historically speaking. First, they're usually stronger, on average, than any other demographic group: they can run fast and carry heavy loads. Second, they're (relatively) biologically expendable from a species-survival perspective: women of child-bearing age are the limiting factor in population growth. A society can lose a lot of young men without a devastating impact on overall population growth.

Today, though, these characteristics don't matter as much as they once did. Overall birthrates are much lower in modern societies than they were during earlier periods, but life expectancy is much longer. Early societies worried about sustaining their populations; today we worry less about ensuring population growth than about overburdening the planet's load-bearing capacity.

Simple brawn also offers far less advantage in our high-tech age. In modern warfare, brutal hand-to-hand combat is no longer the norm, and warfare is no longer a matter of sending out wave after wave of troops to overwhelm the enemy through sheer mass. Increasingly, modern warfare involves a mixture of high-tech skills and low-tech cultural knowledge rather than "fighting" in the traditional sense.

In fact, if the next few decades are anything like the last, most military personnel will never see combat. A recent McKinsey study found that the "tooth to tail" ratio in the active duty U.S. military was roughly one to three in 2008: for every service member in a combat or combat-support position, there were more than three service members in non-combat-related positions. A 2010 Defense Business Board study found that 40 percent of active duty military personnel had never even been deployed -- and that's during a decade in which the United States was at war in both Iraq and Afghanistan.

Being young, male, and strong offers no particular advantage to an Air Force remote drone pilot or an Army financial services technician. Even for service members in combat positions, the physical strength that young men are more likely to possess no longer offers as much of an advantage: today's weapons are lighter and more portable than they used to be, and even the most impressive musculature is no match for an IED.

I don't mean to suggest that the physical strength of soldiers has become militarily irrelevant. Sometimes, military personnel -- particularly infantrymen -- still find themselves doing things the old-fashioned way: hauling heavy equipment up a winding mountain trail, or slugging it out hand to hand during a raid. Specialized groups such as Navy SEALs will also continue to value strength and endurance, and that's appropriate for their mission. But for increasing numbers of military personnel, the marginal benefits of sheer physical strength have plummeted relative to earlier eras -- and this trend seems likely to continue.

Experts don't agree on what the future of warfare will look like. Perhaps the age of counterinsurgency and stability operations isn't over: perhaps, despite the best intentions of current leaders, the United States will have more Iraqs and Afghanistans. But even if we don't -- especially if we don't -- we'll continue to want to leverage the capabilities of partners and allies. To do that, we'll likely rely more and more heavily on the kind of skills honed by the Special Forces community: specifically, the ability to operate effectively in small groups in foreign cultures, keeping a low profile while working closely with host nation militaries.

Or perhaps the future of warfare will be high-tech. Perhaps we'll increasingly have to grapple with cyberattacks, unmanned technologies such as robots and drones, or high-end asymmetric threats such as anti-access and area-denial technologies. And perhaps we'll see all these things at the same time: the high end and the low end, all mixed together.

No one knows precisely what warfare will look like in the decades to come, but I'm pretty sure I know what it won't look like. It won't look like tanks sweeping across the plains of Eastern Europe. It won't look like Gettysburg, and it won't look like Homeric conflict outside the walls of Troy.

In other words, it won't be the kind of conflict that relies on mass, or favors the brawny over the brainy. It won't be the kind of conflict at which young males have traditionally excelled.

On the contrary. The skills the military is most likely to need  in the future are precisely the skills that American young people in general -- and young males in particular -- are most likely to lack. The U.S. military will need people with technical experience and scientific know-how. It will also need people with foreign language and regional expertise and an anthropological cast of mind -- people who can operate comfortably and effectively surrounded by foreigners. And in the 24-7 media environment -- the era of the strategic corporal -- the military will, above all, need people with maturity and good judgment.

These, it hardly needs to be said, are not generally the qualities most closely associated with the 18-24 year-old male demographic. Don't get me wrong: I've known many 18-24 year-old young men with terrific judgment and technical or cultural sophistication. But statistically, those thoughtful and sophisticated 18-24 year-old men are surrounded by a lot of not-so mature or sophisticated peers. (Ever spent time in a frat house?)

The statistics make for gloomy reading. As David Courtwright, author of Violent Land: Young Men and Social Disorder, puts it, young men seem to have "an affinity for trouble." They're responsible for a disproportionate share of fatal auto accidents, for instance. Violent crime rates are higher among 18-24 year-old men than among any other demographic group, tapering off sharply after age 25 or so. 18-24 year-olds commit homicides at roughly twice the rate of 25-34 year-olds. Young males also commit an outsized percentage of property crimes, commit suicides at disproportionately high rates, and are disproportionately likely to have substance abuse problems.

Young men in the U.S. military aren't immune from these statistical trends. Although the military conducts psychological testing on would-be recruits and screens people out based on a wide range of risk factors (prior felonies, lack of high school diploma, and so on), miscellaneous bad behavior is still far from unheard of among young service members. Ask a master sergeant or a battalion commander how much of their time goes into dealing with the assorted messes young people -- especially young men -- manage to get into, and they'll tell you they see a seemingly unending parade of junior soldiers arrested for driving drunk, defaulting on loans, assault, shoplifting, domestic violence, and the like.

Don't blame the boys: the fault lies not in their characters but in their neurological development. The parts of the brain responsible for impulse control and the ability to see consequences and evaluate risks seems to develop more slowly in males than in females. In males, the development in the prefrontal cortex -- "the seat of sober second thought" -- isn't complete until age 25 or later.

Of course, there are plenty of young men out there who are responsible, mature, and intellectually sophisticated -- and even the most immature young men generally grow up to become responsible, sober-minded citizens. But in the meantime, why do military recruiters continue to primarily target young males? As the world grows more complex -- as the skills needed to ward off security threats become more subtle and varied -- wouldn't we do better to radically rethink military recruitment strategies?

If the military opened up more opportunities for service to older Americans -- or simply devoted far more resources to recruiting women and men over 25 -- we might find it far easier to turn the military into the agile, sophisticated machine we keep saying we want. Better still, why not reconsider the whole military career progression, creating more of a revolving door between the military and civilian world for people at all career stages -- and particularly for those with critical skills, be they linguistic or scientific?

Transforming the military personnel system is a vital project, but one that will likely take decades. For now, we can start small. How about military recruitment booths at the AARP?

Alex Wong/Getty Images