Why are all these advocacy groups aligning themselves with the military?

With less than a month of campaigning to go, Barack Obama and Mitt Romney are vying to demonstrate their love for all things military. For political candidates, this isn't so unusual: for as long as there have been soldiers, there have been politicians eager to stand beside them and soak up a bit of reflected glory. What's more unusual is how eagerly the rest of us have lined up to imitate the candidates. From human rights activists to nutritionists, everyone now seems to look to the military for some borrowed credibility.

Take human rights. During the Bush administration, human rights organizations struggled to convince Americans to oppose so-called "enhanced interrogation" (that's torture, when it's at home). In the years immediately following the 9/11 attacks, the American public appeared to have little sympathy for abstract arguments about the rights of suspected terrorists. Searching for a more effective way to change public opinion, Human Rights First assembled a group of retired generals and admirals willing to make the military case against torture. In a letter to then-President Bush, the group (which included the former commanding general of CENTCOM) asserted that the U.S. use of torture has "put American military personnel at greater risk [and] undermined U.S. intelligence gathering efforts."

The group of retired officers assembled by Human Rights First remains active today. A few weeks ago, for instance, General Charles Krulak, the former commandant of the Marine Corps, issued a statement under Human Rights First's auspices that called upon Mitt Romney to reject torture: it's "illegal [and] immoral," sure, but it also "undermines both our national security and the order and discipline of our armed forces....[I]t produces unreliable results and often impedes further intelligence collection."

It's not just human rights advocates who have sought to enhance their credibility with the American public by associating themselves with the military. With conservatives taking aim at recent efforts to reduce the caloric content of school lunches and public attention waning, health care advocates have also brought in the big guns: in their case, a group of senior officers who can frame obesity not as a health problem, but as a military recruitment and readiness problem. In a 2010 report called Too Fat to Fight, dozens of retired general and flag officers proclaimed the obesity epidemic a threat to national security. According to the report, more than a quarter of young Americans are now too fat to qualify for military service. This, obviously, is bad news for military recruiters, and for the rest of us, too -- how can a flabby bunch of couch potatoes defend America as we face off against the third world's lean, hungry masses?

Too Fat to Fight goes on to call for the kind of reforms the left generally loves and the right generally hates, such as greater attention to the relationship between poverty, hunger, and obesity; increased federal funding of school lunch programs for the poor; and more government money for "the development, testing and deployment of proven public-health interventions." In September 2012, a follow-up report (Still Too Fat to Fight) funded by foundations such as the Robert Wood Johnson Foundation called for the elimination of junk food in school vending machines -- again in the name of military readiness.

The last decade has seen similar efforts to frame everything from climate change to low-quality public education as military issues. And why not? Obesity and poor nutrition surely will hurt military recruitment and readiness, and the U.S. use of torture surely does endanger troops and produce unreliable information. Similarly, low-quality public education threatens military readiness -- illiterate and innumerate recruits are as bad as obese ones -- and climate change will certainly cause migration and conflict over resources, creating new challenges for the military.

It's more than that, though. In an era in which all military personnel have officially been labeled "heroes," former military personnel make fantastic spokespeople for causes that might otherwise languish. After all, Americans have lost faith in virtually every other profession and public institution: in Gallup's annual study of confidence in institutions, well under half of Americans surveyed in 2012 said they had "a great deal" or "quite a lot" of confidence in the presidency, newspapers, public schools, television news, banks, business, unions, the criminal justice system, the medical system or organized religion. (Congress, as usual, garnered the confidence of just 13 percent of Americans.) Only the military seems to have been exempted from this epidemic of public cynicism: 75 percent of Americans say they have "a great deal" or "quite a lot" of confidence in the military.

But though I take my hat off to the many organizations that have made creative use of the magic of military endorsements, the trend troubles me. What does it say about us, as a nation, that fewer and fewer issues can gain traction if they're not wrapped in the mantle of military effectiveness?

We see this played out on a larger scale in debates about the federal budget. Both political parties agree that the deficit needs to be brought under control, and though Republicans and Democrats differ in their views on the role of revenue collection (a.k.a. taxes), both parties assert a need for significant across-the-board federal budget cuts...for everything except defense spending, that is.

President Obama proposes slowing the rate of growth of defense spending, essentially by keeping future spending on the base defense budget at current levels, with increases to keep pace with inflation. Mitt Romney considers maintaining current levels of defense spending tantamount to stripping troops of their weapons and body armor, and proposes pegging the base defense budget at a floor of 4 percent of GDP -- essentially tossing another $ 2 trillion at DoD over the next decade.

Given that U.S. defense spending is already higher, in real dollars, than it has been at any time since World War II, it's a little odd that no one -- at least, no one hoping to win an election -- appears willing to contemplate the possibility of genuine cuts to the base defense budget. At least not publicly.

Contrast the Defense Department's future budget prospects with those of many other federal programs. President Obama's proposed budget includes sizeable cuts in many non-defense discretionary programs: the budget for toxic waste clean-up and safe drinking water programs would be slashed, for instance, along with initiatives to help low-income people keep the heat on during the winter and NASA's Mars exploration efforts. And those are nothing compared to the cuts proposed by Mitt Romney's running mate Paul Ryan: Ryan, as Daniel Altman has written, would slash the percentage of GDP that goes into domestic programs to the level prevailing in Equatorial Guinea.

Here's what it adds up to: if you want to get something funded in the United States today, you need to find a way to shoehorn it into the Defense budget. Ever wonder why the military is doing more and more not-so-militaryish things, like operating health clinics in Africa and funding economic development projects in the Philippines? In part, it's because no one else has the money to do it. Funding for the State Department and the U.S. Agency for International Development has drastically fallen over the last two decades. Congress seems increasingly disinclined to fund civilian diplomacy and development initiatives -- but call something a military program, and presto, money falls from the heavens.

I exaggerate -- but not by much. As larger and larger swathes of the federal budget fall victim to Jack the Ripper-style cuts, it's the military that increasingly provides the vital services once provided by other parts of the federal government. Diplomacy and development? Check. Free or low-cost health care? The military provides it to active duty personnel, reservists, retirees, and their dependents -- but just try convincing Congress to fund similar programs outside the military. Military subsidies for higher education have become a route to college for hundreds of thousands of young people, even as federally subsidized grant and loan aid has shrunk in the civilian world. Subsidized childcare? Universal for the dependents of active duty military personnel, but practically extinct for most civilians.

Little wonder, then, that service members have become a must-have accessory for political candidates and issue advocates. Our cynical political culture devalues social welfare programs and snickers at communitarian impulses, and most of us trust neither our neighbors nor the public institutions that are meant to serve us. The distrust is not unmerited, but it's a vicious circle: the more we devalue public programs, the less we fund them and the less they can offer us, so the less we trust them, and so on. The military is all that's left: the last institution standing; the last part of the federal government that works.

No question, there's an element of self-serving jingoism in the efforts of politicians and interest groups to snuggle up with the military -- a desire to benefit from a little heroism-by-association, combined with a shameless appeal to the public's most bellicose and mindless "us versus them" instincts. But perhaps it's more than that. Perhaps we're simply desperate to be reassured that there is an "us" in the first place -- that the United States is something more than simply 300 million people who don't much like or trust one other (and who definitely don't trust their government).

Perhaps we try to associate every issue and platform with the military not because we're self-serving cynics, but because we secretly yearn for a domain that's free of cynicism. The military has come to symbolize those lost American virtues of public-spiritedness, generosity, sacrifice, self-discipline, and service to something larger than the self. It also represents that most elusive of American dreams: a government institution that actually works.


National Security

Warning: Idealism Can Kill You

(And a lot of other people.)

Do our best instincts cause our worst failures?

Consider Iraq. Granted, some not-so-great instincts brought us there in 2003 -- but it was our best instincts that kept us there. Remember "you break it, you own it?" We took the Pottery Barn Rule seriously enough to stick around for the next eight years, trying earnestly to glue the shattered pieces back together.

The glue never stuck. We couldn't bring ourselves to believe gloomy predictions that looting might take place, and we convinced ourselves the Iraqis would be better off if we could just get the Baathists out of government and disband Saddam Hussein's army. But instead of bringing peace and democracy, early U.S. decisions in Iraq led to chaos, revenge killings, a government that could no longer provide the most basic services, and millions of angry, armed, and unemployed young men.

It only got worse. The continued U.S. presence sparked an insurgency and brought al Qaeda to Iraq. While the Iraq War's civilian death toll remains disputed, it was certainly high: during the eight years of U.S. military engagement, an estimated 162,000 Iraqis were killed, mostly civilians. Violence began gradually to subside after 2006, for reasons most analysts suspect had little to do with the U.S. troop surge. In 2011, we finally slunk off, tails between our legs. Behind us, we left a squabbling, barely functional government, an economy still in shambles, and a level of civil violence that remains astronomically high.

My point here is not that the Iraq War was a bad idea in the first place (though it certainly was). It's more depressing than that. My point is that this cynical, foolish, arguably illegal war might still have come right in the end if we had tried a little less hard to fix everything. I remember Iraq in summer 2003, before the bombing of the U.N. building, before suicide bombs and IEDs became near daily occurrences. There was a brief window that spring and summer, a window in which the mood in Baghdad was cautiously celebratory. People spoke freely. NGOs and human rights groups popped up out of nowhere. Saddam was out; hope was in. We should have left then.

What would have happened if the U.S. government had been less determined to fix Iraq's broken parts? What would have happened if we'd brought our troops back home in summer 2003? What if we'd quelled our national do-gooder instincts, and left the Iraqi army and all but a handful of top Baathist officials in place, offering the rest their lives, liberty, and some generous economic assistance in exchange for genuine cooperation on weapons inspections?

There's no way to know for sure, but I have an uneasy feeling that a more cynical U.S. government approach from the get-go -- an approach that never even contemplated the restoration of democracy -- might ultimately have caused less bloodshed.

Or consider Afghanistan, where we recently lost the 2000th U.S. service member. We've been at war in Afghanistan for over a decade, struggling to keep the Taliban at bay, build up a democratic government, eliminate corruption, and create an Afghan military capable of defending the population and disinclined to prey on it. All worthy aims -- and to achieve them, Presidents Bush and Obama let U.S. troop levels creep up over the course of eleven years, going from fewer than 10,000 in late 2002 to more than 30,000 in November 2008. By mid-2009, Obama had doubled that number. By mid-2010, he had tripled it. And although Obama's promised "civilian surge" never got very surge-like, we did substantially increase the number of U.S. civilian officials tasked to help with Afghan governance and development issues.

The result? Today the Afghan government remains corrupt, insecurity remains rampant, civilian deaths directly and indirectly attributable to the U.S. presence remain high, and a 2011 poll found that 76 percent of Afghans say they feel "some fear" or "a lot of fear" when encountering international coalition forces. If the Afghan population doesn't trust us much, we now trust them even less: green-on-blue attacks have spiked, killing more than 50 Americans so far this year. Meanwhile, the Taliban have decided they can't be bothered to negotiate with us, and as in Iraq, we're now limping ignominiously towards the exit.

What if the United States had done things differently? What if we had pummeled al Qaeda's strongholds, helped the Northern Alliance oust the Taliban, and then...left? If we had left early in 2002, we could have continued to strike al Qaeda targets of opportunity as needed, using special operations forces and aerial attacks; we could have used diplomacy and foreign aid to urge governance and human rights reforms. Perhaps there wouldn't have been many reforms. But would things really be any worse than they are today?

Here again, my point isn't that the war in Afghanistan was a mistake, or that our efforts in Afghanistan have fallen badly short, an argument that has been made often and persuasively. And I'm not arguing that we're now "less safe" than we used to be, or insufficiently "more safe" -- claims that have always struck me as hard to prove one way or the other. My point is that it has often been our best instincts, not our worst, that have led us to do harm in the world. In Afghanistan and Iraq, we spent billions of dollars and suffered thousands of U.S. casualties. Worst of all, we caused untold suffering for the very populations we so earnestly intended to help.


I'm not suggesting that the United States is all idealism, all the time: we're capable of plenty of cynicism, and occasional acts of plain old evil. But even our most cynical moments are accompanied by idealism. We want to help, and we want to set things right. We want everyone to share in peace, justice, and the benefits of the American way -- even if it hurts.

And hurt it does. The United States and those we try to "help" are is often the victims of our own idealistic commitments to democracy, human rights, and the rule of law.

This phenomenon plays out at a micro level as well a macro level. U.S. construction and economic development projects take far too long and cost far too much, in part because we want everything to satisfy stringent U.S. and international quality standards. Our reconstruction projects are so elaborate that only those Afghans who already have wealth and power have the capacity to serve as sub-contractors; by and large, the result is that power is concentrated even more in the hands of the (often corrupt and violent) few.

Even U.S. detention facilities in Afghanistan are built to Western specifications, complete with climate control systems and electronic security. As a result, we render them virtually useless for the Afghan officials who will inherit them -- and who won't have the trained staff or the unlimited supply of electrical power to make them run. But the suggestion that the Afghans might sometimes be better off with less reeks, to us, of unacceptable double standards.

Or consider a larger and more tragic irony: by late 2009, the United States had embarked on a counterinsurgency-influenced approach to the conflict in Afghanistan: the Afghan population, we decided, was the center of gravity. Our success or failure would depend on our ability to protect the population and enable the Afghan government to provide services and thus build legitimacy. Laudable goals! But by making the Afghan population the center of gravity, we also inadvertently placed the Afghan population at the center of a big red bulls-eye. We incentivized the Taliban to combat our efforts by placing IEDs in civilian structures and targeting police, courts, governance and economic development projects. They did so, with a vengeance.

I could go on -- and on, and on -- but it's too depressing.

It's not surprising that we often fail to achieve our idealistic goals. After all, building a culture that respects human rights, democracy, and the rule of law takes time. Our own imperfect form of democracy -- rife as it still is with injustice and corruption -- took us more than two centuries to build, though we stood on the shoulders of those who drafted the Magna Carta and the English Bill of Rights. So why should we imagine that durable change could come any faster in societies that start with far less -- less wealth, less education, less tradition of democratic government, human rights, or peaceful change?

Simple failure to achieve our loftiest goals could be excused. But if our efforts to help only cause more harm, it's inexcusable.


Scarred by Vietnam, my parents' generation came of age with a deep distrust of American power. They suspected that American interventionism never stemmed from pure motives, and never, ever, ended well. My generation came of age at a more hopeful moment: the Berlin Wall came down while I was in college, and the notion of non-ideological U.S. engagement with the world seemed suddenly possible again.

The Rwandan genocide taught my cohort that non-intervention can be as unconscionable as meddling, and Bosnia and Kosovo taught us that U.S. military power could be a force for good. My own early career revolved around human rights work, and brought me to Uganda during the early years of the Lord's Resistance Army, Kosovo in the wake of the NATO air campaign, and Sierra Leone during that country's brutal civil war. In each case, U.S. engagement seemed urgent and necessary.

But after all the waste and bloodshed in Iraq and Afghanistan, I've lost much of my faith in our government's ability to do good. The injustice and abuse that once motivated me still does -- but I don't have much faith anymore in our ability to restore peace or bring justice.

I'd love to have someone prove me wrong. But here's my fear: the more we try to fix things, the more we end up shattering them into jagged little pieces.