Smaller Is Smarter

Military drawdowns have driven innovation for millennia.

There is an emerging consensus, in Congress and around the country, that government spending must decline, but there is just as strong a sentiment that there are far more artful ways to achieve this than by across-the-board cuts. In the case of domestic entitlement programs like Social Security and Medicare, this growing awareness has sparked some bold thinking about reforms, particularly among Republicans in Congress. In the defense sector, however, there is far less evidence of a willingness to contemplate innovative ideas. But if there were, a world of intriguing possibilities would open up.

Unfortunately, the bipartisan reaction to sequestration as it bears upon military matters has been to try to figure out ways to wriggle free of its constraints, perhaps even to avoid any spending reductions over the next 10 years, much less drawdowns amounting to an additional $500 billion on top of currently planned cuts. If this sentiment prevails, a signal disservice will have been rendered to the military and the American people, because the failure to insist on defense spending reductions will continue to allow the military to forgo making tough and much needed choices about future directions. Strategic affairs are in great flux, due to factors ranging from radical technological change to the rise of a series of wars between nations and networks. A failure to transform the military now will only increase perils -- even if spending cuts are avoided.

The challenge before us is to embrace budgetary constraints as empowering rather than crippling. And there are many good examples of professional militaries that seized such opportunities, extending far back in history. In the 6th century of the Common Era, the Byzantine Emperor Justinian sought to restore territorial holdings in the West that had been lost as Rome declined and fell -- yet he had only the slenderest of financial resources with which to carry out this goal. However, he picked skillful generals, Belisarius and Narses, who made the most of what little they had as they pioneered the development of new types of military formations. The great strategist Liddell Hart saw in the heavy cavalry troops that were created, in part to make up for a critical lack of legionary infantry, a clear foreshadowing of modern armored warfare. And so, with always outnumbered forces, Belisarius and Narses reconquered and held Italy, Africa, and southern Spain for the Empire.

A more modern example of success-under-constraint is the post-World War I army of Germany's Weimar Republic. In this case, treaty restrictions and the parlous state of the economy kept the active-duty force quite small -- limited to 100,000 soldiers. Their commander, General Hans von Seeckt, used this in two important ways. First, he emphasized the profound importance of understanding the operational implications of key maturing technologies: tanks, planes, and radio. His focus on mobile maneuvers led to the rise of blitzkrieg. Second, force-size limits led him to rethink the active-reserve mix, and to nurture the notion of cycling through large numbers of young men on short active-duty periods, then moving them into vigorous reserve programs -- sometimes under the guise of labor organizations. Thus Germany eventually had a very large trained manpower pool upon which to draw, allowing the army to expand rapidly and effectively when war came.

To some extent, the U.S. military during the decade after Vietnam followed a similar pattern of development. Active-duty forces were reduced by 40 percent, from 3.5 to 2.1 million. Defense spending declined sharply as well, falling from $344 billion in 1972, at the end of the war, to just $295 billion by 1979 -- over a 14 percent drop before factoring in the effects of inflation. Yet in the face of these challenges, the smaller active army became more professionalized and a new doctrine began to form, Air-Land Battle, which was formally introduced in 1982 and focused on the importance of the swift movement of information and the striking power of precision-guided munitions. Like the German Reichswehr, the post-Vietnam U.S. military found its way ahead despite considerable constraints. Even the spending increases under Ronald Reagan were relatively short-lived as, by the time the elder President Bush submitted his final budget for FY 1993, the actual spending level was only $15 billion more than at the end of the Vietnam War. In inflation-adjusted dollars, this was quite a reduction.

The larger point here is that constraints in general should be seen as opportunities for innovation. Budgetary matters aside, think of the Boland Amendment in the 1980s, which restricted the American presence in El Salvador to 55 military advisors. In the midst of a bitter civil war being waged in our continent's most densely populated country, these advisors hugely improved the quality (and behavior) of the Salvadoran military, and came up with a counterinsurgency strategy that turned the tide of battle and helped lead to a durable peace and the establishment of a vibrant democracy. More recently, similar political and other constraints have limited the American military to sending only small detachments of special operations forces to the Philippines and Colombia -- yet they have done profound good in both places with their highly innovative ideas.

In the wake of the 9/11 attacks on America, the need to respond swiftly in far-off Afghanistan led to Secretary of Defense Rumsfeld championing another bold approach: setting loose just 11 Special Forces A-teams -- about 200 sets of boots on the ground -- in the company of indigenous Afghan fighters of quite mixed quality. The result was an amazing victory, the toppling of the Taliban in a few short weeks once the Green Berets were deployed in battle. That the occupation of Afghanistan went awry later on, and that large surges of troops did little to end the war, should be seen simply as testament to the fact that too many resources may impede the kind of creativity called for in such settings. We were at our best, our most inventive, when our forces were the insurgents, operating on a shoestring.

So embrace the call for defense budget cuts in the same amount as called for by sequestration, but reject the meat-ax notion of applying reductions equally, across the board. There are more skillful ways ahead that will emerge in the wake of reduced resources -- perhaps a whole new way of war to be revealed. For the Byzantines, such creativity took the form of creating a 6th century version of the modern armored division. For the Reichswehr, it took the form of deep thinking about the implications of technological change and the need for rapid "expandability" of the force. For the U.S. military, the lessons of recent experience suggest an ever greater awareness of the need to move from forces made of a few large and expensive things to a force comprised of many small, nimble, networked parts.

The answers will reveal themselves. All they wait on is the "call for the question" to be stimulated by the requirement for additional defense budget cuts.

Jason Reed-Pool/Getty Images

National Security

Caveat Preemptor

How Obama has adopted the Bush doctrine.

Truth may be the first casualty of war, but the language of strategic discourse has also suffered multiple serious wounds over the past decade -- none more grievous than that which has been inflicted on the time-honored concept of "preemption," the notion of striking first so as to thwart an adversary's own impending attack. The most egregious misuse of the term arose in President George W. Bush's 2002 national security strategy, which sought to expand its meaning to encompass the use of force against any who might one day pose a threat. Thus was the attempt made to justify the 2003 invasion of Iraq as "preemptive."

But the attack on Iraq was not preemptive. In the strategic lexicon, the kind of action taken against Iraq is called "preventive war," which is about attacking before a threat becomes imminent. Moral philosopher Michael Walzer has parsed these matters neatly, noting that preemption focuses closely on the need to take action in a current crisis; preventive war has to do with worries about the future consequences of inaction. Generally, ethicists are open to the need to be able to take preemptive action. But the very concept of waging preventive war gets their backs up. It looks a little too much like naked aggression. Due to this concern, Bush and his senior advisers sought to defuse principled opposition to the use of force, in the absence of imminent threat, by arbitrarily expanding the definition of preemption.

Humpty Dumpty got away, for a while, with the bald-faced assertion that a word "means just what I choose it to mean." But for national leaders and diplomats, this looseness is a recipe for disaster -- as the ensuing costly debacle in Iraq suggests. Even more troubling than the facts that Americans are gone from Iraq, al Qaeda is back, and the killing has continued, is that President Obama has taken the same approach to preemption as his predecessor. He has ramped up the global drone war on terror with a many-fold increase in strikes on suspects. We are told that this is done with great care, and that the targets are being selected strictly on the basis of the imminence of the threats posed. But this is hardly believable, as scarcely a shred of evidence has been presented to the public in support of the notion that the victims of these attacks were on their way to hit American (or other) targets. Further, the frequent use of "signature strikes," hitting at sites simply on the basis of intelligence profiles suggesting they're populated by troublemakers, is highly problematic.

Another Obama administration application of preemption is emerging in cyberspace. Last fall, then-Secretary of Defense Panetta, in a major policy speech, explicitly spoke to the possibility of mounting preemptive attacks. For the most part, his qualifying "ifs" (if a cyber attack is perceived as imminent, and if it is likely to do great damage) suggest a degree of caution. But there has also been language in the administration discourse about striking first on the basis of "emergence of a concrete threat" that begins to move this policy more in the direction of using preventive force than just taking preemptive action. This is a serious concern, given how very hard it will be to detect an imminent attack. In cyberspace there are no troops massing on the border, no telltale signs of long-range missiles being readied for launch, no aircraft scrambling. Cyberattack comes with a simple click. Identifying the attacker ahead of time will require amazing forensic skills -- not in evidence yet even in the case of exhaustive post-incident investigations.

For all the current troubles with the slippage from preemption to prevention, it must be noted that the history of strategic thought about striking first to forestall an imminent attack has been very troubling in its own right. Nuclear preemption notions during the Cold War, for example, led to highly destabilizing ideas about the "launch on warning" of one's vulnerable missiles. The Soviet hierarchy's war plan for central Europe was just as high-risk, too, as it called for a preemptive series of nuclear strikes from the outset of any conflict, before NATO would be able to use its own atomic arsenal. Both sides eventually had the good sense to realize that nuclear preemption made no sense, and mutual deterrence eventually held sway -- as it still does today.

Even the classic case of preemption in a conventional conflict, Israel's opening operations in the Six-Day War of 1967, leaves much to question. One Israeli officer quoted at the time in the Associated Press noted simply, "Time is against us. Nasser said he seeks to destroy us. Why shouldn't we believe him?" This sort of reasoning is preventive in nature -- that is, it speaks to attacking before the odds of winning worsen. Cost factors also drove the action back then, as mobilization of Israel's citizen army ran about $20 million daily (big dollars in those days, for a small country). A lingering crisis was going to ruin the economy. Even so, the great Israeli statesman David Ben-Gurion was not convinced that war was necessary, and wrote in his diary on the eve of the conflict: "I'm very worried about the step we're about to take. The haste involved here is beyond my understanding."

Indeed, it is hard to identify cases of preemption -- save for those "spoiling attacks" featured at the tactical level in many military campaigns -- that do lie within our understanding. Francis Bacon no doubt had it right -- in theory -- four centuries ago in his essay, "Of Empire," when he asserted, "there is no question but a just fear of an imminent danger, though there be no blow given, is a lawful cause of war." But in practice, preemption has never made much sense as a strategic national policy. Further, the expensive misadventure in Iraq has made for real problems with the pursuit of preventive policies. Yet prevention may be the only rational way ahead, in terms of pursuing the twin goals of stemming proliferation -- in Iran and elsewhere -- and keeping weapons of mass destruction out of the hands of terrorists.

So it is time for senior leaders to fess up. The past decade has seen a lot of stumbling around, with wrongheaded preventive actions taken and a sustained, bipartisan effort at Newspeak that willfully mislabels prevention as preemption. Given the inherent problems with preemption, though, let's just be honest about the need to act forcefully and preventively against proliferators and terrorists -- even in the absence of imminent threats. How hard is it to admit this?