Voice

Rich nation, poor army?

Today I offer a brief comment on David Bosco's excellent FP piece on U.N. peacekeeping. Bosco points out that the United Nations draws its peacekeepers overwhelmingly from poor societies; in his words, "U.N. peacekeeping is an activity mostly paid for by the rich world and carried out by troops from poorer states."

My comment is twofold. First, much the same could be said of military activity conducted by the United States of America. Now that the country has an all-volunteer force, military service in the United States is increasingly reserved for the poorer segments of society. As Amy Lutz, a Syracuse University sociologist, concludes in a 2008 article: "as family income increases, the likelihood of having ever served in the military decreases … the economic elite are very unlikely to serve in the [U.S.] military." As with U.N. peacekeeping, in short, the "common defense" in the United States is an activity paid for by richer Americans and carried out (mostly) by poorer Americans.

Second, I suspect this tendency reflects the broad recognition that warfare is not an especially glorious or attractive activity: It may be necessary at times, but military service is not the best way to make a living if you have other alternatives. For the most part, Americans no longer share Teddy Roosevelt's belief that "a just war is in the long run far better for a man's soul than the most prosperous peace." It may also reflect the collective social awareness that the United States is actually very secure and that most citizens (and particularly those who are well off) do not need to serve in uniform in order to make a contribution to the national defense. Instead, they can just get a job and pay their taxes.

None of this should be seen as denigrating military service itself or questioning the choices of those Americans (including the relatively well-to-do) who opt for a military career. But as Karl Eikenberry and David Kennedy observed in a thoughtful New York Times op-ed this week, the gradual separation between the U.S. military and the rest of society has significant costs and may ultimately be quite unhealthy for the republic. (For a longer discussion, Eikenberry's recent article in the Washington Quarterly is well worth reading too.)

Scott Olson/Getty Images

Stephen M. Walt

On intellectual diversity

It's Commencement Day here at Harvard, and we are sending the Class of 2013 out into the world with congratulations, good wishes, and high hopes. My graduate students here at the Kennedy School are a remarkable group, and I look forward to watching them make their way in the complex and often troubling world of foreign policymaking.

At such times I tend to think about how we might have educated them better, and I want to draw an analogy to an interesting op-ed by Jacob Hamblin in today's New York Times. Hamblin's subject is biodiversity, and he traces the origins of our present concern for it to some rather chilling Cold War strategic planning. Specifically, war planners investigating ways to destroy enemy ecosystems gained new appreciation for the dangers of environments that were ecologically one-dimensional (such as vast farmlands sown with a single crop). In particular, loss of diversity leaves whole areas vulnerable to a single pathogen or event that wipes out the dominant species.

I would argue that the same is true of "intellectual ecosystems" as well. When academic disciplines become overly concentrated on one set of questions, one set of theoretical answers, one set of methods, or one body of data, what might seem at first glance to be a powerful engine of scholarly progress can be a source of danger as well. Having everyone working in more or less the same way can generate lots of publications and citations and even help knowledge advance in this particular area, but "normal science" of this sort also means that alternative approaches, questions, methods, or theories get short shrift. The danger is that scholars wake up one day and discover that the reigning method du jour has fatal limitations, or it turns out that some neglected skills (e.g., foreign languages, cultures, etc.) suddenly become very valuable.

In the IR field, for example, contemporary graduate training increasingly involves mastering an enormous arsenal of methodological skills, most of them statistical in nature. Because there are only 24 hours in a day and only five to six years in most Ph.D. programs, most students won't have the time to learn foreign languages, read broadly in history, do more than cursory field research on rather narrow topics, or even acquire a sophisticated understanding of social theory. There are important benefits to this type of training -- though perhaps fewer than is often alleged -- but privileging this particular set of skills comes with a cost as well. If today's graduate students increasingly resemble each other -- varying only in their raw talents or determination -- then we are in effect creating an intellectual monoculture that might leave us badly prepared for new developments. To take an obvious example: After the 9/11 attacks, wouldn't it have been nice to have had a few more people in academia who really understood Islam, the Middle East, the nature of terrorist movements, or even Arabic? Similarly, having a few more people who understood how financial markets and regulations really worked (as opposed to how they worked in theory) might have come in handy both before and after the Great Meltdown.

Of course, the combination of tenure and the abolition of mandatory retirement in the United States compounds this dilemma. Scholars rise to the top of their fields based mostly on their early work, which is bound to reflect the research norms and standards that prevailed at the time. Most academics try to grow and develop over time, and a few exhibit dramatic shifts in their thinking, but for the most part they tend to like scholars whose own work resembles their own. So they hire and promote people who are more or less like them, further diminishing the degree of intellectual diversity within the field.

I don't have an obvious antidote to this tendency. But one of the nice things about teaching at a public policy school is the presence of numerous disciplines within the same faculty (even if economists tend to dominate, or at least they often try to). In addition to being more interesting, such schools may be better equipped to handle new developments in the real world than most academic departments are. And because public policy schools are explicitly supposed to prepare students for careers in the real world (as opposed to the ivory tower), they are more likely to welcome practitioners, public intellectuals, and scholars who don't fit into neat categories. The result is a much richer intellectual environment and one that ought to be more adaptable over time.

If this theory is right, then public policy schools (and other explicitly inter- and multidisciplinary enterprises) should have an especially bright future -- not because they are necessarily better at any one thing, but because they will be less vulnerable to fads, changes of fashion, or shifts in the agenda of relevant problems. Like a diverse investment portfolio or a diverse ecosystem, building a diverse intellectual environment is the smart long-term strategy.

So to the graduates of the Class of 2013, I say: "Congratulations! Your degree is valuable today and is likely to be even more valuable in the future." At least I hope so, for their sake as well as my own.

Paul Marotta/Getty Images