National leaders get all sorts of advice in times of tension and conflict. But often the competing counsel can be broken down into two basic categories. On one side are the hawks: They tend to favor coercive action, are more willing to use military force, and are more likely to doubt the value of offering concessions. When they look at adversaries overseas, they often see unremittingly hostile regimes who only understand the language of force. On the other side are the doves, skeptical about the usefulness of force and more inclined to contemplate political solutions. Where hawks see little in their adversaries but hostility, doves often point to subtle openings for dialogue.
As the hawks and doves thrust and parry, one hopes that the decision makers will hear their arguments on the merits and weigh them judiciously before choosing a course of action. Don't count on it. Modern psychology suggests that policymakers come to the debate predisposed to believe their hawkish advisors more than the doves. There are numerous reasons for the burden of persuasion that doves carry, and some of them have nothing to do with politics or strategy. In fact, a bias in favor of hawkish beliefs and preferences is built into the fabric of the human mind.
Social and cognitive psychologists have identified a number of predictable errors (psychologists call them biases) in the ways that humans judge situations and evaluate risks. Biases have been documented both in the laboratory and in the real world, mostly in situations that have no connection to international politics. For example, people are prone to exaggerating their strengths: About 80 percent of us believe that our driving skills are better than average. In situations of potential conflict, the same optimistic bias makes politicians and generals receptive to advisors who offer highly favorable estimates of the outcomes of war. Such a predisposition, often shared by leaders on both sides of a conflict, is likely to produce a disaster. And this is not an isolated example.
In fact, when we constructed a list of the biases uncovered in 40 years of psychological research, we were startled by what we found: All the biases in our list favor hawks. These psychological impulses -- only a few of which we discuss here -- incline national leaders to exaggerate the evil intentions of adversaries, to misjudge how adversaries perceive them, to be overly sanguine when hostilities start, and overly reluctant to make necessary concessions in negotiations. In short, these biases have the effect of making wars more likely to begin and more difficult to end.
None of this means that hawks are always wrong. One need only recall the debates between British hawks and doves before World War II to remember that doves can easily find themselves on the wrong side of history. More generally, there are some strong arguments for deliberately instituting a hawkish bias. It is perfectly reasonable, for example, to demand far more than a 50-50 chance of being right before we accept the promises of a dangerous adversary. The biases that we have examined, however, operate over and beyond such rules of prudence and are not the product of thoughtful consideration. Our conclusion is not that hawkish advisors are necessarily wrong, only that they are likely to be more persuasive than they deserve to be.
Several well-known laboratory demonstrations have examined the way people assess their adversary's intelligence, willingness to negotiate, and hostility, as well as the way they view their own position. The results are sobering. Even when people are aware of the context and possible constraints on another party's behavior, they often do not factor it in when assessing the other side's motives. Yet, people still assume that outside observers grasp the constraints on their own behavior. With armies on high alert, it's an instinct that leaders can ill afford to ignore.
Imagine, for example, that you have been placed in a room and asked to watch a series of student speeches on the policies of Venezuelan leader Hugo Chávez. You've been told in advance that the students were assigned the task of either attacking or supporting Chávez and had no choice in the matter. Now, suppose that you are then asked to assess the political leanings of these students. Shrewd observers, of course, would factor in the context and adjust their assessments accordingly. A student who gave an enthusiastic pro-Chávez speech was merely doing what she was told, not revealing anything about her true attitudes. In fact, many experiments suggest that people would overwhelmingly rate the pro-Chávez speakers as more leftist. Even when alerted to context that should affect their judgment, people tend to ignore it. Instead, they attribute the behavior they see to the person's nature, character, or persistent motives. This bias is so robust and common that social psychologists have given it a lofty title: They call it the fundamental attribution error.
The effect of this failure in conflict situations can be pernicious. A policymaker or diplomat involved in a tense exchange with a foreign government is likely to observe a great deal of hostile behavior by that country's representatives. Some of that behavior may indeed be the result of deep hostility. But some of it is simply a response to the current situation as it is perceived by the other side. What is ironic is that individuals who attribute others' behavior to deep hostility are quite likely to explain away their own behavior as a result of being "pushed into a corner" by an adversary. The tendency of both sides of a dispute to view themselves as reacting to the other's provocative behavior is a familiar feature of marital quarrels, and it is found as well in international conflicts. During the run-up to World War I, the leaders of every one of the nations that would soon be at war perceived themselves as significantly less hostile than their adversaries.
If people are often poorly equipped to explain the behavior of their adversaries, they are also bad at understanding how they appear to others. This bias can manifest itself at critical stages in international crises, when signals are rarely as clear as diplomats and generals believe them to be. Consider the Korean War, just one example of how misperception and a failure to appreciate an adversary's assessment of intentions can lead to hawkish outcomes. In October 1950, as coalition forces were moving rapidly up the Korean Peninsula, policymakers in Washington were debating how far to advance and attempting to predict China's response. U.S. Secretary of State Dean Acheson was convinced that "no possible shred of evidence could have existed in the minds of the Chinese Communists about the non-threatening intentions of the forces of the United Nations." Because U.S. leaders knew that their intentions toward China were not hostile, they assumed that the Chinese knew this as well. Washington was, therefore, incapable of interpreting the Chinese intervention as a reaction to a threat. Instead, the Americans interpreted the Chinese reaction as an expression of fundamental hostility toward the United States. Some historians now believe that Chinese leaders may in fact have seen advancing Allied forces as a threat to their regime.