Strictly Confidential

Why professors and analysts are our best secret weapon.

Everyone says an open, transparent government is the hallmark of a healthy democracy. But most reasonable people would agree that there must be limits to how transparent government can be. After all, no one wants sensitive national security secrets to fall into the wrong hands. When information shared in the name of good governance can be exploited by terrorists, many would argue the public's right to know pales in comparison to the public’s need for protection.

But are openness and security really opposing values? When government officials curb access to information, they cut themselves off from the brain power and analytical skills of a huge community of scientists, engineers, and security experts who are often far better at identifying threats, weaknesses, and solutions than any government agency. The effort to cordon off experts from sensitive information has been dramatic, especially in the United States. An executive order signed by President George W. Bush in 2003 permitted -- some say encouraged -- the U.S. government to classify mountains of information. For the first time, basic infrastructure information was designated as a category of classifiable information. The increase in secrecy has been staggering: In 1996, 5.8 million documents were classified by the U.S. government. In 2005, that figure nearly tripled, to 14.2 million documents stamped "secret" or "confidential." Even more information has been kept from the public by an increase in the use of "sensitive but unclassified" markings, which operate without the legal constraints of the traditional classification system. In 2006 alone, just the number of categories under which a document might be labeled "sensitive" increased by 20 percent. But this added secrecy hasn't made anyone safer. In truth, it is leaving everyone more vulnerable to emerging threats.

Consider the efforts of Lawrence Wein, a Stanford University business professor, and his then graduate student, Manas Baveja. In research published in 2005, Wein and Baveja analyzed the effectiveness of the U.S. Visitor and Immigrant Status Indicator Technology, a fingerprint identification program designed to prevent known terrorists from entering the United States. Using information on fingerprint readers from the Web site of the National Institute of Standards and Technology (NIST), the federal agency that tracks the United States' technology infrastructure, they found that terrorists could take advantage of the system by reducing the image quality of their fingerprints. (Simply rubbing one's fingers with sandpaper would do the trick.) They also found an easy solution: Require people to present more fingers if their prints were poor. The fix made the program more effective even though terrorists knew exactly how the system worked. Immigration officials quickly adopted a version of Wein and Baveja's idea. But NIST responded by removing the information Wein and Baveja used for their analysis from its Web site.

Or, take graduate student Sean Gorman's 2003 dissertation at George Mason University. Gorman used public information to map the fiber-optic network of the United States, identifying critical choke points in the country's telecommunications infrastructure. Under pressure from government officials, Gorman ultimately agreed to redact many of his most sensitive findings. Today, parts of his research remain classified, and much of the information Gorman used is no longer publicly available. But this reaction misses what was so valuable about his study. In identifying dangerous vulnerabilities, Gorman found efficient ways to remove them. Putting information behind lock and key does not make targets safe from attack. It leaves security analysts unable to find solutions to other weaknesses in the future. It also leaves government and industry less motivated to find safeguards of their own.

Harvard University’s Efraim Benmelech and the RAND Corp.'s Claude Berrebi took up research on a project with unmistakable relevance to everyone’s national security. Using publicly available data from the Israeli Security Agency, they found that not only are older, better-educated suicide bombers assigned to more important targets but that these bombers are deadlier in their attacks. Their analysis depended critically on having access to data about failed attacks, information that most other governments restrict out of fear that it can prove useful to terrorists. But understanding how terrorist groups assign operatives to missions does more to combat terrorism than the Israeli government’s openness did to abet it.

Governments must do a better job of assessing when the benefits to sharing information will exceed the costs. The question should not be, "Can this information help terrorists at all?" It should be, "Will sharing this information do more to protect society than it will to help those who wish us harm?" In determining the trade-offs, common sense is the most natural guide. When governments don't fully understand a system’s vulnerabilities, information should be shared so that analysts can find solutions before terrorists identify weaknesses. Similarly, information about known vulnerabilities should be shared when terrorists can easily identify the target.

The thinking capacity of a huge network of universities and research centers should be considered a national security strength, not a threat or nuisance that needs to be kept at arm’s length. The academy’s powers to analyze dangers and recommend safe, efficient solutions are stronger than that of the government, and infinitely stronger than that of any terrorist organization. When governments consider everyone a potential terrorist, they are insulating themselves from the brain power that should be our first line of defense.


Africa's Revolutionary Deficit

In many parts of Africa, anyone can start a revolution. And that's the problem.

Somalia is once again on the front page -- and the news isn't pretty. Since 2003, the country's seaside capital of Mogadishu has served as an arena for a battle of gladiators, pitting U.S.-backed warlords against gun-toting Islamic revolutionaries. With no capable or legitimate state to counter it, the Union of Islamic Courts emerged victorious last June, only to be felled in December by an enfeebled transitional government, formed in exile and backed by the Ethiopian military. A recent spate of assassination-style killings and suicide bombings herald the arrival of a new resistance movement intent on ejecting these foreign forces and the African Union troops now being dispatched to the country. Caught in the midst of this violent morass is Somalia's long-suffering population of 8.5 million, seeking order from whomever can provide it, simply hoping that the bully who comes out on top will care enough to reverse the country's economic collapse.

Somalia may be garnering headlines today, but the country’s strife parallels the bloodshed in far too many of Africa’s struggling nations. Violence has engulfed 27 of the 46 countries in sub-Saharan Africa since independence, and the revolutionary movements that emerged to wage these wars of "liberation" and "transformation" have rarely behaved better than the regimes they sought to uproot. In Sierra Leone, the Revolutionary United Front publicly challenged decades of corrupt leadership as it hacked its way through the countryside, killing and maiming thousands of civilians in its quest for control of the nation’s diamond mines. After the fall of Mobutu Sese Seko in the Democratic Republic of the Congo in 1997, a patchwork of competing militias and warlords ruled the vast eastern provinces, promising clean government and a return to democracy, while looting homes and raping women at will. In the past 10 years, the story has been no different in Angola, the Central African Republic, Chad, Congo, and Liberia: rebels trampling on civilian populations in their quest to capture the capital.

Why have Africa's civil wars so rarely produced revolutionary movements that fight for the political and economic changes that the population deserves? The answer is as simple as the violence is troubling. In much of Africa, the barriers to entry for rebel movements are simply too low. With states often incapable of projecting power outside of cities and insurgents easily able to finance their own private armies, just about anyone can hoist a flag, arm recruits, and launch a revolution. Building a rebel army should be difficult, in principle, because young people must risk their lives for highly uncertain returns. But in many parts of Africa, initiating a rebellion may be easier than starting a business.

Unlike early nation-states in Europe, where rulers depended on citizens for taxes and built strong states to protect them in return, Africa’s state-building process has often gone awry. Seldom do rebel leaders turn to civilians for the resources needed to field private armies. War is becoming cheaper, and the means to wage it flow from illicit trafficking in natural resources, contributions from foreign capitals, or networks of expats -- not from the voluntary contributions of those who most need political change. Legitimacy, too, depends not on popular support but simply on achieving control of the capital city, from where access to a seat at the United Nations provides all the protections of sovereignty. With such a system in place, is it really any surprise that civilian populations have been largely ignored by Africa’s revolutionaries?

The great irony is that in a part of the world where civil war is endemic, Africa faces a dispiriting shortage of true revolutionaries -- members of movements committed to replacing decades of misrule with effective, transparent governance. Only in places where armies have been mobilized with the most meager resources have we witnessed the birth of insurgencies that protect and advocate for the poor. But in countries rich in natural resources, where elites loot the treasury rather than provide public goods for ordinary people, civilians have been cursed with abusive insurgencies. These are environments in which an opportunistic form of rebellion is most attractive -- where the barriers to organizing an army are low, the pickings are good, and constructive revolutionary movements tend to be crowded out by criminals.

War must be made more expensive in Africa. That means redoubling efforts to choke the sources of financial support that prop up rebel armies. Stemming the trade in illicit resources is an important first step, but insurgent movements draw heavily on financing provided by neighboring governments. Just as governments were pressured by international nongovernmental organizations to clamp down on the trade in blood diamonds and other illegally traded resources, cross-border support for rebel groups must be unearthed, publicized, and penalized. Civil-society organizations have a role to play, but ultimately governments, acting through the U.N. Security Council, must make external alliances with rebels more costly. Diaspora financing, too, given its origins in rich countries, can be stopped at its source. And the proliferation of small arms and light weapons -- technologies that diminish the costs of raising an army -- requires urgent international attention. Rich countries continue to be among the most substantial producers and distributors of small arms; they should also demonstrate a clear commitment to stronger export and border controls and more aggressive efforts to dismantle trafficking networks, perhaps in the form of an international arms trade treaty.

Part of the challenge is that sovereignty accrues to whomever mobilizes the guns and men required to take a capital. But sovereignty, with all of its benefits, should be conditional. Putting it into practice, however, means abandoning decades of U.N. impartiality and recognizing that rebel movements, like governments, wear different stripes. A seat at the table should be a privilege, and it should be reserved for those who earn it.