Voice

Cyber Fail

Why can't the government keep hackers out? Because the public is afraid of letting it.

The world's leading cyberpower is … North Korea. This is the considered opinion of Richard Clarke, former cyberczar and advisor to four presidents. How has he come to this conclusion? Very sensibly, by assessing countries in terms of their offensive and defensive capabilities, along with the degree to which they are dependent on the Net and the Web. North Korea has only modest attacking capabilities -- don't look for the next Stuxnet to come slinking out of Pyongyang -- but its cyberdefenses are formidable, and there is little in that sad land that requires connectivity to cyberspace in order to keep working.

How does the United States fare in Clarke's analysis? Despite fielding the world's best computer worms and viruses, America rates only a fourth-place position -- Russia comes in second and China third. The United States gets dragged down by its pitifully poor defenses, coupled with very high cyberdependence. At the Aspen Security Forum this summer, the head of Cyber Command, Gen. Keith Alexander, went so far as to give a grade of "3" to U.S. defenses on a scale of 1 to 10. He observed that cybersnooping is now so rampant that the theft of intellectual property constituted the "greatest transfer of wealth in history."

Things don't look so good -- and they're not getting better.

The recent defeat of the Senate's Cybersecurity Act of 2012 is just the latest reverse in a nearly 20-year run of repeated failures to master the challenge of protecting the virtual domain. Back in President Bill Clinton's first term, the "clipper chip" concept was all about improving the security of private communications. Americans were to enjoy the routine ability to send strongly encoded messages to each other that criminals and snoops would not be able to hack, making cyberspace a lot safer.

But the government was still to hold a "key" that would let it tap into and monitor said messages, primarily for purposes of law enforcement. The initiative foundered over this too-intrusive capacity. All these years later, the Cybersecurity Act called for a similar (though less encompassing) monitoring capability -- along with the request that commercial firms voluntarily share more information -- and died because of the concerns it rekindled.

These events are just the bookends of a long policymaking trail of tears. In the years after the clipper-chip debacle, commission after commission rose up to study how to improve cybersecurity without unduly violating privacy. Yet, even as the government considered snooping and hacking central concerns, it opposed the very idea of improving individual security by encouraging the use of powerful encryption -- largely because the intelligence and law enforcement communities strongly resisted any initiative that might reduce their ability to conduct cybertaps.

The government's intransigence was only countered in the end by the actions of "code rebels," to use tech journalist Steven Levy's term, who broke the rules -- and, arguably, the law -- by making top-tier encryption available to the people. Thanks to them, average Americans now have access to the same strong encryption capabilities available to their leaders -- as well as to the range of criminals, terrorists, and other rogues who are so utterly reliant on keeping their communications secure.

Sadly, industry leaders have never emphasized the value of strong crypto sufficiently either. There are many reasons for this neglect -- the most likely being that encouraging ubiquitous use of strong crypto could weaken sales of the firewalls and anti-viral products that form so much of the cybersecurity business model. Most importantly, though, cybersecurity today is poor because the market hasn't demanded it. Consumers are much more interested in features such as speed, variety of apps, weight, even color -- so this is what drives production. It's a classic case of market failure.

Thus, the complex, constantly growing virtual world -- upon which individuals, commercial enterprises, and militaries are increasingly dependent -- is plagued by rampant insecurity. So say top governmental officials today. So say those who know the results of the CIA's extensive (and still classified) cyberwar game, Silent Horizon, conducted several years ago. And so say all involved in defending against the serious, real-life intrusions into defense information systems known to the public under names like Moonlight Maze and Titan Rain -- the former apparently involving sophisticated Russian hackers, the latter seemingly emanating from China.

Unless there is a profound change in perspective, the market will continue to fail, with manufacturers focusing on speedy, attractive tech products instead of secure ones. Unless a fresh mindset emerges among the public, the fear of Big Brother will prevent legislative action, even though the data-mining about individuals and consumer habits conducted by marketers and social networking sites -- a lot of Little Brothers -- already dwarfs what the government knows. It is odd indeed that people freely allow organizations like Facebook a level of access into their private lives that they resist giving their elected leaders in Washington. And unless presidents and their advisors start taking cyberthreats more seriously and stop saying things like "There is no cyberwar" (as President Barack Obama's former cyberczar, Howard Schmidt, used to), the lack of leadership on this issue will leave America gravely vulnerable.

But ways ahead do exist. There is a regulatory role: to mandate better security from the chip-level out -- something that Sen. Joseph Lieberman's Cybersecurity Act would only have made voluntary. Encouraging the widespread use of encryption can assuage fears about the loss of privacy. And finally, we should treat cybersecurity as a foreign-policy issue, not just a domestic one. For if countries, and even some networks, can find a way to agree to norms that discourage cyberwar-making against civilian infrastructure -- much as the many countries that can make chemical and biological weapons have signed conventions against doing so -- then it is just possible that the brave new virtual world will be a little less conflict prone.

JIM WATSON/AFP/Getty Images

National Security

Cool War

Could the age of cyberwarfare lead us to a brighter future?

"It is well that war is so terrible," Confederate General Robert E. Lee once said, "lest we should grow too fond of it." For him, and generations of military leaders before and since, the carnage and other costs of war have driven a sense of reluctance to start a conflict, or even to join one already in progress.

Caution about going to war has formed a central aspect of the American public character. George Washington worried about being drawn into foreign wars through what Thomas Jefferson later called "entangling alliances." John Quincy Adams admonished Americans not to "go abroad in search of monsters to destroy." Their advice has generally been followed. Even when it came to helping thwart the adventurer-conquerors who started the twentieth century's world wars, the United States stayed out of both from the outset, entering only when dragged into them.

This pattern briefly changed during the Cold War, with the launching of military interventions in Korea and Vietnam. The former was fought to a bloody draw; the latter turned into a costly debacle. Both were quite "terrible," costing tens of thousands of American lives and untold treasure -- nearly 100,000 lives and trillions of dollars -- reaffirming Lee's reservations.

Operation Desert Storm -- a lopsided win against a weak opponent in Iraq -- seemed to break the pattern, ushering in President George H.W. Bush's "new world order." But the military experiments in regime change begun by his son -- an unexpectedly long and bloody slog through Iraq and Afghanistan -- reawakened traditional concerns about going to war, propelling Barack Obama to the presidency and energizing Ron Paul's support within the GOP.

Even Obama's "intervention-lite" in Libya proved unsatisfying, unleashing much suffering and uncertainty about the future of that sad land. And a furious debate rages about the practical and ethical value of drone bombing campaigns and "targeted killing" of our enemies -- due in part to the deaths of innocents caught up in these attacks, but also because of the possibility of fomenting rabidly anti-American sentiments, perhaps even revolution, in places like nuclear-armed Pakistan.

But now, somehow, it seems that war may no longer seem so terrible.

How has this come to pass? The culprit is the bits and bytes that are the principal weapons of cyberwar. It is now possible to intervene swiftly and secretly anywhere in the world, riding the rails of the global information infrastructure to strike at one's enemies. Such attacks can be mounted with little risk of discovery, as the veil of anonymity that cloaks the virtual domain is hard to pierce. And even when "outed," a lack of convincing forensic evidence to finger the perpetrator makes heated denials hard to disprove.

Beyond secrecy, there is also great economy. The most sophisticated cyber weaponry can be crafted and deployed at a tiny fraction of the cost of other forms of intervention. No aircraft carriers needed, no "boots on the ground" to be shot at or blown up by IEDs. Instead, there is just a dimly lit war room where hacker-soldiers click for their country, and the hum of air conditioners keeping powerful computers from overheating. Cool room, cool war.

The early returns seem to suggest the great efficacy of this new mode of conflict. For example, the Stuxnet worm, a complex program of ones and zeros, infected a sizeable proportion of Iran's several thousand centrifuges, commanding them to run at higher and higher speeds until they broke. All this went on while Iranian technicians tried fruitlessly to stop the attack. The result: a serious disruption of Tehran's nuclear enrichment capabilities -- and possibly of a secret proliferation program.

The sabotage occurred without any missile strikes or commando raids. And, for now, without any open acknowledgment of responsibility, although reporters and others have pointed their fingers at the United States and Israel. It is loose lips in high places, not sophisticated "back hacking," that seem to have divulged the secret of Stuxnet.

Another example of the looming cool war is the malicious software known as Flame, which sought information via cyber snooping from target countries in the Middle East. The code that comprises it seems to make the point that we no longer need physical agents in place if we can now rely on artificially intelligent agents to dredge up the deepest secrets. There will be no new John le Carré to chronicle this era's spies. Not when the closest thing to George Smiley is a few lines of source code.

Beyond Stuxnet-like "cybotage" and software-driven spying, the coming cool war might also influence whether some traditional wars are even going to break out. The good news is that a preemptive cyber attack on the military command-and-control systems of two countries getting ready to fight a "real war" might give each side pause before going into the fight. In this instance, the hackers mounting such attacks should probably publicize their actions -- perhaps even under U.N. auspices --  lest the disputants think it was the enemy who had crippled their forces, deepening their mutual antagonism. There are no doubt some risks in having a third party mount a preemptive cyberattack of this sort -- but the risks are acceptable when weighed against the chance of averting a bloody war.

The other potential upside of cool war capabilities, in addition to tamping down military crises between nations, would lie in multilateral tracking of transnational criminal and terrorist networks. These villains thrive in the virtual wilderness of cyberspace, and it is about time that they were detected, tracked, and disrupted. Think of Interpol, or an international intelligence alliance, using something like Flame to get inside a drug cartel's communications network. Or al Qaeda's. The potential for illuminating these dark networks -- and bringing them to justice -- is great and should not be forgone.

On balance, it seems that cyberwar capabilities have real potential to deal with some of the world's more pernicious problems, from crime and terrorism to nuclear proliferation. In stark contrast to pitched battles that would regularly claim thousands of young soldiers' lives during Robert E. Lee's time, the very nature of conflict may come to be reshaped along more humane lines of operations. War, in this sense, might be "made better" -- think disruption rather than destruction. More decisive, but at the same time less lethal.

Against these potential benefits, one must also weigh the key downside of an era of cyber conflict: the outbreak of a Hobbesian "war of all against all." This possibility was first considered back in 1979 by the great science fiction writer Frederik Pohl, whose dystopian The Cool War -- a descriptor that might end up fitting our world all too well -- envisioned a time when virtually every nation fielded small teams of hit men and women. Their repertoires included launching computer viruses to crash stock markets and other nefarious, disruptive capabilities.

In Pohl's novel, the world system is battered by waves of social distrust, economic malaise and environmental degradation. Only the rebellion of a few cool warriors - some, but not all, were hacker types -- at the end, offers a glimmer of hope for a way out and a way ahead.

The question that confronts us today is whether to yield to the attractions of cyberwar. We have come out of one of mankind's bloodiest centuries, and are already in an era in which wars are smaller -- if still quite nasty. Now we have the chance to make even these conflicts less lethal. And in reality, there may be no option. Once the first network or nation takes this path -- as some observers believe the United States is doing -- others will surely follow, starting a new arms race, this time not in weaponry, but in clandestine and devastating programs like Stuxnet and the Flame virus.

It is a curious irony that the United States, a power traditionally reluctant to go to war but furious in its waging, is now seemingly shifting gears. It is becoming a nation with the capability to go to war easily, while at the same time far less ferociously. Is this an improvement? Perhaps. Delaying Iranian proliferation with bits and bytes seems far superior to the costs and risks that would be incurred, and the human suffering inflicted, by trying to achieve such effects with bombs and bullets.

But looking ahead, how will Americans respond when others begin to employ cyber means to achieve their ends, perhaps even by attacking us? After all, Stuxnet escaped from that Iranian facility into the wild, and is certainly being studied, reverse engineered and tweaked by many around the world. No country may be foolish enough to engage the incomparable U.S. military in open battle, but we seem like fairly easy pickings to the computer mice that may soon roar.

Despite all these concerns, though, a cool war world will be a better place to live in than its Cold War predecessor. Yes, conflict will continue in the years to come, but it will morph in ways that make our self-destruction as a civilization less likely -- even if it means living with occasional disruptions to vulnerable high-tech systems.

The bargain made when "cyber" and "war" came together need not turn out to be Faustian. This story can still have a happy ending: As war becomes "cooler," mankind's future may edge a bit closer to the utopian end that all of us, secretly or not so secretly, truly desire.

NICOLAS ASFOURI/AFP/Getty Images