Argument

The New Westphalian Web

The future of the Internet may lie in the past. And that's not a good thing.

Nearly 365 years ago, more than 100 warring diplomats and princes got together in the cities of Münster and Osnabrück, in what is now northwestern Germany. There they signed a set of treaties that became the basic framework for our modern world: the Peace of Westphalia. Thanks to these dignitaries, we have territorial sovereignty: nation-states, demarcated by borders.

In the intervening centuries, Westphalian sovereignty has been the basic ordering principle of our societies. Empires have risen and fallen, countries come and gone. The most successful states have established internal monopolies on information and resources and have exerted discretion on what trade, ideas, money, or people crossed their borders.

But 30 years ago, humanity gave birth to one of the most disruptive forces of our time. On Jan. 1, 1983, the implementation of TCP/IP -- a standard protocol to allow computers to exchange data over a network -- turned discrete clusters of research computers into a distributed global phenomenon. It was essentially the work of three men: two engineers to write the protocol, and one to carry out the plan. It was a birth so quiet no one even has a photo of the day; a recent post by one of TCP/IP's authors, Vint Cerf, was able to turn up only a commemorative pin.

It took awhile for the Internet to make it from mainframes in universities to desktops in the home, but as it did, it birthed its own culture, full of shorthands and memes, communities and cesspools. This Internet was wild and wooly, unknown and unregulated. It was clearly a place, but a place without any familiar cultural signposts, a space beyond the boundaries of geography or identity. It deserved its own name: cyberspace.

Like all new frontiers, cyberspace's early settlers declared themselves independent -- most famously in 1996, in cyberlibertarian John Perry Barlow's "A Declaration of the Independence of Cyberspace." Barlow asserted a realm beyond borders or government, rejecting the systems we use to run the physical universe. "Governments of the Industrial World," he reproached, "You have no sovereignty where we gather.… Cyberspace does not lie within your borders."

With the flip of a switch, three engineers had undone the work of more than 100 princes and diplomats.

Barlow was right, in part. Independence was a structural fact of cyberspace, and free expression and communication were baked into the network. The standards and protocols on which the Internet runs are agnostic: They don't care whether you were in Bangkok, Buenos Aires, or Boise. If they run into an attempt to block traffic, they merely reroute along a seemingly infinite network of decentralized nodes, inspiring technologist John Gilmore's maxim: "The Net interprets censorship as damage and routes around it."

And unlike almost every other global resource in history, the Internet largely escaped government regulation at first -- probably because no one could figure out how to make money from it. From the outset, it was managed not by governments, but by an ad hoc coalition of volunteer standards bodies and civil society groups composed of engineers, academics, and passionate geeks -- awkwardly dubbed the multistakeholder system.

So lawmakers and politicians wrung their hands over the Internet's lawlessness, gnashed their teeth at the moral decay of porn and downloads, and despaired at their inability to legislate a place without a geography. In the popular consciousness, the Internet was simultaneously a place of possibility and danger. In 1993, Time magazine warned, "People who use … the Net may be in for a shock.… Anybody can start a discussion on any topic and say anything."

It was precisely this structural independence that transformed the Internet from a mere tool for information-sharing to the world's open forum.

The rise of self-publishing tools like Blogger transformed the "third space" of cyberspace into a modern speaker's corner, offering any motivated writer a platform for his or her political views. Initially, this online free expression was often marginalized or dismissed -- the term "blogosphere" was originally a joke. But bloggers kept plugging away. In liberal democracies their free expression was guaranteed, and in closed societies connectivity was often too limited to draw any real attention.

In the past decade, however, all this has changed. Roughly 2 billion people use the Internet, in nearly every country in the world (North Korea is perhaps the last holdout). Blogs are now mainstream, and social networks have pushed self-publishing even closer to ordinary users, enabling instantaneous political and personal expression. And the Internet -- this global resource, this wild space independent of states -- has made its mark on our neatly ordered world of nations.

Information has always been power, and governments have long sought to control it. So for countries where power is a tightly controlled narrative, parsed by state television and radio stations, the Internet has been catastrophic. Its global, decentralized networks of information-sharing have routed around censorship -- just as Gilmore promised they would. It gives people an outlet to publish what the media cannot, organize where organizing is forbidden, and revolt where protest is unknown.

And the Internet isn't only threatening dictatorships. It has created new forms of political participation and protest in democracies, where it has been used to demand the decentralization of power to the people, facilitate radical transparency and information-sharing, demand responsive government, unseat corrupt authorities, organize marginalized minorities, and challenge the hegemony of traditional political heavyweights.

Naturally, systems of power have finally taken notice.

In response, governments around the world have begun to assert control, seeking to carve up the global Internet, manage it within national borders, and impose Westphalian sovereignty on the wild World Wide Web.

It's not entirely a new trend. The Great Firewall of China is almost as old as the Internet itself. But it is spreading, and taking new shapes.

Some of these efforts are explicitly about political control, imposing strict limits on what users within individual countries' borders can access. Iran's proposed halal Internet seeks to impose Islamic virtue on the browsing masses. In Russia, the state agency Roskomnadzor enforces an Internet block list that has filtered the blogs of government critics. And in Pakistan, a recently revived proposal for a national firewall targets "blasphemy" as a proxy for ideas unpopular with the government.

But some of this is about commerce and partitioning off intellectual property from a world without jurisdiction. In 2012, the United States saw proposed legislation, SOPA and PIPA, that would have made censorship a technical specification of U.S. networks and that threatened the stability of DNS -- a protocol that comprises the very backbone of the global web. And in Europe, the global trade agreement ACTA would have imposed similar restrictions -- all to reduce piracy.

Perhaps more worryingly, as countries seek to break up the Internet into neatly defined mirrors of themselves, they're trying to redefine international norms in order to justify their actions.

At the summit of the International Telecommunication Union in Dubai this past December, a bloc of countries -- RUCASS, made up of Russia, the United Arab Emirates, China, Algeria, Saudi Arabia, and Sudan -- floated a proposal that tried to define a new term: the "national Internet segment," or any telecommunications networks within the territory of a state. This language, later endorsed by Bahrain and Iraq, would have allowed countries full regulation of the Internet within their borders, from filtering content to imposing fees on foreign traffic. Ultimately, it was withdrawn.

But even without new international regulations, the technical backbone of our Internet is increasingly controlled at the national level. Two years ago, as the Arab world exploded in popular protest, governments responded by simply shutting off the Internet, removing entire countries from the international grid. Egypt's mobile services were shut down and its Internet almost entirely disconnected, while in Libya, the Internet was throttled to a point of uselessness.

Recently, the network research and analytics company Renesys tried to assess how hard it would be to take the world offline. They assessed disconnection risk based on the number of national service providers in every country, finding that 61 countries are at severe risk for disconnection, with another 72 at significant risk. That makes 133 countries where network control is so centralized that the Internet could be turned off with not much more than a phone call.

It seems our global Internet is not so global.

But as worrying as these threats are, at least they have all been civilian, rather than military, attempts to exert control over the web. This won't be the case for long: Governments around the world are sounding alarms about the existential threat posed by cyberwar. From hostile foreign regimes to lawless nonstate actors, the threat of attacks on critical infrastructure to the theft of state secrets, the danger of economic warfare to corporate espionage, not a day goes by when cybersecurity is not in the news.

In response, governments around the world are devoting significant financial, military, and personnel resources to developing frameworks for cybersecurity and cyberconflict. Cyberspace is no longer the independent space of the cyberlibertarians; it is now a military domain. And when a freewheeling place like the Internet militarizes, the Internet's laissez-faire culture of privacy, anonymity, and free expression inevitably comes into conflict with military priorities of security and protocol.

In the United States, the Pentagon has been tasked with the development of rules of engagement for cyberconflict. Just last week on Feb. 12, President Barack Obama issued a long-awaited executive order on cybersecurity and used his State of the Union address to call for new bipartisan legislation on the issue, emphasizing the need to protect U.S. critical infrastructure. The very next day, Rep. Mike Rogers (R-Mich.) and Rep. Dutch Ruppersberger (D-Md.) reintroduced CISPA, the Cyber Intelligence Sharing and Protection Act -- a bill reviled by the privacy and civil liberties community for its lack of credible privacy protections and provisions for warrantless information-sharing.

Make no mistake, cyberhostilities are on the increase. Every day around the world, critical systems come under attack, whether from petty cybercriminals or coordinated state efforts. From Stuxnet, which set back Iran's nuclear efforts, to Shamoon, which destroyed the control systems of oil giant Saudi Aramco, to the recent compromise of the Washington Post, New York Times, Twitter, and Facebook, we're witnessing large-scale attempts to penetrate and interfere with both private and public systems.

Many cybersecurity experts, however, disagree on how to best tackle the threats at hand. Many dismiss proposals such as public-private data exchanges, arguing that such solutions erode civil liberties while failing to address critical problems. Others argue that reducing cyberconflict is best achieved through embracing the values of an open Internet: creating transparent norms, such as establishing clear red lines, common terminology, and mutual confidence-building measures. But the most influential voices remain those arguing for greater militarization: investing in the development of strategic exploits or offensive capacity that double down on the idea of the Internet as a domain subject to dominance by state actors.

Nearly 365 years ago, those hundred-plus princes and diplomats came together to end war -- and in the process, created borders. The Internet broke those borders down, advancing the cause of fundamental rights, free expression, and shared humanity in all its messy glory. Now, to stifle political dissent and in the name of defending national security, governments are putting those borders back up -- and in doing so, they're dragging the Internet into ancient history.

LOIC VENANCE/AFP/Getty Images

Argument

Rolling out the Red Carpet

Why is Hollywood kowtowing to China?

In the 1997 international political thriller Red Corner, Chinese officials in Beijing entrap an American lawyer for murder. Richard Gere, a noted disciple of the Dalai Lama, China's public enemy No. 1, plays the lawyer fighting for justice in the benighted Chinese legal system, aided by a Chinese female lawyer willing to risk her life for American-style justice and freedoms. But by 2013, another American lawyer was finding love and humor in Shanghai -- the premise of the just-released romantic comedy Shanghai Calling, which the New York Times calls "a plug" for China. These days, "Why would you make a movie that demonizes China?" asks Daniel Hsia, who wrote and directed the film.

Why indeed? Over the past two decades, Hollywood's perception of China has evolved, from a totalitarian state to a major growth opportunity. And as the American movie industry increasingly needs China, its films have begun to alter content accordingly. Life of Pi, which has no connection to China besides the Taiwanese ethnicity of its director Ang Lee, has received 11 nominations for Sunday's Oscars, and box-office receipts of more than $90 million on the mainland. The uncontroversial film is the only one of the Oscar nominees for Best Picture to have been shown in movie theaters in China. In all likelihood, that's for good reason: In the American version, a character declares that "religion is darkness"; in the Chinese it was changed.

An offspring of a co-production with China Film Group, the largest state film conglomerate, Shanghai Calling underscores Hollywood's shifting strategy toward China and the overt or self-censorship it brings. A decade after China entered the World Trade Organization, Hollywood is only allowed to export about 20 films a year to the China market, where box office sales climbed to more than $2 billion in 2012.

One way around the quota restriction, explains Hsia, is to get approval for co-productions. Under this arrangement, a Hollywood studio partners with a Chinese entity in order to have the final product considered a domestic film, exempting it from the import quota. It also allows for risk-sharing, because the Chinese partner puts up part of the money. The potential for Chinese money and market access is highly attractive to a Hollywood that faces dwindling domestic ticket sales and saw declining profits in five out of six of its major studios in 2012.

Although China has made it much easier for Americans to invest, getting a co-production approved is still a difficult process. Ideologues in the Communist Party have long considered Western culture "spiritual pollution" and viewed Hollywood suspiciously as an instrument of American statecraft packaged into nebulous "soft power." Scripts for co-productions are submitted for approval to the State Administration for Radio, Film, and Television (SARFT), which oversees the film and entertainment industry. "Like in any business negotiation, the person who has the power to say no has the leverage," says Hsia.

Here's where censorship comes in: SARFT even meddled with the making of a rather innocuous and apolitical comedy like Shanghai Calling. But beyond what foreign filmmakers must do to get a co-production approved, the effort to avoid offending the Chinese has had an impact on film content in the U.S. market. Subtle but noticeable changes have also seeped into on-screen portrayals of China.

In Hollywood in the 1990s, China was an oppressive place. Red Corner opens with Gere gazing up at security cameras in Beijing's Tiananmen Square, ground zero of the infamous bloodshed of early June, 1989, seared into many Americans' memories. Brad Pitt, too, had been blacklisted from China, ostensibly for starring in the 1997 feature Seven Years in Tibet, in which his character becomes friends with the young Dalai Lama.

Hollywood has also tended to churn out political activist A-listers, some of whom have had uneasy relationships with the Chinese government. Actress Mia Farrow contributed to director Steven Spielberg's defection in early 2008 from the Beijing Olympics advisory committee over China's involvement in Sudan; Christian Bale, while filming in China in 2011, tried to visit then imprisioned Chinese dissident Chen Guangcheng. As an industry whose craft is telling stories, however woeful and inadequate at times, Hollywood stands squarely within the proud tradition of American idealism that revolts against oppression and celebrates individual freedoms.

But things are changing. The apocalyptic 2012, released at the height of the financial crisis in 2009, depicts the Chinese as ingenious saviors who assembled massive arks to house the few humans selected to carry on the human race. Oliver Platt, playing a White House staffer, even slips in the line "Leave it to the Chinese. I didn't think it was possible. Not in the time we had." Men in Black 3 digitally cropped scenes of New York's Chinatown that were considered unflattering, and the highly anticipated Iron Man 3 is also expected to include positive references to China.

The kowtowing occasionally descends into farce, as with the November 2012 release of the remake of Red Dawn, a Cold War-era cult classic, in which a band of American teens defeats an invading army of North Koreans. Except the enemies weren't supposed to be North Koreans, but rather Chinese; the producers changed the nationality of the invaders mid-filming, and digitally erased Chinese flags. As implausible as a Chinese invasion of the American Midwest sounds, it is far more realistic than one from North Koreans.

Beyond content adjustments, casting choices and shooting locations are being sinified. The Expendables sequel traded Jet Li for a Chinese vixen, Nan Yu, who is not Lucy Liu; Taiwanese pop sensation Jay Chou (who is not Jackie Chan) played alongside Seth Rogen in the reincarnation of Green Hornet, and Chinese starlet Zhou Xun has popped up in Cloud Atlas.

What was once Hong Kong's quintessential role as the establishing shot -- alerting theater audiences that they're now in China -- has now been overtaken by glitzy mainland metropolises. Tom Cruise's 2006 Mission Impossible 3 was perhaps the first major blockbuster to set a lengthy scene in contemporary Shanghai, portrayed as developed and futuristic. Since then, Will Smith has taken the Karate Kid 2 to Beijing, Transformers had sets designed to evoke Shanghai, and the newest James Bond and the dystopian future adventure flick Looper also threw down in Shanghai. 

The era in which China could still be a menacing villain and stir political passions from the Spielbergs and the Geres appears to be ending. Even Brangelina are reportedly studying Mandarin. And the political drama surrounding disgraced Chinese politician Bo Xilai, ripe for Hollywoodification, will never see the light of day. Too bad, because the Bo Ultimatum is the Chinese Godfather waiting to be made. As Hollywood gathers for its biggest awards night Sunday, the industry seems to be biting its tongue. After all, the future, as Jeff Daniels quips in Looper, is in China.

LIU JIN/AFP/Getty Images