Think Again

Think Again: Big Data

Why the rise of machines isn't all it's cracked up to be.

"Big data" is the jargon du jour, the tech world's one-size-fits-all (so long as it's triple XL) answer to solving the world's most intractable problems. The term is commonly used to describe the art and science of analyzing massive amounts of information to detect patterns, glean insights, and predict answers to complex questions. It might sound a bit dull, but from stopping terrorists to ending poverty to saving the planet, there's no problem too big for the evangelists of big data.

"The benefits to society will be myriad, as big data becomes part of the solution to pressing global problems like addressing climate change, eradicating disease, and fostering good governance and economic development," crow Viktor Mayer-Schönberger and Kenneth Cukier in modestly titled Big Data: A Revolution that Will Transform How We Live, Work, and Think.

So long as there are enough numbers to crunch -- whether it's data from your iPhone, grocery store purchases, online dating profile, or, say, the anonymized health records of an entire country -- the insights that can be gleaned from our computing ability to decode this raw data are innumerable. Even Barack Obama's administration has jumped with both feet on the bandwagon, releasing on May 9 a "groundbreaking" trove of "previously inaccessible or unmanageable data" to entrepreneurs, researchers, and the public.

"One of the things we're doing to fuel more private-sector innovation and discovery is to make vast amounts of America's data open and easy to access for the first time in history. And talented entrepreneurs are doing some pretty amazing things with it," said President Obama.

But is big data really all it's cracked up to be? Can we trust that so many ones and zeros will illuminate the hidden world of human behavior? Foreign Policy invited Kate Crawford of the MIT Center for Civic Media to go behind the numbers. —Ed.

 

"With Enough Data, the Numbers Speak for Themselves."

Not a chance. The promoters of big data would like us to believe that behind the lines of code and vast databases lie objective and universal insights into patterns of human behavior, be it consumer spending, criminal or terrorist acts, healthy habits, or employee productivity. But many big-data evangelists avoid taking a hard look at the weaknesses. Numbers can't speak for themselves, and data sets -- no matter their scale -- are still objects of human design. The tools of big-data science, such as the Apache Hadoop software framework, do not immunize us from skews, gaps, and faulty assumptions. Those factors are particularly significant when big data tries to reflect the social world we live in, yet we can often be fooled into thinking that the results are somehow more objective than human opinions. Biases and blind spots exist in big data as much as they do in individual perceptions and experiences. Yet there is a problematic belief that bigger data is always better data and that correlation is as good as causation.

For example, social media is a popular source for big-data analysis, and there's certainly a lot of information to be mined there. Twitter data, we are told, informs us that people are happier when they are farther from home and saddest on Thursday nights. But there are many reasons to ask questions about what this data really reflects. For starters, we know from the Pew Research Center that only 16 percent of online adults in the United States use Twitter, and they are by no means a representative sample -- they skew younger and more urban than the general population. Further, we know many Twitter accounts are automated response programs called "bots," fake accounts, or "cyborgs" -- human-controlled accounts assisted by bots. Recent estimates suggest there could be as many as 20 million fake accounts. So even before we get into the methodological minefield of how you assess sentiment on Twitter, let's ask whether those emotions are expressed by people or just automated algorithms.

But even if you're convinced that the vast majority of tweeters are real flesh-and-blood people, there's the problem of confirmation bias. For example, to determine which players in the 2013 Australian Open were the "most positively referenced" on social media, IBM conducted a large-scale analysis of tweets about the players via its Social Sentiment Index. The results determined that Victoria Azarenka was top of the list. But many of those mentions of Azarenka on Twitter were critical of her controversial use of medical timeouts. So did Twitter love her or hate her? It's difficult to trust that IBM's algorithms got it right.

Once we get past the dirty-data problem, we can consider the ways in which algorithms themselves are biased. News aggregator sites that use your personal preferences and click history to funnel in the latest stories on topics of interest also come with their own baked-in assumptions -- for example, assuming that frequency equals importance or that the most popular news stories shared on your social network must also be interesting to you. As an algorithm filters through masses of data, it is applying rules about how the world will appear -- rules that average users will never get to see, but that powerfully shape their perceptions.

Some computer scientists are moving to address these concerns. Ed Felten, a Princeton University professor and former chief technologist at the U.S. Federal Trade Commission, recently announced an initiative to test algorithms for bias, especially those that the U.S. government relies upon to assess the status of individuals, such as the infamous "no-fly" list that the FBI and Transportation Security Administration compile from the numerous big-data resources at the government's disposal and use as part of their airport security regimes.

wikimedia

"Big Data Will Make Our Cities Smarter and More Efficient."

Up to a point. Big data can provide valuable insights to help improve our cities, but it can only take us so far. Because not all data is created or even collected equally, there are "signal problems" in big-data sets -- dark zones or shadows where some citizens and communities are overlooked or underrepresented. So big-data approaches to city planning depend heavily on city officials understanding both the data and its limits.

For example, Boston's Street Bump app, which collects smartphone data from drivers going over potholes, is a clever way to gather information at low cost, and more apps like it are emerging. But if cities begin to rely on data that only come from citizens with smartphones, it's a self-selecting sample -- it will necessarily have less data from those neighborhoods with fewer smartphone owners, which typically include older and less affluent populations. While Boston's Office of New Urban Mechanics has made concerted efforts to address these potential data gaps, less conscientious public officials may miss them and end up misallocating resources in ways that further entrench existing social inequities. One need only look to the 2012 Google Flu Trends miscalculations, which significantly overestimated annual flu rates, to realize the impact that relying on faulty big data could have on public services and public policy.

The same is true for "open government" initiatives that post data about public sectors online, such as Data.gov and the White House's Open Government Initiative. More data won't necessarily improve any functions of government, including transparency or accountability, unless there are mechanisms to allow engagement between the public and their institutions, not to mention aid the government's ability to interpret the data and respond with adequate resources. None of that is easy. In fact, there just aren't many skilled data scientists around yet. Universities are currently scrambling to define the field, write curricula, and meet demand.

Human rights groups are also looking to use big data to help understand conflicts and crises. But here too there are questions about the quality of both the data and the analysis. The MacArthur Foundation recently awarded an 18-month, $175,000 grant to Carnegie Mellon University's Center for Human Rights Science to investigate how big-data analytics are changing human rights fact-finding, such as through development of "credibility tests" to sort alleged human rights violations posted to sites like Crisis Mappers, Ushahidi, Facebook, and YouTube. The director of the center, Jay D. Aronson, notes that there are "serious questions emerging about the use of data and the responsibilities of academics and human rights organizations to its sources. In many cases, it is unclear whether the safety and security of the people reporting the incidents is enhanced or threatened by these new technologies."

NOEL CELIS/AFP/Getty Images

"Big Data Doesn't Discriminate Between Social Groups."

Hardly. Another promise of big data's alleged objectivity is that there will be less discrimination against minority groups because raw data is somehow immune to social bias, allowing analysis to be conducted at a mass level and thus avoiding group-based discrimination. Yet big data is often deployed for exactly this purpose -- to segregate individuals into groups -- because of its ability to make claims about how groups behave differently. For example, a recent paper points to how scientists are allowing their assumptions about race to shape their big-data genomics research.

As Alistair Croll writes, the potential for big data to be used for price discrimination raises serious civil rights concerns, a practice that was historically known as "redlining." Under the rubric of "personalization," big data can be used to isolate specific social groups and treat them differently, something that laws often prohibit businesses or humans from doing explicitly. Companies can choose to show online ads for a credit card offer to people who are most attractive in terms of household income or credit history to banks, leaving others completely unaware that a particular offer is available. Google even has a patent to dynamically price content: So if your past buying history indicates you are more likely to pay top dollar for shoes, your starting price the next time you shop for footwear online might be considerably higher. Now employers are trying to get apply big data to human resources, assessing how to make employees more productive, all by analyzing their every click and tap. Employees may have no idea how much data is being gathering about them or how it is being used.

Discrimination can also take on other demographic dimensions. For example, the New York Times reported that Target started compiling analytic profiles of its customers years ago; it now has so much data on purchasing trends that it can predict under certain circumstances if a woman is pregnant with an 87 percent confidence rate, simply based on her shopping history. While the Target statistician in the article emphasizes how this will help the company improve its marketing to expectant parents, one can also imagine such determinations being used in other ways to discriminate that might have serious ramifications for social equality and, of course, privacy.

And recently, a big-data study from Cambridge University of 58,000 Facebook "likes" was used to predict very sensitive personal information about users, such as sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parents' marital status, age, and gender. As journalist Tom Foremski observes of the study: "Easy access to such highly sensitive information could be used by employers, landlords, government agencies, educational institutes, and private organizations, in ways that discriminate and punish individuals. And there's no way [to] fight it."

Finally, consider the implications in the context of law enforcement. From Washington, D.C., to New Castle County, Delaware, police are turning to "predictive policing" models of big data in the hopes that they will shine investigative light on unsolved cases and even help prevent future crimes. However, focusing police activity on particular big data-detected "hot spots" runs the danger of reinforcing stigmatized social groups as likely criminals and institutionalizing differential policing as a standard practice. As one police chief has written, although predictive policing algorithms explicitly avoid categories such as race or gender, the practical result of using such systems without sensitivity to differential impact can be "a recipe for deteriorating community relations between police and the community, a perceived lack of procedural justice, accusations of racial profiling, and a threat to police legitimacy."

Tim Boyle/Getty Images

"Big Data Is Anonymous, so It Doesn't Invade Our Privacy."

Flat-out wrong. While many big-data providers do their best to de-identify individuals from human-subject data sets, the risk of re-identification is very real. Cell-phone data, on mass, may seem fairly anonymous, but a recent study on a data set of 1.5 million cell-phone users in Europe showed that just four points of reference were enough to individually identify 95 percent of people. There is a uniqueness to the way that people make their way through cities, the researchers observed, and given how much can be inferred by the large number of public data sets, this makes privacy a "growing concern." We already know, thanks to academics like Alessandro Acquisti, how to predict an individual's Social Security number simply by cross-analyzing publicly available data.

But big data's privacy problem goes far beyond standard re-identification risks. Currently, medical data sold to analytics firms has a risk of being used to track your identity. There is a lot of chatter about personalized medicine, where the hope is that drugs and other therapies will be so individually targeted that they work to heal an individual's body as if they were made from that person's very own DNA. It's a wonderful prospect in terms of improving the power of medical science, but it's fundamentally reliant on personal identification at cellular and genetic levels, with high risks if it is used inappropriately or leaked. But despite the rapid growth in personal health data collectors such as RunKeeper and Nike+, practical use of big data to improve health-care delivery is still more aspiration than reality.

Other kinds of intimate information are being collected by big-data energy initiatives, such as the Smart Grid. This effort looks to improve the efficiency of energy distribution to our homes and businesses by analyzing enormous data sets of consumer energy usage. The project has great promise but also comes with great privacy risks. It can predict not only how much energy we need and when we need it, but also minute-by-minute information on where we are in our homes and what we are doing. This can include knowing when we are in the shower, when our dinner guests leave for the night, and when we turn off the lights to go to sleep.

Of course, such highly personal big-data sets are a prime targets for hackers or leakers. WikiLeaks has been at the center of some of the most significant big-data releases of recent times. And as we saw recently with the massive data leak from Britain's offshore financial industry, the 1 percenters of the world are just as vulnerable as everyone else to having their personal data made public.

MOHAMMED AL-SHAIKH/AFP/Getty Images

"Big Data Is the Future of Science."

Partly true, but it has some growing up to do. Big data offers new roads for science, without a doubt. We only need look to the discovery of the Higgs boson particle, a result of the largest grid-computing project in history, with CERN using the Hadoop Distributed File System to manage all the data. But unless we recognize and address some of big data's inherent weaknesses in reflecting on human lives, we may make major public policy and business decisions based on incorrect assumptions.

To address this, data scientists are starting to collaborate with social scientists, who have a long history of critically engaging with data: assessing sources, the methods of data collection, and the ethics of use. Over time, this means finding new ways to combine big-data approaches with small-data studies. This goes well beyond advertising and marketing approaches like focus groups or A/B testing (in which two versions of a design or outcome are shown to users in order to see which variation proves more effective). Rather, new hybrid methods can ask questions about why people do things, beyond just tallying up how often something occurs. That means drawing on sociological analysis and deep ethnographic insight as well as information retrieval and machine learning.

Technology companies recognized early on that social scientists could give them greater insight into how and why people engage with their products, such as when Xerox's PARC hired pioneering anthropologist Lucy Suchman. The next stage will be a richer collaboration between computer scientists, statisticians, and social scientists of many stripes -- not just to test the findings of each other's work, but to ask fundamentally different kinds of questions, with greater rigor.

Given the immense amount of information collected about us every day -- including Facebook clicks, GPS data, health-care prescriptions, and Netflix queues -- we must decide sooner rather than later whom we can trust with that information, and for what purpose. We can't escape the fact that data is never neutral and that it's difficult to anonymize. But we can draw on expertise across different fields in order to better recognize biases, gaps, and assumptions, and to rise to the new challenges to privacy and fairness.

FABRICE COFFRINI/AFP/Getty Images

Think Again

Think Again: European Decline

Sure, it may seem as if Europe is down and out. But things are far, far better than they look.

 

"Europe Is History."

No. These days, many speak of Europe as if it has already faded into irrelevance. In the words of American pundit Fareed Zakaria, "it may well turn out that the most consequential trend of the next decade will be the economic decline of Europe." According to Singaporean scholar Kishore Mahbubani, Europe "does not get how irrelevant it is becoming to the rest of the world." Not a day went by on the 2012 U.S. campaign trail, it seemed, without Republican challenger Mitt Romney warning that President Barack Obama was -- gasp -- turning the United States into a "European social welfare state."

With its anemic growth, ongoing eurocrisis, and the complexity of its decision-making, Europe is admittedly a fat target right now. And the stunning rise of countries like Brazil and China in recent years has led many to believe that the Old World is destined for the proverbial ash heap. But the declinists would do well to remember a few stubborn facts. Not only does the European Union remain the largest single economy in the world, but it also has the world's second-highest defense budget after the United States, with more than 66,000 troops deployed around the world and some 57,000 diplomats (India has roughly 600). The EU's GDP per capita in purchasing-power terms is still nearly four times that of China, three times Brazil's, and nearly nine times India's. If this is decline, it sure beats living in a rising power.

Power, of course, depends not just on these resources but on the ability to convert them to produce outcomes. Here too Europe delivers: Indeed, no other power apart from the United States has had such an impact on the world in the last 20 years. Since the end of the Cold War, the EU has peacefully expanded to include 15 new member states and has transformed much of its neighborhood by reducing ethnic conflicts, exporting the rule of law, and developing economies from the Baltic to the Balkans. Compare that with China, whose rise is creating fear and provoking resistance across Asia. At a global level, many of the rules and institutions that keep markets open and regulate world trade, limit carbon emissions, and prosecute human rights abusers were created by the European Union. Who was behind the World Trade Organization and the International Criminal Court? Not the United States or China. It's Europe that has led the way toward a future run by committees and statesmen, not soldiers and strongmen.

Yes, the EU now faces an existential crisis. Even as it struggles, however, it is still contributing more than other powers to solving both regional conflicts and global problems. When the Arab revolutions erupted in 2011, the supposedly bankrupt EU pledged more money to support democracy in Egypt and Tunisia than the United States did. When Libya's Muammar al-Qaddafi was about to carry out a massacre in Benghazi in March 2011, it was France and Britain that led from the front. This year, France acted to prevent a takeover of southern Mali by jihadists and drug smugglers. Europeans may not have done enough to stop the conflict in Syria, but they have done as much as anyone else in this tragic story.

In one sense, it is true that Europe is in inexorable decline. For four centuries, Europe was the dominant force in international relations. It was home to the Renaissance and the Enlightenment. It industrialized first and colonized much of the world. As a result, until the 20th century, all the world's great powers were European. It was inevitable -- and desirable -- that other players would gradually narrow the gap in wealth and power over time. Since World War II, that catch-up process has accelerated. But Europeans benefit from this: Through their economic interdependence with rising powers, including those in Asia, Europeans have continued to increase their GDP and improve their quality of life. In other words, like the United States -- and unlike, for example, Russia on the continent's eastern frontier -- Europe is in relative though not absolute decline.

The EU is an entirely unprecedented phenomenon in world affairs: a project of political, economic, and above all legal integration among 27 countries with a long history of fighting each other. What has emerged is neither an intergovernmental organization nor a superstate, but a new model that pools resources and sovereignty with a continent-sized market and common legislation and budgets to address transnational threats from organized crime to climate change. Most importantly, the EU has revolutionized the way its members think about security, replacing the old traditions of balance-of-power politics and noninterference in internal affairs with a new model under which security for all is guaranteed by working together. This experiment is now at a pivotal moment, and it faces serious, complex challenges -- some related to its unique character and some that other major powers, particularly Japan and the United States, also face. But the EU's problems are not quite the stuff of doomsday scenarios.

Simon Dawson/Bloomberg Via Getty Images

"The Eurozone Is an Economic Basket Case."

Only part of it. Many describe the eurozone, the 17 countries that share the euro as a common currency, as an economic disaster. As a whole, however, it has lower debt and a more competitive economy than many other parts of the world. For example, the International Monetary Fund projects that the eurozone's combined 2013 government deficit as a share of GDP will be 2.6 percent -- roughly a third of that of the United States. Gross government debt as a percentage of GDP is around the same as in the United States and much lower than that in Japan.

Nor is Europe as a whole uncompetitive. In fact, according to the latest edition of the World Economic Forum's Global Competitiveness Index, three eurozone countries (Finland, the Netherlands, and Germany) and another two EU member states (Britain and Sweden) are among the world's 10 most competitive economies. China ranks 29th. The eurozone accounts for 15.6 percent of the world's exports, well above 8.3 percent for the United States and 4.6 percent for Japan. And unlike the United States, its current trade account is roughly in balance with the rest of the world.

These figures show that, in spite of the tragically counterproductive policies imposed on Europe's debtor countries and despite whatever happens to the euro, the European economy is fundamentally sound. European companies are among the most successful exporters anywhere. Airbus competes with Boeing; Volkswagen is the world's third largest automaker and is forecast to extend its lead in sales over Toyota and General Motors in the next five years; and European luxury brands (many from crisis-wracked Italy) are coveted all over the world. Europe has a highly skilled workforce, with universities second only to America's, well-developed systems of vocational training, empowered women in the workforce, and excellent infrastructure. Europe's economic model is not unsustainable simply because its GDP growth has slowed of late.

The real difference between the eurozone and the United States or Japan is that it has internal imbalances but is not a country, and that it has a common currency but no common treasury. Financial markets therefore look at the worst data for individual countries -- say, Greece or Italy -- rather than aggregate figures. Due to uncertainty about whether the eurozone's creditor countries will stand by its debtors, spreads -- that is, the difference in bond yields between countries with different credit ratings -- have increased since the crisis began. Creditor countries such as Germany have the resources to bail out the debtors, but by insisting on austerity measures, they are trapping debtor countries like Spain in a debt-deflation spiral. Nobody knows whether the eurozone will be able to overcome these challenges, but the pundits who confidently predicted a "Grexit" or a complete breakup of the single currency have been proved wrong thus far. Above all, the eurocrisis is a political problem rather than an economic one.

Krafft Angerer/Getty Images

"Europeans Are from Venus."

Hardly. In 2002, American author Robert Kagan famously wrote, "Americans are from Mars and Europeans are from Venus." More recently, Robert Gates, then U.S. defense secretary, warned in 2010 of the "demilitarization" of Europe. But not only are European militaries among the world's strongest -- these assessments also overlook one of the great achievements of human civilization: A continent that gave us the most destructive conflicts in history has now basically agreed to give up war on its own turf. Besides, within Europe there are huge differences in attitudes toward the uses and abuses of hard power. Hawkish countries such as Poland and Britain are closer to the United States than they are to dovish Germany, and many continue to foresee a world where a strong military is an indispensable component of security. And unlike rising powers such as China that proclaim the principle of noninterference, Europeans are still prepared to use force to intervene abroad. Ask the people of the Malian city of Gao, which had been occupied for nearly a year by hard-line Islamists until French troops ejected them, whether they see Europeans as timid pacifists.

At the same time, Americans have changed much in the decade since Kagan said they are from Mars. As the United States draws down from the wars in Afghanistan and Iraq and focuses on "nation-building at home," it looks increasingly Venusian. In fact, attitudes toward military intervention are converging on both sides of the Atlantic. According to the most recent edition of Transatlantic Trends, a regular survey by the German Marshall Fund, only 49 percent of Americans think that the intervention in Libya was the right thing to do, compared with 48 percent of Europeans. Almost as many Americans (68 percent) as Europeans (75 percent) now want to withdraw troops from Afghanistan.

Many American critics of Europe point to the continent's low levels of military spending. But it only looks low next to the United States -- by far the world's biggest spender. In fact, Europeans collectively accounted for about 20 percent of the world's military spending in 2011, compared with 8 percent for China, 4 percent for Russia, and less than 3 percent for India, according to the Stockholm International Peace Research Institute. It is true that, against the background of the crisis, many EU member states are now making dramatic cuts in military spending, including, most worryingly, France. Britain and Germany, however, have so far made only modest cuts, and Poland and Sweden are actually increasing military spending. Moreover, the crisis is accelerating much-needed pooling and sharing of capabilities, such as air policing and satellite navigation. As for those Martians in Washington, the U.S. Congress is cutting military spending by $487 billion over the next 10 years and by $43 billion this year alone -- and the supposedly warlike American people seem content with butter's triumph over guns.

JOEL SAGET/AFP/Getty Images

"Europe Has a Democratic Deficit."

No, but it has a legitimacy problem. Skeptics have claimed for years that Europe has a "democratic deficit" because the European Commission, which runs the EU, is unelected or because the European Parliament, which approves and amends legislation, has insufficient powers. But European Commission members are appointed by directly elected national governments, and European Parliament members are elected directly by voters. In general, EU-level decisions are made jointly by democratically elected national governments and the European Parliament. Compared with other states or even an ideal democracy, the EU has more checks and balances and requires bigger majorities to pass legislation. If Obama thinks it's tough assembling 60 votes to get a bill through the Senate, he should try putting together a two-thirds majority of Europe's governments and then getting it ratified by the European Parliament. The European Union is plenty democratic.

The eurozone does, however, have a more fundamental legitimacy problem due to the way it was constructed. Although decisions are made by democratically elected leaders, the EU is a fundamentally technocratic project based on the "Monnet method," named for French diplomat Jean Monnet, one of the founding fathers of an integrated Europe. Monnet rejected grand plans and instead sought to "build Europe" step by step through "concrete achievements." This incremental strategy -- first a coal and steel community, then a single market, and finally a single currency -- took ever more areas out of the political sphere. But the more successful this project became, the more it restricted the powers of national governments and the more it fueled a populist backlash.

To solve the current crisis, member states and EU institutions are now taking new areas of economic policymaking out of the political sphere. Led by Germany, eurozone countries have signed up to a "fiscal compact" that commits them to austerity indefinitely. There is a real danger that this approach will lead to democracy without real choices: Citizens will be able to change governments but not policies. In protest, voters in Italy and Greece are turning to radical parties such as Alexis Tsipras's Syriza party in Greece and Beppe Grillo's Five Star Movement in Italy. These parties, however, could become part of the solution by forcing member states to revisit the strict austerity programs and go further in mutualizing debt across Europe -- which they must ultimately do. So yes, European politics have a legitimacy problem; the solution is more likely to come from policy change rather than, say, giving yet more power to the European Parliament. Never mind what the skeptics say -- it already has plenty.

ANGELOS TZORTZINIS/AFP/GettyImages

"Europe Is About to Fall off a Demographic Cliff."

So is nearly everybody else. The EU does have a serious demographic problem. Unlike the United States -- whose population is projected to increase to 400 million by 2050 -- the EU's population is projected to increase from 504 million now to 525 million in 2035, but thereafter to decline gradually to 517 million in 2060, according to Europe's official statistical office. The problem is particularly acute in Germany, today the EU's largest member state, which has one of the world's lowest birth rates. Under current projections, its population could fall from 82 million to 65 million by 2060.

Europe's population is also aging. This year, the EU's working-age population will start falling from 308 million and is projected to drop to 265 million in 2060. That's expected to increase the old-age dependency ratio (the number of over-65s as a proportion of the total working-age population) from 28 percent in 2010 to 58 percent in 2060. Such figures can lead to absurd predictions of civilizational extinction. As one Guardian pundit put it, "With each generation reproducing only half its number, this looks like the start of a continent-wide collapse in numbers. Some predict wipeout by 2100."

Demographic woes are not, however, something unique to Europe. In fact, nearly all the world's major powers are aging -- and some more dramatically than Europe. China is projected to go from a population with a median age of 35 to 43 by 2030, and Japan will go from 45 to 52. Germany will go from 44 to 49. But Britain will go from 40 to just 42 -- a rate of aging comparable to that of the United States, one of the powers with the best demographic prospects.

So sure, demography will be a major headache for Europe. But the continent's most imperiled countries have much that's hopeful to learn from elsewhere in Europe. France and Sweden, for example, have reversed their falling birth rates by promoting maternity (and paternity) rights and child-care facilities. In the short term, the politics may be complicated, but immigration offers the possibility of mitigating both the aging and shrinking of Europe's population -- so-called decline aside, there is no shortage of young people who want to come to Europe. In the medium term, member states could also increase the retirement age -- another heavy political lift but one that many are now facing. In the long term, smart family-friendly policies such as child payments, tax credits, and state-supported day care could encourage Europeans to have more children. But arguably, Europe is already ahead of the rest of the world in developing solutions to the problem of an aging society. The graying Chinese should take note.

Joern Pollex/Getty Images

"Europe Is Irrelevant in Asia."

No. It is often said -- most often and loudly by Singapore's Mahbubani -- that though the EU may remain relevant in its neighborhood, it is irrelevant in Asia, the region that will matter most in the 21st century. Last November, then-Secretary of State Hillary Clinton proclaimed that the U.S. "pivot" to Asia was "not a pivot away from Europe" and said the United States wants Europe to "engage more in Asia along with us."

But Europe is already there. It is China's biggest trading partner, India's second-biggest, the Association of Southeast Asian Nations (ASEAN)'s second-biggest, Japan's third-biggest, and Indonesia's fourth-biggest. It has negotiated free trade areas with Singapore and South Korea and has begun separate talks with ASEAN, India, Japan, Malaysia, Thailand, and Vietnam. These economic relationships are already forming the basis for close political relationships in Asia. Germany even holds a regular government-to-government consultation -- in effect a joint cabinet meeting -- with China. If the United States can claim to be a Pacific power, Europe is already a Pacific economy and is starting to flex its political muscles there too.

Europe played a key role in imposing sanctions against Burma -- and in lifting them after the military junta began to reform. Europe helped resolve conflicts in Aceh, Indonesia, and is mediating in Mindanao in the Philippines. While Europe may not have a 7th Fleet in Japan, some member states already play a role in security in Asia: The British have military facilities in Brunei, Nepal, and Diego Garcia, and the French have a naval base in Tahiti. And those kinds of ties are growing. For example, Japanese Prime Minister Shinzo Abe, who is trying to diversify Japan's security relationships, has said he wants to join the Five Power Defense Arrangements, a security treaty that includes Britain. European Union member states also supply advanced weaponry such as fighter jets and frigates to democratic countries like India and Indonesia. That's hardly irrelevance.

BERTRAND LANGLOIS/AFP/Getty Images

"Europe Will Fall Apart."

Too soon to say. The danger of European disintegration is real. The most benign scenario is the emergence of a three-tier Europe consisting of a eurozone core, "pre-ins" such as Poland that are committed to joining the euro, and "opt-outs" such as Britain that have no intention of joining the single currency. In a more malign scenario, some eurozone countries such as Cyprus or Greece will be forced to leave the single currency, and some EU member states such as Britain may leave the EU completely -- with huge implications for the EU's resources and its image in the world. It would be a tragedy if an attempt to save the eurozone led to a breakup of the European Union.

But Europeans are aware of this danger, and there is political will to prevent it. Germany does not want Greece to leave the single currency, not least due to a fear of contagion. A British withdrawal is possible but unlikely and in any case some way off: Prime Minister David Cameron would have to win an overall majority in the next election, and British citizens would have to vote to leave in a referendum. In short, it's premature to predict an EU breakup.

This is not to say it will never happen. The ending of the long story of Europe remains very much unwritten. It is not a simple choice between greater integration and disintegration. The key will be whether Europe can save the euro without splitting the European Union. Simply by its creation, the EU is already an unprecedented phenomenon in the history of international relations -- and a much more perfect union than the declinists will admit. If its member states can pool their resources, they will find their rightful place alongside Washington and Beijing in shaping the world in the 21st century. As columnist Charles Krauthammer famously said in relation to America, "Decline is a choice." It is for Europe too.

Milos Bicanski/Getty Images