Big Company, Big Government, Big Brother? Privacy after Covid-19

The COVID-19 pandemic will be a history-altering event. But where will it take us? One theory of governance is that democracies only make major changes when confronted with a crisis. It is too early to conclude whether Covid-19 was such a crisis, but the immediate response to it was a rebalancing, perhaps temporary, of individual …

The COVID-19 pandemic will be a history-altering event. But where will it take us?

One theory of governance is that democracies only make major changes when confronted with a crisis. It is too early to conclude whether Covid-19 was such a crisis, but the immediate response to it was a rebalancing, perhaps temporary, of individual and collective rights. Long-standing political assumptions about the benefits of restricting the role of government were shattered, and the reverberations of this shift will shape policy in the future. As one commentator put it, “They say there are no atheists in a foxhole, and there aren’t a lot of libertarians in a global pandemic.”

Even before Covid-19, there was pressure to reconsider the “shrink government” approach that has shaped U.S. policy for the last 25 years. The revived discussion of industrial policy in response to technological competition with China is one example. But, reconsidering small government does not mean renewed public faith in Washington. For privacy, leadership has devolved to state legislatures and to the European Union, which has set itself the goals of becoming the global privacy regulator and holding the United States to higher privacy standards. But neither states nor the European Union are an adequate substitute for federal policy.

The Post-privacy World

Privacy has been defined as the right to be left alone. But, measured by the amount of data routinely collected and stored by commercial entities, we live in a post-privacy environment. “Natural” privacy—where an individual’s personal information was unknown and often unknowable—is an artifact of the Industrial Age, before ubiquitous sensors and digital connectivity. Increasingly we have “artificial privacy,” where masses of data are collected for commercial purposes, and privacy, to the extent it exists, is created by rules and regulation that govern how this data is used.

There are still plenty of libertarians on the internet, and the libertarian impulse (reinforced by commercial interests) still guides privacy and internet policy in the United States. There is some justification for this, as anything other than a laissez-faire approach will require modernizing the laws, regulations, and institutions needed to govern a digital environment. (Laissez-faire has been American practice for two decades.) Moving toward greater regulation of the internet (including for privacy) will damage innovation—the European experience is evidence of that. But laissez-faire policies do not protect privacy and face increasing skepticism and growing demands from governments for greater control of data (called “data sovereignty”). These external pressures will force some reconsideration of privacy policy in the United States.

Let’s review what companies can currently collect on individuals through their mobile phone. This includes location (down to the square meter), rate of travel, and, in some cases, altitude. Those with health apps may also provide heart rate, diet, and hours of sleep. Sophisticated algorithms expand the knowledge available from this data. For example, knowing location, speed, and altitude identifies mode of transport (e.g., foot, motor vehicle, or plane). Some operating systems allow location data to be correlated with web searches, contacts, or reading lists. A few widely used mobile operating systems are part of larger commercial activities that scan emails for keywords and correlate them with location data and online searches. Using the enhanced privacy settings on some major operating systems does not stop this collection.

Health Surveillance

Recognition that the data generated by these commercial surveillance technologies could be used for public health purposes led governments to adopt intrusive measures to create what can be called “smart quarantines”—targeted measures that track those infected or exposed. There are appreciable benefits from surveillance of location and contacts for managing the spread of disease. Israel uses location data already collected by its intelligence agencies for this. Singapore and Korea created “apps” for mobile phones that tracked compliance with quarantine rules. In the United States, the White House is developing a national virus surveillance system, and the recent stimulus package directs and funds the Centers for Disease Control to develop a “public health data surveillance and analytics infrastructure.” The two tech giants who control the majority of the world’s smartphone operating systems agreed to cooperate with the federal government to develop technologies that allow surveillance from mobile devices. Some of these technologies allow people to choose to “opt-in” rather than making surveillance mandatory. In each case, the data is anonymized, at least in the initial phases, but allows for the discovery of an individual’s identity so that they can be contacted if needed.

In the face of emergency, existing privacy regulations did not obstruct efforts to manage Covid-19. The European Union’s General Data Protection Regulation (GDPR) has not been an obstacle to the creation of government health surveillance programs or technologies in Europe, even though European officials describe these measures as “an infringement of fundamental rights and freedoms.” The Health Insurance Portability and Accountability Art (HIPAA), which governs the collection and use of health-related data in the United States, was interpreted by the Department of Health and Human Services to allow the sharing of “protected health information” (e.g., patient information) to protect health workers, manage outbreaks of infectious disease, and assist patients.

These measures are temporary relaxations on government use of health data and do not touch directly on privacy’s central problem—commercial use of personal data. Americans seem willing to trust their location and contact data to health providers rather than to law enforcement or security agencies. Survey data suggests individuals will accept this intrusive surveillance during a pandemic but are reluctant to see it become permanent. The issue is whether, once Covid-19 is under control, to continue health surveillance to warn of and control any future pandemic.

When there is a tangible threat, Americans and Europeans give privacy lower priority. But when Covid-19 pressures decline, there will be a renewed debate on whether citizen concerns over surveillance outweigh public health. Privacy groups have already lined up to oppose permanent change or the retention of health surveillance system. “We cannot allow the COVID-19 pandemic to serve as an excuse to gut individual’s right to privacy,” said a statement issued April 2 by 100 human rights groups. However, it seems clear that as long as the threat of Covid-19 remains, the public will support health surveillance.

Health surveillance becomes complicated when it involves commercial actors and the data they already collect. If the public objects to continued health surveillance post-Covid-19, the distrust that greets government surveillance may spread to commercial surveillance if they are seen as agents of government. On the other hand, if commercial surveillance data is not shared with health authorities, it creates a significant vulnerability. The surveillance that people accept now amid a pandemic may have less support afterward, and continuation will require new rules and an oversight structure. This could increase demand for opt-in provisions, but opt-in runs counter to the need for continued broad health surveillance.

Uncertainty and Redefinition

The post-privacy environment is marked by uncertainty, in part because people are unaware of the scale of collection and in part because there is disagreement over the form new privacy rules should take. Privacy faces a complicated landscape of digital interconnections that follows the logic of networks and business, not the logic of political borders. The most obvious fault line is between Europe and the United States. The United States has taken a sectoral and self-regulatory approach to privacy where constraints largely apply only to government use of personal data. The European Union has taken a comprehensive regulatory approach. There are disadvantages with each, as the digital environment continues to redefine itself in response to technological change.

What people say about privacy and what they do are very different. Perhaps people may be unaware of how little privacy they have left online, but consumers make choices that suggest their attitudes toward privacy are changing. Europeans, for example, are hostile toward Google (and other U.S. tech giants) as Google collects immense amounts of personal data, but the great majority of Europeans still avail themselves of Google’s services. An EU-funded, privacy-respecting alternative search engine (Quaero) unveiled a decade ago has so little appeal that one has to google it to find it. This acceptance of intrusive commercial surveillance may be because consumers have little choice (to use a service, you must provide your data), but this raises anti-competitiveness concerns and (at least in Brussels) the question of whether online services should be regulated like public utilities.

Privacy is being redefined. The response to Covid-19 will accelerate this. In a digital environment, you surrender the possibility of being left alone the moment you connect to a digital network, and very few are willing to completely disconnect. Digital technologies create surveillance opportunities that make people uncomfortable but, as in the case of health surveillance, offer real benefit. The process of redefinition will be lengthy and complicated by the increased interest outside the United States in regulation and “data sovereignty.” The immediate issue is whether to continue health surveillance and under what rules would such collection be governed.

Legislating Privacy

Covid-19 will expand public awareness of how technology has commercialized personal data, but it will take time for this awareness to play out in public attitudes and policy. The general sentiment seems to be that the measures of enhanced digital surveillance created in response to the virus are expedient and temporary, justified by health and not security. While these have been presented as provisional, and to be discontinued once the crisis is over, this global pandemic opens the possibility of permanent measures for monitoring disease and health using the data generated by digital devices. If this is the case, both U.S. and European privacy rules will need to be amended to permit it.

While it is unlikely that Congress will pass a national privacy bill in the next year, the need to amend existing laws to accommodate health surveillance, the antitrust implications of a few giant companies dominating the information space, and the growing discontent over privacy will eventually force it to enact legislation. Industrial Age privacy has been eroded to the point of vanishing by the data created by and harvested from digital networks, but Europe is not a good guide on how to construct a new American privacy policy. The EU’s privacy rules helped stifle the European information technology industry and explain why there are no European internet giants. An overly expansive definition of privacy hurts innovation; a minimalist definition (like that in the United States) and the relentless commercialization of personal information leaves citizens vulnerable to exploitation and manipulation, as with Cambridge Analytica. Informed by the experience of Covid-19, Congress may wish to consider the following points:

  • “Personally Identifiable Information” (PII) is a commodity. The business model of the internet is based on trading this commodity for services. No alternative to this business model has emerged, but trade in PII is unstructured and lacks markets or pricing mechanisms. This creates inefficiencies in the exchange and use of personal data and makes individual control over data more difficult.
  • Absent judicial and regulatory pressure, privacy will continue to shrink as new technologies (particularly the combination of fast 5G networks, widespread sensors, and artificial intelligence and data analytics) create new sources of data and make greater use of it. Access to and use of data is essential for economic growth and for public health, but making PII a commodity creates discontent. GDPR is an attempt to change this, but the European experience shows that poorly designed privacy regulations create obstacles to innovation in information technologies.
  • The response to Covid-19 will accelerate corporate and individual concerns over cybersecurity, but there has been no new thinking in a decade on public policy to address the cybersecurity problem. Cybersecurity and privacy are related—both deal with data protection—but have a fundamental difference. Privacy is based on an expectation that rules will be followed. Cybersecurity is a response to rules violations. Covid-19, with its eruption of cyber fraud and cyber espionage, will increase demand for better data protection.
  • Antitrust affects privacy, as the data practices of regulated and unregulated service providers differ significantly. Tensions over antitrust will be exacerbated as changes in telecommunications technology pit large tech companies offering largely unregulated network services against highly regulated telecommunications service providers. As big tech companies begin to offer telecommunications services, they will gain an unintended advantage if this is preserved. Unbalanced regulatory treatment of increasingly similar services guarantees a battle. It will not be a repetition of the Uber experience, where a well-financed tech company offered an unregulated service in competition with the highly regulated taxi companies. Unlike the hapless cab drivers, the regulated competitors this time are also financial giants and experienced in the ways of Washington.
  • The imbalance between highly regulated government use of personal data and lightly regulated commercial use dates to the 1980s and no longer makes sense. The response to Covid-19 is an opportunity to move toward greater use of personal data by government agencies for health and economic purposes and more constraints on commercial use. Privacy advocates care about government surveillance but have often given commercial surveillance a pass. Covid-19 has not fundamentally changed their views, and we can expect a backlash from these groups against the measures created for health surveillance. At the same time, the benefits of health surveillance and data analytics for controlling the risk of pandemic will be widely recognized and demand for some continuation of these measures will probably be unstoppable. A key issue for policymakers will be whether to keep health surveillance in place even after the virus is under control

Covid-19 came at a time of already heightened attention to the ways that technology reshapes societies and economies. Although there are many contradictions in public attitudes on data collection and use, the broad outlines of a new concept of privacy is emerging that balances greater access with increased transparency and rules on use, since privacy depends on public trust that there are rules that are followed and enforced.

A new rules-based approach to privacy and data protection lies somewhere between the United States’ commercial minimalism and European regulatory overreach. The health surveillance created for Covid-19 can help reframe the issue for Congress to better balance regulation, oversight, government use, and commercial practices.

Follow us on Google News

Filed under