Coronavirus Surveillance: Techno-Mediated Future or Dystopia

: :

Author: Rubez Chong is a graduate researcher at the Center for Civic Media, MIT Media Lab where she studies the socio-technical implications of surveillance technologies. Her current work examines the relationship between smart technologies, surveillance capitalism, and civic resistance.

The latest in pandemic news has centered around the eruption of technologies used to usher in the reopening of businesses. Sandwich chain Subway has deployed thermal cameras at several of its outlets to measure employees’ temperatures. PwC has launched its contact tracing app to track employees’ movement at their Shanghai office. Belgian port workers are required to wear digital bracelets used to enforce social distancing. Yet, what we are witnessing is merely the beginning of the proliferation of surveillance technologies in our everyday lives. From electronic wristbands, to digital immunity badges, to thermal cameras, and contact tracing apps, these technologies will play more mediated roles in the new pandemic-imposed normal.

It is important, then, to consider what is at stake for our new realities and critically question the kinds of techno-mediated futures (or dystopias) we want to be living in. What does this public health crisis mean for civic health? Can these technologies free and also shackle us in new ways?

Old Technologies, New Social Rules

In a reflective interview between Shane Smith (CEO, Vice) and Edward Snowden (famed NSA whistleblower), Smith remarks the extent to which our smartphones have become our ankle bracelets in this pandemic age. This revelation is not new. And neither are the technologies. Governments and corporations have been surveilling citizen-consumers for years via telecoms and location-tracking data. Thermal cameras and bluetooth signals have also been around for decades. What is different about this new era of surveillance is how rapidly deployed and readily accepted these technologies have been without pause for criticality. The U.S. Food and Drug Administration (FDA), for example, has said that “it would temporarily allow device makers to market thermal cameras, which have not been vetted by federal health regulators, for temperature checks in places like warehouses and factories.”

These unchecked technologies can have far-reaching consequences that will outlive the pandemic itself. China’s virus-tracking software was first introduced in February and critics have argued that it has quickly become another tool of social control. Citizens are given a red/yellow/green color code that supposedly indicates a person’s health status. In turn, the color code determines their access to public spaces. How an individual’s color code is determined is obscured by the software’s black-box of algorithms. Now it aims to implement a new health score system that tracks citizens’ daily activities like exercise, drinking, and smoking habits. While this is unsurprisingly for Big Brother China, similar surveillance practices have long existed in democratic nations as well. In the US, renewed public-private surveillance partnerships should raise alarm bells. Without civic oversight, such partnerships can turn this public health crisis into a surveillance-fest: from tracking citizens’ whereabouts to mapping social relationships and interactions, to collecting under-the-skin health data. Once in place, it will be hard to control the uses and growth of this surveillance architecture.

Several have contended that privacy-loss is an inevitable price to pay to fight this virus. This is not only a narrow assumption but that it lacks social imagination. Protecting public health cannot come at the expense of civic health. Governments need to ensure that their virus-tracing and monitoring techniques value and respect human life, dignity, and right to privacy. Sam Biddle’s recent piece provides a good starting point for coronavirus surveillance best practices:

  1. Public health officials must drive data decisions instead of government/corporate actors
  2. Coronavirus-related surveillance must be clearly justified against the costs
  3. Data collected for Covid-19 purposes should expire
  4. Data collected for Covid-19 should be walled off, like the U.S. Census
  5. Beware of attempts at “Reputation Laundering” i.e. recasting surveillance technologies as pandemic solutions
  6. Remember the limitations of surveillance and tech

I am lightly addressing his points here because everyone should read his article. Instead, I want to pick up on his last point i.e. understanding the limitations of surveillance and tech. In the age of coronavirus surveillance, the overused maxim: “technology is not a panacea”, is more relevant than ever.

What Tech Can’t Fix

It is easy to get caught up in the euphemisms of “smart”-futures, especially in times of crisis. But solely relying on technology without building the social infrastructures for pandemic-transition can lead to new types of crises. Firstly, unchecked technologies can lead to new kinds of digital discrimination. Contact-tracing technologies can serve as proxies for silencing political dissent. Exposing potentially-infected persons can lead to further discrimination towards already disenfranchised groups. In South Korea, for example, an outbreak at a queer-friendly bar has revitalised discrimination towards LGBTQ groups. This digital discrimination can also lead to new forms of ableism. The Equal Employment Opportunity Commission has also given companies the green light to withdraw a job offer if a newly hired employee contracts the virus. Already, our immunities are becoming our access cards to employment, services, and places.

Secondly, inaccuracies in technologies can further exacerbate and perpetuate discrimination. Thermal cameras, for example, detect skin (instead of core body) temperatures which are influenced by environmental factors i.e. heat, nervousness etc. They can have inaccuracies of several degrees Celsius, effectively misdiagnosing if someone has a fever or not. Additionally, many covid-infected people do not show any symptoms which renders the thermal cameras useless. Contact tracing apps operating on Bluetooth signals are also riddled with inaccuracies. While Bluetooth technology is less invasive than GPS location data, their signals are influenced by physical factors like walls, human bodies, and bodies of water. Neighbors separated by a wall might register as a contact event as opposed to 2 people standing back-to-back. Misdiagnoses and false positives can lead to greater social distress and further overwhelm already-overwhelmed hospitals. These apps are only a means to an end. Their efficacy depends on widespread adoption (which hinges on strong public trust in government institutions) and social responsibility to self-report and self-quarantine when at-risk.

I am not suggesting that we stop using these technologies, but that we put as much effort into building the social infrastructure to address a public health crisis as we are building the technical one. We need to ensure that everyone has access to “widespread, accessible, free testing.” Infected and at-risk persons should also have the financial and emotional resources to safely self-quarantine. Future vaccination or treatment always need to be made widely available. Our technical solutions cannot replace the human networks for a post-pandemic future. Not only are our current civil liberties at stake, but this surveillance infrastructure sets the stage for future surveillance practices.

Protecting Public and Civic Health

In the past 2 months, we have seen a heightened renewal in online vigilantism. With new social rules in place to limit the spread of the virus, several have taken on the roles of social police to enforce these new norms. SG Covidiots, a Singaporean Facebook group dedicated to shaming those breaking social distance rules, has more than 30,000 members actively posting and sharing the “social deviants” of Covid-19. Others in the US wrestle with calling the cops on their deviant neighbours. In New York City, their covid reporting hotline has seen overwhelming calls. In addition to peer-monitoring, employers have also explored new forms of workplace surveillance using wearable tracking beacons and screen-monitoring software. Technical pandemic solutions can lead to further curtailing of civil liberties.

What should our response be then? Building on Biddle’s piece, I suggest 3 additional recommendations:

  1. Governments need to develop transparent plans for the continuous disposal of data. Given that the virus is projected to stick around for some time, governments cannot wait until the pandemic has passed to commit to the dismantling of the surveillance architectures.
  2. Governments need to clearly communicate how data is stored, collected, and used in the interim. People get lost in long policy documents so governments need to develop simple and effective guidelines – think IKEA manual. 
  3. Opting into these technologies should be optional. This ensures a positive feedback loop; governments build trust with their citizens by ensuring user privacy and anonymity and in turn, more citizens opt into contract tracing, driving adoption and increasing efficacy.

While advances in technology have unquestionably aided in global crises, we cannot forgo our ability to ask the hard questions blinded by pandemic desperation. Before rushing to implement these technologies, we need to examine its socio-technical implications for the years, and decades, to come. Questions such as, who benefits from our surveillance-mediated futures? Who’s left out and perhaps, further disenfranchised? How will data be managed and privacy respected? If we’re not careful, the pandemic is just the political nod tech giants have needed for a deepening of surveillance technologies and cultures.

I am just as eager as everyone else to combat the virus, save lives, and resume to some level of pre-pandemic normality. But our exhaustion cannot cloud us from the long-term impacts of these surveillance technologies on our everyday lives; many of which will be programmed into the new normal.