We find acceptable how concerns about privacy in COVID-19 contact tracing by Apple and Google are handled.
Apple and Google announced a joint effort (see Apple and Google partner on COVID-19 contact tracing technology) to develop software that will automate “contact tracing.” It will alert users who opt-in to be alerted if and when they have come into contact with (their phone has been close to) someone who self-registers as having been tested positive for the coronavirus COVID-19. This could potentially be a significant technology to help fight the further spread of COVID-19.
What does the Apple-Google COVID-19 contact tracing app do?
Phase 1 is planned for release in May 2020, with infrastructure on both iPhone and Android smartphones that will enable an opt-in app administered by a state health agency with which a user can voluntarily indicate that he or she has been tested positive for the COVID-19 virus. In turn, the app will notify anyone who has opted in to alerts if their phone has been in close proximity of anyone self-registered as having COVID-19, without identifying who.
How does the Apple-Google COVID-19 contact tracing app work?
We understand that Apple and Google are implementing the contact tracing infrastructure as follows: The participating users' phones are identified by an anonymous key. The determination of proximity of two parties who opted in is based on a Bluetooth connection between their two mobile phones. Bluetooth is the same low-energy, short-distance transmission as between your phone and your wireless earbud or car radio. The phones store the anonymous keys of other phones that come near them. Unlike the system being implemented in Seoul, South Korea (see the New Yorker’s Seoul’s Radical Experiment in Digital Contact Tracing), for example, no personally identifying information or location information is collected into a central database regarding either party.
Specifically, when a user self-registers that he or she has been diagnosed as having COVID-19, only that phone's anonymous key is stored in a central database. If another user opts-in for alerts, his or her app logs keys of phones that have come in close proximity (within x feet and for more than y minutes, where x and y will be configurable as more experience is gained). If any of the keys logged on the phone match that of any of the self-registered COVID-19 users’ keys in the database, the app will send the opted-in user an alert that they have been in close proximity of a COVID-19 user.
Beyond this comparison of keys logged on the phone to keys that should trigger an alert, there is no database correlation of keys to personally-identifiable data. Also, as mentioned, no other third party has the proximity data. For example, the users' telephone carrier is not involved in the communication and has` access neither to the COVID-19 registration nor to any alerts that result to opted in users.
Phase 2 is planned subsequently as a native feature of the respective operating system (iOS and Android) rather than as an add-on app.
Is there a privacy risk?
It appears from Apple’s and Google’s statements that they have paid close attention to privacy issues, to minimize the risk of privacy abuse. Although some cryptographers have identified hypothetical scenarios whereby the technology could be abused to violate privacy rights, several aspects of the design reassure us that much has been done to Apple’s and Google’s credit with user privacy in mind, including the following:
Anonymous: There is no personally-identifiable information recorded anywhere, just an anonymous key stored on the phone belonging to someone who has voluntarily registered that he or she has COVID-19, or an anonymous key of someone who has been in close proximity to such a person.
De-centralized: The list of who came in close proximity to a COVID-10-infected person is on that person's phone. Those who opt in to be notified when they are in close proximity receive a notification on their phone.
Opt-in: Using the app whether for registering that one has COVID-19 or for getting alerts is on an opt-in basis.
No location information: The system does not record the location of the users, but just that two anonymous parties have been in close proximity. It uses Bluetooth, not GPS or phone carrier transmission, which would have conveyed location information.
Should I opt-in to the the Apple-Google COVID-19 contact tracing app?
The question of whether to use this technology is an instance of a tradeoff we have discussed in a previous blog post, Balancing coronavirus privacy vs. public health. While the jury is still out on how airtight the privacy of users will be guarded, given the design factors above that reflect concern for privacy, and given the importance of fighting the spread until a vaccine emerges, we tilt towards recommending opt-in usage of the technology. This is particularly true of those more susceptible to the virus, such as those with relevant chronic conditions or the elderly. It should also be noted that success will depend, in part, on mass adoption of this technology, and it is far from definite that will happen. There have also concerns of false positives inherent in using Bluetooth (e.g., a COVID-19-positive neighbor safely behind a wall could be unduly alerted to you as someone with whom you have had contact beyond the threshold number of minutes).
Automation of “contact tracing” enabled by the Google and Apple technology has potential to help fight the spread of COVID-19. Since enough privacy-centered precautions were taken as part of the design, we lean towards recommending opt-in adoption as a reasonable balance between a small privacy risk and public health.