Tech solutionism
In many countries, including the Netherlands, a contact tracing app act is presented as the solution to get out of lockdown, but in my experience, technology is only part of the solution.
Apps can only be used as a supporting measure in situations like this. You cannot ignore the organizational measures that will have to be created around the app.
For example, what can employers ask from their employees? Will access to public transportation become dependent on your score in one of these apps? If it is known that the apps have a large margin of error, how to mitigate any negative consequences for individuals? Will there be access to COVID-19 tests to prevent people from having to stay home for two weeks just because their app received a notification?
How people and organizations respond to the use of these apps in practice is incredibly complex. I fear that we cannot understand the ramifications of these apps if we only focus on just the app itself. The app’s design is always the end result of an entire process, not the starting point (which it is now for many governments).
Privacy by design is not enough
The current focus of the privacy community is very much on whether such apps meet the principles of privacy by design. We already see a best practice developing on how this can be achieved, using decentralized Bluetooth contact tracing rather than location data stored in central databases.
The ICO also just concluded that the “contact tracing framework in development by Google and Apple indeed meets the principles of privacy by design. According to the ICO, the same applies to the “Decentralized Privacy-Preserving Proximity Tracing” system developed by a separate expert group, which is based on similar principles.
It is encouraging to see that the privacy-by-design principle in the EU General Data Protection Regulation is indeed doing its job. It challenges developers to be innovative to address privacy concerns already in the technical design.
I find it fascinating to see that as it becomes apparent privacy issues are being addressed in the design of the apps (no centralized location data monitoring, but decentralized contact tracing processing), we suddenly see, for example, the French government requesting Apple and Google facilitate centralized processing.
We really need to stick to our guns here. This is exactly what the privacy rules are for in the first place.
Behavioral sciences by design
What is missing in the ICO’s assessment is the behavioral perspective referred to above. By now, it is clear that these contact tracing apps using Bluetooth will raise many false positives and not raise an alarm in situations it should have (false negatives).
When notifications are not trustworthy, people will just ignore them. These apps will further influence the behavior of people, companies and institutions. If you lift the lockdown with the idea that an app can control the infection, you really create a false sense of security.
Error margin
The experiences in Singapore are a good example. They had to return to lockdown despite using one of these tracing apps. The app examines whether an individual has been within two meters of someone with COVID-19 in the past 30 minutes. If so, they receive a signal that they are possibly infected, as well.
How trustworthy is that really?
If I kiss someone with COVID-19 for one minute, I will probably get infected, but the app won’t raise an alarm because we were in contact for less than 30 minutes. And Bluetooth works through glass and even some walls and may even be triggered by reflection in windows in buildings.
This means the app can give me a signal, even though my neighbor behind a wall or my grandma from behind a window, never had the opportunity to infect me. The app can also only be effective if a majority of the population installs and uses it properly. I doubt whether that is an achievable goal if the app does not generate trustworthy notifications. People will simply click notifications away if they get too many.
Voluntary?
Another concern is that, even if the app is voluntary, employers and governments will demand a “green code” before individuals can go back to work, enter a cafe or use public transport. Those are serious consequences on the basis of an extremely fallible system.
Right now, everyone has to practice social distancing, but if some of us are allowed to go to work or a bar and some of us are not, then people will start gaming the apps. This could potentially include turning off Bluetooth or the device when meeting up with others, for example, or taking another person’s device when boarding public transportation.
The monitoring that would be required to counter this — by checking if people have their mobile phone on them or whether their Bluetooth is switched on — can get out of hand fast. This is not China; forcing people to be tracked will not work. We will have to stimulate people to take responsibility themselves.
How can it be done?
Practice shows that apps will only be used if they have a substantial added value for the users themselves. If the use of the apps entails a risk punishable by a red code, nobody will use it.
It may well wield better results to turn the functionality around, by having the app only provide a notice to the user of the app. Where have you been in the past weeks, and who did you have extensive contact with?
Notifications can then be sent to these people, which makes it more likely that they will act responsibly. If an individual receives an email or call from someone they know, they immediately assess how intensive the level of contact was. People are far more inclined to act on such a personal note than when receiving an anonymous notification, without any context, saying, “You were in the vicinity of an infected person.”
Why would an individual go into quarantine then? These types of automatic warnings will not achieve the results we want. Technology is often very alienating, and we can easily hide behind it (no need to follow-up when you get infected; all is taken care of by the app).
This solution will obviously not work when using public transportation or entering the supermarket. The solution here could be that the app scans a QR entry and exit code, and if later someone is indeed infected, an automated notification signal can be sent.
It also would help if the app had functionality where I could click away the notifications I think are not relevant (waving to your mum behind a window). An individual would then get a clean set of contacts that they might have to notify if contracting the virus.
The recent EDPB guidance on the contact tracing apps
There has been a lot of criticism that the European Data Protection Board has been very slow with issuing useful guidance on urgent topics relating to COVID-19. This has resulted in 25 different pieces of COVID-19 guidance from the supervisory authorities of the EU member states, some of which are actually in direct conflict.
When the GDPR was adopted, the European Commission touted that we would have one pan-European law replacing the then existing inconsistent patchwork of 28 national laws. Especially in a crisis, I would have expected the EDPB to live up more to the occasion. That being said, the guidance issued April 21 on the use of COVID-19 contact tracing tools is more specific and therefore more useful.
The starting point for the EDPB is that contact tracing apps should be voluntary and not rely on tracking individual movements based on location data but on proximity information regarding users (e.g., contact tracing by using Bluetooth). Especially noteworthy is that the EDPB stresses that such apps cannot replace but only support manual contact tracing performed by qualified public health personnel, who can sort out whether close contacts are likely to result in virus transmission or not. This seems to preclude that, for example, employers would use these apps for admission of their employees and suppliers.
The EDPB further notes that any identification of an infection risk will likely have a high impact on individuals, such as having to remain in self-isolation until testing negative and that this necessitates the ability for individuals to address false negatives.
Again, we will have to take organizational measures to ensure that the impact of the app is mitigated.
Long-term effects?
Assuming we address privacy-by-design and behavioral issues, concerns remain about whether the world will ultimately change for the better. For example, Apple and Google are about to build the contact tracing functionality in their operating systems layer of Apple (iOS) and Google (Android).
This fundamentally changes the amount of control users have: We can uninstall an app but cannot uninstall the entire OS without being able to use our device.
According to the specifications, users can switch on and off the “exposure notifications” in their settings. But if these are switched off, the contact tracing functionality is still functional in the background, which can be leveraged at any time by whatever app across the globe. This puts quite the onus on tech companies to act responsibly, to say the least.
The pressure to leverage this functionality will not only come from authoritarian states. Experience shows, that whenever there is a technical possibility, there are suddenly many urgent purposes for the application thereof.