As we continue to navigate Covid-19, the search for a safe and sustainable path out of lockdown becomes more urgent every day.
The medical and scientific communities have been highlighting the key steps on that path for some time. One of those steps, the development, roll-out and use of contact tracing technology, presents some real challenges.
The tracing of those who may have been in contact with infected persons is crucial. Without real-time information on who has been exposed to who, and who might be at risk of spreading the virus, we are travelling the path blindfolded.
Contact tracing, at its most effective, will include the use of technologies that are now almost ubiquitous to supplement an otherwise laborious process. The most controversial and widely discussed of these is a contact tracing app.
In recent times, we have become accustomed to our smartphones knowing where we are, what we are doing and, often, with whom we are doing it.
An uneasy détente has been reached between our desire to maintain a degree of privacy over our lives and our willingness to offer up that privacy in return for information, facilities and services making our lives ever easier.
We engage in constant trade-offs, and data privacy laws sit between those two competing objectives. They seek to provide a bulwark against the exploitation of our data without inhibiting technological progress and commercial enterprise.
In the Covid-19 world, the trade-off is between an invasion of privacy in which data may be shared and analysed outside of the normal course of events, and the health (physical, mental and economic) of individuals and our nation as a whole.
Our current crisis, and the technology which is likely to be used as part of the response, brings a new dimension to this ongoing uneasy balance. Most people accept the concept of technology, such as contact tracing apps, as a tool to accelerate significantly the next steps in moving towards a society living with, and after, Covid-19.
However, many also have expressed very real concerns that such technology could introduce, legitimise and ultimately normalise a level of digital surveillance that most of us would, in normal circumstances, be unwilling to accept.
Developments to date
In Ireland, Nearform has been tasked with developing our version of a contact tracing app. Executives at the HSE are reportedly engaging with the European Commission, the Attorney General and the Office of the Data Protection Commissioner on how this app will be built, and how it will operate.
However, there is a lack of information on key matters relating to the Irish app, including what information will be collected and how; who will be responsible for the data collected; and, as basic as it might appear, the precise extent of tracing facilities that the app will offer.
Just as crucially, little information has been released on the development process itself. Privacy advocates in Ireland have advocated the publication of the Data Protection Impact Assessment (DPIA) prior to the app being released, as well as the release of the relevant source code.
In Europe, bodies such as the European Commission and the European Data Protection Board have released guidance and toolboxes for use by member states. Coalitions of EU scientists and technologists are developing standards and protocols.
Apple and Google, in an unprecedented step, have come together to develop both a tracing app for incorporation into software, and a broader cross-platform system that would use wireless signals to inform people if they encounter someone who has, or is later diagnosed with, Covid-19.
Already, the use of similar apps has generated debate and protest in other countries. In France, Germany and Britain, privacy advocates as well as the general public have resisted elements of, or all of, any proposed apps.
In France, 45 per cent of respondents to a survey by the Jean Jaurés Foundation said they would not be willing to download a contact tracing app, with commentators saying that it represented “a risk of sliding towards a form of digital tyranny”.
Meanwhile in Asia, countries such as China, Hong Kong, South Korea and Singapore have all introduced versions of this software, varying hugely in terms of technologies employed and the range of surveillance involved.
On April 17, the European Commission issued a guidance note on apps “supporting the fight” against Covid-19 in relation to data protection. This guidance followed its recommendation on the development of a common “toolbox” for the EU on the use of technology in the fight against the virus.
The toolbox, released by the eHealth Network, a platform for member states’ competent authorities on digital health, focuses on two areas, one of which is the development of a pan-European approach for the use of apps, including essential requirements, interoperability, safeguards, governance and accessibility.
This focus on governance and safeguards is to be welcomed.
A new angle on the question of surveillance
Employing technology to accelerate contact tracing, even in the context of a prevailing global pandemic, has provoked a debate on the ethics of digital surveillance. This debate has been ongoing in the European Union since before the introduction of the GDPR on in May 2018.
The GDPR takes a rights-based and principles-based approach to the protection of personal data, mandating lawful and fair processing, purpose limitation, data minimisation and storage limitation as fundamentals of any processing activities on personal data.
It also leads with umbrella principles such as transparency, proportionality and data protection by design and by default. The GDPR’s mere existence demonstrates that as a society we value privacy, and offers some comfort to those of us alive to the risks and opportunities associated with sharing our data.
The data protection framework has been created within the EU to build technologies, and an environment in which they may operate, which uphold fundamental principles and protects personal freedoms.
It may even be the case that successful creation and management of technologies such as contact tracing apps validates the GDPR and data privacy as a concept, by showing how data use and privacy may co-exist to the benefit of all.
Data protection principles
The protection of public health is already provided for, both in the GDPR and in the Irish national implementing legislation, the Data Protection Acts 1988-2018.
For example, Recital 52 of the GDPR specifically refers to “the prevention or control of communicable diseases and other serious threats to health” as a reason for which EU or member state law may derogate from the prohibition on processing “special category” personal data.
It also cites contact tracing as a valid ground for relying on derogations from principles relating to the transfer of personal data outside the EU.
Even if contact tracing technology does not specifically involve the processing of “special category” health data, it has been acknowledged at European and national level that the regulation envisaged and provided for the processing of data for public health purposes.
If government and key stakeholders apply fundamental data protection principles in building a digital contact tracing system - and in communicating with the public regarding that system and its ongoing implementation and management - then the integrity of both our data and our data governance framework can be ensured without compromising the effectiveness of that tracing system.
Organisations such as data protection supervisory authorities, data privacy campaigners and bodies and, ultimately, citizens will be central in ensuring and demanding that those principles are respected and applied from the outset.
It is obvious that strictly adhering to a number of core principles of the GDPR will be vital to ensuring that not only does the technology exist and its use occur within the legal confines of the regulation, but that it is seen to do so, something which is key to ensuring public support and adoption.
Anonymisation of any personal data collected by contact tracing apps has been identified by many as being a fundamental component of the data protection framework. If successfully achieved, this would take the data outside of the scope of the GDPR entirely.
However, the ability to truly and completely anonymise data collected in this manner will always be questionable. Accordingly, other processing principles can, and should, be given the consideration and importance they require.
The requirement of transparency in the processing of personal data, a concept which envelops the GDPR and data privacy as a whole, should comprise the essence of the technology from concept to roll-out and beyond.
Calls for the release of the source code and the publication of DPIAs in advance of the Irish app’s release have already been made. Such steps would lay the foundations for a subsequent data governance regime that retains transparency at its core.
Citizens should be kept fully informed at all times as to data held, the status of that data and the processing being employed on it.
Effective accountability will also be a deciding factor in the technology’s success. In order to engage with it, the Irish public will need to know to whom collected personal data is being provided, and who is responsible for that data’s security and for restricting its use to the app’s clearly identified purposes.
Those persons or bodies will, and should, be required to fulfil their obligations under law and demonstrate that they have done so.
Data protection by design and default
Without respecting the doctrines of data protection by design and by default from the outset, this project will fail. Unlike in many Asian and other countries, it is not within the gift of European governments and agencies to introduce wide-ranging and intrusive technologies in a coercive (or even merely obligatory) manner.
National governments and the EU will have one chance to implement this technology successfully, with a failure to do so setting back the entire Covi-19 response across the continent. By taking a privacy-first approach from the outset this technology will not falter before it begins.
The meaning of “success”
What do we mean when we speak of technology succeeding or failing? In this context, success or failure means securing the public’s trust to a level necessary to ensure it is adopted in sufficient numbers to give the platform a chance of making a difference.
Without sufficient support, the tracing mechanisms facilitated by that technology are of little use. It is, therefore, public buy-in rather than technical superiority or mass data gathering that is needed.
Such buy-in will only be secured through comprehensive and obvious implementation of, and adherence to, these data protection fundamentals.
Recital 4 of the GDPR notes that “the processing of personal data should be designed to serve mankind” and that the protection of personal data is not an absolute right.
It really resonates in this time of crisis, when personal data provides the key to the lockdown exit door. Nonetheless, it remains the case that a balancing of rights is not the same as an expropriation of rights.
The statement that data processing should serve mankind is made at the outset of the foremost piece of legislative protection given to personal data globally. The statement exists within that context and so, therefore, should the technology it must facilitate.
This is a question of leadership. Europe has been, and continues to be, a leader in promoting and protecting data privacy as a 21st century right.
You and I, as European citizens, are now being called upon to exercise this privacy leadership as a central part of the effective use of technology in our fight against Covid-19 across the Union.
Sarah Slevin is a corporate solicitor in Ronan Daly Jermyn in Dublin, and a member of the firm’s cyber and data protection team. She advises both private and public sector clients on best practice in data protection, and specifically GDPR compliance