By David Erdos
Faculty of Law, University of Cambridge
Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
Faculty of Law, University of Cambridge
Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
At least in Europe, the basic data protection framework coalesced during the 1970s and very early 1980s. Even then, it was framework under serious socio-technological challenge. But since this time our experience of the gathering and spread of personal data through computerized networks has radically altered. One such change concerns the large-scale online spread of third-party personal data, which is promoted, organised, aggregated and rendered searchable by public dissemination services. These intermediary publishers include both generalist and specialised search engines, social networking sites and wide range of other online platforms. Huge conundrums present themselves as to whether and, if so, what data protection duties these services are subject to as a result of this personal information dissemination. Whilst Data Protection Authorities (DPAs) have grappled with such questions over many decades, it was the Court of Justice of the EU’s decision in Google Spain that brought it to wide public attention. Google Spain memorably found that, at least in relation to name-based searches, a generalist search engine would need respond ex post to claims to deindex personal data whose processing would otherwise violate data protection norms. More abstractly, the Court specified that
Inasmuch as the activity of a search engine is … liable to affect significantly and additionally compared with that of the [original] publishers … the fundamental rights to privacy and to the protection of personal data, the operator of the search engine as the person determining the purposes and means of that activity must ensure, within the framework of its responsibilities, powers and capabilities, that the activity meets the requirements of [data protection].
As my own work has explored, it is concerning that these tests don’t seem to be present in the actual legislative framework which the Court said it was applying. European data protection has generally adopted a “processing” model which holds an operator responsible in any case where they are determining purposes and means, irrespective of whether the activity is significantly or additionally impactful. Moreover, the idea of removing duties on the basis of a lack of capability (or even power) in the relevant systems would appear in tension with the general expectation (now explicitly stated in laws such as the GDPR) that controllers should proactively ensure data protection by design. Nevertheless, the tests themselves may be considered quite reasonable and, aside from where services are clearly operating on behalf of another named or traceable entity which can realistically be held legally accountable, should be broadly applied. Nevertheless, not only is there a need to place this all on a statutory footing, but these abstractions clearly also require considerable further contextual specification. Thus, whilst Google and general search engines have accepted that nominative searches significantly and additionally affect data subject rights, what about searches based on an image, telephone number or job title and workplace? Should the functionality of social networking sites be considered intrinsically significantly and additionally impactful or, if not, how should in-scope processing be demarcated there? Finally, what can be done to ensure that action is respectful of users’ enjoyment of freedom of expression, without fundamentally undermining these personal information safeguards? Related questions are being addressed in a wider range of legal contexts as the European Commission’s draft Digital Services Act and the UK Government’s Online Safety Bill highlight. The development of data protection should, therefore, take full account of these related developments. Only by legislators themselves addressing these issues in earnest can we hope that the landscape in this area might be effectively charted to the potential benefit of legitimate services, users and data subjects alike.
Inasmuch as the activity of a search engine is … liable to affect significantly and additionally compared with that of the [original] publishers … the fundamental rights to privacy and to the protection of personal data, the operator of the search engine as the person determining the purposes and means of that activity must ensure, within the framework of its responsibilities, powers and capabilities, that the activity meets the requirements of [data protection].
As my own work has explored, it is concerning that these tests don’t seem to be present in the actual legislative framework which the Court said it was applying. European data protection has generally adopted a “processing” model which holds an operator responsible in any case where they are determining purposes and means, irrespective of whether the activity is significantly or additionally impactful. Moreover, the idea of removing duties on the basis of a lack of capability (or even power) in the relevant systems would appear in tension with the general expectation (now explicitly stated in laws such as the GDPR) that controllers should proactively ensure data protection by design. Nevertheless, the tests themselves may be considered quite reasonable and, aside from where services are clearly operating on behalf of another named or traceable entity which can realistically be held legally accountable, should be broadly applied. Nevertheless, not only is there a need to place this all on a statutory footing, but these abstractions clearly also require considerable further contextual specification. Thus, whilst Google and general search engines have accepted that nominative searches significantly and additionally affect data subject rights, what about searches based on an image, telephone number or job title and workplace? Should the functionality of social networking sites be considered intrinsically significantly and additionally impactful or, if not, how should in-scope processing be demarcated there? Finally, what can be done to ensure that action is respectful of users’ enjoyment of freedom of expression, without fundamentally undermining these personal information safeguards? Related questions are being addressed in a wider range of legal contexts as the European Commission’s draft Digital Services Act and the UK Government’s Online Safety Bill highlight. The development of data protection should, therefore, take full account of these related developments. Only by legislators themselves addressing these issues in earnest can we hope that the landscape in this area might be effectively charted to the potential benefit of legitimate services, users and data subjects alike.