By Yann Padova
Partner, Baker McKenzie
Former Commissioner, Commission de Régulation de l’Energie
Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
Partner, Baker McKenzie
Former Commissioner, Commission de Régulation de l’Energie
Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
If you search for the word “innovation” in the GDPR you might be disappointed. Indeed, you will find no occurrence at all. In our current digital world, this is akin to an elephant in the room. Everybody sees that innovation is the driver of new services, a differentiator that ultimately has consequences on the growth of an economy. Everybody knows that most of the digital innovation relies on intense data processing. However, the most important piece of legislation on personal data in Europe is silent about it.
This silence is also supported by another debate about the need to encourage, if not require, companies to share data. Such claims focus on specific sectors such as environment, transportation, insurance and health or target specific actors, the GAFAM -- not to name them. The rationale for these claims is alternatively the alleged scarcity of available data that is believed to impair innovation or the concentration of data in the hands of a few dominant and gate-keeping companies that has the same effect.
In order to fight against these phenomena, the EU Commission contemplated in 2020, among other measures, the introduction of the notion of “data in the general interest”, following the initiative taken by the French lawmakers in 2016. However, the legal validity of this new notion remains to be demonstrated. Besides, the legal ground for mandatory sharing proves itself to be challenging to establish in light of the value and the strength of the right of property. More recently, the EU Commission has introduced a draft regulation, the Digital Market Act, that targets the so-called gatekeeper companies and sets several obligations, in particular relating to the combination and the use of data.
Nevertheless, our point here is that these claims underestimate the importance of the systemic and legal conditions that are conducive to using data in order to foster innovation. Indeed, data innovation entails defining qualities that hardly cope well with some of the core principles of the GDPR. Data innovation, name it AI or “Big Data” a few years ago, is about trying to find correlation in massive and messy sets of data with unspecified a-priori purposes, learning algorithms that change depending on the data input and the output, reuse of data and combining it with other sources. All these features collide with some of the GDPR’s fundamental principles such as data minimisation, purpose limitation, data retention, data accuracy to name a few. The tension is obvious and the focus on data sharing may be looking at the wrong side of the problem.
Indeed, is innovation just a question of access to massive volumes of data, or does it also – and above all – require a favourable and incentive-based legal framework? On the substance, the fact that data is scarce is debatable and to rely on this assumption favours “quantitative” criteria to the detriment of an analysis of the systemic and legal prerequisites conducive to innovation and competition.
British regulators have understood the complexity and the ambivalence of the links between innovation, competitiveness and data protection. They organized themselves in order to make innovation an instrument to strengthen companies’ competitiveness through the so-called “sandbox” technique, [1] thereby demonstrating that the approach based on innovative uses of data is just as, if not more, important than the mere issue of access to data and data sharing. Such initiatives may be seen as an implicit acknowledgment that the interplay between the legal framework applicable to data protection and innovation is not, by design, necessarily and unconditionally favourable to innovation.
This need to organize a “safe space” for innovation has crossed the Chanel since the EU draft regulation on AI provides also for sandboxes. Maybe the GDPR creators should reflect on whether the current regulation strikes the right balance between innovation and data protection and provides the relevant tools to address the unprecedented challenges and opportunities of AI.
[1] The Financial Conduct Authority (FCA) implemented the first “sandbox” in 2015. The ICO in turn committed to this approach in 2018.
This silence is also supported by another debate about the need to encourage, if not require, companies to share data. Such claims focus on specific sectors such as environment, transportation, insurance and health or target specific actors, the GAFAM -- not to name them. The rationale for these claims is alternatively the alleged scarcity of available data that is believed to impair innovation or the concentration of data in the hands of a few dominant and gate-keeping companies that has the same effect.
In order to fight against these phenomena, the EU Commission contemplated in 2020, among other measures, the introduction of the notion of “data in the general interest”, following the initiative taken by the French lawmakers in 2016. However, the legal validity of this new notion remains to be demonstrated. Besides, the legal ground for mandatory sharing proves itself to be challenging to establish in light of the value and the strength of the right of property. More recently, the EU Commission has introduced a draft regulation, the Digital Market Act, that targets the so-called gatekeeper companies and sets several obligations, in particular relating to the combination and the use of data.
Nevertheless, our point here is that these claims underestimate the importance of the systemic and legal conditions that are conducive to using data in order to foster innovation. Indeed, data innovation entails defining qualities that hardly cope well with some of the core principles of the GDPR. Data innovation, name it AI or “Big Data” a few years ago, is about trying to find correlation in massive and messy sets of data with unspecified a-priori purposes, learning algorithms that change depending on the data input and the output, reuse of data and combining it with other sources. All these features collide with some of the GDPR’s fundamental principles such as data minimisation, purpose limitation, data retention, data accuracy to name a few. The tension is obvious and the focus on data sharing may be looking at the wrong side of the problem.
Indeed, is innovation just a question of access to massive volumes of data, or does it also – and above all – require a favourable and incentive-based legal framework? On the substance, the fact that data is scarce is debatable and to rely on this assumption favours “quantitative” criteria to the detriment of an analysis of the systemic and legal prerequisites conducive to innovation and competition.
British regulators have understood the complexity and the ambivalence of the links between innovation, competitiveness and data protection. They organized themselves in order to make innovation an instrument to strengthen companies’ competitiveness through the so-called “sandbox” technique, [1] thereby demonstrating that the approach based on innovative uses of data is just as, if not more, important than the mere issue of access to data and data sharing. Such initiatives may be seen as an implicit acknowledgment that the interplay between the legal framework applicable to data protection and innovation is not, by design, necessarily and unconditionally favourable to innovation.
This need to organize a “safe space” for innovation has crossed the Chanel since the EU draft regulation on AI provides also for sandboxes. Maybe the GDPR creators should reflect on whether the current regulation strikes the right balance between innovation and data protection and provides the relevant tools to address the unprecedented challenges and opportunities of AI.
[1] The Financial Conduct Authority (FCA) implemented the first “sandbox” in 2015. The ICO in turn committed to this approach in 2018.