Centre for Information Policy Leadership
  • Home
  • About
    • Meet the Team
  • Membership
  • Events
  • Projects
    • AI Project
    • Brazil AI Project
    • Organizational Accountability
    • Protecting Children's Data Privacy >
      • Policy Paper I: International Issues & Compliance Challenges
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Effective LGPD
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
    • Corporate Digital Responsibility and Accountability
    • Regulatory Engagement
    • Artificial Intelligence
    • Digital Economy and Society
    • Cross-Border Data Transfer Mechanisms
    • GDPR Implementation
    • US Privacy Framework
  • CIPL Blog
  • Media
  • Careers
  • Contact Us
  • Home
  • About
    • Meet the Team
  • Membership
  • Events
  • Projects
    • AI Project
    • Brazil AI Project
    • Organizational Accountability
    • Protecting Children's Data Privacy >
      • Policy Paper I: International Issues & Compliance Challenges
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Effective LGPD
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
    • Corporate Digital Responsibility and Accountability
    • Regulatory Engagement
    • Artificial Intelligence
    • Digital Economy and Society
    • Cross-Border Data Transfer Mechanisms
    • GDPR Implementation
    • US Privacy Framework
  • CIPL Blog
  • Media
  • Careers
  • Contact Us

Increasing Trust In Our Digital Societies And Economies: A Key Factor To Improve Personal Data Protection

9/30/2021

0 Comments

 
By Eduardo Bertoni
Representative of the Regional Office for South America of the Inter American Institute of Human Rights
​Former Director of the Argentine Data Protection and Access to Information Authority​


Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
​The pandemic caused by COVID 19 triggered many discussions about the benefits of “digital economies” and the mutation of our societies to what we can call “digital societies.” In truth, these discussions were latent. The health emergency only accelerated processes that were already in the works. As an example, teleworking was possible before the pandemic, but it has since become necessary, and as time passed, it became more and more incorporated into our lives. Similar processes took place in areas such as "telemedicine," or meetings to discuss global issues that today can be done at low cost by convening people from different countries virtually.

All these activities involve in one way or another the use or processing of personal data. If such use or processing is carried out without rules that we trust will protect our privacy, we run the risk of losing the enormous opportunities that technology offers us today.

The question we must answer is whether the rules that exist today are, in the first place, sufficient to generate trust in users and if, thanks to that trust, they might be useful for the continuation and development of what we now accept as usual activities in these “digital societies”. 

My answer is regrettably that they do not.

The rules that exist today to protect personal data do not generate trust because they are not accepted globally. And this is a problem because personal data is constantly moving across country borders. Preventing this flow makes it impossible for activities of digital societies to occur.

An important reason for the decrease in trust is that the lack of globalization of these rules prevents state entities in charge of protecting personal data -in the few cases that they exist and are independent- from enforcing their decisions. In other words, if a company is sanctioned in a country for violating data protection rules, it could evade the sanction for reasons of jurisdiction or applicable law. The consequence is that users in "digital societies" do not trust both in state institutions and rules that should protect them.

It is true that there are developments that can make us feel cautiously optimistic: the European Union’s push for processes to adapt its own rules (the GDPR) or the Council of Europe’s attempt to globalize Convention 108 are examples of measures that aim to globalize these rules. But, for instance, in Latin America there are very few countries considered to have adequate legislation or that are party to the Convention 108.

Consequently, one of the greatest challenges we face today is to work for an international treaty, with clear rules that can be followed in practice and enforced and ideally enforced globally. An agreement that brings States, companies, academics and users to the table for discussion is essential. Just as the pandemic accelerated the practices of the use of technology that were latent, the pandemic can also perhaps be a factor to accelerate the discussion and adoption of such a global agreement.
0 Comments

Welcome developments in data protection, but are they enough?

9/20/2021

1 Comment

 
By Malcolm Crompton
Founder & Lead Privacy Advisor, Information Integrity Solutions
​Former Australia Privacy Commissioner


Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
The rise of the digital age has profoundly changed the economics of personal data handling. The growing sophistication of data analytics coupled with data storage capabilities that would have been unthinkable a generation ago has created conditions that amplify the value of personal data to businesses. This has obvious implications for personal privacy because in the end, each data point is about a real live individual who deserves dignity, respect, and a rich personal life ‘out of view’.

The problem we are faced with is this astonishing new data-driven business model is being regulated, in large part, by legacy privacy laws. Many such laws were either developed pre-internet or are based on laws that were. This includes the EU General Data Protection Regulation( GDPR). The result is a serious disconnect. Traditional privacy principles such as data minimisation and purpose limitation clash with new business imperatives that demand maximal data and unfettered data use and reuse.
​
And yet – with the usual lag between business innovation and regulatory reaction – we are beginning to see some changes in the regulatory landscape that demonstrate, if nothing else, that regulators and governments are actively grappling with this changed state of play.

First, there has been a conscious effort in more jurisdictions to expand the extra-territorial reach of privacy legislation, which Australia did 20 years ago. The trans-border operation of so many data-driven organisations has meant that data protection authorities have at times struggled to hold such organisations to account.

Second, there has been a trend for broadening the legal definition of personal data’. This enables privacy laws to regulate data (used in online tracking, profiling and targeted advertising) which might otherwise fall outside traditional definitions of personal data. The GDPR and California Consumer Privacy Act (CCPA) both offer examples of this.

Third, new and reformed privacy laws have strengthened regulator powers and increased the size of the penalties available to them. Several recent high-profile privacy cases have involved record-breaking fines. New laws – most notably the GDPR – have sought to enable imposition of fines that reflect the size and revenue of the tech giants and therefore better incentivise compliance. In other cases, particularly those involving the US Federal Trade Commission (FTC), we see record fines being imposed under existing laws.

Fourth, a number of jurisdictions have expanded the rights available to individuals under privacy law. The EU introduced a range of rights into the GDPR including the right to erasure (also known as the ‘right to be forgotten’), the right to restrict processing, the right to object and rights associated with automated decision-making. The GDPR also gives individuals the right to withdraw consent at any time. Other jurisdictions have joined the EU in legislating the ‘right to be forgotten’, while California has given its consumers the right to demand that an organisation not sell their personal information.

Foundationally, such changes – the wider remit of privacy law, stronger regulator powers, and expanded individual rights – attempt to correct some of the power asymmetry between individuals on one hand, and tech giants and other data-driven organisations on the other. Nevertheless, the same intractable problems persist – the failure of the ‘Notice and Consent’ model wherever it is available, including in GDPR, limits of traditional privacy principles, the conundrum of data sovereignty, the inadequacy of consent buckling under the weight of overuse, and others. The recent developments outlined here are just a start and not enough.

In addition and notwithstanding the few significant examples of enforcement, the funding of privacy and data protection authorities worldwide is woefully inadequate almost without exception. Most organisations are ‘getting away with it’ most of the time including in Australia, Europe, the USA and elsewhere.

I believe that we are at a tipping point, by which I mean that we are at the point of engaging with those issues in new ways. We are already seeing how the lines are blurring between conversations about privacy, data sovereignty, AI, anti-trust and even democracy which creates fertile conditions for innovation in how we approach privacy.

My final observation is that there is nothing inevitable about the data-driven business model we are confronted with in 2021. This approach has been powered by two factors: obfuscation of how personal information is actually used (as so pungently described by the Australian Competition and Consumer Commission in its final report on its Digital Platforms Inquiry) and the innate human inability to assess, even in their own interests, short term gain versus long term loss. At last, the scale and impact of the long term losses such as insidious adverse discrimination and damage to democracy are becoming clear in the public mind.
​
In the same way as the world eventually learnt that the impact of ozone and now carbon emissions could actually endanger the planet, I am confident that the combination of economic pressure and regulation will develop the alternative for personal information and privacy. It will not be fast and it will not be easy, but it will happen.
1 Comment
<<Previous

    Archives

    January 2024
    December 2023
    October 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2020
    June 2020
    April 2020
    March 2020
    December 2019

    Categories

    All
    Accountability
    Data Processing
    Individual Rights
    Legitimate Interest
    Transparency
    US Privacy

    RSS Feed

Copyright © 2025 by the Centre for Information Policy Leadership at Hunton Andrews Kurth LLP.
Disclaimer | Privacy Policy | Cookies Policy | CA Privacy Notice | Contact
Picture
Picture