Centre for Information Policy Leadership
  • Home
  • About
    • CIPL Principals
    • Quarterly Reports
  • Membership
  • Events
    • Past Events
  • Projects
    • AI Project
    • Brazil AI Project
    • Organizational Accountability
    • Protecting Children's Data Privacy >
      • Policy Paper I: International Issues & Compliance Challenges
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Effective LGPD
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
  • CIPL Blog
  • Media
  • Contact Us
  • Home
  • About
    • CIPL Principals
    • Quarterly Reports
  • Membership
  • Events
    • Past Events
  • Projects
    • AI Project
    • Brazil AI Project
    • Organizational Accountability
    • Protecting Children's Data Privacy >
      • Policy Paper I: International Issues & Compliance Challenges
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Effective LGPD
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
  • CIPL Blog
  • Media
  • Contact Us

Lessons from COVID-19 for a New US Privacy Framework

8/25/2020

5 Comments

 
COVID-19 has forced an increased reliance on technology and data, both in our daily lives and in responding to the pandemic. The pandemic has also demonstrated, more than ever before, the need for a comprehensive US federal privacy framework. CIPL has published a new paper entitled “Data Protection in the New Decade: Lessons from COVID-19 for a US Privacy Framework.” It highlights seven key lessons from the health crisis to consider when developing a new privacy framework for the US:
 
1. Data and the technologies that facilitate its collection and use are an essential part of our lives.
 
Since the start of the pandemic, technology fueled by data has kept our economy and society operating as key aspects of our lives (e.g. work, shopping, education, entertainment, medical care, social life, etc.) have moved online. Data has also been essential for medical research and developing tools to fight to the pandemic, as well as to ensure a safe re-opening of our businesses. This situation has also highlighted the ability to share data both between organizations and between the public and private sectors, which, in turn, has put a spotlight on important data protection and privacy issues. We’ve learned that we must have a privacy framework that is flexible and nimble enough to effectively meet the increasing need to use and share data in new ways. Any US data privacy law, therefore, needs to be drafted in a way that both protects individual privacy and enables the effective use of data.
 
2.  A privacy law must not impede the responsible use of artificial intelligence (AI).
 
AI has played a key role in developing technologies to combat the spread of COVID-19 as well as in developing a vaccine and other treatments for the virus. These are just the latest examples of how AI has been used to revolutionize business operations and generally transform core aspects of how we live. As such, any privacy rules we create should not seek to impede the development of AI technology, but must provide reasonable guardrails that enable its further development and responsible use.
 
3. The right to privacy must be balanced with other fundamental rights.
 
Times of crisis have demonstrated that the right to privacy cannot be absolute and must be balanced with other fundamental rights such as healthcare and the freedom of movement. A well-tailored privacy law can and must provide the flexibility to respond to crises such as the pandemic while also protecting individual privacy. As explained further below, a privacy law that is grounded in organizational accountability and rigorously enforced can deliver the appropriate balance and flexibility.
 
4. Traditional interpretations of data protection principles have proven insufficient to keep up with modern data uses.
 
Modern uses of data are challenging long-standing privacy principles. Consent has proven particularly inadequate to protect individuals given how data is used today and how it’s being used to respond to the pandemic. While consent remains relevant in some contexts, consent requests can improperly suggest to individuals that they are choosing between compromising their privacy (by giving their consent) and maintaining their privacy (by not consenting). But privacy protections should not and need not depend on whether one has consented to a particular data use. Moreover, consent can be burdensome to individuals in our increasingly complex, data-driven economy. Not even privacy experts could manage to invest the time and analysis it would take to make appropriate choices in the many contexts where consent is being requested. This overuse of consent has resulted in consent fatigue, which can render even legitimate and appropriate consent requests meaningless.  There are also many uses of data for which consent is not possible or even desirable -- for example, developing a vaccine, enforcing quarantine, or contact tracing for people who have been exposed to the coronavirus, in addition to protecting national security, enforcing criminal laws, and conducting life-saving research. Thus, while there is a role for consent in certain circumstances, it should not be the principal protection mechanism of a modern-day privacy law.
 
5. Privacy laws should focus less on the collection of data and more on the use of data after collection.
 
Many existing privacy laws and proposals focus on the collection of data. However, the COVID-19 pandemic has demonstrated that there always are compelling reasons for collecting data, such as preventing the spread of the virus and medical research. Thus, privacy laws should focus less on data collection and more on how collected data can be used. They should apply a risk- or harm-based approach to determine what uses should be prohibited or allowed based on the actual risk they pose to individuals, taking into account the available mitigations to reduce the risk.
 
6. Privacy laws should embrace an accountability-based model of data protection.
 
The accountability-based model of data protection is the most promising model in for the digital economy and society. It incorporates privacy risk assessment as one of its most important core elements. Risk assessments enable organizations to devise targeted privacy protections that focus on risky and harmful data uses while enabling other data uses that are not risky or harmful.  This approach is ideally suited to the privacy challenges posed by unforeseen events like a pandemic because it facilitates tailoring privacy protections on a case-by-case basis to the risks at hand rather than casting the protective net so widely that it impedes beneficial and harmless data uses. CIPL’s Accountability Framework provides organizations a comprehensive approach for building, implementing and demonstrating accountable and risk-based privacy management programs. While this approach can and should be used even in the absence of a privacy law, any new US privacy law should incorporate an accountability requirement that can be implemented through comprehensive privacy management programs or other measures that operationalize compliance. Other accountability measures leadership and oversight, appointing a person responsible for data protection compliance, effective and actionable transparency, training of relevant employees, written policies and procedures including on data security, or implementing contractual measures to ensure accountability in the context of cross-border data transfers. Such accountability measures are the future of ensuring both responsible and innovative data uses and robust and enforceable protections for individual privacy.
 
7. Comprehensive federal privacy legislation is the best approach to ensuring privacy protections in the US.
 
COVID-19 does not abide by state borders and large amounts of data needs to be shared across the country to respond to the emergency. If it wasn’t already clear, this situation has illustrated the importance of a privacy law that provides uniform protections for this data throughout the US. Personal data should not be subjected to a patchwork of different privacy regimes. Consumers deserve consistent protections and businesses deserve consistent rules that can enable economic activity and innovation across the country. Given the importance of personal data in the modern economy (as brought into even sharper focus by the pandemic), a single comprehensive approach to US privacy law should be considered a top priority, not least to facilitate economic recovery. It would rationalize and streamline data privacy requirements for US businesses and provide the basis for consumers to gain trust in the digital economy, embrace new technologies, and welcome rather than fear broad uses of data for social good and other beneficial purposes. 
 
For more information on any of these topics, please see CIPL’s new paper.
5 Comments

Are Our Privacy Laws Asking Too Much of Consumers and Too Little of Businesses?

12/13/2019

 
In the last few weeks in the US, Democrats and Republicans from the Senate Commerce Committee have each released draft comprehensive federal privacy legislation bills, and there is a considerable amount of overlap between them. In the Committee’s recent hearing the two sides appeared closer than ever to a bipartisan compromise on a privacy bill. But despite this potential breakthrough, it’s important that lawmakers take the necessary time to ensure they get this groundbreaking legislation right before it becomes law.  

In our complex data-driven society, privacy laws will not be able to provide effective privacy protections if they continue to be rooted in notice and choice. That model no longer scales to our near-constant interactions with data, and it has proven to be a failure for a variety of reasons. Unfortunately, lawmakers appear to be doubling down on the outmoded individual control paradigm of privacy that many experts have deemed ineffective. California’s Consumer Privacy Act (CCPA) features notice and choice as its main protection, and most proposed privacy bills at the federal and state levels in recent months have done the same. But, with only one comprehensive state privacy law on the books, and an unsettled federal privacy landscape, there still is time to direct the US privacy approach towards one that will protect and empower individuals more effectively. An accountability-based model, which places the burden on organizations, not individuals, to prevent privacy harms, delivers far stronger privacy protections.

Of course, the logic behind notice and consent appears sound enough: companies provide individuals information about how their personal data will be used to empower them to make informed decisions, and individuals choose whether to consent to handing over their data based on that information. It may have served us well for a while, but at this stage it is time to abandon this approach. In fact, many privacy regulators and experts from civil society and academia have come to recognize that the notice and consent model of privacy protection is no longer workable. For example, FTC Commissioner Rebecca Slaughter has repeatedly outlined the limitations of notice and consent in her speeches and testimony. Similarly Professor Woodrow Hartzog noted in his testimony before the Senate Commerce Committee in February that “notice and consent has failed.” Consent places an immense burden on individuals to protect themselves and understand what is happening with their data, and they simply cannot make informed decisions in each and every one of the countless daily online interactions involving their personal information. The sheer volume of personal data collected, inferred, used and shared in the digital economy makes this impossible. 

However, as noted above, the early efforts from the federal government and the states to craft new privacy laws have not been promising in that respect. The CCPA, which passed last year and will go into effect in January, grounds its protections in its right to opt-out of the sale of personal information, and requires consumers to inform themselves and act upon that information to protect themselves. While it certainly provides Californians with some new privacy protections not provided by existing U.S. laws, it ultimately asks too much of individuals while ignoring available tools that are better-suited for providing effective protections. Similarly, several federal proposals such as Rep. DelBene’s Information Transparency and Personal Data Control Act and Rep. Eshoo’s and Lofgren’s recently-introduced Online Privacy Act of 2019 rely on notice and consent as their primary method to protect consumer privacy. Similarly, the two most recent bills by Senator Cantwell and Senator Wicker make notice and consent (both opt-in and opt out) a prominent feature in the protections they provide.

Effective privacy protections cannot be based upon the premise that consumers know what they’re consenting to (or failing to consent to) when all research shows that they aren’t actually reading privacy policies. And simply improving notice and consent mechanisms (for example through shorter, easy to understand pop-up notices) is not the answer either. Such improvements, though laudable, cannot address the consent fatigue caused by the onslaught of privacy notices and consent requests. In the context of cookie notices, which have become more detailed and prominent since the introduction of the GDPR, we have seen that consumers are likely to accept the terms just to get a pop-up off their screen, especially when they show up again and again. Consumers are tired of these notices and just want the content they’re trying to access. Indeed, when the choice is between accepting the terms or not gaining access to the service, is that choice even meaningful? 

This is not to say that there is no role for notice and choice in future privacy laws. But it must be limited to where it is truly meaningful - perhaps in the context of sharing some types of highly sensitive data for a purpose unrelated to that for which it was collected, such as a pharmacy selling customer information to a lifestyle brand. But for the vast majority of information uses, privacy laws should include different and superior requirements that would actually result in empowering individuals and delivering more effective protections for their data and privacy. 

  • Enhanced User-Centric Transparency - Ensuring that individuals have visibility into what data is being collected on them and how it’s being used is essential for engendering trust in the digital economy and creating accountability. Appropriate disclosures and information should absolutely remain a priority for both lawmakers drafting privacy laws and companies using personal data. Organizations must be transparent not only about what information they collect and how they use and share it, but also about the accountability mechanisms they employ to protect consumers from harm and, importantly, what rights individuals have and they can obtain redress when harm occurs. Privacy policies must also provide sufficient information to regulators about organizations’ data practices so that they can be evaluated and enforced against. Thus, transparency has an important role beyond just enabling consent.

  • Individual rights – Appropriate access, correction, deletion and portability rights empower individuals and give them control over their personal data without undermining organizations ability to work with data. These rights have already been enshrined in the GDPR and, to some extent, the CCPA, and should be adapted to the US context in any new legislation. In addition, individual empowerment can be significantly safeguarded through improved complaint-handling requirements and redress rights for individuals who have experienced privacy harms. Combined with the other accountability-based obligations described in this article that would shift the primary burden of protecting privacy on organizations, this approach would reduce the constant pressure on individuals to make ex ante guesses about what choices will protect them and replaces it with effective and efficient remedies if something does go wrong.

  • A Risk- and Harm-Based Approach - Privacy laws should require organizations to focus on preventing privacy harms to individuals by identifying the potential risks of their data uses and removing or mitigating them through appropriate mechanisms and tools such as anonymization, de-identification, appropriate use limitations, effective redress mechanisms, and employing privacy by design. This approach puts individuals at the center of an organization’s information management practices and results in better protection for individuals, particularly in instances where consent is neither effective nor feasible. Significant modern privacy laws such as the EU GDPR and the Brazil LGPD already incorporate obligatory risk assessments, including through requirements to conduct formal Data Protection Impact Assessments (DPIAs) in certain contexts, also known more generally as Privacy Impact Assessments. 

  • Legitimate Interest Processing – An example of data processing based on risk assessment is the so called legitimate interest ground for processing, which authorizes data processing activities for which a risk assessment has demonstrated that the benefits to the organization or a third party are not outweighed by the interests and risks of harm to individuals. This ground for processing is one of the several co-equal grounds for processing both in the EU GDPR and the Brazilian LGPD (with consent being another one). Including a legitimate interest ground for processing in a US law would provide a formal mechanism for organizations to process data for beneficial purposes as long as they have demonstrably mitigated any risks to individuals. This mechanism requires organizations to consider, in advance, whether processing is likely to result in injury, unfairness or discrimination to individuals, and thus ensures organizations are considering impacts to individuals in their decision-making process. It would also enable responsible data uses where other grounds (like consent) are ineffective and unavailable, such as in the case of previously unanticipated uses of data like in the context of big data analytics and AI and machine learning. To ensure legal certainty and accountability, regulatory guidance could define the risks and harms that would have to be avoided, as well as establish appropriate methodologies for assessing and weighing the involved risks and benefits. 

  • Fair Processing – Fair processing is a separate data protection principle in many privacy laws around the world. The US FTC Act also includes a variant of this principle by prohibiting unfair business practices, including in the context of using personal data. While “fairness” has been difficult to define, spelling out parameters for fair processing presents another vehicle for requiring organizations to focus on the impact of their data uses and to prevent harm, including discrimination. Thus, any new privacy law should include appropriate fair processing requirements, potentially as further defined through regulatory guidance.

  • Accountability – All major modern privacy laws (GDPR, Brazil’s LGDP, India’s draft privacy law, etc.) require companies to have comprehensive privacy management and compliance programs. This is often referred to as “organizational accountability.” In fact, this should be a core component of any modern privacy law as it will provide the structure and processes required for compliance and delivering effective protections to individuals. Such accountability-based privacy programs would include all of the above and other measures to address all key elements of accountability: leadership and oversight; risk assessment; written policies and procedures; transparency; training and awareness; monitoring and verification; and response and enforcement (e.g. complaint handling). Indeed, through its privacy consent orders, the US FTC has also embraced organizational accountability mediated through comprehensive privacy programs. See, for example, the recent FTC consent orders against FB and Equifax, imposing strong accountability-based privacy compliance programs. Of course, the specifics of these programs can and must be tailored and scaled to the size and nature of the organization and the way in which it uses personal data. 

Some have noted that an accountability and risk-based privacy framework places too much faith in companies to do the right thing. But with clear substantive rules set forth both in the law or through regulations and guidelines (defining, for example, the harms that must be prevented), coupled with rigorous enforcement by the Federal Trade Commission, state attorneys general, and possibly even consumers in some instances, companies will have to implement strong privacy practices.

The U.S. might be the only first world country without a comprehensive privacy law, but that means it can learn from and improve upon the laws other countries have put into place.  Doubling down on what’s proven to be an ineffective notice and consent regime won’t result in a privacy law that gives consumers the protections they need and will result in unnecessary impediments to effective and beneficial uses of personal data. To deliver strong privacy protections and enable innovation, we need a framework that empowers consumers beyond consent through a range of accountability measures that place the burden of protecting individuals against actual harms on the organizations that process personal data. 
Picture

    Archives

    March 2023
    February 2023
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2020
    June 2020
    April 2020
    March 2020
    December 2019

    Categories

    All
    Accountability
    Data Processing
    Individual Rights
    Legitimate Interest
    Transparency
    US Privacy

    RSS Feed

Copyright © 2022 by the Centre for Information Policy Leadership at Hunton Andrews Kurth LLP.
Disclaimer | Privacy Policy | Cookies Policy | Contact
Picture
Picture