Centre for Information Policy Leadership
  • Home
  • About
    • CIPL Principals
    • Quarterly Reports
  • Membership
  • Events
    • Past Events
  • Projects
    • AI Project
    • Brazil AI Project
    • Organizational Accountability
    • Protecting Children's Data Privacy >
      • Policy Paper I: International Issues & Compliance Challenges
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Effective LGPD
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
  • CIPL Blog
  • Media
  • Contact Us
  • Home
  • About
    • CIPL Principals
    • Quarterly Reports
  • Membership
  • Events
    • Past Events
  • Projects
    • AI Project
    • Brazil AI Project
    • Organizational Accountability
    • Protecting Children's Data Privacy >
      • Policy Paper I: International Issues & Compliance Challenges
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Effective LGPD
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
  • CIPL Blog
  • Media
  • Contact Us

The U.S. Urgently Needs a Comprehensive Privacy Law that Goes Beyond the Fair Information Practices

11/30/2021

2 Comments

 
By Woodrow Hartzog, Professor of Law & Computer Science, Northeastern University
and Neil Richards, Koch Distinguished Professor in Law, Washington University in St. Louis


​Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
​America’s privacy bill has come due. Since the dawn of the Internet, Congress has repeatedly failed to build a robust identity for American privacy law. But now both U.S. states like California and the European Union have forced Congress’s hand by passing legislation like the California Consumer Privacy Act (CCPA) and the General Data Protection Regulation (GDPR). These data protection frameworks, structured around principles for Fair Information Processing called the “FIPs,” have industry and privacy advocates alike for a “U.S. GDPR.” States seemed poised to blanket the country with FIP-based laws if Congress fails to act. The United States is thus in the midst of a “constitutional moment” for privacy, in which intense public deliberation and action may bring about constitutive and structural change. And the European data protection model of the GDPR is ascendant.
 
But there are great risks of U.S. lawmakers embracing a watered-down version of the European model as American privacy law enters its constitutional moment. European-style data protection rules have undeniable virtues, but they won’t be enough. The FIPs assume data processing is always a worthy goal, but even fairly processed data can lead to oppression and abuse. Data protection is also myopic because it ignores how industry’s appetite for data is wrecking our environment, our democracy, our attention spans, and our emotional health. Even if E.U.-style data protection were sufficient, the United States is too different from Europe to implement and enforce such a framework effectively on its European law terms. Any U.S. GDPR would in practice be what we call a “GDPR-lite.”
 
Our argument is simple: In the United States, a data protection model cannot do it all for privacy, though if current trends continue, we will likely entrench it as though it can. We propose instead a more comprehensive approach to privacy that is better focused on power asymmetries, corporate structures, and a broader vision of human well-being. Settling for an American GDPR-lite would be a tragic ending to a real opportunity to tackle the critical problems of the information age.
 
If you look closely, the foundation for a pluralistic American theory of privacy based upon constraining corporate power and protecting vulnerable consumers has already been established. We must embrace it. Practically speaking, lawmakers, courts, and companies must embolden the doctrines and legal tools that advance this agenda. This means strengthening trust-based torts like the breach of confidence and theories of indirect liability, prohibiting more data practices outright, and being more skeptical of the role of consent in validating data practices. It also means both governments and organizations must leverage the concept of privacy to further the over-all well-being of their citizens and customers. An effective approach to privacy also requires a shift from focusing mainly on procedural rules to include substantive restrictions as well. Procedural requirements like obligations to get peoples’ consent for data practices ultimately normalize the kinds of data collection and surveillance harms that they are supposed to mitigate. They are a recipe for companies to exploit and manipulate people in service of ever more data. The substantive shift we call for will require lawmakers to revisit some basic assumptions about when data collection and processing is desirable and entertains bolder obligations, such as outright bans and moratoria on certain technologies and practices. It also requires legislatures to be imaginative and go beyond the standard suite of procedural safeguards like transparency and data subject rights like access to data. Lawmakers have been remarkably creative in creating rules for other industries. They should leverage the power to tax, change business incentives, and pierce the corporate veil in going beyond standard data and consumer protection approaches to confront modern privacy risks.
 
If the United States is to take the modern privacy dilemma seriously, lawmakers must act urgently and be willing to expend political capital for effective rules. America’s privacy reckoning is here, but its identity has yet to be defined. Congress has an opportunity to show leadership by embracing a comprehensive approach that addresses modern data and privacy problems, not those of the 1970s. But if it fails to embrace a comprehensive framework that addresses corporate power, vulnerabilities in information relationships, and data’s externalities, America will be resigned to a weak and myopic approach as its constitutional moment passes. Settling for an American GDPR-lite would be a tragic ending to a real opportunity to tackle the critical problems of the information age.
2 Comments

Neurotech and Privacy of the Mind

11/18/2021

0 Comments

 
By Dario Gil
Senior Vice President & Director
IBM Research


​Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
The next 10 years will bring about all manner of revolutionary data-driven technologies that pose both tremendous benefits and alarming privacy risks. Of these, neurotechnology, or neurotech, will likely be one of the most disruptive.

Neurotech is our, frankly, mind-blowing attempt to connect human brains to machines. Although brain-computer interfaces (BCIs) are the heart of neurotech, it is more broadly defined as technology able to collect, interpret, infer or modify information generated by any part of the nervous system. Why? To develop therapies for mental illnesses and neurological diseases. Beyond health care, it could soon be used in education, gaming, entertainment, transportation and so much more.

But there are pitfalls: there are no widely accepted regulations or guardrails yet when it comes to neurotech’s development or deployment. We must have principles and policies around neurotech, technology safeguards, and national and international regulations.

Neurotech is far from just conceptual -- such technology has already improved the quality of life and abilities of people with different illnesses or impairments, from epilepsy to Parkinson’s Disease to chronic pain. One day, we might implant such neurotech devices into paralyzed humans, allowing them to easily control phones, computers and prosthetic limbs—with their thoughts alone. In 2017, Rodrigo Hübner Mendes, a paraplegic, used neurotech to drive a racecar with his mind. Recently, an invasive neurotech device accurately decoded imagined handwriting movements in real time, at a speed that matched typical typing. Researchers have also showed how invasive neurotech allows users with missing or damaged limbs to feel touch, heat and cold through their prostheses.

Emerging applications of neurotech provide even more promise. Not only can neurotechnology sense or read neurodata but it can also modulate—invasively and noninvasively. This research is still in early stages, but it’s advancing rapidly. One astounding example is the work of Rafael Yuste, a neurobiologist at Columbia University. His team has recorded the neuron activity of a mouse that was performing an action, such as licking, for a reward. Later the researchers reactivated these same neurons and got the mouse to perform the same action, even if the rodent did not intend to do it at that moment. It is easy to imagine how this technology could lead to new breakthrough treatments for people with physical disabilities, for example.

Neurotech is still extremely immature. As it becomes more commonplace, we must consider the risks it might present, the ethics around it, and what regulation would be appropriate. Such risks are indeed vast, in some cases challenging the very autonomy of our own actions and the privacy of our thoughts. What if someone were to face employment discrimination because the algorithms that power a neurotech application used for hiring misinterpreted their neurodata? What if someone’s most sensitive and private thoughts were shared without their knowledge or consent? Of particular concern is the fact that most of the neurodata generated by the nervous systems is unconscious, meaning it could be possible for users to unknowingly or unintentionally share sensitive neurodata. The presumption of privacy within one’s own mind may simply no longer be a certainty.
​
While it is too early to know how to answer the questions neurotech poses about privacy and ethics, we need to ensure that researchers, corporations, policymakers, and consumers alike study and monitor this technology carefully. Developers of neurotech in particular must reaffirm their commitment to responsible innovation and help to develop and enforce guardrails so that they lead to beneficial long-term outcomes for the economy and society alike.
0 Comments
<<Previous

    Archives

    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2020
    June 2020
    April 2020
    March 2020
    December 2019

    Categories

    All
    Accountability
    Data Processing
    Individual Rights
    Legitimate Interest
    Transparency
    US Privacy

    RSS Feed

Copyright © 2022 by the Centre for Information Policy Leadership at Hunton Andrews Kurth LLP.
Disclaimer | Privacy Policy | Cookies Policy | Contact
Picture
Picture