In our complex data-driven society, privacy laws will not be able to provide effective privacy protections if they continue to be rooted in notice and choice. That model no longer scales to our near-constant interactions with data, and it has proven to be a failure for a variety of reasons. Unfortunately, lawmakers appear to be doubling down on the outmoded individual control paradigm of privacy that many experts have deemed ineffective. California’s Consumer Privacy Act (CCPA) features notice and choice as its main protection, and most proposed privacy bills at the federal and state levels in recent months have done the same. But, with only one comprehensive state privacy law on the books, and an unsettled federal privacy landscape, there still is time to direct the US privacy approach towards one that will protect and empower individuals more effectively. An accountability-based model, which places the burden on organizations, not individuals, to prevent privacy harms, delivers far stronger privacy protections.
Of course, the logic behind notice and consent appears sound enough: companies provide individuals information about how their personal data will be used to empower them to make informed decisions, and individuals choose whether to consent to handing over their data based on that information. It may have served us well for a while, but at this stage it is time to abandon this approach. In fact, many privacy regulators and experts from civil society and academia have come to recognize that the notice and consent model of privacy protection is no longer workable. For example, FTC Commissioner Rebecca Slaughter has repeatedly outlined the limitations of notice and consent in her speeches and testimony. Similarly Professor Woodrow Hartzog noted in his testimony before the Senate Commerce Committee in February that “notice and consent has failed.” Consent places an immense burden on individuals to protect themselves and understand what is happening with their data, and they simply cannot make informed decisions in each and every one of the countless daily online interactions involving their personal information. The sheer volume of personal data collected, inferred, used and shared in the digital economy makes this impossible.
However, as noted above, the early efforts from the federal government and the states to craft new privacy laws have not been promising in that respect. The CCPA, which passed last year and will go into effect in January, grounds its protections in its right to opt-out of the sale of personal information, and requires consumers to inform themselves and act upon that information to protect themselves. While it certainly provides Californians with some new privacy protections not provided by existing U.S. laws, it ultimately asks too much of individuals while ignoring available tools that are better-suited for providing effective protections. Similarly, several federal proposals such as Rep. DelBene’s Information Transparency and Personal Data Control Act and Rep. Eshoo’s and Lofgren’s recently-introduced Online Privacy Act of 2019 rely on notice and consent as their primary method to protect consumer privacy. Similarly, the two most recent bills by Senator Cantwell and Senator Wicker make notice and consent (both opt-in and opt out) a prominent feature in the protections they provide.
Effective privacy protections cannot be based upon the premise that consumers know what they’re consenting to (or failing to consent to) when all research shows that they aren’t actually reading privacy policies. And simply improving notice and consent mechanisms (for example through shorter, easy to understand pop-up notices) is not the answer either. Such improvements, though laudable, cannot address the consent fatigue caused by the onslaught of privacy notices and consent requests. In the context of cookie notices, which have become more detailed and prominent since the introduction of the GDPR, we have seen that consumers are likely to accept the terms just to get a pop-up off their screen, especially when they show up again and again. Consumers are tired of these notices and just want the content they’re trying to access. Indeed, when the choice is between accepting the terms or not gaining access to the service, is that choice even meaningful?
This is not to say that there is no role for notice and choice in future privacy laws. But it must be limited to where it is truly meaningful - perhaps in the context of sharing some types of highly sensitive data for a purpose unrelated to that for which it was collected, such as a pharmacy selling customer information to a lifestyle brand. But for the vast majority of information uses, privacy laws should include different and superior requirements that would actually result in empowering individuals and delivering more effective protections for their data and privacy.
- Enhanced User-Centric Transparency - Ensuring that individuals have visibility into what data is being collected on them and how it’s being used is essential for engendering trust in the digital economy and creating accountability. Appropriate disclosures and information should absolutely remain a priority for both lawmakers drafting privacy laws and companies using personal data. Organizations must be transparent not only about what information they collect and how they use and share it, but also about the accountability mechanisms they employ to protect consumers from harm and, importantly, what rights individuals have and they can obtain redress when harm occurs. Privacy policies must also provide sufficient information to regulators about organizations’ data practices so that they can be evaluated and enforced against. Thus, transparency has an important role beyond just enabling consent.
- Individual rights – Appropriate access, correction, deletion and portability rights empower individuals and give them control over their personal data without undermining organizations ability to work with data. These rights have already been enshrined in the GDPR and, to some extent, the CCPA, and should be adapted to the US context in any new legislation. In addition, individual empowerment can be significantly safeguarded through improved complaint-handling requirements and redress rights for individuals who have experienced privacy harms. Combined with the other accountability-based obligations described in this article that would shift the primary burden of protecting privacy on organizations, this approach would reduce the constant pressure on individuals to make ex ante guesses about what choices will protect them and replaces it with effective and efficient remedies if something does go wrong.
- A Risk- and Harm-Based Approach - Privacy laws should require organizations to focus on preventing privacy harms to individuals by identifying the potential risks of their data uses and removing or mitigating them through appropriate mechanisms and tools such as anonymization, de-identification, appropriate use limitations, effective redress mechanisms, and employing privacy by design. This approach puts individuals at the center of an organization’s information management practices and results in better protection for individuals, particularly in instances where consent is neither effective nor feasible. Significant modern privacy laws such as the EU GDPR and the Brazil LGPD already incorporate obligatory risk assessments, including through requirements to conduct formal Data Protection Impact Assessments (DPIAs) in certain contexts, also known more generally as Privacy Impact Assessments.
- Legitimate Interest Processing – An example of data processing based on risk assessment is the so called legitimate interest ground for processing, which authorizes data processing activities for which a risk assessment has demonstrated that the benefits to the organization or a third party are not outweighed by the interests and risks of harm to individuals. This ground for processing is one of the several co-equal grounds for processing both in the EU GDPR and the Brazilian LGPD (with consent being another one). Including a legitimate interest ground for processing in a US law would provide a formal mechanism for organizations to process data for beneficial purposes as long as they have demonstrably mitigated any risks to individuals. This mechanism requires organizations to consider, in advance, whether processing is likely to result in injury, unfairness or discrimination to individuals, and thus ensures organizations are considering impacts to individuals in their decision-making process. It would also enable responsible data uses where other grounds (like consent) are ineffective and unavailable, such as in the case of previously unanticipated uses of data like in the context of big data analytics and AI and machine learning. To ensure legal certainty and accountability, regulatory guidance could define the risks and harms that would have to be avoided, as well as establish appropriate methodologies for assessing and weighing the involved risks and benefits.
- Fair Processing – Fair processing is a separate data protection principle in many privacy laws around the world. The US FTC Act also includes a variant of this principle by prohibiting unfair business practices, including in the context of using personal data. While “fairness” has been difficult to define, spelling out parameters for fair processing presents another vehicle for requiring organizations to focus on the impact of their data uses and to prevent harm, including discrimination. Thus, any new privacy law should include appropriate fair processing requirements, potentially as further defined through regulatory guidance.
- Accountability – All major modern privacy laws (GDPR, Brazil’s LGDP, India’s draft privacy law, etc.) require companies to have comprehensive privacy management and compliance programs. This is often referred to as “organizational accountability.” In fact, this should be a core component of any modern privacy law as it will provide the structure and processes required for compliance and delivering effective protections to individuals. Such accountability-based privacy programs would include all of the above and other measures to address all key elements of accountability: leadership and oversight; risk assessment; written policies and procedures; transparency; training and awareness; monitoring and verification; and response and enforcement (e.g. complaint handling). Indeed, through its privacy consent orders, the US FTC has also embraced organizational accountability mediated through comprehensive privacy programs. See, for example, the recent FTC consent orders against FB and Equifax, imposing strong accountability-based privacy compliance programs. Of course, the specifics of these programs can and must be tailored and scaled to the size and nature of the organization and the way in which it uses personal data.
Some have noted that an accountability and risk-based privacy framework places too much faith in companies to do the right thing. But with clear substantive rules set forth both in the law or through regulations and guidelines (defining, for example, the harms that must be prevented), coupled with rigorous enforcement by the Federal Trade Commission, state attorneys general, and possibly even consumers in some instances, companies will have to implement strong privacy practices.
The U.S. might be the only first world country without a comprehensive privacy law, but that means it can learn from and improve upon the laws other countries have put into place. Doubling down on what’s proven to be an ineffective notice and consent regime won’t result in a privacy law that gives consumers the protections they need and will result in unnecessary impediments to effective and beneficial uses of personal data. To deliver strong privacy protections and enable innovation, we need a framework that empowers consumers beyond consent through a range of accountability measures that place the burden of protecting individuals against actual harms on the organizations that process personal data.