Around the world, new privacy laws are coming into force and outdated laws continue to be updated: the EU GDPR, Brazil’s Lei Geral de Proteção de Dados Pessoais (LGPD), Thailand’s Personal Data Protection Act (PDPA), India’s and Indonesia’s proposed bills, California’s Consumer Privacy Act (CCPA), and the various efforts in the rest of the United States at the federal and state levels. This proliferation of privacy laws is bound to continue.
Yet, the impact of modern information technologies and the data and business ecosystems they enable transcends national boundaries. The economic value of global data flows associated with this reality has surpassed the value of traditional trade in goods. And consumers and citizens are also increasingly mobile and global. In this globally connected world, it will be essential to drive convergence and ensure consistency and interoperability between privacy laws and regulations. This is crucial both for delivering effective privacy protections as well as maximizing the ability to engage in responsible and beneficial data uses.
Modern privacy laws and regulatory guidance must be developed with an eye to other global privacy regimes rather than solely within domestic echo chambers. While the goal of global convergence must be balanced with unique local requirements and priorities, the vast majority of core privacy protections and principles can and should be harmonized in the digital world of today.
2. Bottom-up Best Practices for Accountable AI
Lawmakers are just starting to dip their toes into AI regulation, but there seems to be widespread recognition that premature regulation and over-regulation could stifle AI growth and development. Traditional data protection principles don’t necessarily align neatly with AI technologies, so lawmakers and regulators must be ready to establish and interpret rules more flexibly based on the risks to individuals and benefits to society. At the same time, some AI-based technologies, such as facial recognition, are rightfully causing a sense of urgency among all stakeholders as well as calls for specific regulation.
While such steps are being debated, AI use is increasing at a rapid pace. It is, therefore, imperative that organizations that develop and use AI technologies start building internal programs, processes, tools and techniques to deliver accountable and trustworthy AI. Indeed, more than in any other context, organizational accountability and best practices will be the key ingredients in the “secret sauce” for an effective regulatory response to AI. Accordingly, CIPL has proposed a layered approach to AI regulation based on three pillars (1) leveraging and expanding the existing privacy tools and norms, such as privacy impact assessments and transparency; (2) setting expectations for organizations in all sectors to build demonstrable and verifiable accountability programs and best practices for AI; and (3) adopting regulatory practices that are risk-based, transparent and emphasize thought-leadership as well as incorporate innovative oversight mechanisms such as sandboxes, seals and certifications and constructive engagement between regulators and organizations.
3. Promoting Accountable Free-Flow of Data
The free flow of data is essential to a sustainable global digital economy. Safeguarding this free flow of data requires coherent and efficient cross-border data transfer mechanisms and avoidance of data localization requirements that artificially constrain data flows.
Both organizations and individuals expect that privacy protections and accountability will follow the data. Policy and law makers must build on those expectations. They must drive convergence between data transfer mechanisms, so that organizations can use the same type of mechanisms, such as contracts, certifications or binding corporate rules, no matter where they operate. They must also drive interoperability between different transfer mechanisms and create “adaptors” for the different national privacy “plugs”. GDPR Binding Corporate Rules (BCR) and APEC Cross Border Privacy Rules (CBPR) should be able to talk to one another, just like the future GDPR certifications and CBPR should be able to.
Accountability frameworks, such as certifications and codes of conduct will have to become part and parcel of any effective comprehensive privacy law and framework around the world. This includes certifications based on ISO standards, EU binding corporate rules, APEC CBPR or similar formal accountability frameworks. Organizations should be actively considering these certifications and start using them not only as transfer mechanisms but also to provide assurance of compliance with the ever growing body of national laws and to demonstrate being responsible partners to businesses, consumers and regulators. Organizations that have comprehensive privacy programs will automatically be ready to go this extra step and successfully obtain such certifications.
4. Pursuing Constructive Engagement between Data Protection Authorities and Industry
Regulating cutting edge, complex and evolving technologies presents unprecedented challenges for regulators, particularly when resources are limited. Deterrence and punishment alone have proven to have limited effectiveness in achieving desired results, much less encourage a race to the top in the market. If regulators want to be effective, they must apply modern and innovative regulatory approaches as well as prioritize open and constructive relationships with the organizations they regulate. This is true for all regulators, including in jurisdictions with well-established data protection cultures and histories. In countries that have, or soon will have, new or first-time privacy laws, like Brazil and India, newly established data protection authorities now have the advantage of getting this right from the start. Providing guidance and thought-leadership, encouraging and incentivizing good behaviors, and accountability will yield results, as it encourages a race to the top and taps into the desire of organizations to be seen as responsible businesses and trusted users of data.
Regulated entities must also prioritize and be ready for such constructive engagement. They must share knowledge and help educate regulators about new technologies. There is considerable scope for building compliance solutions cooperatively and ensuring responsible innovation that also protects the rights and interests of individuals. Regulatory sandboxes are a perfect example. These allow businesses to test innovative products, services and business models in real life and with actual customers under the supervision of a regulatory body. They create a safe haven for regulated companies to experiment and innovate, while helping regulators better understand the technologies they are regulating.
5. Expanding the Beneficial Use of Data through Accountable Data Sharing Arrangements
Data sharing between public and private organizations and within and across industry sectors will likely become more important, even transformational, to the modern data economy and digital society. It fosters competition and innovation, particularly in the context of AI-based technologies and data-driven business models. It is essential for academic and scientific research, as well as for machine learning and algorithmic training. Data sharing also improves effectiveness of governments and public policy, from health, education and tax to social policy, all of which increasingly rely on data-driven decisions.
There is a real need to develop a framework for trusted data sharing based on organizational accountability. An overly “user centric” approach that makes data sharing dependent on choices made by individuals may actually defeat the benefits and full potential of data sharing. Instead, the focus of the debate should be the wide range of accountability measures that organizations could employ in this context, from risk assessments, transparency, proportionality and articulation of values and benefits to governance and data sharing agreements.
6. Saying Goodbye to the Individual Control Paradigm of Privacy Protection
What is an approach to privacy regulation that is fit for the 21st Century’s Fourth Industrial Revolution? Will we start to design and apply our privacy laws in ways that truly work for individuals, or will we continue to design and apply them to make individuals work? Under the old model – making individuals work – consumers have to read endless privacy notices and constantly make choices about how their personal data may be used as they use different services and go about their daily life and work. Being able to make such choices is, of course, important in certain situations, but in many cases, there are better ways to protect individuals that don’t require consumers to become full-time privacy professionals.
Despite mounting evidence of its ineffectiveness in protecting privacy and calls to shift away from this approach from all corners, the notice, choice and consent model of privacy protection is still alive and well in 2020. Several recent proposals for U.S. federal privacy legislation have relied heavily on notice and consent, as has nearly every state bill that’s been introduced so far this year. The ePrivacy regulation in the EU has the same problem. Even where privacy laws, such as GDPR, do not privilege consent over other bases for processing personal data , deep rooted habits of regulators and policy makers continue to treat notice and consent as a sine qua non of privacy and data protection at the expense of better options grounded in “organizational accountability” that are available in plain sight (see below).
Our digital world and society need new and different approaches to regulating data privacy, while still empowering individuals. Now more than ever, it is essential that we unite in educating law- and policymakers in the US, Europe and beyond about the benefits of the accountability-based model of privacy regulation and about alternative grounds for processing personal data to the old fashioned individual consent approach.
7. Welcoming Organizational Accountability
Organizational accountability requires organizations to implement comprehensive privacy programs governing all aspects of the collection and use of personal information. It also requires organizations to be able to demonstrate the existence and effectiveness of such programs upon request. It ensures robust protections for individuals and their data while enabling responsible data collection, use and sharing, placing more responsibility on organizations that are collecting and using data and less burden on individuals.
One of accountability’s core features is risk-based privacy protection, which can provide organizations broad latitude in using personal data in no- or low-risk contexts, enable more targeted and effective protections where actual risks are identified, and may include legal or regulatory prohibitions on certain high-risk activities that cannot be made safe. Risk-based privacy programs enable organizations to focus on truly risky processing and prioritize their privacy protections in areas where it really matters. Under this approach, the primary burden of protecting individuals would lie with organizations that would now be required to formally identify privacy risks, mitigate against them, and be able to demonstrate and justify their risk assessments and mitigations.
Polls confirm that individuals are concerned about how organizations use their personal data. Individuals are looking for value and responsible stewardship of their data. Many organizations are increasingly waking up to the existing trust deficit and are taking proactive steps to address it, even when it’s not yet explicitly required by law. They are building privacy management programs that include leadership and oversight, risk assessments, policies and procedures, transparency, training and awareness, monitoring and verification and response and enforcement. Enlightened organizations that are taking this approach are realizing the business and competitive benefits that flow from having such comprehensive privacy programs. It enables them to unlock the potential of their major assets – data – and to drive business growth and competitiveness through data-driven innovation.
The next year and decade will be all about accountability and corporate digital responsibility. CEOs and senior business leaders, as well as corporate boards must be ready for this step change and set the tone for this transformation.
8. How Will Things Unfold in the US?
With the California Consumer Privacy Act (CCPA) coming into effect on January 1, 2020, California became the first U.S. state with a comprehensive data privacy law, and other states like Washington may soon follow. This will put huge pressure on federal law makers. A state patchwork of different and possibly conflicting state privacy laws would be a significant compliance challenge for companies operating in the U.S and may undermine its digital leadership, which historically has relied on the effectiveness of scale in data processing, technological innovation, research and investment within a consistent regulatory environment across the entire U.S.
While the U.S. is pondering its privacy path forward, there are intermediate or preparatory steps that can be undertaken by organizations and regulators. For example, one possibility would be to develop a voluntary multi-state privacy interoperability code of conduct (modeled on, for example, APEC CBPR, the US/EU Privacy Shield and CCPA requirements) to facilitate cross-recognition of privacy compliance obligations between state privacy laws. Such a code could ultimately enable companies to more-easily live with a state-by-state approach to privacy law, if that becomes the new status quo. It would essentially provide a baseline set of core privacy obligations to create interoperability between otherwise disparate frameworks. On the other hand, if a U.S. federal law were to materialize, this code could operate as one of the certifications or codes of conduct this law would presumably enable. In addition, organizations should continue to implement comprehensive privacy compliance and management programs modeled on the Federal Trade Commission’s (FTC) privacy consent orders. These orders are not only relevant to the companies that are the subjects of these orders; they describe the types of accountability measures and privacy programs the FTC and data protection authorities around the world expect of all companies, regardless of whether they are explicitly required by law.