Centre for Information Policy Leadership
  • Home
  • About
    • CIPL Principals
    • Quarterly Reports
  • Membership
  • Events
    • Past Events
  • Projects
    • AI Project
    • Organizational Accountability
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Brazilian Data Protection Implementation and Effective Regulation
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
  • CIPL Blog
  • Media
  • Contact Us
  • Home
  • About
    • CIPL Principals
    • Quarterly Reports
  • Membership
  • Events
    • Past Events
  • Projects
    • AI Project
    • Organizational Accountability
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Brazilian Data Protection Implementation and Effective Regulation
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
  • CIPL Blog
  • Media
  • Contact Us

Covid-19 Meets Privacy: A Case Study for Accountability

4/14/2020

0 Comments

 

In the pressing global fight against Covid-19, technological and AI solutions, involving massive tracking and data analytics, have brought into sharp focus public concern over our fundamental right to privacy. Some have even asked whether privacy will be the victim of Covid-19. And, some have pointed out that our fundamental right to life must trump our right to privacy. 

However, most of us want and expect both. Most of us agree that data driven analysis and decisions, as well as data sharing among industry and governments, are indispensable in fighting Covid-19 and future pandemics—whether to anticipate the virus’ spread and peak; to test new medications or forecast the need for hospitals, medical staff and equipment; to understand people’s social interactions and likelihood of contamination; to verify that quarantine and social distancing measures are observed; or to enable those who have recovered from the virus to resume their work, life and other freedoms for the benefit of us all. And, we also agree that privacy is foundational to our democracies and must be protected now and in the post-Covid world. So, how can we have both—socially responsible collective action and privacy? The answers lie in organizational accountability.

Organizational accountability is an emerging concept in data protection and privacy regimes globally. It requires companies and the public sector to implement effective privacy and data management programs, measures, processes and tools and be able to demonstrate these to regulators, shareholders, business partners, and the public. It complements compliance with existing privacy laws, which may vary from country to country, and operationalizes applicable legal requirements. Accountability also goes beyond compliance and creates openness around decision-making processes for data use and sharing, thereby generating public trust. This concept has been increasingly embraced by industry and public sector bodies around the world. Organizations have been appointing Chief Privacy Officers; establishing internal governance and oversight procedures; carrying out privacy impact assessments that balance risks to individuals and benefits of data uses; delivering user-centric transparency; training their staff about data privacy and ethics; imposing restrictions on service providers, business and government partners when using their data; implementing security measures and technologies; responding to complaints and individuals exercising their rights to know about data uses or delete data. And yes, they have also had to deal in an accountable and cooperative manner with regulators when things went wrong and they suffered security breaches. These accountability-based privacy programs deliver robust controls and protections for individuals and their data, while also enabling responsible data use and sharing, which is so essential for the growth of our digital society and economy. All industry sectors, from high tech to healthcare and telecom, are looking to organizational accountability as the mechanism to bridge the dual imperatives of privacy and innovative data use. 

The fight against Covid-19 is a perfect case study for how accountability can enable both of these goals. Organizational accountability empowers organizations to react quickly and robustly at a time of crisis without sacrificing privacy protections, or the ability to do what is necessary and right for our collective wellbeing. When accountability measures are properly implemented, communicated and enforced, there is no better way to address not only legal privacy obligations, but also societal skepticism, distrust, and fear of unbounded surveillance and abusive data practices. This is especially true where speed is of the essence, frameworks for responsible data sharing have not yet been developed, the need for clear and practical controls is mounting and regulators are scrambling to provide their own views on using data in the context of pandemics.

So, what are the basic accountability measures that organizations of all types, including companies, governments, research and academic institutions, can agree on and implement immediately to address data privacy concerns and enable responsible collection, use and sharing of personal data in the fight against Covid-19?

(1) Clearly defined and documented purposes of data use: Each proposed project must define clear objectives to set the boundaries of what can and should be done with the data and for what purposes. The proposed data purposes should be supported by evidence that data use actually addresses a particular need.

(2) Proportionality test: The amount, manner and duration of data processing must be relevant, necessary and proportionate to the desired objectives. The organizations must be able to answer and document the following considerations: (1) Can we achieve the same objective with less data, or by using aggregated or fully anonymized data that does not identify any individuals? (2) Is the data processing we are proposing a proportionate response for the goal we are trying to achieve? And if not, what do we need to change and do to make it proportionate? 

(3) Privacy impact assessment: Organizations must assess the level of risk of data use or sharing and the potential impact on rights and freedoms of individuals for each project. Risk may be higher if a project involves sharing of health or geolocation data. In that case, what specific mitigation measures should be put in place to address this heightened risk? The assessments must also include an assessment of identified risks against benefits of data use and, especially, the reticence risk. There may be great costs for not using data-driven technology in crisis contexts, even to our other fundamental rights, including life, health and movement.

(4) Transparency to individuals: Individuals whose data is being collected, shared and used must be given user-friendly information about the project and the uses of their data. This can also include information about controls implemented to address any data privacy risks and anything else that would build their trust in and acceptance of the project, such as where to address questions and requests to exercise data protection rights.

(5) Robust security: Security is one of the cornerstones of data privacy. It must be maximized in the Covid-19 context to avoid unauthorized access to sensitive data, tampering with machine learning algorithms that may be used to forecast the need for hospitals, medical staff and equipment, make diagnoses or anticipate the hacking of critical IT systems. 

(6) Storage and use limitation: Data processing undertaken in the Covid-19 context to predict virus peak levels or to understand people’s interaction and likelihood of contamination, must be conducted under clearly limited time frames. Once the purpose of processing is fulfilled, data should not be stored and used any more for another new objective that is unrelated to the original purpose. 

(7) Roles, responsibilities and training: All staff, contractors and third parties working on the project must be clear on their roles and responsibilities in delivering accountability measures and ensuring privacy protection. Organizations must provide everybody with role based training and set expectations for acceptable behaviors.

(8) Data sharing agreements and protocols: Organizations sharing information must define their respective rights and obligations and specific controls relating to data use in a legally binding instrument. The protocols must include oversight and review mechanisms to escalate any issues and ensure all parties act in accordance with the agreement.

(9) Trust, but verify: Organizations must conduct assessments and audits, and verify that they are implementing all the requirements, controls and accountability measures specified in the project and any third party agreements. 

(10) Internal oversight and external validation: The more complex and high risk the data use/sharing project is, the greater the need to ensure internal top management and Chief Privacy Officer oversight and clear accountability of leadership in the organizations. This may also involve some forms of external validation, with external ethics or data advisory councils, or data review boards where these are already in place. 

(11) Regulatory engagement and validation: Organizations must be prepared to demonstrate accountability measures and seek feedback on the project from data privacy regulators. This can be either in the planning phases, or post-facto, on request, or in the case of a complaint or another issue. Constructive engagement between organizations and regulators is especially crucial in the Covid-19 climate where new and unforeseen uses of data are becoming essential for protecting the public. By discussing and navigating relevant challenges together, organizations and regulators can achieve necessary and quick outcomes that comply with applicable requirements and generate public trust. Such efforts will also set data privacy regulators up to establish a unified approach to regulating data in emergencies for the future.

(12) Privacy-by-design through technical measures: Organizations must consider how technical measures can help ensure privacy-by-design in new data projects. For example, differential privacy, anonymization and federated learning can be useful techniques when deploying AI and machine learning applications.

By implementing the above measures, both the public and private sectors will ensure digital responsibility, enabling data innovation and delivering effective privacy protection. Public authorities, in particular, have a leading role to play in applying accountable practices in the Covid-19 context. Data often travels between the public and private sectors and it is critical that governments refrain from any practice that could jeopardize trust in technology and the digital economy to help solve this crisis. When requesting access to or sharing of data from the private sector, governments must implement all appropriate accountability measures and protections we discuss above. In particular, their requests must be based on a statutory or other legally permissible requirement and their use of data strictly limited for the purpose of a specific Covid-19 initiative. 

Individuals, industry, academics, public authorities and society at large—we are all engaged in the same battle against the coronavirus. Our key advantage over previous pandemics is technology and data, which have armed us with new and effective resources against the virus’ destructive potential. We must use them to the fullest. In times of danger, individual privacy cannot trump our social responsibility towards others. Nor should the common good have to trump privacy. Accountability enables us to enjoy both and to have our privacy cake and eat it together.

Click here to download this article. 
0 Comments

Eight Privacy Priorities for 2020 and Beyond

3/9/2020

0 Comments

 
1. Global Convergence and Interoperability between Privacy Regimes

Around the world, new privacy laws are coming into force and outdated laws continue to be updated: the EU GDPR, Brazil’s Lei Geral de Proteção de Dados Pessoais (LGPD), Thailand’s Personal Data Protection Act (PDPA), India’s and Indonesia’s proposed bills, California’s Consumer Privacy Act (CCPA), and the various efforts in the rest of the United States at the federal and state levels. This proliferation of privacy laws is bound to continue. 

Yet, the impact of modern information technologies and the data and business ecosystems they enable transcends national boundaries. The economic value of global data flows associated with this reality has surpassed the value of traditional trade in goods. And consumers and citizens are also increasingly mobile and global. In this globally connected world, it will be essential to drive convergence and ensure consistency and interoperability between privacy laws and regulations. This is crucial both for delivering effective privacy protections as well as maximizing the ability to engage in responsible and beneficial data uses. 

Modern privacy laws and regulatory guidance must be developed with an eye to other global privacy regimes rather than solely within domestic echo chambers. While the goal of global convergence must be balanced with unique local requirements and priorities, the vast majority of core privacy protections and principles can and should be harmonized in the digital world of today.  

2. Bottom-up Best Practices for Accountable AI  

Lawmakers are just starting to dip their toes into AI regulation, but there seems to be widespread recognition that premature regulation and over-regulation could stifle AI growth and development. Traditional data protection principles don’t necessarily align neatly with AI technologies, so lawmakers and regulators must be ready to establish and interpret rules more flexibly based on the risks to individuals and benefits to society.  At the same time, some AI-based technologies, such as facial recognition, are rightfully causing a sense of urgency among all stakeholders as well as calls for specific regulation. 

While such steps are being debated, AI use is increasing at a rapid pace. It is, therefore, imperative that organizations that develop and use AI technologies start building internal programs, processes, tools and techniques to deliver accountable and trustworthy AI. Indeed, more than in any other context, organizational accountability and best practices will be the key ingredients in the “secret sauce” for an effective regulatory response to AI. Accordingly, CIPL has proposed a layered approach to AI regulation based on three pillars (1) leveraging and expanding the existing privacy tools and norms, such as privacy impact assessments and transparency; (2) setting expectations for organizations in all sectors to build  demonstrable and verifiable accountability programs and best practices for AI; and (3) adopting regulatory practices that are risk-based, transparent and emphasize thought-leadership as well as incorporate innovative oversight mechanisms such as sandboxes, seals and certifications and constructive engagement between regulators and organizations.

3. Promoting Accountable Free-Flow of Data

The free flow of data is essential to a sustainable global digital economy. Safeguarding this free flow of data requires coherent and efficient cross-border data transfer mechanisms and avoidance of data localization requirements that artificially constrain data flows. 

Both organizations and individuals expect that privacy protections and accountability will follow the data. Policy and law makers must build on those expectations. They must drive convergence between data transfer mechanisms, so that organizations can use the same type of mechanisms, such as contracts, certifications or binding corporate rules, no matter where they operate. They must also drive interoperability between different transfer mechanisms and create “adaptors” for the different national privacy “plugs”. GDPR Binding Corporate Rules (BCR) and APEC Cross Border Privacy Rules (CBPR) should be able to talk to one another, just like the future GDPR certifications and CBPR should be able to. 

Accountability frameworks, such as certifications and codes of conduct will have to become part and parcel of any effective comprehensive privacy law and framework around the world. This includes certifications based on ISO standards, EU binding corporate rules, APEC CBPR or similar formal accountability frameworks. Organizations should be actively considering these certifications and start using them not only as transfer mechanisms but also to provide assurance of compliance with the ever growing body of national laws and to demonstrate being responsible partners to businesses, consumers and regulators. Organizations that have comprehensive privacy programs will automatically be ready to go this extra step and successfully obtain such certifications.  

4. Pursuing Constructive Engagement between Data Protection Authorities and Industry

Regulating cutting edge, complex and evolving technologies presents unprecedented challenges for regulators, particularly when resources are limited. Deterrence and punishment alone have proven to have limited effectiveness in achieving desired results, much less encourage a race to the top in the market. If regulators want to be effective, they must apply modern and innovative regulatory approaches as well as prioritize open and constructive relationships with the organizations they regulate. This is true for all regulators, including in jurisdictions with well-established data protection cultures and histories. In countries that have, or soon will have, new or first-time privacy laws, like Brazil and India, newly established data protection authorities now have the advantage of getting this right from the start. Providing guidance and thought-leadership, encouraging and incentivizing good behaviors, and accountability will yield results, as it encourages a race to the top and taps into the desire of organizations to be seen as responsible businesses and trusted users of data.

Regulated entities must also prioritize and be ready for such constructive engagement. They must share knowledge and help educate regulators about new technologies. There is considerable scope for building compliance solutions cooperatively and ensuring responsible innovation that also protects the rights and interests of individuals. Regulatory sandboxes are a perfect example. These allow businesses to test innovative products, services and business models in real life and with actual customers under the supervision of a regulatory body. They create a safe haven for regulated companies to experiment and innovate, while helping regulators better understand the technologies they are regulating. 

5. Expanding the Beneficial Use of Data through Accountable Data Sharing Arrangements

Data sharing between public and private organizations and within and across industry sectors will likely become more important, even transformational, to the modern data economy and digital society. It fosters competition and innovation, particularly in the context of AI-based technologies and data-driven business models. It is essential for academic and scientific research, as well as for machine learning and algorithmic training. Data sharing also improves effectiveness of governments and public policy, from health, education and tax to social policy, all of which increasingly rely on data-driven decisions. 
There is a real need to develop a framework for trusted data sharing based on organizational accountability. An overly “user centric” approach that makes data sharing dependent on choices made by individuals may actually defeat the benefits and full potential of data sharing. Instead, the focus of the debate should be the wide range of accountability measures that organizations could employ in this context, from risk assessments, transparency, proportionality and articulation of values and benefits to governance and data sharing agreements.

6. Saying Goodbye to the Individual Control Paradigm of Privacy Protection

What is an approach to privacy regulation that is fit for the 21st Century’s Fourth Industrial Revolution? Will we start to design and apply our privacy laws in ways that truly work for individuals, or will we continue to design and apply them to make individuals work? Under the old model – making individuals work – consumers have to read endless privacy notices and constantly make choices about how their personal data may be used as they use different services and go about their daily life and work. Being able to make such choices is, of course, important in certain situations, but in many cases, there are better ways to protect individuals that don’t require consumers to become full-time privacy professionals. 

Despite mounting evidence of its ineffectiveness in protecting privacy and calls to shift away from this approach from all corners, the notice, choice and consent model of privacy protection is still alive and well in 2020. Several recent proposals for U.S. federal privacy legislation have relied heavily on notice and consent, as has nearly every state bill that’s been introduced so far this year. The ePrivacy regulation in the EU has the same problem. Even where privacy laws, such as GDPR, do not privilege consent over other bases for processing personal data , deep rooted habits of regulators and policy makers continue to treat notice and consent as a sine qua non of privacy and data protection at the expense of better options grounded in “organizational accountability” that are available in plain sight (see below). 

Our digital world and society need new and different approaches to regulating data privacy, while still empowering individuals. Now more than ever, it is essential that we unite in educating law- and policymakers in the US, Europe and beyond about the benefits of the accountability-based model of privacy regulation and about alternative grounds for processing personal data to the old fashioned individual consent approach. 

7. Welcoming Organizational Accountability 

Organizational accountability requires organizations to implement comprehensive privacy programs governing all aspects of the collection and use of personal information. It also requires organizations to be able to demonstrate the existence and effectiveness of such programs upon request. It ensures robust protections for individuals and their data while enabling responsible data collection, use and sharing, placing more responsibility on organizations that are collecting and using data and less burden on individuals. 

One of accountability’s core features is risk-based privacy protection, which can provide organizations broad latitude in using personal data in no- or low-risk contexts, enable more targeted and effective protections where actual risks are identified, and may include legal or regulatory prohibitions on certain high-risk activities that cannot be made safe. Risk-based privacy programs enable organizations to focus on truly risky processing and prioritize their privacy protections in areas where it really matters. Under this approach, the primary burden of protecting individuals would lie with organizations that would now be required to formally identify privacy risks, mitigate against them, and be able to demonstrate and justify their risk assessments and mitigations. 

Polls confirm that individuals are concerned about how organizations use their personal data. Individuals are looking for value and responsible stewardship of their data. Many organizations  are increasingly waking up to the existing trust deficit and are taking proactive steps to address it, even when it’s not yet explicitly required by law. They are building privacy management programs that include leadership and oversight, risk assessments, policies and procedures, transparency, training and awareness, monitoring and verification and response and enforcement. Enlightened organizations that are taking this approach are realizing the business and competitive benefits that flow from having such comprehensive privacy programs. It enables them to unlock the potential of their major assets – data – and to drive business growth and competitiveness through data-driven innovation. 

The next year and decade will be all about accountability and corporate digital responsibility. CEOs and senior business leaders, as well as corporate boards must be ready for this step change and set the tone for this transformation.  

8. How Will Things Unfold in the US?  
​

With the California Consumer Privacy Act (CCPA) coming into effect on January 1, 2020, California became the first U.S. state with a comprehensive data privacy law, and other states like Washington may soon follow. This will put huge pressure on federal law makers. A state patchwork of different and possibly conflicting state privacy laws would be a significant compliance challenge for companies operating in the U.S and may undermine its digital leadership, which historically has relied on the effectiveness of scale in data processing, technological innovation, research and investment within a consistent regulatory environment across the entire U.S.  

While the U.S. is pondering its privacy path forward, there are intermediate or preparatory steps that can be undertaken by organizations and regulators. For example, one possibility would be to develop a voluntary multi-state privacy interoperability code of conduct (modeled on, for example, APEC CBPR, the US/EU Privacy Shield and CCPA requirements) to facilitate cross-recognition of privacy compliance obligations between state privacy laws. Such a code could ultimately enable companies to more-easily live with a state-by-state approach to privacy law, if that becomes the new status quo. It would essentially provide a baseline set of core privacy obligations to create interoperability between otherwise disparate frameworks. On the other hand, if a U.S. federal law were to materialize, this code could operate as one of the certifications or codes of conduct this law would presumably enable. In addition, organizations should continue to implement comprehensive privacy compliance and management programs modeled on the Federal Trade Commission’s (FTC) privacy consent orders. These orders are not only relevant to the companies that are the subjects of these orders; they describe the types of accountability measures and privacy programs the FTC and data protection authorities around the world expect of all companies, regardless of whether they are explicitly required by law. 

0 Comments
<<Previous
Forward>>

    Archives

    August 2020
    June 2020
    April 2020
    March 2020
    December 2019

    Categories

    All
    Accountability
    Data Processing
    Individual Rights
    Legitimate Interest
    Transparency
    US Privacy

    RSS Feed

Copyright © 2020 by the Centre for Information Policy Leadership at Hunton Andrews Kurth LLP.
Disclaimer | Privacy Policy | Cookies Policy | Contact
Picture
Picture