Centre for Information Policy Leadership
  • Home
  • About
    • CIPL Principals
    • Quarterly Reports
  • Membership
  • Events
    • Past Events
  • Projects
    • AI Project
    • Brazil AI Project
    • Organizational Accountability
    • Protecting Children's Data Privacy >
      • Policy Paper I: International Issues & Compliance Challenges
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Effective LGPD
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
  • CIPL Blog
  • Media
  • Contact Us
  • Home
  • About
    • CIPL Principals
    • Quarterly Reports
  • Membership
  • Events
    • Past Events
  • Projects
    • AI Project
    • Brazil AI Project
    • Organizational Accountability
    • Protecting Children's Data Privacy >
      • Policy Paper I: International Issues & Compliance Challenges
    • EU GDPR Implementation >
      • Global Readiness Benchmarks for GDPR
    • Enabling Data Driven Innovation and Big Data >
      • Privacy Risk Management
      • Transparency and User Controls
      • Updating Core Privacy Principles
    • Role of the DPO
    • Enabling Global Data Flows
    • Regional Focus and Outreach >
      • Effective LGPD
  • Resources
    • CIPL White Papers
    • Public Consultations
    • CIPL Articles
    • Hunton Andrews Kurth Privacy & Information Security Law Blog
  • CIPL Blog
  • Media
  • Contact Us

Some Thoughts on Information Climate Change

10/19/2021

0 Comments

 
By Dr. Alexander Dix
Vice-Chair of the European Academy for Freedom of Information and Data Protection
Former Data Protection and Freedom of Information Commissioner in Brandenburg (1998-2005) and Berlin (2005-2016)

​
Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
Sir Bruce Slane, New Zealand’s first Privacy Commissioner, at the end of his term of office in 2003 highlighted the importance of privacy by drawing a parallel to the air we breathe: both are invisible and we only notice it once it’s gone. So when Scott McNealy said in 1999: “You have zero privacy anyway, get over it” he might as well have said: “You don’t have any air to breathe, so just stop breathing.” Twenty years later, Mark Zuckerberg uttered: “The future is private” (after taking the opposite view all along since setting up Facebook).

So are we witnessing a positive climate change with regard to privacy and data protection? Doubtless, there are some positive developments. Since the GDPR entered into force 2018, 17 more countries have adopted privacy laws and 163 more privacy tech vendors have started their businesses. However, some signs point in the opposite direction. Our informational and technological environment is changing at what the European Commission once aptly described “breakneck speed”. Some have described this development as causing a “data tsunami”. 

Richard Thomas, then UK Information Commissioner, at the 2010 International Conference of Data Protection and Privacy Commissioners in Jerusalem, made a remarkable intervention during a panel on biometrics. At that time facial recognition technology started to become available, but Google had declared publicly that they would refrain from deploying facial recognition technology for ethical reasons. Thomas was skeptical as to how long this position would be upheld and urged the Commissioners to raise public awareness by pointing to the risks for women in particular if one day facial recognition would be rolled out for commercial use. Women being the majority of data subjects would form a strong opposition against the use of this technology that facilitates stalking (to name but one negative consequence). 

More recently, while Microsoft and IBM have stopped the sale of facial recognition software to police departments in the US due to privacy concerns and the discriminatory effect of the software, and Amazon has extended its moratorium on police use of facial recognition “until further notice”, two smaller companies are making the headlines. ClearView AI is offering the technology to local authorities and law enforcement agencies and this offer is being taken up by a number of US cities and the US Postal Service. On the other hand, half a dozen US states and several other cities have banned the use of the technology. The company PimEyes, initially launched by a Polish start-up, is offering facial recognition not only to public bodies but also the general public. PimEyes markets its service with a privacy spin: everyone should have the right – so they say – to find out where their photos have been posted online and get alerts when they are posted. PimEyes has become wildly popular among stalkers. Its general use would in effect be the end of anonymity and free movement in public places. As Stephanie Hare has rightly observed, while technical gadgets such as smartphones can be switched off or left at home, faces cannot. ClearView AI and PimEyes are now under investigation in Europe as to whether they comply with the provisions of the GDPR. 

But the wider picture of our global informational ecosystem must include China where the government has declared it wants the country to be a world leader in artificial intelligence by 2030. Facial recognition is considered to be a helpful tool to identify people in mass gatherings in case there is “a major accident”. The determination of what constitutes a major accident can be illustrated by recent events in Hong Kong. The official values and policy of the Chinese government are diametrically opposite to what western democracies stand for. And still, even in the United States some Capitol rioters in January 2021 have been identified by private “sedition hunters” using PimEyes (which is currently legal in the U.S.).

The question we have to decide is: which kind of society we want to live in? Technology is man-made, not a natural disaster like a tsunami (although some of these “natural” disasters are themselves man-made as a result of climate-change). Technology can and should be regulated. However, regulation very often comes too late to influence design decisions, business models and infrastructures. Regulation is necessary, but not sufficient. 

What is needed, in addition, is twofold: raising the awareness of young people (even at kindergarten-level) for the fundamental importance of being let alone if they want to. And secondly a code of ethics for IT-engineers similar to the Hippocratic Oath for doctors is needed, supervised and enforced by professional bodies. Not every technology which is legal is ethically and socially acceptable. After all, the defense of privacy is too important to be left to Data Protection and Privacy Commissioners alone.
0 Comments

Accounting for Women's Different Experiences with Privacy Online

10/14/2021

12 Comments

 
By Emily Sharpe
Director of Policy, The Web Foundation
​

Any views expressed herein are not necessarily the views of CIPL nor Hunton Andrews Kurth LLP
In 1989, Sir Tim Berners-Lee was working as a software engineer at CERN, the large particle physics laboratory in Switzerland. Throughout his tenure at CERN, he noticed that the scientists who had come from all over the world to use its accelerators were having difficulty sharing information.
 
“In those days, there was different information on different computers, but you had to log on to different computers to get at it. Also, sometimes you had to learn a different program on each computer. Often it was just easier to go and ask people when they were having coffee…” Tim has said.
 
Tim saw a way to solve this problem. Already, millions of computers were being connected together through the fast-developing internet, and he realized they could share information by using an emerging technology called hypertext.
 
In March 1989, Tim set out his vision for what would become the web in a document called “Information Management: A Proposal.” Fast forward to the end of 1990, and the first web page was served on the open internet.
 
The web quickly became a place for global innovation and collaboration never seen before. It created opportunity, gave marginalized groups a voice, and made our daily lives easier. The Covid-19 pandemic has shown us that the web is a lifeline, not a luxury. But access to these digital spaces is not equal for everyone. Thirty years on from the invention of the web, only half of the world was able to connect. And for those who are online, too often they are driven offline by privacy violations, online harassment, censorship, fraud, and more.   
 
In 2009, Sir Tim co-founded the World Wide Web Foundation with Rosemary Leith to address these challenges and to galvanize the global community to fight for the web we want: a web that is safe, empowering, and genuinely for everyone.
 
Today the Web Foundation continues to fight for digital equality and opportunity, with a special focus on half the world’s population: women. Our recent research on women’s rights online highlights the stark gaps between how women and men experience the web and digital services, with men 21% more likely to be online than women, rising to 52% in the world’s least developed countries. There are also gaps in quality of connectivity and digital skills, and threats that disproportionately impact women’s privacy and safety — all of which prevent women from fully benefiting from the opportunities that digital technology offers.       
                                               
Our research shows that women on average have lower levels of trust in private companies, with 54% stating they would not allow companies to use any of their data, compared to 47% of men. Women are more concerned than men about the privacy of their personal data, such as private messages, personal data of family members, medical records, and home addresses.                                   
And women are more concerned about the potential harms they face if their information is misused. Women are disproportionately affected by serious privacy violations in some areas, like doxing, the sharing of non-consensual images, cyberstalking, and surveillance via connected devices by abusive partners.
 
Our survey also found women are less likely to be creators of content when they do get online. Men were far more likely to engage in a range of online activities, including posting comments about political, social or economic issues, selling products or advertising a service, or publishing a blog post. There needs to be more research into the reasons behind this gap — including the role privacy concerns play in women’s lower levels of content creation.
                                   
There is no “universal” experience of the web. We must recognize that individuals’ gender, race, ethnicity, socio-economic class, sexual orientation, and other intersecting and overlapping identities impact how they perceive, interact with, and are served by data, products, and policies. The global tech community must adopt a more intersectional approach to developing policies and products that account for the full diversity of those who use digital tools, especially from a privacy and data protection perspective. As a starting point, companies, governments and researchers should collect gender-disaggregated data around women’s and men’s perceptions of privacy and use of data-driven services. 
12 Comments
<<Previous

    Archives

    March 2023
    February 2023
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2020
    June 2020
    April 2020
    March 2020
    December 2019

    Categories

    All
    Accountability
    Data Processing
    Individual Rights
    Legitimate Interest
    Transparency
    US Privacy

    RSS Feed

Copyright © 2022 by the Centre for Information Policy Leadership at Hunton Andrews Kurth LLP.
Disclaimer | Privacy Policy | Cookies Policy | Contact
Picture
Picture