20.07.21
Intervalo: 07/20/2021 - 07/20/2021

Welcome to another edition of the Bulletin! In this 43rd edition, we highlight that the National Data Protection Authority (ANPD) published its first ordinance (Ordinance No. 1), which provides, among […]

Welcome to another edition of the Bulletin!

In this 43rd edition, we highlight that the National Data Protection Authority (ANPD) published its first ordinance (Ordinance No. 1), which provides, among other points, on procedures for the preparation of the regulatory agenda and normative acts edited by the Authority, including rules applicable to consultations with society, the preparation of a Regulatory Impact Analysis (AIR) and the Assessment of Regulatory Results (ARR). The ordinance will come into effect on August 1, 2021.

In the international context, we highlight that the Information Commissioner’s Office (ICO) has released a set of lesson plans and worksheets designed to teach students how to protect their privacy online and how they can control what online companies and platforms know about them. The resources can be downloaded free of charge from the ICO website and are part of the effort to raise awareness of the Age Appropriate Design Code, code already mentioned in previous editions, which provides a set of standards that online services must follow if they are likely to be accessed ​​by children.

Finally, in the academy, we highlight the article published by Rafael Zanatta and Michel Souza in the Latin American Human Rights Studies Magazine. The text analyzes the characteristics of automated facial recognition technologies and their response by civil society organizations in Brazil, analyzing two arguments: the argument from endemic bias, which seeks to correct unfair and potentially racist consequences, and the argument from endemic oppression, which identifies a set of facilitators of systematic violation of fundamental rights. In this sense, the authors present the concept of counter-movements to explain the possibilities of legal contestation of the spread of facial recognition and explain how the argument can move from the logic of bias to that of oppression, with the possibility of changing the regulation to ban this technology in certain uses.

We wish you a great reading!

Bruno Bioni, Mariana Rielli e Júlia Mendonça

Data Protection at Authorities

Brazil

ANPD participated in the 41st Plenary Meeting of Convention 108

The National Data Protection Authority (ANPD) participated in the 41st Meeting in a plenary session organized by the Advisory Committee of Convention 108 of the European Council, which addressed the Automated Processing of Personal Data. Convention 108, published on January 28, 1981, was the first legally binding international instrument adopted in the data protection field. The ANPD representatives, appointed by the President of the Authority through diplomatic channels, participated in the Plenary Meeting, which normally takes place twice a year, with the participation of its signatory members, as well as other countries and international organizations as observers. Brazil has had observer status since October 2018. One of the main topics in the debates during the meeting was the modernization of Convention 108, due to the constant privacy and data protection challenges arising from the use of new technologies and with a view to strengthening its evaluation and follow-up mechanism and ensure compliance of its signatories with its provisions. The meetings took place between June 28 and 30, from 10:30 am to 1:00 pm, Strasbourg time (CET) and by videoconference.

ANPD announced a new date for the public hearing on the supervision regulation

The public hearing on the draft resolution that provides for the inspection and application of sanctions by the National Data Protection Authority, initially scheduled to take place on July 8 and 9, was postponed and will take place on July 15, 2021, of 9 am to 1 pm and from 2 pm to 6 pm, and on July 16, 2021, from 9 am to 1 pm. The hearing, which is one of the mandatory steps of the regulatory process provided for in Law No. 13.709, of August 14, 2018, the General Data Protection Law (LGPD), aims to discuss with society the normative proposal, which was available for public consultation until June 28, 2021 through the Participa + Brasil platform.

ANPD publishes an ordinance that defines the authority’s regulatory procedures

On July 9, 2021, the National Data Protection Authority (ANPD) published Ordinance No. 1, which provides, among other points, on procedures for the preparation of the regulatory agenda and normative acts issued by the ANPD, including rules applicable to conducting consultations with society, preparing the Regulatory Impact Analysis (AIR) and the Regulatory Result Assessment (ARR). According to Director Miriam Wimmer, rapporteur of the process, “the edition of the ordinance is positive, since it establishes uniform and transparent rules to be observed by the ANPD units, providing speed and predictability to the internal processes of elaboration and review of normative acts .” It should be noted that the procedures provided for in the ordinance have already been followed by the ANPD technical team, so that an eventual adaptation will not be necessary. The ordinance comes into force on August 1, 2021, as provided for in its art. 32.

United States

The FTC pointed out that the App Recolor app, an online coloring book, illegally collected personal information from children.

In a complaint filed by the Department of Justice on behalf of the FTC, the Commission alleged that Toronto-based Kuuhuub Inc., along with its Finnish subsidiaries Kuu Hubb Oy and Recolor Oy, violated the Children’s Online Privacy Protection Act (COPPA) . Law requires websites and apps to provide parental notice and obtain parental consent before collecting personal information from children if the website or app – or even a portion of the website or app – is targeted at children under 13 years of age. These companies operate App Recolor, which provides images that users can color digitally on their mobile devices. Although it is advertised as an “adult coloring book”, a portion of the coloring book app was aimed at children. Images are organized in a library with categories such as Movies and Animals. A popular category, called Kids, included images that tend to appeal to children, such as animated characters and animals. In its complaint, the FTC alleged that the app collected personal information from children under the age of 13 who used the app’s social media features and allowed third-party advertising networks to collect personal information from users in the form of “persistent identifiers”, as well known as cookies, or targeted advertisements. The companies have not instructed ad networks to avoid using persistent child identifiers for behavioral advertising, according to the complaint. The FTC also claimed that companies did not notify parents or obtain verifiable parental consent prior to collecting personal information from underage users of the Recolor app, in violation of the COPPA rule.

Denmark

Danish Authority fined company that performs COVID-19 tests

The Danish Data Protection Authority denounced the company “Medicals Nordic” for having processed confidential and health information related to the COVID-19 tests, without having established the appropriate security measures, which culminated in the imposition of a fine of DKK 600,000. In January 2021, the Authority discovered that Medicals Nordic was using the WhatsApp application to transmit confidential and health information about citizens being tested at the company’s test centers. Based on this, DPA initiated its own investigation, with the objective of ascertaining whether the organization had implemented adequate organizational measures and security techniques regarding the transmission of such information. The conclusion of the investigation, which revealed non-compliance with legal obligations, led to the application of the aforementioned fine.

Danish Authority expressed serious criticism of the without consent telephone conversations recorded by the Danish Business Agency.

The Danish Authority published an opinion on the case in which the Danish Business Agency would have recorded telephone conversations between a holder and the agency, without obtaining consent. During the process, it was discovered that, since June 1, 2018, the Business Agency has been recording all telephone calls received to the customer service center. The recordings were made with the purpose of documenting the conversations, if there was a need to carry out a Police Report, in order to protect call center employees against threats, and for use in their ongoing training, based on the commitment of the duty to guide the Agency. In view of this, the Danish DPA emphasized that such measure should be adopted as an exception, only in cases where a situation that would generate the need for a police report actually occurred. Furthermore, the Authority concluded that recording of telephone conversations for educational purposes could only take place with the consent of the data holders.

France

Artificial intelligence: CNIL’s opinion on the future European regulation

The CNIL and its peers fully embraced the European Commission’s proposal to establish harmonized rules on artificial intelligence in order to preserve individual freedoms. Although it is likely that significant changes will occur, depending on the changes to the text made by the Parliament or the European Council, the European Data Protection Authorities considered it essential to adopt a position, following the publication of an opinion. In this sense, the CNIL published a text observing 4 fundamental points to be taken into consideration about the entire discussion of Artificial Intelligence in Europe, namely: (i) The need to draw specific limits for future AI uses; (ii) The challenge of articulation with the GDPR; (iii) The importance of harmonized governance; (iv) The importance of the existence of support for innovation.

Italy

Italian Authority released new cookies guidelines for users protection

The Italian Authority has approved new guidelines on cookies, with the aim of strengthening the decision-making power of users regarding the use of their personal data when surfing online. The measure was adopted taking into account the results of the public consultation launched at the end of the year 2020. The update of the 2014 Guidelines was necessary in light of the innovations introduced by European regulations on privacy, but it is also justified due to a number of other factors : the experience gained in recent years (based on numerous complaints, reports and requests for opinions) on the incorrect application of the information and consent procedures for the use of data; the increasing use of invasive screening techniques; the multiplication of users’ digital identities, which favors the crossing of their data and the creation of increasingly detailed profiles, etc. Thus, the Authority concluded that the mechanism for obtaining online consent must, in the first place, ensure that, by default, at the time of first access to a website, no cookies or other tools other than those strictly necessary for the operation are placed on the device.

Italian Authority published a video to explain to children how to defend themselves against cyberbullying

Many young people are victims of cyberbullying online. A phenomenon that is unfortunately growing rapidly and that often has serious consequences for victims, causing suffering and discomfort that go beyond the psychological effects. To promote awareness of the rights and protection tools provided for in the legislation, the Italian Authority has produced an animated video that, in simple and clear language, aims to provide information to young victims of cyberbullying and their parents. The video was broadcast on the Authority’s social channels (Youtube, Linkedin, Twitter, Instagram and Telegram) and is published on the thematic page dedicated to Cyberbullying on the Garante website, where it is also possible to consult an information sheet and download the form for the exercise of the protection rights provided for in law 71/2017.

Iceland

Icelandic DPA fined a company that runs ice cream parlors for processing personal employee data through a surveillance camera installed in an employee area.

Iceland’s Data Protection Authority has issued an administrative fine of ISK 5,000,000 (34,000 euros) to an Icelandic company that runs five ice cream parlors. One of the employees complained to the Icelandic DPA about an area used by employees to wear their work uniform, which is under constant surveillance by cameras. The official also pointed out that he has not received notifications or information about surveillance, and that there is no sign indicating the existence of cameras. In this regard, the Icelandic DPA confirmed through inspection that employees did not have access to an acceptable area to change clothes that was not under surveillance, concluding that the complainant’s personal data was not processed in a legal, fair or transparent manner, nor were adequate, relevant and limited to what was necessary in relation to the purposes for which the data were processed. Thus, the authority concluded that the company did not inform its employees about the video surveillance system, as well as about their rights, pursuant to Article 13 of the GDPR.

Lithuania

Lithuanian authority fined a sports club for violations of the GDPR in handling customer and employee fingerprints

The Lithuanian Data Protection Authority (SDPI) carried out an investigation into the processing of personal biometric data at a sports club and imposed a fine of 20,000 euros on VS FITNESS UAB for identified breaches of the General Data Protection Regulation (GDPR) . The fine was imposed for breach of the provisions of Article 5(1)(a) and Article 5(1)(c). Article 9, paragraph 1, article 13, paragraph 1, article 13, paragraph 2, article 30, article 35, paragraph 1, of the GDPR, ie , processing of biometric data without the voluntary consent of the holders involved, as well as non-compliance with the rights of the holders by not informing about certain processing operations. Finally, the authority also identified that the club did not prepare the necessary impact reports for some of the treatment activities carried out.

Peru

Peruvian National Authority advised university students on how to protect their personal information on social media

With more than 270 young people from the universities of Arequipa, Ancash, Ayacucho, Amazonas, Cajamarca, Cusco, among others, the Peruvian Authority held lectures aimed at making them aware of the protection of their personal data on social networks and the measures to be adopted for avoid being a victim of cybercrimes. The informative lecture was a joint effort of the Peruvian Universities Network and the Peruvian Personal Data Protection Authority of the Ministry of Justice and Human Rights of Peru (MINJUSDH). The meeting was led by María Alejandra Gonzalez Luna, Director of the Personal Data Protection Directorate, who provided guidance to young university students on how to protect their personal information. Such measures can prevent situations such as sexual harassment, human trafficking, acts against privacy, as well as cyber fraud. In this context, it was recommended that it is important to use the privacy policies and options provided by digital platforms and verify with whom personal information is shared.

United Kingdom

ICO fined transgender charity for data protection breach and exposure of confidential personal informations

The Information Commissioner’s Office (ICO) fined Mermaids, a transgender charity, £25,000 for not keeping its users’ personal data safe. ICO’s investigation began after it received a data breach report from the charity regarding an internal email group that would have been used from August 2016 until July 2017, when it was deactivated. The charity did not become aware of the breach until June 2019. The ICO found that the group was created with insufficient security settings, which allowed approximately 780 pages of sensitive email to be viewed online for nearly three years. This made personal information such as names and email addresses of 550 people searchable online. The personal data of 24 of these people were sensitive, as they revealed how the person was coping and feeling, in addition to the fact that various special category information, such as mental and physical health and sexual orientation, were exposed. The ICO investigation concluded that the Mermaids should have stipulated restricted access to their email group and could have applied pseudonymization or encryption techniques to add an extra layer of protection to the personal data they maintained. In accordance with the UK General Data Protection Act, organizations responsible for ​​personal data must ensure that they have the appropriate technical and organizational measures in place to ensure the security of personal data.

ICO has implemented new features in online classes that will help young people take control of their online privacy

The Information Commissioner’s Office (ICO) is helping children and young people understand the power of their personal data as they learn, play and socialize online. ICO has released a set of lesson plans and worksheets designed to teach students how to protect their privacy online and how they can control what companies and online platforms know about them. The resources explain what is considered personal data, how to protect it and how to keep it private on social media. These resources can be downloaded free of charge from the ICO website and are part of the effort to raise awareness of the Age Appropriate Design Code, a set of standards that online services must adhere to if they are likely to be accessed by children.

Data Privacy at Universities

The Problem of Automated Facial Recognition Technologies in Brazil: Social Countermovements and the New Frontiers of Fundamental Rights

ZANATTA, Rafael; SOUZA, Michel.

The article, published in the Latin American Human Rights Studies Magazine, analyzes the characteristics of automated facial recognition technologies and their response by civil society organizations in Brazil. The authors analyze two arguments in this debate: the endemic bias argument, which seeks to correct unfair and potentially racist consequences, and the endemic oppression argument, which identifies a set of enablers for the systematic violation of fundamental rights. The concept of counter-movements is presented to explain the possibilities of legal contestation of the spread of facial recognition and to explain how the argument about recognition can move from the bias logic to oppression, with the possibility of changing the regulation to ban this technology in certain uses.

Philosophy and Digitization: Dangers and Possibilities in the New Digital Worlds

PEDERSEN, Esther; BRINCKER, Maria

Our world is undergoing a massive digital transformation. Almost no area of ​​our social, informational, political, economic, cultural and biological spheres remains unchanged. Therefore, the text seeks to analyze some topics: what can philosophy contribute when we try to understand and think about these changes? How does digitization challenge previous ideas of who we are and where we are going? Where does this leave our ethical aspirations and ideals of democracy, equality, privacy, trust, freedom and social inclusion? Who decides and controls the powers of digitization and for what purposes? Epistemologically, do most of us understand these new mediations – and therefore the fabrics – of our new world? Finally – how is the new technological landscape shaping not only our living conditions, but also our collective imagination and our personal identities?

Ethics-Based Auditing of Automated Decision-Making Systems: Nature, Scope, and Limitations 

FLORIDI, Luciano; MOKANDER, Jakob; MORLEY, Jessica; TADDEO, Mariarosaria.

Important decisions that affect human lives, their livelihoods and the natural environment are increasingly being automated. Delegating tasks to so-called automated decision-making systems (ADMS) can improve efficiency and enable new solutions. However, these benefits are associated with ethical challenges. For example, ADMS can produce discriminatory results, violate individual privacy and undermine human self-determination. New governance mechanisms are therefore needed to help organizations design and deploy ADMS in ethical ways, allowing society to reap all the economic and social benefits of automation. In this article, the authors consider the feasibility and effectiveness of ethics-based auditing (EBA) as a governance mechanism that allows organizations to validate the claims made about their ADMS. Based on previous work, the authors defined EBA as a structured process by which an entity’s past or present behavior is assessed for consistency with relevant principles or norms. Then, three contributions to the existing literature are offered. First, a theoretical explanation of how EBA can contribute to good governance, promoting regularity and transparency of procedures. Second, seven criteria are proposed for successfully designing and implementing EBA procedures. Finally, the conceptual, technical, social, economic, organizational and institutional constraints associated with EBA are identified and discussed.

Data Protection in the Brazilian Legislative

Proposed Bill of Law that criminalizes the disclosure of personal data without authorization

Bill No. 2394/2021, proposed by Deputy Hildo Rocha (MDB/MA), proposes to amend Decree-Law No. 2848/40 (Criminal Code), in order to typify the conduct of disclosing personal data without authorization. The PL adds article 154-B to the Criminal Code, to typify the conduct of disclosing, providing, selling, giving or allowing access to the personal data of third parties, without authorization or for unlawful purposes, establishing a penalty of imprisonment of two to five years , in addition to fine. Currently, the project is subject to the appreciation of the Plenary.  

Data Protection in the Brazilian Judiciary

Court issued a partially valid sentence based on the LGPD

Action No. 0020043-80.2021.5.04.0261, filed under the allegation of non-compliance with the General Data Protection Law by the defendant company. The complaining party pointed out that in addition to “possession of data”, the company shared them “with several other controllers and operators, without the necessary precautions”, in addition to not having an indication of the person in charge of the personal data. It further argued that the processing of data is “shared through the internet, in disregard of the Civil Law of Law 12.965/14, arts. 10 and 11”, considering that respect for intimacy, privacy and image is not observed. The defense argued that there would be no evidence of any incident with the employees’ data, arguing that it only collected basic data for administrative purposes and registration with official bodies, according to the model form that was attached. Finally, it denied improper sharing or access permission to third parties. In the judgment, the Court upheld in part to: (i) determine that the company nominate and nominate the person in charge; that the defendant implements and proves in the records the practices related to security and data confidentiality; under penalty of fine to be fixed; (ii) prove in the records compliance with the obligations imposed, within a period of 90 (ninety) days, under penalty of a daily fine of R$1,000.00.

Compartilhar: