Welcome to the 32nd edition of the Bulletin! In this edition we highlight the two documents published by the Brazilian National Data Protection Authority, with strategic planning for the years […]
Welcome to the 32nd edition of the Bulletin!
In this edition we highlight the two documents published by the Brazilian National Data Protection Authority, with strategic planning for the years 2021, 2022 and 2023, in addition to the biannual regulatory agenda for 2021 and 2022. The documents contain objectives and guiding principles of extreme relevance for the agency’s activity, especially in the first years of implementation of the General Data Protection Law.
We also stress the guidelines published by the Danish Data Protection Authority on when and how citizens or organizations should share personal information with the police, ensuring that there is security and proportionality in data sharing for the purposes of public security or criminal investigation. .
We also highlight the report prepared by the Data Privacy Brasil Research Association, on the legal basis of the legitimate interest in the General Data Protection Law. The document aims to answer some questions that remain uncertain in the scope of the interpretation of the legal basis, and also provides a series of tools and paths for the proper application and use of the legitimate interest, one of – if not the most – controversial legal basis of the General Data Protection Law.
We wish everyone a great reading!
Bruno Bioni, Iasmine Favaro and Mariana Rielli
Data Protection at Authorities
The Brazilian Data Protection Authority (ANPD) published, on February 1st, its Strategic Planning for 2021-2023. The document presents the advances that the ANPD intends to achieve, having defined three strategic objectives: (i) to promote the strengthening of the culture of protection of personal data; (ii) establish the effective regulatory environment for the protection of personal data; and (iii) improve the conditions for compliance with legal powers. The document contains a brief explanation of how the structure of the Authority is constructed, its governing body and the National Data Protection Council and, in strategic actions, it defines several ” time horizons ”, for example, the period of up to two years for detect LGPD violations and also to develop data protection guidelines and recommendations.
The Brazilian Data Protection Authority (ANPD) published, on January 28, Ordinance No. 11 of 2021, which makes public the regulatory agenda approved by the Directing Council at its first deliberative meeting on January 20, 2021. The publication of the Ordinance is part of the Authority’s activities within the framework of the international data protection and privacy week. The agenda, which is valid for the next two years, lists 10 priority themes for that period, establishing whether they will be regulated by ordinance, resolution or eventual guidance by a guide of good practices. The annex also presents the expected deadline for the start of the process of regulating the themes, dividing the time lapse into three distinct phases.
In June, the Danish Authority initiated a lawsuit against social media TikTok to investigate security in the handling of the service’s personal data. Since then, TikTok has declared that it is established in Europe based in Ireland, and the Irish Authority has reported that TikTok is, in fact, headquartered in that country. The Danish Authority will therefore – in accordance with GDPR rules – transfer the case to the Irish Data Protection Authority. “The establishment in Ireland means that Irish supervision is now what is called the leading supervisory authority for TikTok. Since we started the Danish investigation, we have participated in close cooperation with other European supervisory authorities, who have carried out similar investigations on to TikTok. In other words, there has been a lot of pressure on TikTok from several countries, “says the lawyer and IT security specialist at the Danish Authority, Allan Frank.
On Sunday night, BT reported that Medicals Nordic employees, who perform quick tests, were instructed to use a WhatsApp group to handle information about citizens who tested positive. With that as a starting point, the Danish Authority started to investigate a number of issues – including, for example, who the controller is, whether personal data have been disclosed and whether there are adequate security measures. “Information about citizens’ health is sensitive personal information and requires a higher level of protection. Basically, if you are tested for COVID-19, information about the results should be treated confidentially and only shared with interested parties. See if this is the case in this case, “said IT security expert and attorney for the Authority, Allan Frank.
According to the Authority, it is important to emphasize that when the police, in connection with a specific investigation, come into contact with a private actor or public authority in order to obtain relevant information – including personal data – the request can be safely answered. The document provides several examples of situations in which private organizations may receive requests for information, such as from surveillance cameras, and what they should do to ensure that this data is being properly shared and that no information is leaked from other people. involved. The document also points out what information the requested person or entity should request from the police, such as information about the investigation, judicial decisions that support the request, etc.
The General Data Protection Regulation, in force since May 2018, proposes a new level of fines significantly increased, and the regulation requires Member States to harmonize sanctions. According to the Authority, there is a need to develop some general guidelines that contain a clear and transparent basis for the recommendation of fines, based on the evaluation criteria established in the GDPR itself. The guide is a working document that will be expanded continuously as the Danish Authority and the prosecution and the courts deal with other cases in the area and as the practice, both at national level and in the EU, develops.
Between June 2018 and January 2020, CNIL received dozens of notifications of personal data breaches related to a website on which several million customers make regular purchases. CNIL decided to carry out checks on the controller and its subcontractor, the operator entrusted with the management of this site. During its investigations, CNIL noted that the site in question had suffered several waves of credential clogging attacks. In this type of attack, an attacker obtains lists of “clear” identifiers and passwords posted on the Internet, usually as a result of a data breach. Assuming that users often use the same password and identifier (e-mail address) for different services, the attacker will, thanks to “bots”, attempt a large number of connections to websites. When authentication is successful, it allows you to view the information associated with the accounts in question. CNIL noted that the attackers were thus able to obtain the following information: surname, first name, e-mail address and date of birth of customers, but also number and balance of the loyalty card and information related to their orders. The restricted commission – the CNIL body responsible for the application of sanctions – considered that the two companies did not comply with the obligation to preserve the security of customers’ personal data, provided for in article 32 of the GDPR. For the Authority, companies have been slow to adopt measures to effectively combat these repeated attacks. They decided to focus their response strategy on developing a tool to detect and block attacks launched by robots. However, the development of this tool took a year since the first attacks. Consequently, the restricted commission imposed two separate fines – 150,000 euros against the controller and 75,000 euros against the subcontractor – operator – for their respective liabilities.
The Director of the Authority, Professor Ulrich Kelber, requires the Federal Government to fully implement Directive 2016/680 (Directive JI) to national legislation. The Directive regulates the data protection that must be observed by the authorities in the prevention, investigation, detention or prosecution of criminal offenses or the execution of sentences. Ulrich Kelber criticizes the delay caused by the national legislator: ‘’ EU member states have pledged to enact all necessary laws to harmonize their systems with the Directive by May 6, 2018. Germany has exceeded this deadline by a thousand days. I can only complain about data protection violations at the Federal Police and Customs Investigation. Without national laws, I have no effective enforcement powers. This undermines the democratic legitimacy of data protection oversight and law enforcement officials at the same time. ” The German Authority reported that it had received a draft of a new law from the federal police, however, it did not reach the Bundestag (German parliament).
Access to TikTok in Italy has been blocked for users who cannot definitively prove their age after the tragic death of a young woman who participated in a “blackout challenge”. According to The Guardian, Italian prosecutors opened an investigation and Italy has temporarily blocked access to TikTok for users “whose age could not be definitively proven”. The report notes that the Italian Data Protection Authority blocked TikTok until February 15, with immediate effect, until the regulator’s requirements are met.
The European Data Protection Council (EDPB) has established new guidelines on data breach reports. The guidelines assist organizations with the steps they should take in the event of a data breach. The guidelines contain a list of common types of data leaks, such as ransomware attacks and lost or stolen equipment. For each category, it is indicated what measures an organization should take in advance and what measures the organization should take after the incident. The document also indicates when an organization must notify the supervisory body. In addition, the EDPB briefly emphasizes in the guidelines a number of other aspects relating to the obligation to report data leaks, even though the investigation into the full impact of the data breach has not yet been completed.
An investigation of the RTL newscast showed that there was a large-scale trade in personal data from millions of Dutch people, originating from the two main GGD coronavirus testing systems. The data relates to addresses, telephone numbers, social security numbers and test results. The Dutch Authority immediately requested clarification from the GGD (Ministry), and stated that the agency must inform citizens quickly and well about the incident, including through the website and the opening of an information line. The Authority points out that all organizations – not just the government – should make personal data security a priority. His motto is: the more you do with the data, the greater the risks and, therefore, the higher the level of data protection must be. Information about a person’s health is sensitive and the General Data Protection Regulation (GDPR) therefore stipulates that they must receive an additional level of protection.
The Authority’s preliminary conclusion is that Grindr requires consent to share certain personal information and that Grindr’s consent was not valid. In addition, the Authority believes that the Grindr user shares a category of confidential personal information that must be protected because it says something about the person’s sexual orientation. Users were unable to exercise real control over the disclosure of their own personal information. Bjørn Erik Thon, director of the Authority, points out that business models that involve forcing the user to consent to something, and without explaining well what he agrees with, are not in accordance with the law. The fine imposed was NOK 100 million, equivalent to 10% of the company’s income.
The Information Commissioner’s Office (ICO) issued fines totaling £ 480,000 to four different companies for making illegal calls to numbers registered with the Telephone Preference Service (TPS). Chameleon Marketing (HI) Ltd of Leeds; Rancom Security Limited based in Sutton Coldfield; Repair & Assure Limited of Redhill and Solar Style Solutions Limited in Stockton on Tees made 2.4 million illegal calls, resulting in more than 250 complaints to ICO and TPS.
ICO Sandbox selected three innovative data-sharing services to help those who are vulnerable to the harm of online gambling, supporting men and women in getting the care they need, and a platform to help fight cybercriminals . Now in its third year, the ICO Sandbox is a free service designed to help organizations explore new ways of using personal data, while ensuring that the appropriate protections and safeguards are in place. Sandbox is currently focusing on projects that support the sharing of complex data in the public interest. Although many people like to play without harm, there is clear evidence that this is not the case for everyone. As technology develops in the online space, it has become apparent that new opportunities are emerging to significantly increase the protection of the public and players against damage related to the game. A great opportunity was identified by the Gambling Commission in the form of developing a Unique Customer Vision (SCV) for online players. SCV will allow data that already exists on player behavior to be aggregated in a safe and controlled manner to lead to better decision making, actions and assessments on player protection across all online gambling providers. The Commission recognizes that there is a risk around any new way of using data and is therefore working with ICO to ensure that the SCV is delivered in a way that puts data security and the interests of the public and of players as an absolute priority.
It is essential to guarantee the right to protection of personal data in electoral processes, pointed out Norma Julieta del Río Venegas and Adrián Alcalá Méndez, Commissioners of the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI), when participating in the Declaration for the Protection of Personal Data during the 2020-2021 electoral process in the state of Guanajuato. Venegas stated that compliance with the electoral law and the protection of personal data must be in perfect harmony and obtain authorization from the holder of personal data for the processing of his data. He emphasized that it is necessary to have the Privacy Notices and make them available to individuals. “A task for political parties: regardless of how the law establishes it, it would be very good if these Privacy Notices, mainly graphic ones, were made, because, I insist, there are 92 million citizens, of which they will have, both the electoral bodies and the parties personal data, ”said the Commissioner. He stressed that citizens who consider themselves affected by the use of their personal data or are aware of alleged violations of the obligations provided for in the regulations governing the right to the protection of personal data, may submit to INAI or the guarantor bodies of the federal entities, as the case may be, complaint against the responsible.
Numa primeira fase, o Conselho – através da Direção de Estudos – e o GobLab, estão realizando um levantamento de informações que permitirá conhecer a existência e utilização deste tipo de tecnologia em diferentes departamentos do setor público, com o objetivo de gerar um cadastro e, posteriormente, numa segunda fase, propor recomendações para a implementação das melhores práticas que permitam o cumprimento do regulamento. A Presidente da Autoridade chilena, Gloria de la Fuente, destacou que “a automatização das decisões torna imprescindível abordar tanto a geração e coleta de dados, como a forma como são tratados e geridos, visto que este processo pode deflagrar ou não de uma série de riscos importantes que, se não tratados de forma tempestiva e responsável, podem ser discriminatórios e gerar importantes assimetrias de informação, uma vez que os dados que são selecionados e a forma como são tratados podem replicar vieses, estereótipos e preconceitos das pessoas que os selecionam no mundo físico, repetindo assim essas discriminações em seus resultados ”. A Autoridade ainda aponta que, com diferentes níveis de complexidade e autonomia, os sistemas automatizados de decisão são sistemas orientados para a melhor gestão pública. O uso de tecnologias de reconhecimento facial; priorização de listas de espera em saúde; a inteligência artificial dos sistemas de saúde; O direcionamento de intervenções sociais e a atribuição de vagas em escolas públicas fazem parte das diversas iniciativas públicas de automatização de processos e de incorporação de sistemas de ciência de dados ou inteligência artificial baseados em algoritmos ou decisões automatizadas que visam apoiar a administração.
Data Protection at Universities
With the advancement of the use of facial recognition technology for public security purposes in several countries in Latin America, the discriminatory or damaging effects on other individual guarantees caused by the use of these systems have become evident. The uncertainties regarding the magnitude of the negative potential of biometric monitoring in public spaces, as well as the opacity resulting from the use of artificial intelligence, make it necessary to understand the current scenario of legal guarantees regarding this new surveillance instrument. The article intends to investigate the regulatory situation of the use of facial recognition technologies in the field of security in Latin American countries that have, at least, legislation on the protection of personal data. In addition to presenting cases of use of facial recognition technology in Argentina, Brazil, Chile Colombia, Costa Rica, Mexico, Nicaragua, Panama, Peru, Dominican Republic and Uruguay, norms of national scope were verified that eventually regulate this use or connect directly with the theme, as well as laws on the processing of personal data by public agencies, video surveillance and public security.
O objetivo do relatório é, primordialmente, explorar o DNA e a anatomia da base legal do legítimo interesse, conforme prevista na Lei nº 13.709/LGPD. Algumas questões que despertaram o interesse na construção do documento foram: quando a figura do legítimo interesse apareceu no radar dos legisladores e quem foram os atores que movimentaram o debate que culminou na versão final do texto? Quais eram os interesses que precisaram ser harmonizados ao longo desse processo e como isso se reflete na interpretação do legítimo interesse? É obrigatória a realização de um teste de proporcionalidade – no direito europeu conhecido como Legitimate Interest Assessment/LIA – como uma espécie de registro especial dessa operação de tratamento de dados? Se sim, esse teste deve ser objeto de publicidade? As condicionantes traçadas no artigo 10 da LGPD são cumulativas e governam também a aplicação do interesse legítimo de terceiro e não apenas do controlador? A partir dessa base, o relatório fornece caminhos, enraizados na cultura jurídico-brasileira e com base em uma hermenêutica não apenas literal-gramatical do texto da LGPD, para a compreensão da base legal e alguns de seus aspectos mais sensíveis e até polêmicos.
Data Protection in the Brazilian Judiciary
The São Paulo Court of Justice dismissed the request by Agravante Abrafarma (Brazilian Association of Pharmacy and Drugstore Networks) against the Aggravated State of São Paulo. In the interlocutory instrument No. 2004898-90.2021.8.26.0000, Judge Jarbas Gomes decided that there would be no illegality in placing a notice in pharmacies with the words “PROHIBITED THE CPF IS REQUIRED IN THE PURCHASE ACT THAT CONDITIONS THE GRANTING OF CERTAIN PROMOTIONS” determined by the State of São Paulo. The judge stated that “it is opportune to remember that the consumer has the right to clear and accurate information about products and services and the supplier, the duty to provide them, under the terms of article 6, item III of the Consumer Protection Code. abstract, there is no way to conclude that there is a benefit to the consumer by allowing him to be provided with less information on how his personal data will be handled. The probability of the right is absent and, even less, there is talk of the danger of damage that is difficult to repair. to the author’s associates, who do not allow to wait for the opposite party to manifest themselves and foray into the merits. In fact, such an argument by the author, a contrario sensu, only corroborates the conclusion that the omission of clear information on how the data will be treated personal information (CPF) of consumers, which is what is intended, due to the pretension of refraining from posting notices in stores, under the terms of Article 2 of the State Law, only serves his (the author’s) own inter this one and not the consumers, being, therefore, contradictory with its own thesis that the order is beneficial to them “.