Welcome to another edition of the Bulletin! In this 37th edition, the highlight goes to the nominees composition disclosure for the triple lists of the Brazilian National Council for the […]
Welcome to another edition of the Bulletin!
In this 37th edition, the highlight goes to the nominees composition disclosure for the triple lists of the Brazilian National Council for the Protection of Personal Data and Privacy. The council is an advisory body, part of the ANPD structure, which has the task of drawing up strategic guidelines and providing subsidies for the country’s privacy and data protection policies. We also highlight that Bruno Bioni, director of the Data Privacy Research Association, was one of the nominees in the civil society organizations segment.
We also point out the publications made by the Irish and Italian Data Protection Authorities, regarding the episodes of social media data leakages. The Irish Data Protection Commission (DPC) initiated an investigation under Section 110 of the Data Protection Act (2018), requesting informations from Facebook Ireland about the incident. Based on the responses provided, the Authority concluded that one or more provisions of the GDPR and/or the Data Protection Act (2018) may have been breached. In turn, the Italian Authority launched an investigation against Linkedin, due to the sequence of breaches in its systems that culminated in the disclosure of users’ personal data, including IDs, full names, e-mail addresses, telephone numbers and other job information added in their profiles. At the same opportunity, the Authority warned that the use of any data resulting from the breach is an unlawful processing, that violates personal data protection legislation, in addition to involving sanctioning consequences.
Two important articles touch on the data protection topic in relation to academic research. The first, writed by Janos Meszaros and Chih-hsing Ho, analyzes how the General Data Protection Regulation (GDPR) is applied to the development of AI products and services, drawing attention to the differences between academic and commercial research and emphasizing the limits of the GDPR research exception, to find the right balance between privacy and innovation. And the second, from the brazilian perspective, elaborated by the Center for Education and Research in Innovation (CEPI) of FGV, seeks to establish guidelines and recommendations for safeguarding and safe use of personal data that may be treated in research activities carried out by research institutions.
We wish you a great reading!
Bruno Bioni, Mariana Rielli and Júlia Mendonça
Data Protection at Authorities
The Brazilian Data Protection Authority (ANPD) released the nominees for the composition of the National Council for the Protection of Personal Data and Privacy (CNPD) triple lists. The council is an advisory body, part of the ANPD structure, which has the task of proposing strategic guidelines and providing subsidies for the preparation of the National Policy for the Protection of Personal Data and Privacy and preparing annual reports and studies on the aforementioned themes. The indications aim to represent the following segments: (i) civil society organizations with data protection studies; (ii) scientific, technological and innovation institutions; (iii) confederations representing economic categories in the productive sector; (iv) entities representing the business sector related to the processing personal data area; (v) entities representing the labor sector. We highlight that Bruno Bioni, director of the Data Privacy Research Association, was one of the nominees in the civil society organizations segment.
As a result of Brexit, in the beginning of 2020, the United Kingdom (UK) started to be considered a “third country” according to the European Union’s personal data protection rules. This generated the need to identify a valid legal transfer basis, to make possible the transfer of data to the United Kingdom. In this context, the European Commission issued decisions regarding adequacy guidelines, which state that the United Kingdom can guarantee an adequate level of data protection. Given this, Denmark’s data protection controllers and operators – once the adequacy guidelines have been adopted – will be able to continue to transfer personal data to the UK, just as they did before Brexit.
The Data Protection Commission (DPC) launched an own-volition inquiry pursuant to section 110 of the Data Protection Act 2018 in relation to multiple international media reports, which highlighted that a collated dataset of Facebook user personal data had been made available on the internet. This dataset was reported to contain personal data relating to approximately 533 million Facebook users worldwide. The DPC engaged with Facebook Ireland in relation to this reported issue, raising queries in relation to GDPR compliance to which Facebook Ireland furnished a number of responses. The DPC, having considered the information provided by Facebook Ireland regarding this matter to date, is of the opinion that one or more provisions of the GDPR and/or the Data Protection Act 2018 may have been, and/or are being, infringed in relation to Facebook Users’ personal data.
The Italian Data Protection Authority’s opinion on the use of “Sari Real Time” system by the country’s Ministry of Interior was not favorable. For the Authority, the system, in addition to lacking a legal basis that legitimizes the automated processing of biometric data for facial recognition, can also cause indiscriminate surveillance, based on the way it is designed. The operation of the system occurs through a series of cameras installed in a certain geographic area that analyzes in real time the faces of the recorded subjects, comparing them with a predefined database (called “watch list”), which may contain up to 10,000 faces. If, through a facial recognition algorithm, a match is found between a face on the watch list and a face recorded by one of the cameras, the system generates an alert that catches the attention of the police. The Guarantor, in line with the provisions of the Europe Council, considers delicate the use of facial recognition technologies for prosecution and crimes prevention.
The Italian Authority launched an investigation against Linkedin, due to the sequence of breaches in its systems that culminated in the disclosure of users’ personal data, including IDs, full names, e-mail addresses, telephone numbers, in addition to professional titles and other work information inserted in their profiles. At the same opportunity, the Authority warned that the use of any data resulting from the breach results in an unlawful processing, that violates personal data protection legislation, in addition to involving sanctioning consequences. Likewise, the Authority also alerted the affected users to pay special attention to anomalies related to their phone number or account, as they may be the victim of a series of illegal conduct, such as unwanted calls and messages, online fraud and even identity theft.
The Digital Green Certificate (Covidpass) aims to facilitate the exercise of the right to come and go within the European Union during the COVID-19 pandemic. The Czech Authority draws attention to the arising possible risks from such certificates or similar methods, which, if implemented, must be analyzed by the relevant institutions in the country. The European Data Protection Board and the European Data Protection Supervisor (EDPS) also issued a joint opinion on the topic, pointing out that the Digital Green Certificate must be in full compliance with European legislation on the protection of personal data. At the same opportunity, both institutions emphasized that the certificate use must in no case lead direct or indirect to discrimination against individuals and must be in full compliance with the principles of necessity, effectiveness and proportionality. Finally, the institutions highlighted the need to mitigate risks to the individuals rights, including use for purposes other than facilitating free movement between EU Member States, or for medical purposes.
The National Director of the Data Protection’s Agency in Argentina, Eduardo Cimato, participated in the webinar “Dialogues with data protection authorities – 1 # Security incidents in Latin America”, organized by the Laboratório de Políticas Públicas e Internet (LAPIN), in which public policies were presented that can be adopted by the data protection authorities of Argentina, Uruguay and Brazil, to avoid possible security incidents. On that occasion, measures were formulated to resolve incidents after the breach occurred, as well as alternatives for avoiding such types of events. In his participation, the Argentine director pointed out that “The proactivity of public and private organizations is very important to avoid possible incidents, previously analyzing the software and hardware before launching an application”. The meeting was attended by Amanda Espiñeira, representative of LAPIN, Nairane Rabelo, director of the National Data Protection Authority of Brazil and Gonzalo Sosa, data protection coordinator of the Personal Data Regulation and Control Unit of Uruguay.
The National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) initiated the “Verificación de Hechos” project, with the aim of contributing to the fight against disinformation about the pandemic COVID-19. It is a new service that the Institute offers to society in the midst of the health crisis, to provide access and accurate information for the population. Through institutional social networks, people can send information related to COVID-19 to INAI that appears to be false, so that the staff of the Dirección General de Promoción y Vinculación con la Sociedad can confirm the veracity of the information. The results of the verifications will be published by the institutional media, as well as by the verification site enabled for this purpose, which will be constantly updated. Finally, the Institute points out that only misleading information related to the COVID-19 pandemic will be analyzed on a permanent basis, based on the guidelines established by the Institute’s working group.
According to the National Institute of Transparency, Access to Information and Data Protection (INAI), the registration of biometric data in the Padrón Nacional de Usuarios de Telefonía Móvil (PNUTM) for the purpose of identifying the population, requires the greatest possible care, since it can present risks for the individuals personal data protection. The reform minute of the Federal Telecommunications and Broadcasting Law, approved by the Senate of the Republic, seek to grant attributions to the Federal Telecommunications Institute (IFT) for the installation, operation, regulation and maintenance of the PNUTM, in order to collaborate with the competent authorities in matters of public security and justice, in relation to the infractions committed by mobile phone users. According to the approved minutes, the register will contain the following data: 1) mobile phone line number; 2) activation of the cell line date and time; 3) the user’s full name, denomination or corporate name; 4) nationality; 5) official identification number and Unique Population Registration Code (CURP) of the line owner; 6) biometric data of the user (natural person) or his legal representative (moral person); 7) user’s address; 8) data from the telecommunications department; 9) line contracting scheme and 10) information update notices. In light of this, INAI points out that serious risks to the individual’s data protection can be generated.
Data Protection at Universities
This Article begins by describing three core building blocks of data protection regimes in the United States and Europe—namely, market forces, tort liability and regulatory enforcement—that these jurisdictions combine in different ways to ensure that companies act in accordance consumers’ privacy preferences. It then identifies two key reasons—particularly deep information asymmetries between companies and consumers/regulators, and high levels of market power in many data markets—that enable companies to behave strategically to protect private interests and undermine legal compliance. The conclusion looks at the institutional design of antitrust and anti-fraud laws, two regulatory regimes that face similar challenges in their implementation, to argue that an effective online privacy regulatory system should be built around three key principles. First, the system must multiply monitoring and enforcement resources, and antitrust demonstrates how litigation can fund sophisticated civil-society intermediaries that safeguard consumers. Second, the system must bring violations to light, and anti-fraud policies demonstrate the importance of establishing effective whistleblower programs for data protection. Third, the system must increase governmental accountability, and antitrust provides examples on how to promote public transparency without sacrificing enforcement capacity.
The paper examines how the EU General Data Protection Regulation (GDPR) is applied to the development of AI products and services, drawing attention to the differences between academic and commercial research. The GDPR aims to encourage innovation by providing several exemptions from its strict rules for scientific research. Still, the GDPR defines scientific research in a broad manner, which includes academic and commercial research. However, corporations conducting commercial research might not have in place a similar level of ethical and institutional safeguards as academic researchers. Furthermore, corporate secrecy and opaque algorithms in AI research might pose barriers to oversight. The aim of this paper is to stress the limits of the GDPR research exemption and to find the proper balance between privacy and innovation. The paper argues that commercial AI research should not benefit from the GDPR research exemption unless there is a public interest and has similar safeguards to academic research, such as review by research ethics committees. Since the GDPR provides this broad exemption, it is crucial to clarify the limits and requirements of scientific research, before the application of AI drastically transforms this field.
The guide for Protection of Personal Data on Research, prepared by the Centro de Ensino e Pesquisa em Inovação (CEPI) of FGV, seeks to establish guidelines and recommendations for the protection and safe use of personal data that may be treated in research activities carried out by research organizations. The main legal reference used for its elaboration was the LGPD, Brazilian Data Protection Law, although other national and international standards have also been considered, with special attention to the General Data Protection Regulation (GDPR), the General Regulation on the Protection of Personal Data of the European Union. In addition, the Project also aims to develop methodologies and analysis mechanisms for the elaboration of Data Protection Impact Reports that aim to contribute to the construction of a data protection culture by education institutions and research bodies in Brazil.
Data Protection in the Brazilian Legislative
The Bill nº 871/2021, presented by Senator Veneziano Rêgo (MDB / PB), alters the General Data Protection Law (LGPD), to provide the elaboration of an ethics code on rules of good practice and governance in treatment agents. Modifying article 50 of the aforementioned law, the bill points out that the ethics code should contain “values, principles and guidelines for behavior”, to guide “employees and administrators of treatment agents”. Currently, the bill is in the Senate Plenary.
Data Protection in the Brazilian Judiciary
Lawsuit nº 2076546-33.2021.8.26.0000, DJSP, p. 1489
In the Lawsuit nº 2076546-33.2021.8.26.0000, the plaintiffs questioned the allegedly Criminal Court illegal act, which determined the confidentiality telematic breach on geography data of an indeterminate set of Google users, with the intention to obtain data to assist the authorship verification of the crime against life. In summary, the main argument of the lawsuit is that it would be prohibited by law to breach confidentiality in the requested form, without the people individualization and plausible investigation, in view of the provisions of the Internet Law Framework (Bill nº 12.965/14 ) and the General Data Protection Law (Bill nº. 13,709 / 18). In addition, the plaintiffs point out that the measure would be disproportionate and that it would defy the Federal Constitution, with regard to the provisions on intimacy and private life. The judge dismissed the preliminary injunction motion, considering that the objective pursued concerns the merit of the lawsuit itself, therefore, it is up to the Judging Panel to decide on the request in its full extent.