Welcome to the 34th edition of the Bulletin! This week, the highlight goes to the performance of the French Data Protection Authority, which published a technical note on cybersecurity and […]
Welcome to the 34th edition of the Bulletin!
In Brazil, the presentation of two bills on data protection stands out. On the one hand, Bill nº 766/2021, presented by Federal Deputy Nereu Crispim, from PSL in RS, proposes to regulate the exercise of telemedicine, with more definitions about the activity and attention to the role of the ANPD in the protection of personal data in the area. On the other hand, Bill No. 578/2021, proposed by Federal Deputy Erika Kokay (PT-DF) seeks to modify the LGPD to suppress item I-A of article 65, thus promoting immediate application of the penalties provided for by law.
We also emphasize the text of the European Data Protection board on the role of data protection for democracy in societies of the digital age. In addition to addressing different challenges, such as dark patterns and emerging technologies, several topics are analyzed by leading personalities in data protection in Europe and worldwide.
We wish everyone a great reading!
Bruno Bioni, Iasmine Favaro and Mariana Rielli
Data Protection at Authorities
Regarding the issue of the disclosure of personal data of parties in court proceedings, the Authority pointed out that the need and the effectiveness of disclosure changes at different stages of court proceedings. “This means, for instance, that it is easier to find arguments for the temporary disclosure of parties’ personal data in the course of proceedings, with the aim of informing the public, than for the disclosure of the same personal data after the process is over” , said the Authority. The document answers questions that call attention to practices that are often different from the courts in this area.
Based on a complaint previously submitted to the Authority, the agency criticized the fact that Statistics Denmark did not update the information that a citizen did not want to participate in the surveys voluntary bodies. Statistics Denmark said about the case that it maintains an internal list of citizens who do not wish to participate in Statistics Denmark’s voluntary surveys, and that the citizen should have been added to the list, but that this was not due to an internal error. The Authority understood that the agency would still have the authority to process information about the citizen, but criticized the fact that it did not include the citizen in the list of people who would not like to participate in the surveys carried out.
The text, based on a series of podcasts organized by the European Data Protection Supervisor and the European Data Protection Board, talks about different points, such as mass surveillance and facial recognition, dark patterns and online manipulation and emerging technologies, highlighting future challenges. Among those invited to talk about the topics are Gabriela Zanfir-Fortuna, Jared Brown, Harry Brignull and Ella Jakubowska, leading figures in the debate on data protection in Europe and worldwide.
On 26 December 2020, the European Commission adopted a Council Decision proposal on the conclusion, on behalf of the EU, of the Trade and Cooperation Agreement between the European Union and the European Atomic Energy Community, on the one hand, and the United Kingdom of Great Britain and Northern Ireland, on the other, as well as the Agreement between the European Union and the United Kingdom of Great Britain and Northern Ireland on security procedures for the exchange and protection of classified information. The EDPS is aware of the specific conditions under which these agreements were negotiated and the specific relations, past and future, between the United Kingdom and the European Union.
With regard to the provisions on trade, the Authority regretted that the Trade and Cooperation Agreement did not faithfully comply with the “horizontal EU provisions on cross-border data flows and protection of personal data and privacy in the Digital Trade Title of the EU trade agreements” adopted by the European Commission in 2018. Indeed, the changes made to these horizontal provisions, combined with other provisions of the ATT, call into question, in the field of digital commerce, the preservation of the EU’s autonomy with regard to fundamental rights to data protection and privacy.
The Authority said for a long time that, since the protection of personal data is a fundamental right in the Union, it cannot be the subject of negotiations in the context of EU trade agreements. It would be up to the EU, therefore, to decide how to implement fundamental rights protections in its domestic law. In its opinion, the Authority stated that the EU cannot and should not make any international trade commitments that are incompatible with its data protection legislation.
The note states that website security breaches are among the most common breaches observed during inspections and can lead to security incidents (2,825 notifications received in 2020, or 24% more than in 2019). In this regard, the Authority’s objective is to control the level of security of the most used French sites in the different sectors. The Authority stated that more particular attention will be paid to the ways of collecting personal data, the use of the HTTPS protocol and the compliance by agents with the CNIL recommendation on passwords. In addition, CNIL stated that, in the current context of health and taking into account the increasing challenges linked to the digitization of the health sector (management of access to computerized processes of patients in health establishments, platforms for scheduling online medical appointments, management of personal data security incidents in health establishments, etc.), the Authority wishes to continue its control measures initiated in 2020.
CNIL has been informed by the media about the publication of a file containing medical data of about 500,000 people. The preliminary results seem to indicate that it was, in fact, an incident of particularly significant magnitude and severity, with speculations that the data could have come from medical analysis laboratories. CNIL recalls that data controllers have an obligation to guarantee their security, in addition to treating them proportionally to the risks – in particular, for sensitive data, such as health data.
One hundred and fifty experts from science, business and civil society have developed a guide for the responsible use of digital data in the DiDaT project, funded by the Federal Ministry of Education and Research. The recommendations were recorded in a white paper that was delivered this week by the project managers to Professor Ulrich Kelber, Director of the German Authority. Professor Wolf-Dieter Lukas, Secretary of State at the Federal Ministry of Education and Research, commented on the delivery of the White Paper as follows: “I consider the responsible use of digital data to be one of the main requirements for a life with freedom and prosperity in Germany and Europe. Businesses and research institutions in particular need reliable access to digital data. Digital data, often highly personalized, is an extremely sensitive asset that needs to be protected in a free and democratic legal order. In order to have so, we need more solutions that allow the use of digital data for innovative applications and preserve the sovereignty of each individual’s data, neutralizing negative social side effects. The DiDaT project makes an important contribution to the debate at a time when digitization is penetrating all areas of life.”
In the text written by the Director of the Authority, Aleid Wolfsen states that people often suffer emotional damage as a result of the careless handling of personal data. The text raises the question: “do victims have no right to receive for the damage caused?”. For Wolfsen, this is a relevant issue, because the right to the protection of personal data is one of the most important freedoms of this time. After all, in a society that is digitizing at an accelerated pace, this right also gives extra protection to many other fundamental rights and values, such as the right to privacy, freedom of belief, fair elections and equal treatment. And with all the data being collected, profiling and non-transparent decision making by algorithms are lurking, with all the associated risks. Violations of this specific fundamental right are, therefore, classic violations of rights. Thus, the director points out that, for the victim, a personal indemnity may be more useful than a fine imposed by the Authority. The text also proposes that restitutions vary according to the sensitivity of the data and the ability of the offender to adopt safeguards.
The Authority pointed out that it is increasingly receiving reports of hacking, phishing and other types of attacks. The fact that personal data is increasingly being targeted by criminals is very worrying, says the note, since the theft of personal information – and often sensitive – can cause substantial and concrete damage to people, such as future identity fraud or scams of another nature. The Authority also said that the criminals are very sophisticated and mainly focus on organizations that handle large volumes of personal data. Increasingly, incidents have arisen because criminals are able to be present on a network for a long period of time before attacking and use that time to explore and map the organization’s network, seeking to gain more privileges on the network, for example, ‘ system administrator rights’. One of the Authority’s suggestions is that, if possible, all authentication systems follow two steps, which can make it difficult to do such scams.
For Denham, the Internet was not designed for children, despite the benefits of its use. The director also states that the problem is that the protections and rules for children in the offline world have not been properly translated into the online world. In the UK, the answer, in part, was the Age-appropriate Design Code. This code has been a large part of the Authority’s work for the past two years and is built on consultation with individuals: mothers, fathers and guardians, teachers, developers, technology leaders and the children themselves. Denham says that in the next decade, he believes that this type of code aimed at children will be adopted by a large number of jurisdictions.
Two different companies that sent text messages that were considered bothersome during the Covid-19 pandemic were fined a total of £ 330,000 by the ICO. The Authority fined Leads Works Ltd of West Sussex £ 250,000 for sending more than 2.6 million text messages to customers without their valid consent and fined a Manchester company £ 80,000 for the same reason, after complaints from the public. The company was found to have sent more than 95,000 text messages from June to July 2020 without the recipients’ permission.
The Federal Trade Commission (FTC) will host a virtual workshop on April 29, 2021 to examine digital “dark patterns”, a term that has been used to describe a variety of designs of user interface with manipulative potential used on websites and mobile apps. The event will explore the ways in which user interfaces can have the effect, intentionally or unintentionally, of obscuring, subverting or undermining consumer autonomy, their decision making or choice. For instance, some sites place extra items in the consumer’s online shopping cart or require users to navigate through a maze of pages and confusing questions to avoid being charged for unwanted products or services. In addition, the FTC is seeking research, recommendations for discussion topics and requests from panelists before the workshop. The FTC will also publish a specific request for comments related to dark patterns. Comments should be sent no later than 29 June 2021.
The Authority has communicated a series of criteria on how data on people vaccinated against coronavirus should be treated. For the Authority, when an individual bound by the Argentine Access to Information Law delivers or publishes any information that contains personal data proactively or upon a request for access to public information, this constitutes a transfer of personal data made “under a legal obligation “(art. 1 of Law 27.275; and art. 5, subsection 2 (b) and art. 11, subsections 3 (a) and 3 (b) of Law 25.326 on Protection of Personal Data). However, the Authority consolidated, in face of a collision of rights, it is necessary to analyze the public interest, which must be done on a case-by-case basis. In each case, the risk related to privacy that may materialize with the publication or not of personal data may be different, as well as the public interest that certain information may have. The Authority also explained how the principles foreseen by law should be followed and how the dissemination of data related to vaccination should be made.
Knowledge gaps and low institutionalization of procedures and mechanisms for the protection of personal data in the state sector are identified by public servants in their organizations, according to part of the evidence produced by the IX Transparency Council Public Employee Perception Study on the Right of Access to Information and Protection of Personal Data. These results contrast with the existing consensus among State officials on the importance of enforcing the rights associated with the treatment, management and adequate safeguarding of citizens’ personal information. Low levels of training on personal data protection issues were also identified, with 3 out of 10 employees claiming to have received them. The survey also pointed out differences based on hierarchy, with those with managerial positions those who declared they had greater knowledge about the data protection mechanisms or procedures within the institution. A similar trend has been identified with regard to employees’ knowledge of the personal data protection law.
Data Protection at Universities
This article makes two central contributions. The first is the construction of a roadmap so that the courts understand the damage related to privacy and can address and correct them in a meaningful way. There are several different types of damage to privacy, which have so far been recognized by the courts in inconsistent ways. The second contribution is to provide an approach to when damage to privacy should be required as a condition of a conviction. In many cases, the damage itself is irrelevant for the purpose of the action. Currently, privacy disputes suffer from misalignment between the objectives and solutions provided by the law. The article argues that the law should be guided by the essential question: when and how should privacy regulations be applied? It also offers an approach that seeks to align the application goals with the appropriate solutions.
The practical scope of this article assumes that the reader is already familiar with EU privacy laws and is looking for a good source of information about Brazilian laws of privacy. Although many companies are already well advanced in evaluating their data processing activities in relation to GDPR, they may also need to become compatible with the LGPD, which could mean the introduction of several new requirements. In general, this article explains, from a legal and compliance perspective, the differences between GDPR and LGPD, the enforcement mechanisms, what companies need to know in order to become compliant with Brazilian laws, as well as the effects of violations of the LGPD, its important definitions and which entities are subject to it.
Data Protection in the Brazilian Legislative
The Bill n. 766/2021, presented by Federal Deputy Nereu Crispim, of the PSL in Rio Grande do Sul, proposes to regulate the exercise of telemedicine, determining what is within the scope of the activity, what principles and safeguards must be adopted and defining that, with respect to the specific regulation for the protection of personal data, it is up to the National Data Protection Authority to take a position. Currently, the bill is awaiting validation from the President.
Presented by Federal Deputy Erika Kokay, from the Workers’ Party of the Federal District, Bill 578/2021 proposes the suppression of item I-A of article 65 of the General Data Protection Law, so that the penalties provided for by the Law are immediately applied. The justification points out that compliance with the LGPD will be a differential for organizations, as it promotes credibility for the responsible use of personal data and for respect for the privacy of customers and partners. For this reason, the postponement of the term had very bad repercussions in Brazilian society, and was attributed, in part, to the difficulty or delay in creating the Authority that will regulate the issue, especially in the face of the urgencies imposed by the Coronavirus pandemic. For the legislator, according to the jurists, the successive postponements in effect of the LGPD in Brazil not only render sanctions harmless, but also make society as a whole watch the delay in adapting the processes and rites implemented by the law. Currently, the bill is at the Board of Directors.
Data Protection in the Brazilian Judiciary
Lawsuit nº 2033246-21.2021.8.26.0000, DJSP, p. 1647
Recently, the São Paulo Court of Justice has dealt with the issue of public administration data breach by unauthorized access to administrative documents and processes and influence peddling. In this case, it is alleged that a magistrate working in the municipality of Caieiras influenced the mayor to authorize a close friend of the judge, without any connection with the municipality, to carry out an audit of the city’s finances, in order to promote the leakage of sensitive information. The magistrate reportedly referred his friend to various investigations in order to promote a clandestine dossier to support the filing of actions against opponents and unwanted companies hired by the Caieiras City Hall, thus favoring associates and business partners. Among the information for the illegal formation of the dossier, there would be access to bank balances, financial transactions, data protected by password and reserved interests of the municipality due to fiscal secrecy, contradicting information security responsibilities, constitutional principles of Public Administration and art. 6th, I, of the LGPD. The decision of the TJSP, however, removed the competence to judge the action, observing the Constitution of the State of São Paulo.