Dear reader, In the 28th edition of the Bulletin, we highlight the nine recent publications by European data protection authorities, the two scientific articles selected by the team’s curators, on […]
In the 28th edition of the Bulletin, we highlight the nine recent publications by European data protection authorities, the two scientific articles selected by the team’s curators, on data governance and on the legal basis of consent, the Bill that immunizes exempt organizations LGPD religious organizations and the decision against the company Serasa.
In this edition, we emphasize the position of the European Data Protection Board regarding the temporary annulment of some parts of the Electronic Privacy Directive 2002/58/EC for the purpose of combating violence against children and adolescents online. The body took a stand against annulment, claiming that the principles of proportionality and reasonableness had been harmed, insofar as the risk to privacy of users of digital platforms would not be reasonable for the purpose specified by the justification for the annulment.
We also emphasize that the Bill proposed by Federal Deputy Alex Santana of the PDT and that seeks to add religious entities to the list of immunity proposed by article 4 of LGPD breaks with the rationale of LGPD, which is precisely to be a general law that reaches the use of data in the most different activities of daily life. It is not yet clear what the legislator wanted to determine by “religious data”, it is not understandable whether such data are only those that concern people’s religious orientation or if they are all data that an individual can provide to a religious entity for different purposes. This is not the first and will not be the last bill that will seek to exempt any economic sector from the LGPD.
As for the decision that determined the suspension of the sale of personal data by Serasa, it is yet another indication that the LGPD inspection process may be fragmented by the actions of several entities of the National Consumer Protection System. In the decision, the magistrate states that only consent can be used as a legal basis for the activity carried out by the company, however, it should be noted that the LGPD proposes a more extensive list of legal bases that can be adopted.
We wish you all a great reading!
Bruno Bioni, Iasmine Favaro & Mariana Rielli
Data Protection at Authorities
The Czech government has adopted, in order to remedy the effects of the quarantine, an initiative that aims to support the electronic solution of invoices, notices, contracts, certificates or orders in order to completely replace the usual written correspondence between citizens, businessmen or companies. However, the Authority warns of the abuse of the tool, to send unsolicited commercial messages to data subjects. The Czech Ministry of Interior (equivalent to the Ministry of Justice in Brazil) imposed a fine of 4.5 million crowns on 20 companies for the abusive practice. The Director of the Authority pointed out that “messages sent by electronic means must be used for standard correspondence. Any misuse of these means is therefore bad, but especially in times of emergency, abusing the exceptional free service of this service is unacceptable”.
After verification, the Authority concluded that, within the scope of the implementation of consultation and legislation programs, in 2018 and 2019 it was found that the processing agents did not use the new personal data protection instruments introduced by the GDPR in an appropriate and correct manner. According to the Authority, agents are mainly interested in the basic approach to defining more complex agendas with personal data, especially in the processing of Data Protection Impact Assessment (DPIA).
Research has shown that in many cases, petitioners are not very clear about the structure of DPIA. Often, entities have only carried out a verbal assessment without specific information on the description of threats, implications for privacy and proposed technical and organizational measures.
The material is a methodology and the Authority recommended its adherence. However, if the entity chooses a different methodology, but the resulting DPIA covers the mandatory requirements of the GDPR, the use of the alternative methodology is permitted. The complete Methodology of the DPIA created by the Czech Authority can be found here.
This is an issue that is still open in the General Data Protection Law, as the recently consolidated National Data Protection Authority has not yet expressed the guidelines for the production of a DPIA. When observing this gap, we, from the Data Privacy Brasil Research Association, made a contribution to the public consultation on artificial intelligence, in which guidelines for the elaboration of a DPIA stand out and its importance to guarantee the protection of personal data.
Based on a complaint, the Danish Authority criticized Randers Municipality for sending a letter of resignation to the wrong employee, and, accordingly, did not comply with the requirement for adequate security measures. The proposed termination contained information about the complainant’s health conditions and union membership. In addition, based on the complaint, the Danish Authority criticized the municipality of Randers for not reporting the security breach to any supervisory body and the municipality did not report complaints about the breach in due time.
As a reason for not reporting the violation, Randers Municipality stated that the violation – in the opinion of the municipality – should be considered as an internal issue within the municipality. The Danish Authority’s guidelines on how to deal with personal data security breaches mention an example of a breach in which an HR employee inadvertently sends payment receipts and employment contracts to an incorrect company employee. It seems, in this context, from the example that, in such a case, the breach does not necessarily have to be reported to the Authority, and that the company can assess whether the breach does not involve a risk to the data subject, since it is a internal violation, and that the company has great confidence in the employee in question. In this regard, the Authority emphasized that – in view of the confidential nature of the document and that the document contained information about the health of the claimants and union membership – there was a particular risk of loss of reputation and confidentiality for the claimants in relation to the intended dismissal that was referred to another employee in the workplace.
On 19 February 2020, the European Commission presented its Communication on “A European data strategy”. This communication aims to create a health data repository, named as European Health Data Space (EHDS), presented as an essential tool for the prevention, detection and cure of diseases, in order to increase the effectiveness, accessibility and sustainability health systems. Although EDPS strongly supports the objectives of promoting the exchange of health data and promoting medical research, it highlights the need to define data protection safeguards since the beginning of the creation of the EHDS. Thus, with the preliminary opinion, the Authority highlights the essential elements that must be considered in the development of the EHDS from the point of view of data protection.
The Authority reinforces the importance of establishing a well thought out legal basis for EHDS processing operations, in line with Article 6 (1) of the GDPR and also recalls that such processing must comply with Article 9. of the Regulation for the processing of special categories of data. Furthermore, the Authority stresses that, due to the sensitivity of the data to be processed in the EHDS, the limits of what constitutes legal processing and compatible further processing of the data must be clear to all interested parties involved. Therefore, transparency and public availability of information related to processing in the EHDS will be fundamental to increase public confidence in the system.
The Authority also urged the Commission to clarify the roles and responsibilities of the parties involved and to clearly identify the precise categories of data to be made available to the EHDS. In addition, he proposed that member states establish mechanisms to assess the validity and quality of data sources. It also reinforced the importance of providing the EHDS with a comprehensive security infrastructure, including state-of-the-art organizational and technical security measures to protect the data entered in the EHDS. In this context, he points out that Data Protection Impact Assessments can be a very useful tool for determining the risks of processing operations and the mitigation measures that must be adopted.
EDPS is convinced that the success of the EHDS will depend on the establishment of a strong data governance mechanism that offers sufficient guarantees of a lawful, responsible and ethical management based on EU values, including respect for fundamental rights. The governance mechanism should regulate at least the entities that will be allowed to make data available to the EHDS, the users of the EHDS, the national contact points / authorizing authorities of the Member States and the role of the DPAs in this context.
Finally, the Authority stated that it is interested in public policies aimed at achieving “digital sovereignty” and prefers that data be processed by entities that share European values, including privacy and data protection. To this end, calls on the Commission to ensure that stakeholders who participate in the EHDS, and in particular those responsible for controllers, do not transfer personal data, unless data subjects whose personal data are transferred to a third country benefit from a level of protection essentially equivalent to that guaranteed in the European Union.
On September 10, 2020, the Commission published a Proposal for a Regulation on the temporary cancellation of certain provisions of the Electronic Privacy Directive 2002/58 / EC with regard to the use of technologies by providers of interpersonal communications services for data processing personal data and other data to combat child sexual abuse online.
In particular, the Authority notes that the measures provided for in the Proposal will interfere with fundamental rights to respect for privacy and data protection for all users of very popular electronic communications services, such as instant messaging platforms and applications. According to EDPS, the confidentiality of communications is at the heart of the fundamental rights to respect for private and family life. Even voluntary measures by private companies interfere with these rights when the measures involve monitoring and analyzing the content of communications and the processing of personal data.
The Authority also underlined that the issues at stake are not specific to the fight against child abuse, but to any initiative aimed at collaboration by the private sector for law enforcement purposes. If approved, the proposal will inevitably serve as a precedent for future legislation in this field and, therefore, EDPS considers it essential that the Proposal is not adopted, even in the form of a temporary derogation.
In particular, in the interest of legal certainty, EDPS considers it necessary to clarify whether the Proposal itself is intended to provide a legal basis for processing within the meaning of the GDPR, or not. Otherwise, the Authority recommends an explicit clarification in the proposal whose legal basis for the GDPR would apply in this specific case. In this regard, it stresses that the guidelines of the data protection authorities cannot replace compliance with the legality requirement.
In order to comply with the proportionality requirement, points out the EDPS, the legislation must establish clear and precise rules on the scope and application of the measures in question and impose minimum safeguards, so that people whose personal data is affected have sufficient guarantees that the data will be effectively protected against the risk of abuse.
Finally, EDPS stated that it considers that the proposed five-year period does not seem proportional, given the absence of (a) a prior demonstration of the proportionality of the envisaged measure and (b) the inclusion of sufficient safeguards in the text of the legislation and considers that the validity of any transitional measure should not exceed 2 years.
The authority defines teleworking as the form of work organization that takes place at a distance from the employer’s premises, as opposed to work carried out “on the spot”, using information and communication technologies. Framed by several texts, it is particularly in the private sector by a national interprofessional agreement (ANI) and by the Labor Code. He also answered questions about data processing in this context, pointing out that the processing of personal data that could generate a high risk to the rights and freedoms of data subjects should be subject to an impact assessment. He also recalled that the employer cannot constantly monitor his employees and, as with any processing of personal data, a system for monitoring working time or activities, carried out remotely or “in loco”, must, namely: have a clearly defined purpose and are not used for other purposes; be proportionate and adequate for this purpose; require prior information from interested persons. Among other things, he stated that the employer, as a rule, should not require the activation of the employees’ camera in videoconferences.
The Authority defined how seven UK political parties need to improve the way they handle people’s personal data, after assessing how they manage data protection. To that end, it audited the parties’ data protection compliance after significant concerns about transparency and the use of people’s data in political campaigns that were highlighted in its 2018 report, “Democracy Disrupted?“
According to the Authority, political parties can legitimately retain personal data belonging to millions of people to help them campaign effectively. But advances in the use of data analytics and social media by political parties mean that many voters don’t know how their data is being used.
Key recommendations for the parties include: (i) providing the public with clear information from the beginning about how their data will be used; (ii) telling individuals when they use intrusive profiles, how to combine information about those individuals from several different sources to find out more about their voting characteristics and interests; (iii) be transparent when using personal data to profile and then target people with marketing through social media platforms; (iv) being able to demonstrate that it is responsible, showing how the parties fulfill their obligations and protect people’s rights; (v) conducting thorough checks on all contracted and potential processors and third-party suppliers to obtain guarantees that they comply with the main requirements of transparency, security and accountability of the data protection law; (vi) review its legal bases for the different types of processing of personal data used to ensure that the most appropriate basis is used.
Garante per la protezione dei dati personali ordered Vodafone to pay a fine of more than 12 million euros for illegally processing the personal data of millions of users for telemarketing purposes. In addition to the payment of the fine, the company is obliged to implement several measures envisaged by the Guarantor to comply with national and Community data protection legislation. The investigations carried out by the Guarantor brought to light important critical points of a structural nature related to the violation not only of the consent requirements, but also of fundamental principles, such as responsibility and data protection from conception.
More specifically, one of the most worrying findings from the investigations was the use of fake phone numbers or numbers that were not registered with the ROC (ie the Consolidated National Register of Communication Operators) to make marketing calls. This practice is in the spotlight of Vodafone itself and is apparently related to an obscure set of unauthorized call centers that carry out telemarketing activities in total disregard for personal data protection legislation. In addition, Guarante has determined that Vodafone implements systems to demonstrate that processing for telemarketing purposes complies with the consent requirements.
Data Protection at Universities
“The data governance law – the law that regulates how data about people is collected, processed and used – is the subject of living theories. Concerns about datafication (the transformation of information or knowledge about people in a commodity) and their detrimental personal and social effects have produced an abundance of reform proposals. Different theories defend different legal interests in information, resulting in various individualistic claims and remedies. of their datafication, while others aim to maximize the financial gain of data subjects, but these proposals share a common conceptual flaw: they lose the central importance of population-level relationships between individuals for how data collection produces value and social harm The data collection practices of the most powerful technology companies are primarily aimed at generate perceptions at the population level of the data subjects for applicability at the population level, rather than perceptions at the individual level specific to the data subject in question.
Treating the effects at the ‘data population’ level as central to the data governance task opens up new ground. The proper objective of data governance is not to reaffirm individual control over the terms of your own datafication or to maximize personal gain, but rather to develop the institutional responses necessary to represent the relevant population-level interests at stake in the production of Dice. This changes the reform task of giving individuals exit or payment rights to ensure recognition and legitimacy to shape the purposes and conditions of data production for those with interests at stake in such choices. Based on this reorientation, the data governance law can develop legal reforms capable of responding to the damage of datafication without excluding the socially beneficial forms of data production. “(Our translation, introduction by the author)
The first part of the article describes the risks and status quo of data governance from the documentation of the importance of data processing for the digital economy. It then assesses how the prevailing legal regimes that govern data collection and use – contract law and privacy – encode data as an individual medium. Part three evaluates two prominent proposals for legal reform that arose in response to concerns about datafication and concludes that while the proposals for ownership and dignity differ in the theories of injustice underlying datafication and therefore provide different solutions, both resolve claims and individualistic remedies that do not represent, much less address, the relational nature of data collection and us.
Part four of the article proposes an alternative approach: data as a democratic means (DDM). This alternative conceptual approach apprehends the data’s ability to cause social harm as a fundamentally relevant feature of datafication; hence a commitment to collective institutional forms of governance arises.
The article aims to analyze the consent of the data subject and, for that purpose, it carried out the study of the general rules for the expression of valid and effective consent, its characterization and relationship with the rights of the data subject. Then, it moves on to the protective structure developed in terms of sensitive data and, finally, to the rule regarding the processing of personal data of children and adolescents, with emphasis on the provisions regarding consent for the processing of such information. The purpose of the article is to raise questions and evaluate possibilities of application of LGPD, always in favor of the human person and their existential situations.
Data Protection in the Brazilian Legislative
Bill 5141/2020, drafted by Deputy Alex Santana, adds the term “religious” in 4th article of the LGPD, which deals with the list of immunity to the Law. In justification, the author stated that, in spite of the conquests that came from the LGPD, in order to protect privacy as a full exercise of citizenship, in view of the constitutional guarantee provided for in article 5, VI, which ensures the free exercise of religious services, including their liturgies and their internal procedures, as an extension of the separation between State and Church, there is a need to extend the exclusive hypotheses of applicability of the aforementioned standard to the procedures adopted by religious organizations to the religious field.
Data Protection in the Brazilian Judiciary
On November 20, through a monocratic decision, Judge César Loyola, from the Federal District Court of Justice, after a complaint by the Public Ministry, decided that Serasa could no longer provide information such as name, CPF, address and age more than 150 million Brazilians. The data was sold through online listing and customer prospecting services, and each “package” of information about a person cost ninety-eight cents. In the decision, the magistrate states that the legal basis adopted should necessarily be consent and that, therefore, the activity was illegal. In the action of the Public Ministry, the agency cited other legal bases but which were not taken into account in the trial. The case number is 0749765-29.2020.8.07.0000 and the application can be accessed here.