Welcome to the 31st edition of the Bulletin! In this edition, we highlight the joint technical opinions of the European Data Protection Board (EDPB) and the European Data Protection Supervisor […]
Welcome to the 31st edition of the Bulletin!
In this edition, we highlight the joint technical opinions of the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) on the new standard contractual clauses for the international transfer of personal data and for the preparation of contracts between controllers and operators. It is a document for which there were many expectations, since factors such as the need to adapt to the requirements of the GDPR and the CJEU decisions on the Schrems I and II cases required clarification and harmonization by the authorities. The first webinar in the series “LGPD on the move: key implementation issues” was precisely about the challenges related to the international transfer of personal data and was attended by national and international experts. Check it out!
In the section “Data Protection at Universities”, we stress the importance of the selected articles “Technologies for the aggregation of geolocation data in the fight against COVID-19 in Brazil” and “The data and the virus”, two new contributions to the academic legal debate on technologies adopted to combat COVID-19 and its possible consequences and reflections for the right to the protection of citizens’ personal data.
We wish everyone a great reading!
Bruno Bioni, Iasmine Favaro and Mariana Rielli
Data Protection at Authorities
In October 2020, two judgments (C-623/17 and C-511/18) were issued by the Court of Justice of the European Union, confirming the jurisprudence that there could be (i) comprehensive and indiscriminate storage of traffic and location data in the event of a real and serious threat to national security; (ii) targeted storage of traffic and location data of certain categories of people, in order to guarantee national security, combat serious crime and prevent serious threats to public security, based on objective and non-discriminatory factors, and (iii) implementation of accelerated retention of traffic and location data to ensure national security and the fight against serious crime. These categories are exceptions to the rule, established by the judgments, that there can be no general and indiscriminate preemptive retention of communication data, especially traffic and location. Taking this into account, the Czech Authority pointed out that it would consider it appropriate to establish retention periods individually, so that specific purposes are identified and special communication channels were also established for different types of cases.
As part of the Authority’s work to strengthen data protection and the risk-based approach, a survey was carried out using questionnaires applied to controllers (seven public authorities and seven companies), which should seek to clarify their general maturity level in areas of information security. The Authority’s assessment is that all surveyed controllers have a strong focus on working with information security, including, but not limited to, the data protection aspect. In several cases, however, the Authority assessed that the controllers could have a greater focus on establishing contingency plans.
The event will be held on January 25, and will aim to assess, in general terms, the impact of the measures taken in response to the pandemic of the COVID-19 and identify ways in which the data can be used to better deal with a possible next situation of need and public calamity.
EDPB and EDPS adopted joint opinions on two sets of standard contractual clauses: one opinion on the SCC for contracts between data controllers and operators and another on the SCC for the transfer of personal data to third countries. Several changes were requested in order to make the text more clear and to ensure its practical utility in the day by day operations of controllers and operators. This includes the interaction between the two documents, the so-called “plug-in clause”, which allows additional entities to join the SCC, and other aspects related to the obligations of operators. In addition, EDPB and EDPS suggest that the Annexes to the SCC clarify as much as possible the roles and responsibilities of each party in relation to each processing activity – any ambiguity would make it more difficult for controllers and operators to fulfill their obligations in accordance with the principle of accountability. The new SCC for the transfer of personal data to third countries, pursuant to art. 46 (2) of the GDPR, will replace the existing SCC, which were adopted on the basis of Directive 95/46 and which need to be updated according to the requirements of the GDPR, also taking into account the Schrems II Judgment of the CJEU and in order to better reflect the operations of new and more complex processing methods, often involving multiple data importers and exporters. In particular, the new SCC include more specific safeguards in the event that the legislation of the country of destination impacts compliance with the clauses, in particular in the case of binding requests from public authorities to disclose personal data.
EDPB published an opinion on Personal Information Management Systems, clarifying that the so-called “PIMS” are new products and services that help individuals to have more control over their personal data. PIMS allows individuals to manage their personal data on secure, local or online storage systems, and to share it whenever and wherever they want. Individuals can decide which services can use their data and which third parties can share it. This allows for a human-centered approach to personal data and new business models, seeking to avoid illegal tracking and profiling techniques that aim to circumvent the main data protection principles. A basic feature of the tool is that individuals, service providers and applications must use authentication to access a personal data storage center, which allows people to track who has had access to their digital behavior. Other generally common elements of PIMS are secure data storage, secure data transfer (between systems and applications) and data interoperability and portability.
The document, prepared by CNIL and the Superior Audiovisual Council (SAC), aims to reach teachers, who will have access to comprehensive documentation and educational tools that will make the public aware of the challenges of digital citizenship and also parents, adults, youth and even children, who will find in the document several tools to better understand the uses and supervise their online practices. The main topics covered are: (i) Rights on the Internet; (ii) Protection of online privacy; (iii) Respect for creation; and (iv) The rational and responsible use of screens. Here you can access the complete kit.
On January 12, 2021, the CNIL sanctioned the Ministry of the Interior for illegally using drones equipped with cameras, in particular to monitor compliance with the pandemic containment measures. Considering it likely that the use of these drones involve the processing of personal data, the president of the CNIL sent a letter to the Ministry of the Interior in April 2020, in order to get details about these devices and their characteristics. Up until now, points the Authority, no text authorizes the Ministry of the Interior to use drones equipped with cameras that capture images in which people are identifiable. Likewise, the Authority indicates that, although it is mandatory, no impact assessment has been communicated to CNIL on the use of drones. The general public was also not informed. Therefore, the Authority imposed an administrative sanction on the Ministry. The full decision can be accessed here.
CNIL organized two studies: (i) a survey in February 2020 with 1,000 parents and 500 children between 10 and 17 years. This research focused on the digital practices of minors and their parents’ perception of their behavior and (ii) a public consultation, published on its website from April to June 2020 and which collected around 700 contributions, from education professionals, people working with youth, digital companies or even legal professionals. The survey results show that 82% of children aged 10 to 14 say they usually go online without their parents, against 95% of children aged 15 to 17 years. On average, 70% of children of all ages report watching videos only on the Internet, a number that parents underestimate. Parents of teenagers aged 15 to 17 estimate that the first use of the web was around 13 years old. Parents of children aged 8 to 9 report that they connect to the Internet alone from the age of 7 to play online or watch videos. 46% of parents of young people aged 8 to 17 years have implemented solutions to monitor their children’s activity on the Internet, but the most used solution is the ban on talking to strangers on the network. Finally, the survey indicates that parents understate the frequency with which young people between 10 and 14 years old play alone online. Likewise, they are often not informed of their presence on a social network (more than half of the children in this age group in the survey). As for the public consultation, it allowed to identify two main trends: (i) the desire of minors to gain autonomy and (ii) the desire to strengthen their online protection – two elements that, as emphasized by the text, are not contradictory, but complement each other.
Service providers (state and municipal institutions, private companies, organizations) often want information to protect their employees and customers, but the Authority stated that there is no legal basis or need to require people to provide confirmation that they have not been abroad or that they have not contacted any person infected with COVID-19. The Authority stated that if the organization in question wishes and deems necessary, it can achieve the same objective – limiting the spread of the Covid-19 infection – without taking testimony from individuals, but informing them of their responsibilities, taking into account the decisions made by the Ministry of Health, competent authorities and regulatory decrees.
The information and description of the complaint revealed that students were identified during exams conducted in late May 2020 via videoconference. After the exam, the recordings became available not only to the exam participants, but also to other people who had access to the system. In addition, through a direct link, any third party could have access to the exam recordings and the data of the examined students, presented during identification. As the preliminary information indicated that there could be a high risk to the rights and freedoms of the people who underwent the examination, the Authority asked the controller to clarify the situation. In response to the letter, the University argued that it was not necessary to notify the Authority of the violation, as in its view the risk to the rights or freedoms of the people affected by the incident was low. The Authority found that there was a breach of data protection and that the controller has failed to comply with reporting obligations, including those affected by the breach. Such obligations arise when, due to the violation, there is a high risk to the rights or freedoms of the people affected and the Authority considered that the controller, in this case, incorrectly assessed the risk involved. Thus, it applied a fine of 25 thousand euros to the University.
The store manager in question shot surveillance images with a cell phone and later shared the film. The film spread quickly on the internet. The Authority stated that any processing of personal data requires a legal basis. After investigating the case, the Authority’s assessment is that the Coop Finnmark store had no legal basis for sharing the monitoring footage by the store manager. It is noteworthy that the filming of the camera showed children, and that the disclosure presents a potentially great risk to their privacy. The Authority then decided to fine the store Nkr 400,000.
EDPB adopted guidelines on examples of data breach notification. These guidelines complement the WP 29 guidance on the topic, introducing more recommendations for practice. They are intended to help controllers decide how to deal with violations and what factors should be considered during the risk assessment. The guidelines contain an inventory of data breach notification cases considered to be the most common by national supervisory authorities, such as ransomware attacks; data exfiltration attacks; and lost or stolen devices and paper documents. The guidelines present the most typical good or bad practices by category, in addition to providing advice on how risks should be identified and assessed. The guidelines are open for public consultation for a period of six weeks.
The MOU, signed virtually, establishes the commitment of the British Authority and the Philippine Authority to promote closer collaboration and cooperation in data protection in both jurisdictions. Under the MOU, the two authorities will seek the regulatory cooperation necessary to support their data-based economies and protect the fundamental rights of citizens in each jurisdiction. The document establishes an enhanced working base for the two authorities to move forward on matters of mutual regulatory interest.
Kim Doyle, who worked for RAC, transferred personal data to an unauthorized claims management company. The defendant admitted being guilty of conspiracy charges to ensure unauthorized access to computer data and the sale of illegally obtained personal data at a hearing in January 2020. The employee compiled lists of traffic accident data, including partial names, numbers cell phone numbers and registration numbers, despite not having permission from their employers. As a result, she was sentenced to 8 months in prison.
In its complaint, FTC claims that the company Flo has promised to keep users’ health data private and only use them to provide the application services. In fact, according to the complaint, Flo released health data for millions of users of its Flo Period & Ovulation Tracker app to third parties who provided marketing and analytics services for the app, including analytics division from Google, Facebook, AppsFlyer and Flurry. According to the complaint, Flo disclosed sensitive health information, such as a user’s pregnancy, to third parties in the form of “app events”, which are app data transferred to third parties for a variety of reasons. In addition, the company did not limit how third parties could use this health data. The FTC also claims that Flo violated the EU-US Privacy Shield and Swiss-US Privacy Shield structures, which, among other requirements, demand protection of personal data transferred to third parties. As part of the proposed agreement, the company is prohibited from misrepresenting, in communication with users and authorities, the purposes for which it or the entities to which it discloses the data collect, maintain, use or disclose the data; how much consumers can control the use of that data; its compliance with any privacy, security or compliance program; and how it collects, maintains, uses, discloses, deletes or protects users’ personal information. In addition, it must notify affected users of the disclosure of their personal information and instruct any third party that has received health information from users to delete that data. The agreement will be subject to public comment for 30 days after publication in the Federal Register, after which the Commission will decide whether to make it final.
In its complaint, FTC claims that, in February of 2017, Everalbum launched a new feature in the Ever app, called “Friends”, which used facial recognition technology to group users’ photos by the faces of people who appeared in them and allowed users to tag people by name. Everalbum reportedly enabled facial recognition by default for all mobile app users when it launched the feature. FTC’s complaint alleges that Everalbum’s facial recognition application to the photos of Ever app users was not limited to providing the Friends feature. Between September 2017 and August 2019, the company combined millions of facial images it extracted from photos of Ever users with facial images it obtained from publicly available data sets to create four data sets used in the development of its facial recognition technology. As part of the proposed agreement, Everalbum, Inc. must obtain express consent from consumers before using facial recognition technology in their photos and videos. The proposed deal also requires the company to delete models and algorithms it has developed using photos and videos uploaded by its users. “Using facial recognition, companies can turn photos of their loved ones into confidential biometric data,” said Andrew Smith, director of consumer protection at the FTC. “Ensuring that companies keep their promises to customers about how they use and manipulate biometric data will continue to be a high priority for the FTC.”
The ranking shows that the first three places are telecommunications companies, fined for abusive marketing practices, including non-compliance with the National Registry “Don’t Call”’. The sanctions imposed on companies reach 78 million pesos. They are: 1. Telefónica Móviles Argentina SA (Movistar), 2. Telecom Argentina SA (Personnel) and 3. AMX Argentina SA (Claro) and DIRECTV Argentina SA.
Given the significant number of electoral processes between 2021 and 2022, the Authority organized a discussion on the impact and risks that the circulation of false, manipulated or misleading information on social networks entails for the legitimacy of these processes and democracy. The head of the Authority, Gloria de la Fuente, contextualized the phenomenon, referring to a crisis of confidence in the institutions – including the media, which have verification protocols -, to the increase in the use of social networks and the Internet by Chileans and changes in the ways of generating and consuming information. De la Fuente pointed out that misinformation can be approached from several angles, among them the binomial freedom of expression x limitation of hate messages and the use of big data or data in a massive way, which sets up a discussion about data protection personal data and gaps in a law that has not been modernized since 1999.
The Authority inspected 305 public and private entities, generating 277 final inspection reports. The result of these documents were a hundred administrative sanctioning procedures (PAS) and the issuance of a similar number of corrective measures, which, when implemented by inspection bodies, prevented the constitution of a PAS and the application of fines. 142 complaints were processed, which represents an increase of 178% over the previous year. Likewise, more than 100 resolutions on sanctioning procedures were issued between the first and second instance. During 2020, 829 fines (3,564,700 Soles) were imposed. It is also noted that more than a hundred entities have corrected their infringing behaviors since the inspection phase. Among the entities sanctioned for bad processing of personal data are banks, clinics, universities, supermarkets, social networks, credit risk agencies, among others. A financial entity, for example, has been sanctioned for 40 violations, for failing to respect the duty of confidentiality by failing to implement the security measures necessary to safeguard its customers’ personal data, which has led to security breaches. Likewise, the Authority has sanctioned several entities for failure to obtain valid consent or the duty / right to report.
Data Protection at Universities
The work aims to analyze the risks to privacy and protection of personal data – in their individual and collective dimensions – generated by profiling based on the use of aggregated data from geolocation of mobile devices, seeking to investigate the existence of normative parameters found in the General Protection Law (LGPD) applicable to the identified risks. To this end, the article proposes the following research questions: (i) what risks to fundamental rights to privacy and the protection of personal data do profiling technologies based on the use of aggregated geolocation data from mobile devices generate, at the individual and collective levels in the fight against the COVID-19 pandemic in Brazil? (ii) does the LGPD provide normative parameters applicable in order to deal with these risks, especially to groups created from algorithmic systems? In this context, the article proposes to affirm the collective dimension of the rights to privacy and the protection of personal data. The risks detected for both rights are that of re-identification of users of mobile devices due to inferential attacks (membership inference attacks) and distortion of function and original purpose of data processing. In order to deal with such risks, the article suggests a systematic interpretation of normative parameters of the LGPD, which deal with automated profiling and impact reporting on the protection of personal data.
The article presents the main findings of the research project “The Data and the Virus” (which accompanied and documented the use of information and communication technologies – ICTs to combat the Covid-19 pandemic). Considering that these uses fostered the debate on the legitimacy of these technologies, the article also offers an analysis, focused on the Brazilian case, on how the institutional framework of data protection in the country was affected by the agenda. The main technologies based on data collection and processing used at a global level and the main examples adopted in Brazil are described. The judicial tensions caused by these uses and the attempts to share personal data in the context of Covid-19 (Class Action N. 1019257-34.2020.8.26.0053 and ADI 6387) were also addressed. Then, the impacts of these debates and lawsuits on data protection in the country were analyzed. Among the conclusions, the strengthening of institutional framework stands out, contradicting the initial concerns that the advancement of the technologies in question would weaken it.
Data Protection in the Brazilian Judiciary
Decision on the Extraordinary Appeal 649.379, filed by the company Universo Online S/A. In his vote, Minister Rapporteur Gilmar Mendes pointed out that any information stored by public and private companies regarding the consumer’s private life is understood as data. The due date of collections, which is the subject of the case under discussion, would, for the Rapporteur, be linked to consumer privacy. To display the payment date on the outside of the collection letters is, in this context, a form of undue exposure of the individual, who ends up having personal information made available to third parties, including the fact that he is receiving a charging invoice – and not a simple correspondence. The rapporteur also pointed out that “it is known that data protection results from the inviolability of intimacy and private life. Even without an express provision, ‘it is possible to extract from the Federal Constitution a true fundamental right to the protection of personal data’ (MENDES, Laura Schertel.