07.01.21
Intervalo: 12/10/2020 - 12/10/2020

Welcome to another edition of the Data Protection and Privacy Bulletin! It is with great enthusiasm that we start the year 2021 with the 30th edition of this project. We […]

Welcome to another edition of the Data Protection and Privacy Bulletin! It is with great enthusiasm that we start the year 2021 with the 30th edition of this project.

We highlight the beginning of the activities of the Brazilian National Data Protection Authority (ANPD), with the publication of important guides for compliance with the General Data Protection Law, in areas such as contracts, information security and security incidents and the production of Data Protection Impact Assessment (DPIA).

Also, we highlight the guidelines of the French Data Protection Authority (CNIL) for the processing of personal data in the context of vaccination against COVID-19. The Authority sought to provide transparency to the process, unraveling what data is collected, under what circumstances, which bodies have access to it and what the retention period is. In addition, it also reinforced the rights of the holder, such as the right of access, rectification and opposition (if the holder has not given his consent to receive the vaccine). This is an important position, which complements the series of guidelines that the French Authority and other authorities have been producing on the use of personal data in the context of the COVID-19 pandemic.

In the same vein, with the need generated by the pandemic to share a massive amount of data and to make the process more transparent, the British Data Protection Authority (ICO) has published a guide for data sharing. The guide, combined with a set of new resources, provides practical guidance for companies and organizations on how to perform responsible data sharing.

Another interesting document produced in the period, by ICO, was a series of six considerations for the use of automated systems in the context of employment decisions. The theme was considered of great urgency by the agency, since it identified a growing tendency to automate work relations, including in the context of hiring people remotely, due to contextual elements such as the pandemic and the Black Lives Matter movement .

We also emphasize the article by Bruno Bioni and Daniel Dias on civil liability in the protection of personal data, a topic of extreme relevance for the practical application of the LGPD and which is one of the most heated debates for 2021, in view of the different views on civil liability in the context of the protection of personal data.

We wish you all a great reading!

Bruno Bioni, Iasmine Favaro & Mariana Rielli

Data Protection at Authorities

Brazil

Bodies should indicate responsible for processing of personal data in the institutions of public administration

Bodies and entities of the direct, autarchic and foundational Federal Public Administration must appoint a member to be responsible for the processing of personal data of the corresponding institution. The requirements and procedures for nominating the persons in charge of each body are set out in Normative Instruction SGD / ME nº 117. The “person in charge” is provided for in the General Law for the Protection of Personal Data (LGPD) and will act as a communication channel between the bodies, data holders and the National Data Protection Authority (ANPD), regarding the application of the practices necessary to guarantee the citizen’s privacy and the protection of their personal data. The bodies will have 30 days to indicate who is responsible for processing the information.

ANPD publishes operational guides for adaptation to the LGPD

Some documents were published on the Authority’s official website, such as the guide ”Privacy Governance Program”, which presents the main points of the General Data Protection Law, providing the subsidies for the creation of an institutional program for data management privacy. In addition to this general document, the Authority has also made available the ”Guide to preparing Terms of Use and Privacy Policy for public services”, the ”Security and Privacy Risk Assessment Guide”, the ” Good Practice Guide Practices for Specifying Information Security and Privacy Requirements in Information Technology Contracts ”. Finally, a template for the production of Data Protection Impact Assessment was published, complemented by a case study and an example of a completed template.

Denmark

Danish authority decision invalidates consent solution used by company

After denouncing a user on the DGU Commercial S / A website, the Danish authority checked the consent collection mechanism and concluded that it did not meet the requirements of the GDPR, such as voluntariness, granularity and unequivocal expression of the users’ will. In this case, the visitor to the website was initially presented with information about the processing activities and, after that, he could click on the option “allow all cookies”, and it is not possible to choose another alternative for collecting cookies. The Authority also concluded that the company had been considering the continued use of the website as an expression of user consent.

Three decisions by the Danish authority illustrate data processing in a purely private context

The data protection rules established by GDPR do not apply to the processing of personal data carried out by a natural person in the context of purely personal or family activities. In two cases, the Danish Data Protection Agency concluded that the publication of personal information on a website and on YouTube, respectively, should be considered to have taken place in purely private contexts and therefore outside the scope of the data protection rules. The evaluation carried out took into account the nature of the published information, the context in which the information was published (specifically, whether commercial or not) and the purpose of the publication. For the Authority’s chief advisor, “decisions in these two cases do not mean that private individuals can now publish personal information about others on the Internet. Therefore, you must continue to think carefully before publishing this type of data, you must use the common sense and think about whether it can bother the person you are dealing with. If in doubt, ask him,” says Karina Kok Sanderhoff.

EU-UK exit agreement

On 24 December 2020, the European Commission and the United Kingdom reached an agreement to regulate EU-United Kingdom relations from 1 January 2021. From that date, the United Kingdom will be considered a third country for purposes of GDPR. The transfer of personal data to a third country requires, according to Chapter V of the Regulation (articles 44 to 50), compliance with one of the transfer hypotheses listed, such as the use of Standard Contractual Clauses, global corporate standards, suitability decisions , etc. The agreement between the United Kingdom and the EU provides, however, that transfers from the EU / EEA to the United Kingdom can remain unchanged for up to six months from 1 January 2021, which means that during this period there will be no need for adherence to one of these options. Therefore, EU controllers can, under the agreement, continue to transfer personal data to the UK until the end of June 2021.

Estonia

Estonian authority publishes note on rules for video surveillance

According to the Authority, the average Estonian can enter the coverage area of at least ten security cameras per day and, therefore, rules for the use, collection and processing of personal data must exist. For the Authority, first, the person entering the video surveillance area must be notified of the security camera. This requires a notification label. Second, a person in the field of video surveillance should be able to request information about the processing of your recording. For that, it is necessary to establish conditions that give transparency to the processing. In addition, the Authority states that video surveillance is only possible if its purpose is legitimate and is limited to protecting people, property or fighting crime.

European Data Protection Supervisor

Blog: “What does COVID-19 reveal about our privacy engineering features?”

The article discusses how the public discussion about specific privacy tools for a new contact tracing application, which was only in the early stages of development, was a completely new phenomenon, since historically this type of discussion took place only after serious failures or incidents security issues that affect many people.

This happened in early 2020, when several groups of researchers discussed the possible privacy safeguards for the Corona Tracing App, and their concerns and suggestions found wide echo, even in the mainstream media. At the time, the main focus of public discussion was an architectural choice, with different implications for privacy: centralized or decentralized storage of information about contacts between users of the application.

Subsequently, the vast majority of contact tracking applications deployed in EU countries ended up opting for the decentralized model. Almost all of them decided to rely on a structure created by the main providers of mobile operating systems, Google and Apple. For Thomas Zerdick, head of technology and privacy at EDPS, developer presentations illustrated that choosing an architecture that favors individuals’ control over their personal data is an important step in protecting that data, but it is only one step, and many others are necessary. Among the risks that developers sought to address is traffic analysis: if only mobile apps for which a positive test result is recorded provide uploads to the central database, watching which mobile devices send these updates would reveal which users have a result positive test. In addition, the practical implementation of the decentralized architecture requires many design decisions, which can affect the real protection of users.

On the subject, the Data Privacy Brasil Research Association produced the article “Everything you need to know about the tracing technologies used to combat COVID-19“, which explains the different models adopted in the tracking technologies.

France

CNIL publishes note on data collection in the context of vaccination against COVID-19

To enable the SARS-CoV-2 coronavirus vaccination campaign to be carried out and monitored, the French government decided to create by decree a solution on which the CNIL issued its opinion on 10 December. On this occasion, the Authority reaffirmed respect for the rights and freedoms of data subjects and recalled that it would exercise its power of control as soon as the solution was implemented. The initiative, called Information System (SI) “Vaccine Covid”, includes information about people vaccinated in order to organize the vaccination campaign, monitoring the supply of vaccines and consumables (syringes, etc.).

To achieve the objectives of SI “Vacina Covid”, the following information is collected: identity and contact details, social security number (NIR), health data, such as criteria for eligibility for vaccination determined by the Ministry of Health, etc. Data are also collected on health professionals and people under the responsibility of the data subject and the data retention period is ten years, with the exception of those necessary for the care of vaccinated people in case of identification of new risks, which will be maintained by the Public power for thirty years. Some of this data is transmitted to health professionals and teams responsible for carrying out screening or vaccination itself. Other public structures, such as the National Health Insurance Fund (CNAM) or the National Agency for the Safety of Medicines and Health Products (ANSM), have access to certain data for the fulfillment of their missions.

The pseudonymised data, that is, without the proper name, social security number, contact details are accessible to certain employees of the National Public Health Agency (ANSP) and Regional Health Agencies (ARS) to monitor vaccination coverage and organize the vaccination campaign. These data can also be communicated to the Department of Research, Studies, Evaluation and Statistics (DREES) of the Ministry of Health for the establishment of statistics. Pseudonymised data is also transmitted to the Health Data Hub and CNAM for the purpose of managing the health emergency and to improve knowledge about the virus.

In its opinion, CNIL asked the Ministry to inform all the people involved, namely with regard to their rights of access, rectification and to oppose the processing of their data until they express their consent for vaccination. After that moment, CNIL understood that it is not possible to oppose the processing, since it fulfills an important objective of public interest, namely in the scope of pharmacovigilance.

CNIL sanctions two doctors for leaking health data

After an investigation conducted in September 2019, CNIL found that thousands of medical images hosted on servers belonging to two doctors were accessible for free on the internet. During the hearings, doctors recognized that the data breaches were due to poor choices in configuring the internet and its medical imaging software. The investigations carried out also showed that the medical images kept on their servers were not encrypted.

Based on these elements, the CNIL body in charge of the sanctions found that the two physicians disregarded the basic principles of information security and data protection, considering that they should have ensured that the configuration of their networks did not make the data freely accessible on the Internet. Internet. The committee also identified a violation of the obligation to report data breaches to CNIL (article 33 of the GDPR). As a result, fines of € 3,000 and € 6,000 were imposed on the two doctors.

Germany

German authority praises German Constitutional Court decision on data mining

The background to the decision was a constitutional complaint against a provision in the Anti-Terrorism Archives Act. This rule regulates the use of data collected in counterterrorism files in the country and effectively expands access to personal data for this purpose. Professor Ulrich Kelber sees his legal opinion confirmed by the decision of the Federal Constitutional Court, as the judges decided that the expansion of data use under the Antiterrorism Archives Law was partially unconstitutional. According to Kelber, “the decision strengthens data protection. For the first time, the Federal Constitutional Court issued a statement on data mining applications in security authorities’ databases. It confirmed my long-standing view: data analysis with the use of technology represents an interference with the right to informative self-determination, which requires, to be legitimate, a clear legal basis with limits of intervention ”.

Ireland

Irish authority develops Children Front and Center: fundamentals for a data processing approach aimed at children

The Fundamentals for a Data Processing Approach for Children were developed by the Data Protection Commission (DPC) to drive improvements in data processing standards for this age group. They introduce interpretive principles of data protection specific to children and concrete measures against the data processing risks arising from the use / access to services in the online and offline world. In parallel, Fundamentos will help organizations that treat children’s data, clarifying the principles and standards to which they must adhere. The document was informed by the result of the public consultation that the DPC carried out during the first half of 2019. One aspect of this consultation was directly addressed to children and young people and encouraged them to give their opinions on the use of their own personal data and their rights in a social media context. The other flow of consultation was directed at all other stakeholders, including parents, educators and child rights organizations, as well as organizations that handle child data.

Irish authority announces decision on Twitter inquiry

The Authority announced on December 15 that it had completed an investigation it conducted on Twitter International Company. The DPC investigation began in January 2019, after receiving a notification, from Twitter itself, regarding a security incident. The Authority concluded that Twitter violated GDPR Article 33 (1) and 33 (5) as it failed to notify the violation in time and to properly document the violation. An administrative fine of € 450,000 was applied as an effective, proportionate and dissuasive measure. The provisional decision in the investigation was submitted to analysis by other Interested Authorities, pursuant to Article 60 of the GDPR, in May this year. It was the first case to go through the Article 65 (“conflict resolution”) process since the introduction of the Regulation and also the first provisional decision in a “big tech” case in which all EU authorities were consulted.

Italy

Italian authority publishes guide on deep fakes

Deep fakes are photos, videos and audio created thanks to artificial intelligence software that, from real content (images and audio), are able to modify or recreate, in an extremely realistic way, the characteristics and movements of a face or a body or faithfully imitate a certain voice. The Italian Authority has developed an information leaflet to make users aware of the risks associated with the malicious uses of this new technology, which are increasingly frequent, including the dissemination of applications and software that allow the elaboration of sophisticated and sophisticated counterfeits, using a common smartphone.

Italian authority opens suit against TikTok

The investigation launched by the Authority in March 2020 showed a set of data processing carried out by the social network that are not in compliance with the new regulatory framework for the protection of personal data. The Authority considers, first of all, that the methods of registering on the social network do not adequately protect minors. For the Authority, the prohibition on the registration of minors under 13 years old, instituted by the social network, is easily circumvented, since a false date of birth can be used. Consequently, Tik Tok does not prevent children from signing up, nor does it verify that European privacy regulations, which provide for the need for authorization from parents or those who have parental responsibility for children under 14, are respected.

The information passed on to users – the Authority also highlights – is standardized and does not specifically take into account the situation of minors, requiring the creation of a special section dedicated to children, written in simpler language and with alert mechanisms that denounce the risks. to which they are exposed. The data retention times are indefinite in relation to the purposes for which they are collected, as well as the anonymization methods that the social network claims to apply. The same lack of clarity exists regarding the transfer of data to third countries, since the countries to which the company intends to transfer the data are not specified, nor does it indicate the suitability or not of those countries to European privacy legislation.

Finally, according to the Authority, the social network pre-defines the user’s profile as “public”, allowing maximum visibility of the content published there. This preset contrasts with data protection legislation that provides for the adoption of technical and organizational measures for the protection of personal data, from conception, especially for children and adolescents.

Netherlands

EDPB publishes strategies for 2021-2023, Brexit and international data transfer

The EDPB Strategy 2021-2023 discusses the strategic objectives for the coming years. These goals are grouped into 4 pillars: (i) to promote harmonization and facilitate compliance by the processing agents with the rules and principles; (ii) supporting effective enforcement and efficient cooperation between regulators; (iii) addressing new technologies from the perspective of fundamental rights and (iv) promoting global data protection. Part of the strategy is being carried out through a pilot project with an Expert Support Group.

Through this group, European regulators can support each other, sharing capacity and experience, for example, when conducting investigations. In addition, the EDPB also describes in 2 documents the consequences of Brexit for the international transfer of personal data involving the United Kingdom: the declaration on the end of the Brexit transition period and the information note on data transfers under the GDPR for the UK after the transition period.

United Kingdom

ICO publishes new Practical Guide on Data Sharing

The guide, combined with a set of new resources, provides practical guidance for companies and organizations on how to perform responsible data sharing. Data sharing is critical to digital innovation in the public and private sectors. It can lead to many economic and social benefits, including increased growth, technological innovations and the delivery of more efficient and targeted services. Information Commissioner Elizabeth Denham said the COVID-19 pandemic has given greater focus to the need for fair, transparent and secure data sharing.

The document was constructed as follows: as a first step, the British government included provisions in the Data Protection Act 2018 requiring the ICO to produce a practical guide that would provide guidance on data sharing. An earlier data sharing code was published in 2011, under the Data Protection Act of 1998. Then, the first draft of the new guide was put into public consultation in July 2019, preceded by a call for opinions in 2018. It was informed by initial opinions and evidence collected from a wide range of private, public and third sector organizations, as well as individuals acting in a private capacity.

The ICO submitted the Practical Data Sharing Guide to the Secretary of State on December 17, 2020. The Secretary of State will now need to submit the code to Parliament for approval. After the document is presented, it will remain in Parliament for 40 consecutive days. If there are no objections, it will take effect 21 days later.

ICO publishes note pointing out six things to consider when using algorithms for employment decisions

First, with many losing their jobs due to the Covid-19 pandemic, the Authority sees a trend for more people to apply for limited vacancies. This can cause employers to look for algorithms to ease the burden on HR departments. Second, the Black Lives Matter movement has prompted employers to look for ways to deal with racism and other forms of prejudice in their workplaces. Algorithms can be one of the tools used to deal with discriminatory hiring practices. From this context, the six points highlighted are: (i) Prejudice and discrimination are a problem in human decision making, therefore, they are also a problem in decision making by AI; (ii) All algorithms must comply with the data protection fairness principle, which the Authority itself considers complex. This means that the processing of data by an AI system should not have any unjustified adverse effects on an individual. This includes discrimination against people who have some protected characteristic; (iii) The advance of big data and machine learning algorithms is making it difficult to detect bias and discrimination; (iv) Both data protection rules and those related to equality must be considered when developing AI systems; (v) The use of exclusively automated decisions for the purposes of contracting the private sector is likely to result in an illegality under the GDPR and, ultimately, (vi) Algorithms and automation can also be used to solve the problems of prejudice and discrimination. They are part of the problem, but they can also be part of the solution.

Video conference companies’ response to the joint statement on global privacy expectations of video conference companies

Microsoft, Cisco, Zoom and Google responded to the open letter prepared by the ICO and five other personal data protection authorities. In their responses, companies highlighted several privacy and security practices, measures and tools that they intend to implement or integrate with their videoconferencing services. For the Authority, the information provided by these companies is encouraging. In the next steps of the process, the joint signatories will engage even more with these companies, seeking additional clarification on some points questioned and giving companies the opportunity to demonstrate how they achieve, monitor and validate the measures set out in their responses.

Data Protection at Universities

Civil liability in the protection of personal data: building bridges between the General Data Protection Law and the Consumer Protection Code

BIONI, Bruno. DIAS, Daniel.

The article points out that Brazilian doctrine has focused its attention on essentially answering a question: whether the regime of responsibility is objective or subjective, but, as relevant as that may be, it is not this question that should guide the debate. For the authors, this question seems to start from a false premise of duality of legal regimes of responsibility, objective or subjective. More important than this attempt to classify binary responsibility, whether objective or subjective, is to analyze more closely and, in detail, the normative elements that would restrict or extend the discussion of guilt for the purposes of accountability in the processing of personal data. They conclude that, even though the LGPD has created a regime of subjective civil liability, the barriers to the outbreak of the duty to indemnify have been substantially reduced.

WhatsApp Marketing: A Study on WhatsApp Brand Communication and the Role of Trust in Self-Disclosure

ZAROUALI, Brahim. BROSIUS, Anna. HELBERGER, Natali. VREESE, Claes H.

The article points out that WhatsApp recently allowed commercial brands to initiate private chat conversations with users through its direct messaging platform. With more than 1 billion users, the article points to the importance of having insights about users’ trust in brands and WhatsApp, as well as their willingness to disclose personal information to those brands. To this end, the study uses data from a nationally representative survey, which points to the conclusion that the perceived security, perceived privacy and perceived sociability of WhatsApp as a platform are positively associated with trust in brands on the platform. In turn, brand confidence positively influences consumers’ intentions to disclose information to brands on WhatsApp. Finally, these results are also compared with Facebook Messenger. The study points out that there are significant differences between the two messaging platforms.

Data Protection in the Brazilian Legislative

Bill on teleworking presented

Bill 5581/2020, presented by Federal Deputy Rodrigo Agostinho, deals with teleworking, defining the concept of home office, and has a section for the protection of privacy and personal data, pointing out that the employer must respect privacy and protect the personal data of the teleworker, in addition to determining that the employer may adopt monitoring of the teleworker, by means of telematics and control of the provision of service with access to images, sounds and other personal data for the specific purposes of the employment relationship, in especially for the purposes of information security, work management by the employer and to confirm compliance with contractual and legal obligations, provided that the employer informs the teleworker about the location, scope, time and forms of monitoring, control and use of personal data; guide teleworkers on how to maintain their privacy and intimacy in the face of the monitoring carried out; and properly respect the right to privacy, intimacy and protection of personal data of the teleworker and people who inhabit his residence.

Compartilhar: