In this edition, the OPPD Bulletin highlights the main movements of Data Protection Authorities from different countries and selects scientific articles on the legal basis of consent and the challenge […]
In this edition, the OPPD Bulletin highlights the main movements of Data Protection Authorities from different countries and selects scientific articles on the legal basis of consent and the challenge of Artificial Intelligence. It also compiles new bills presented and, in the Judiciary axis, the judgment of the first Public Civil Action based on the General Data Protection Law.
We highlight the presentation of Bill 4723/2020, by Federal Deputy Luiz Philippe de Orleans e Bragança that changes the LGPD to determine the storage of personal data in the national territory, in addition to prohibiting the use of cloud computing for all operations mentioned in item X of the caput of art. 5 of the Law. This is an ineffective measure, which has been widely discussed previously during the formulation of the Marco Civil da Internet. At the time, it was understood that maintaining the requirement for data storage in a given location not only negatively impacts the functioning of small and large companies that need international data transfer to maintain their business, discouraging innovation in the country, but also confines the discussion of cross-border sharing to the isolation of States, in detriment of creating international standards of security and use of personal data that protect the citizens’ right to privacy in the most diverse locations.
We also stress the importance of the guidelines published by the European Data Protection Board on targeting and the opinion of the Icelandic Authority on the use of social networks for election campaigns. In Brazil, we are getting closer to the elections and it is extremely important to understand the limits and possibilities of such practices. In this sense, a study group created by Data Privacy Brasil Research Association, InternetLab and the Instituto Liberdade Digital produced the document “Data Protection in Elections: Democracy and Privacy”, which explores, among other elements, the electoral campaign process on social networks and the best use of personal data for this purpose.
Bruno Bioni, Iasmine Favaro & Mariana Rielli
Data Protection at Authorities
The authority imposed a 6 million crown sanction on companies involved in the sale of cars for recurring and unsolicited commercial messages. The authority analyzed the company’s entire marketing campaign, which had almost 500,000 recipients, without, however, making any mention of consent.
The new strategic vision proposed by the authority contains a new vision, mission and values, in addition to defining the direction of the authority’s work for the coming years and proposing more concrete guidelines for specific problems. The new strategy was created based on the contributions of stakeholders who are interested in the authority’s work and, accordingly, an analysis with different views was prepared, in which various authorities, industry organizations and other interest groups were consulted on their experience. and future wishes for the organ’s focus.
Registration for the event is open until November 6th. The theme “Digital rights and freedoms at work: reality and horizons” will be divided into three axes: (i) The role of artificial intelligence in the labor market and its challenges for identifying talent or even discriminatory hiring bias; (ii) Automated data management in real time to measure and optimize worker productivity (or people analysis) and its limits and (iii) Experience with feedback on the massive increase in telework during the health crisis, which caused a change in corporate culture and brought new questions about employee privacy. The event will take place online on November 9th.
The authority published reminders for employers about obligations when collecting personal data from employees, such as: (i) reminding employees, in contact with others, of their obligation to communicate, in case of contamination or suspected contamination, employers or competent health authorities, for the sole purpose of enabling them to adapt working conditions; (ii) facilitate their communication by establishing, if necessary, dedicated and secure channels; (iii) promoting remote working methods and encouraging the use of occupational medicine. As for the processing of employee health data, CNIL points out that there is a prohibition in the processing, with the exception of: (i) the need for the employer to process this data in order to fulfill its obligations regarding labor, social security and social protection: this is the case with the processing of workers’ reports and (ii) the need for a health professional to process these data for the purposes of preventive medicine or occupational medicine, (health) assessment of the worker’s work capacity, medical diagnoses, etc. Finally, the authority pointed out that employers wishing to initiate any measures aimed at guaranteeing the health status of their employees must rely on the competent occupational health services, which are at the center of managing the health crisis. They cannot, on their own, create sheets relating to the body temperature of their employees or to certain pathologies.
Commissioner Prof. Ulrich Kelber emphasized, in celebration of the Right to Information Day that “since March, authority has been increasingly asked to provide information. People have many questions and a great need for official information about federal activities related to the pandemic. I would like the authorities to have more active transparency in the future. This means that they can better explain government action by publishing processes, numbers and other information independently. This would help, for example, to increase the acceptance of measures to combat pandemics.” In addition, the Commissioner also spoke about the contact tracing application developed by the government: “Transparency creates confidence. In a democracy, the success of measures against Covid-19 depends on people’s belief in government actions and compliance with recommendations. Especially in times of uncertainty and misleading reports, it is more important than ever that federal authorities open up and contribute to objectifying the debate with valid information. Our democracy is based on knowledge “.
The authority indicated principles that companies should follow when collecting customer data to perform contact tracing, pointing out that they should: (i) collect only the details necessary for contact or compliance tracing purposes, for example, name, number contact details, time and date of attendance. In the case of licensed facilities, records of the sale of meals to customers must be recorded for compliance purposes; (ii) Be transparent with your customers about the reason for collecting this data. Be able to clearly explain the purpose of collecting personal data; (iii) Store this information carefully. Although it is not necessary to use technology to store them, if you decide to keep them electronically, you must make sure that the system used is safe and delete the information at regular intervals when they are no longer needed; (iv) Limit this data to the purpose for which it was collected. In particular, do not use this data for direct marketing purposes or to contact customers for any reason and (v) Make sure to delete contact details when you no longer need to keep them for contact tracking or compliance purposes.
The manifest published by the Italian authority, in partnership with the IAPP, is the manifest of the thought of Giovanni Buttarelli and gathers reflections and notes from the author, who was European Data Protection Supervisor. The work has two distinct sections: the first, based on the writings and reflections of Buttarelli, which ends with a kind of “decalogue” of the privacy of the new decade. The second, based on contributions from renowned academics and international experts (including Marc Rotenberg, the founder of the Electronic Privacy Information Center, and Shoshana Zuboff, a scholar and author of a recent work on surveillance capitalism).
The authority fined Vilnus Municipality € 15,000 for failing to keep the registration data of children and foster parents up to date. It was found that most of the data was out of date and there was no effort by the Municipality to keep this information updated and accurate.
The authority points out that companies wishing to transfer personal data processed in the course of their activities to a third country, such as the USA, Turkey, Australia, Ukraine, etc., must follow the rules for the transfer of personal data to third countries established in the GDPR. It also discusses one of the possible grounds for the transfer of personal data to third countries, the Binding Corporate Rules (BCRs).
The webinar, which will take place on September 30, will seek to discuss issues that arose during the pandemic in relation to distance education and teaching organizations, with notes on the security of students’ information. The training will not only concern the data processing of students, their parents or teachers, but will also contain practical tips on how to conduct videoconferences, take care of the equipment they work on, etc.
EDPB adopted guidelines on targeting at its 37th plenary meeting. These guidelines were intended to provide stakeholders with practical guidance and examples from different situations to help identify the “scenario” most related to the targeting practice that organizations will apply. The main objective of the guidelines is to clarify the roles and responsibilities of social media providers and their audience. Therefore, the guidelines address, among other things, potential threats to the freedom of individuals, application of the fundamental principles and requirements for the processing of personal data, namely, legality, transparency and carrying out an assessment of the impact of data protection. The public consultation will remain open until October 19.
The authority investigated eight political parties that targeted individuals for the purposes of political propaganda and found that in two cases, only information about the age and location of the people was used. In the other parties, groups were more precisely defined based on their field of interest on social media. The areas of interest were registered by the users themselves or determined by social networks based on their activity, such as what they liked, shared or what they showed interest in.
Thus, some political organizations sent personalized messages to specific groups of voters who, according to their personalities, were considered likely to vote or undecided. The authority emphasizes that the processing of sensitive personal information by members and voters by political organizations, such as political opinions, must be based on individuals’ unambiguous consent to the processing.
On this subject, a study group created by the Data Privacy Brasil Research Association, InternetLab and the Instituto Liberdade Digital, produced the document “Data Protection in Elections: Democracy and Privacy” which explores, among other elements, the campaign process election on social networks and its implications for the protection of personal data.
Joint liability, in accordance with Article 26 of the GDPR, arises if there are two (or more) persons or entities responsible for the same processing of data at the same time. In this case, they work together consciously and decide together on the essential objectives and means for data processing. The contract model made available by the authority is based on two jointly responsible persons. Joint responsibility may also include more contracting parties. In addition, the contract model assumes that both contractual partners are subject to GDPR.
The Norwegian National Institute of Public Health will develop a new application, just for contact tracking and based on the structure developed by the companies Apple and Google. According to the authority, different types of technology were analyzed and it was concluded that the one developed by companies would be the one that represents the least risk to the privacy of citizens.
The British government has made it mandatory for all companies in the hotel, leisure and tourism sector, in addition to sectors that involve close contact, such as barbers and beauticians, to collect customer information for the test and contact tracing program. With that, the authority defined five guidelines to respect the privacy of individuals: (i) ask people only for the specific information that was established in the government’s guidelines; (ii) being clear, open and honest with people about what is being done with their personal information; (iii) keeping people’s data safe. Organizations must not use open logbooks and must ensure that their customers’ personal information is kept confidential; (iv) not using the personal information collected to track contacts for other purposes, such as direct marketing, profiling or analyzing data; and (v) erase or discard the personal information collected after 21 days.
The authority pointed out that he maintains a good relationship with the app’s developer parties and that he is being widely consulted on issues of privacy and protection of personal data, pointing out some specific positive changes: (i) improved privacy information, better informing individuals about the implications that the application may have on your privacy, the measures taken to mitigate these risks and how individuals can exercise their information rights; (ii) clearer information about automated decision making, including the opportunity, given to individuals, to speak with an employee about the decision in question and the reasoning behind the algorithm; (iii) more transparency for individuals about how and when personal data is considered anonymous and with whom it is shared, and (iv) greater clarity of data flows and security considerations. The authority also pointed out that he will audit the entire test and tracking ecosystem.
Digital Growth Experts Limited was fined £ 60,000 for sending thousands of marketing texts that were considered bothersome during the pandemic. In order to capitalize and profit from the pandemic, DGEL sent the texts, of which 16,190 were received between February 29 and April 30, 2020, promoting a hand sanitizing product that claimed to be “effective against the coronavirus”. The messages were all sent to people who did not consent to receive them. The head of ICO’s investigations pointed out that “DGEL played with people’s concerns at a time of great public uncertainty, acting with flagrant disregard for the law, and everything to fill their own pockets. We will prioritize actions against organizations that carry out similar activities.”.
Data Protection at Univesities
The article aims to discuss the focus on the consent of the data subject as a nuclear regulatory instrument for the protection of personal data. To this end, three aspects are briefly discussed that demonstrate the insufficiencies of the consent paradigm: (i) the cognitive limitations of the holder of personal data to assess the costs and benefits involved regarding their rights; (ii) the binary logic “take it or leave it”, which reflects the absence of a free will due to the asymmetry of powers between him and the agent responsible for the processing, as well as his dependence on many services of the society of the information; and (iii) modern techniques for processing and analyzing personal data, which make it possible to aggregate information and which can hardly be managed by the data subject at the time of data collection. To overcome these shortcomings, contemporary trends in ‘materialization’ of data protection are presented as interesting solutions, making it more responsive both to the risks generated by the processing and to the concrete obstacles to a free and autonomous decision. In this text, three paths were explored in this sense: (i) strategies based on technology and the design of information systems (privacy by design) in order to assist the holder in controlling his data; (ii) implementation of regulation based on accountability by processing agents (accountability), measuring the risks prior to the processing of personal data; and (iii) the contextual control of consent.
The article addresses possible legal responses to the challenges posed by the digital transformation of society, using the example of German national law and the EU. The focus is on the development and use of big data, that is, large volumes of data that vary in nature and quality and that can be processed at high speed for a variety of purposes. The law has the task of facilitating the opportunities associated with big data, but also of removing or minimizing the risks to legally protected individual and collective interests. Of particular importance is the protection of the rights of freedom, including the right to informational self-determination (data protection). Law also requires innovations. In addition to legal structures, protection through technology is important (such as protection by design/by default). Sufficient guarantees of transparency and control are also required, in addition to judicial protection. In addition, care must be taken to ensure that adequate governance structures are developed.
Data Protection in the Brazilian Legislative
Bill 4723/2020, presented by Dep. Fed. Luiz Philippe de Orleans e Bragança, from São Paulo’s PSL, alters art. 3rd LGPD to determine that the personal data of Brazilians remain stored in the national territory. In addition, the Bill amends art. 55-D of the LGPD, providing for the appointment of counselors and prohibiting “the appointment of spouse, partner or relative in a straight line, collateral or by affinity, up to the third degree, of authorities of the Legislative, Executive and Ministers of the Judiciary”. The Bill is at the Board of Directors.
The PLV 39/2020, presented by Dep. Fed. Gastão Vieira, from PROS-MA, was the conversion of Provisional Measure 982/2020, which provides for the digital social savings account. Currently, the PLV is in Plenary.
Bill 4695, presented by Dep. Fed. Danilo Cabral, from PSB Pernambuco, establishes some guidelines such as, for example, educational institutions preferring to opt for technologies that do not require the provision and sharing of personal data, there is no collection or availability of sensitive personal data, whenever possible, among other tools to protect students’ privacy. Currently, the PL is at the Board of Directors.
Data Protection in the Brazilian Judiciary
Judge Wagner Pessoa Vieira, of the Federal District Court of Justice, decided to terminate Public Civil Action No. 0730600-90.2020.8.07.0001, filed by the Public Ministry of the Federal District, on the grounds of illegal commercialization of personal data by a mining company. The judge stated that the presence of procedural interest is not foreseen, since, when consulting the company’s website, it was found that it was undergoing maintenance, so that it was demonstrated that the company was seeking to comply with the recently existing General Data Protection Law of Brazil.