Welcome to another edition of the Bulletin! In this 42th edition, in the Brazilian context, we highlight that technical meetings promoted by the National Data Protection Authority (ANPD) were held […]
Welcome to another edition of the Bulletin!
In this 42th edition, in the Brazilian context, we highlight that technical meetings promoted by the National Data Protection Authority (ANPD) were held on June 21, 23 and 25, 2021, on the process of regulating the Data Protection Impact Assessment (DPIA) . Maria Cecília Gomes, a professor at Data Privacy Brasil, was selected and joined one of the discussion blocks.
Also in the Brazilian context, we highlight the Draft Bill No. 1674/2021, proposed by Senator Carlos Portinho (PL/RJ), which creates the Certificate of Immunization and Sanitary Security (CSS). The Bill is inspired by the Green Digital Certificate created by the European Union and which has appeared constantly in the latest editions of the Bulletin, with the aim of allowing people who have been vaccinated or who tested negative for Covid-19/other infectious diseases, to go back to visiting places with great concentration of public. The Bill was unanimously approved by the Federal Senate and was sent to the parliament, currently being subject to consideration by the plenary.
Finally, in the international context, we emphasize that the European Protection Board and the Spanish Data Protection Authority adopted a joint opinion on the European Commission’s Proposal for Regulation that establishes harmonized rules on artificial intelligence (AI). In this sense, although the Authorities understand positively the risk-based approach contained in the Proposal, they consider that the concept of “risk to the fundamental rights of the holders” should be aligned with the regulatory framework for data protection of the European Union. Furthermore, they recommend that social risks for groups of individuals should also be assessed and mitigated, as well as highlighting the need to clarify that current EU legislation on data protection (GDPR, EUDPR and LED) applies to any treatment of personal data falling within the scope of the draft AI Regulation.
We wish you a great reading!
Bruno Bioni, Mariana Rielli e Júlia Mendonça
Data Protection at Authorities
The public hearing, which will take place on July 8, 2021, from 10:00 am to 12:00 pm and from 2:00 pm to 6:00 pm, aims to discuss with society the proposed regulation of inspection by the National Data Protection Authority, which was available for public consultation, through the Participa + Brasil platform, until the 28th of June. This standard establishes the inspection mechanism that the Authority intends to adopt, with provision for monitoring, guidance, prevention and sanctioning actions, following the logic of responsive regulation. The session will be open to everyone and, during it, interested parties will be able to manifest themselves, making comments and suggestions regarding the subject discussed. Those interested in collaborating with the regulatory process during the hearing must register beforehand by 23:59 on 07/06/2021, using the form available on the authority’s website. The discussion will be broadcast on ANPD’s YouTube channel and registration is not required for those who just want to watch.
The National Data Protection Authority (ANPD) made public the schedule of the technical meetings that took place within the scope of the regulation process of Data Protection Impact Assessment (DPIA). The matter is provided for in item 7 of ANPD’s Biannual Regulatory Agenda, pursuant to Ordinance No. 11, of January 27, 2021. The technical meetings took place on June 21, 23 and 25, 2021, at 10 am, in a public manner, having been broadcast by ANPD’s YouTube channel. Previously, on May 25, ANPD opened registration so that interested parties could contribute to the regulation process and participate in technical meetings as exhibitors. A total of 543 applications were received within the deadline, 12 names were selected, taking into account practical experience in data protection, experience with data protection impact reporting, and academic background or production. We highlight that Maria Cecília Gomes, professor of Data Privacy Brasil, was part of one of the discussion blocks.
After a complaint by the Consumer Defense Institute (IDEC) and the Collective Defense Institute (IDC), the National Consumer Secretariat (SENACON) determined the application of an administrative fine to Itaú Consignado Bank SA in the amount of R$ 9,6 million, due to the practice of infringement of the Consumer Defense Code. Senacon understood that the Financial Institution did not diligently exercise its duty of vigilance and inspection of the activities carried out by its banking correspondents, in view of the abusiveness committed in the offer and contracting of payroll-deductible loans, as well as the improper use of personal information. To measure the amount of the fine applied, the principles of reasonableness and proportionality, the seriousness and extent of the injury caused to consumers throughout the country, the benefit earned and the economic condition of the company were considered. In a statement, the bank stated that it maintains a process of continuous improvement for the offer and contracting of payroll-deductible loans and that it will appeal the decision. This is the third fine imposed by Senacon in a month for the same reason, the others were applied to Banco Cetelem (R$ 4 million) and Banco Pan (R$ 8.8 million).
Video calls have become a fundamental means for holding work meetings or contacting family members, given the measures of distancing and isolation resulting from the pandemic caused by COVID-19. Therefore, the Data Protection Authority of Argentina issued recommendations to be observed when participating in video calls: ‘’(i) Read the privacy policies and conditions of the applications or mobile platforms you will use. Companies must ask for your consent to process your personal data, which must be in writing or similar means, in a clear and simple manner; (ii) On some occasions, when the product is “free”, companies may use personal data for other purposes, which may pose a risk to the rights of the holders, therefore, companies must inform in advance what the purposes and their consequences, the recipients and, if kept in a database, the name and contact details of the person responsible, so that the effective exercise of rights, for example, of opposition, is possible; (iii) If a video call is recorded, this fact must be informed in advance, explicitly and specifically, together with the termination of the use of these images; (iv) We recommend the use of platforms that allow you to enter a password for the rooms or calls, as well as an identification (ID) that only those invited to participate have’’.
The Spanish Data Protection Agency (AEPD) has published its new guide “Risk management and impact assessment in the processing of personal data”, a document that incorporates the experience accumulated in the application of risk management in the field of data protection since the beginning of the application of the General Data Protection Regulation and adds the interpretations of the AEPD, the European Data Protection Committee and the EDPS. The document, addressed to managers, controllers and supervisors, offers a unified view of risk management and data protection impact assessments, in addition to facilitating the integration of risk management in the entities’ general management and governance processes. The GDPR establishes that organizations that process personal data must carry out risk management in order to establish the necessary measures to guarantee people’s rights and freedoms. Furthermore, in cases where the processing involves a high risk for data protection, the regulation provides that these organizations are required to carry out a Data Protection Impact Assessment, including to mitigate these risks. The guide is divided into three sections: the first contains a description of the fundamentals of risk management for rights and freedoms; the second includes a basic methodological development for the application of risk management and the last focuses on cases in which an assessment is necessary, with the necessary guidelines for its realization.
A Federal Trade Commission (FTC) finalizou um acordo que exigirá que a Flo Health Inc., empresa controladora de um aplicativo de fertilidade, obtenha o consentimento dos usuários antes de compartilhar suas informações pessoais de saúde com terceiros, além de impor uma revisão das suas práticas de privacidade. Na reclamação, realizada pela primeira vez eThe Federal Trade Commission (FTC) has finalized an agreement that will require Flo Health Inc., the parent company of a fertility app, to obtain users’ consent before sharing their personal health information with third parties, and to impose a review of theirs privacy practices. In the complaint, first filed in January, the FTC claimed that despite promising to keep user data private, the company shared sensitive health data of millions of users of its Flo Period & Ovulation Tracker app with marketing and analytics companies, including Facebook and Google. By virtue of the agreement, Flo Health must notify affected users of the disclosure of their health information and instruct any third party who received such data from users to delete it.
European Data Protection Supervisor (EDPS)
EDPB-AEPD joint opinion 5/2021 on the proposal for a Regulation of the European Parliament and the European Commission establishing harmonized rules on artificial intelligence (Artificial Intelligence Law)
The European Protection Board and the Spanish Data Protection Authority adopted a joint opinion on the European Commission’s Proposal for a Regulation that establishes harmonized rules on artificial intelligence (AI). Although the EDPB and the EDPS welcome the risk-based approach contained in the Proposal, they also consider that the concept of “risk to the fundamental rights of holders” should be aligned with the European Union’s data protection regulatory framework. In addition, the Authorities recommend that social risks for groups of individuals are also assessed and mitigated, and consider that compliance with legal obligations under EU law, including on the protection of personal data, should be a precondition for entry into the European market. Finally, the authorities highlight the need to clarify that current EU data protection legislation (GDPR, EUDPR and LED) applies to any processing of personal data falling within the scope of the draft AI Regulation.
Personal data can now flow freely from the European Union to the United Kingdom, where they benefit from a level of protection essentially equivalent to that guaranteed by European Union law. The adequacy decisions issued by the European Commission, both in relation to the General Data Protection Regulation and in relation to the Law Enforcement Directive, also facilitate the correct implementation of the EU-UK Trade and Cooperation Agreement, which provides for the exchange of personal information, by for example, for cooperation in legal matters. Finally, it should be noted that both adequacy decisions include strong safeguards in case of future divergences, such as a ‘sunset clause’, which limits the adequacy duration to four years.
The French Authority (CNIL) carried out several investigations between 2018 and 2021 linked to the company BRICO PRIVÉ, which has a sales website (bricoprive.com) dedicated to DIY, gardening and household items. This company operates in France and three other European countries (Spain, Italy and Portugal). During the investigations, the CNIL found several gaps with regard to the protection of personal data of customers, verifying the non-compliance with various obligations provided for in the Code of Electronic Communications (CPCE) and GDPR. In view of this, the Authority imposed a fine of 500,000 euros, in addition to ordering the company to adapt its processing to article L.34-5 of the CPCE and to article 5.1.e of the GDPR, within a period of 3 months from notification of the decision , subject to a penalty of 500 euros per day of delay .
The first meeting of the technical panel on the protection of children’s rights in the context of social networks and digital products was held, which took place at the Ministry of Justice of Italy, with a working group composed of three entities: the Italian Data Protection Authority , the Italian Child and Adolescent Authority and Italian Communications Guarantee Authority. On the occasion, Pasquale Stanzione, president of the Data Protection Authority, pointed out in his speech that “Protecting minors online and on social networks is a primary objective, which should unite institutions, families and the school environment. The Italian Authority is always committed to promoting awareness of the use of new technologies by minors, as well as establishing rules for their protection”. In turn, Carla Garlatti, a member of the Child and Adolescent Authority, highlighted that “The constitution of the work group is positive and goes in the direction requested by the Authority that I represent. This is because the synergy between the Ministry of Justice and the three Authorities, each with specific competences, can represent a precious opportunity to find regulatory solutions regarding the verification of the age of minors on social networks, regulating the exploitation of their image and effectively safeguarding their digital participation”.
The Italian Authority for the Protection of Personal Data has expressed a favorable opinion on the use of the “IO App”, which has as its scope the issuance of Green Certificates, previously approved by the Italian Government. The company PagoPA, responsible for the development and management of the “IO App”, after introducing measures to solve the problems identified by the Authority in relation to the privacy of users, further modified the application to also make available the “Covid Green-19 Certification”, version indicated by the Ministry of Health. According to the Authority, now users will be notified about this new feature on their first login and will have the option to disable it. To ensure even more protection of the data of the more than 11 million users of the application, the Authority also requested the company that the data related to the use of the Green Certificate, transmitted to Mixpanel (Data analysis company), be kept for a limited period. , not more than ten days after collection, and then immediately excluded. Finally, it should be noted that the blocking of data already collected by the US company, prior to the Authority’s intervention, will remain until the end of the investigation initiated.
The National Authority for the Protection of Personal Data of Peru (ANPD), within the framework of the “March of Policy 35”, arising from the national agreement that promotes the use of information technologies, implemented a virtual platform for the registration of personal data banks in the country’s National Data Protection Registry (RNPDP). The registration of personal databases, in addition to being an obligation provided for in the Peruvian Personal Data Protection Law, is an important step for the sector, as it will allow for better organization and supervision of the use of data, with the adoption of necessary measures for your protection. The new platform, designed by the General Directorate of Information Technologies of Minjusdh, will allow entities that carry out processing operations to correctly complete the respective registration request, as it indicates which fields are mandatory, streamlining the procedure and avoiding the need the displacement to the physical headquarters of the Ministry of Justice and Human Rights. According to the Authority, progressively, improvements will be implemented to virtualize all RNPDP procedures.
Em um blogpost para a ICO, a UK Information Commissioner, Elizabeth Denham, destacou que a tecnologia de reconhecimento facial traz benefícios que podem tornar aspectos de nossas vidas mais fáceis, eficientes e seguros, além de permitir desbloquear nossos telefones celulares, abrir uma conta bancária online ou passar pelo controle de passaporte. No entanto, conforme ponderado pela comissária, quando a tecnologia e seus algoritmos são usados para escanear o rosto das pessoas em tempo real e em contextos mais públicos, os riscos para a privacidade das pessoas são altos. Nesse sentido, ela apontou que está preocupada com o potencial que a tecnologia de reconhecimento facial In a blogpost for ICO, the UK Information Commissioner, Elizabeth Denham, highlighted that facial recognition technology brings benefits that can make aspects of our lives easier, more efficient and secure, as well as allowing us to unlock our cell phones, open a bank account. online or go through passport control. However, as the commissioner ponders, when the technology and its algorithms are used to scan people’s faces in real time and in more public contexts, the risks to people’s privacy are high. In this regard, she pointed out that she is concerned about the potential that live facial recognition (LFR) technology has to be used inappropriately, excessively or even recklessly. That’s because when sensitive personal data is collected on a large scale, without people’s knowledge, choice or control, the impacts can be significant. According to the Commissioner, the LFR and its algorithms can “automatically identify who you are and infer confidential details about you, and can be used to create an instant profile and serve personalized ads”, or even compare your image with “shop robbers acquaintances while you do your weekly shopping”. She then asserted that “It is not my role to endorse or ban a technology, but although this technology is under development and not widely deployed, we have the opportunity to ensure that it does not expand without due thought to data protection ”. In light of this, the ICO has published an opinion on the use of LFR in public places by private companies and public organizations, which explains how data protection and the privacy of individuals must be at the heart of any decision to implement LFR.
The Information Commissioner’s Office (ICO) has fined Papa John’s (GB) Limited £10,000 for sending 168,022 annoying marketing messages to its customers without valid consent required by law. The ICO points out that it received about 15 complaints from Papa John’s customers about the unwanted marketing they were receiving via text message and email, highlighting the anguish and annoyance the messages were causing. ICO’s subsequent investigation found that between October 1, 2019 and April 30, 2020, Papa John’s sent more than 210,000 marketing messages with 168,022 acknowledgments of receipt. That investigation also found that the company was relying on the idea of soft opt in, which allows organizations to send e-marketing messages to customers whose details were obtained for similar services, but offers a simple way to decline or cancel the receivement. Therefore, the Authority decided that Papa John’s could not use this type of authorization for customers who placed an order over the phone, as they did not have the option to cancel the contact, nor did they receive a privacy notice, which resulted in the application of the fine.
The Authority of Uruguay has published a set of recommendations, in accordance with the country’s personal data protection regulations, aimed at controllers to be adopted when using messaging applications. The recommendations seek to clarify some concepts associated with the information managed by the applications, the consequences associated with its use and the ways in which people can protect their privacy and data. Likewise, the document sets out recommendations for controllers who wish to communicate with third parties through this type of application, indicating that such communication must be carried out through channels that minimize the potential negative impacts for the holders. Likewise, for those who control and design these applications, recommendations are also listed indicating that privacy by design and privacy by default measures should be incorporated, with due assessment of the role of the application in relation to the data collected and processed, among other points.
Data Privacy at Universities
FENG, Fei; WANG, Xia; CHEN, Tianxiang
According to the text, researchers studying data collection, analysis and use in the era of big data and algorithms are paying more attention to the use of inferred data. Information inferred by an algorithm has distinct “personality and property interests”, which challenges existing theories of personal information and privacy. However, according to the text, a complete method of legal regulation for this information does not yet exist in China. The article focuses on how to recognize the nature of inferred information and how to carry out appropriate legal assessment and regulation to better protect the legitimate rights and interests of relevant matters in China. For the authors, based on China’s social needs and experience of judicial practice, the theory of privacy of “contextual integrity” developed by Professor Helen Nissenbaum can be used to assess whether the inferred information generates any kind of violation, with the prediction that China is likely to adopt the US regulatory model.
STURDEE, Miriam; WIMASALIRI Bhagya; THORNTON Lauren; PATIL Sameer
Concepts related to cybersecurity can be difficult to explain or summarize. The complexity associated with these concepts is aggravated by the impact of rapid technological changes and the contextual nature of the meaning attributed to the various themes. Given this, and considering that visual images are often used to articulate and explain concepts, the authors conducted a study in which participants were asked to visually outline their understanding of cybersecurity concepts. Based on the analysis of these sketches and subsequent discussions with the participants, the authors advocate the use of sketches and visual aids as a tool for cybersecurity research. According to the text, the collection of sketches and icons can serve as the seed for a visual vocabulary of interfaces and communication related to cybersecurity.
MARTINS, Guilherme Magalhães; BASAN, Arthur Pinheiro; JÚNIOR, José Luiz de Moura Faleiros.
The Covid-19 pandemic marked the year 2020 in history, bringing transversal reflections in Legal Science and instigating debates about the effectiveness of rights in peculiar times in which the problematic around their effectiveness ends up going beyond mere dogmatic discussion; in fact, it is thought about the realization of rights based on the fulfillment of fundamental duties. It is at this point that the problem carried out in the study lies: how the protection of personal data, considered a fundamental right and so well versed in legislative discussions in the second decade of the 21st century, can and should be implemented in times of social isolation and containment of viral spread for the preservation of a plexus of other human rights, such as life, liberty and public safety?
Data Protection in the Brazilian Legislative
The Bill nº1674/2021, proposed by Senator Carlos Portinho (PL/RJ), proposes the creation of the Immunization and Sanitary Security Certificate (CSS). The Bill is inspired by the Green Digital Certificate drawn up by the European Union, with the aim of allowing people who have been vaccinated or who tested negative for Covid-19/other infectious diseases, to return to frequent places with a high concentration of public. The document will be implemented through a digital platform, where the holder can issue different types of certificates: (i) Certificate of National Vaccination – CNV; (ii) Certificate of International Vaccination and Testing – CVIT; (iii) Testing Certificate – CT; (iv) Infectious-contagious Disease Recovery Certificate – CRDI. The Bill was unanimously approved by the Federal Senate, which led to its submission to the Parliament, currently being subject to consideration by the plenary.
Data Protection in the Brazilian Judiciary
This is an innominate appeal filed by the defendant in case No. 0753373-84.2020.8.07.0016. Based on article 7 of the General Law for the Protection of Personal Data – LGPD (Law No. 13.709/2018), the 1st degree sentence ordered him to provide the necessary data for the identification and location of those responsible for the bank transfer of the account to the which the author, by mistake, made a deposit in the amount of R$900.00, in order to obtain authorization for the reversal of that amount. In the reasons for the appeal, the defendant alleged: (i) lack of unlawful conduct implying compensation for damages; (ii); impossibility of penalizing the bank due to the exclusive fault of the consumer; (iii) that he cannot be condemned under the obligation to “do or not do something other than by virtue of the law”; (iv) absence of responsibility for the damages mentioned in the initial; (v) absence of tort, negligence or failure to provide the services; (vi) that? it is not the burden of reversing the amounts involved; (vii) absence of irregularity that implies the obligation to make reparation; and (viii) illegality and impossibility of fulfilling the obligation to do. In the end, he requested the reform of the sentence to dismiss the requests contained in the complaint. However, the Third Appeal Panel of the Federal District’s Special Courts pointed out that, pursuant to art. 932, III of the Code of Civil Procedure, it is for the appellant to specifically challenge the grounds of the contested decision, which was not the case. According to the Panel, the appeal was limited to repeating arguments already launched in the defense, without any challenge to the reasons for deciding, in particular regarding the authorization provided for in article 7 of the General Law for the Protection of Personal Data – LGPD (Law no. 13.709/2018). Given this, the feature was not known.