Roschier Data Protection Update – Highlights of 2018 and what to expect in 2019?
The UK is scheduled to leave the EU on 29 March 2019. However, considering recent developments, it may not reach a deal with the EU. To prepare for this “no-deal” scenario, organizations must consider how transfers of personal data to the UK can be justified. The General Data Protection Regulation (“GDPR, 2016/679) currently applies in the UK, and allows data transfers within the EU and the EEA. However, leaving the EU means that the UK will become a third-country for data transfer purposes. For data transfers outside the EU and the EEA, a specific legal basis provided in the GDPR is needed.
One option would be for the Commission to issue an adequacy decision, stating that the level of personal data protection in the UK is adequate and data transfers there are generally allowed. However, the Commission has not seemed inclined to do so. The UK’s surveillance laws, in particular, may preclude such a decision.
If no solution is reached before the exit date, organizations transferring personal data must themselves ensure compliance with the GDPR when dealing with the UK. One option is to enter into so-called standard contractual clauses to enable such cross-border data transfers. Special attention should also be paid to privacy notices and records of processing of personal data as they may need to be updated to correspond to the new legal relationship between the EU and the UK.
The Court of Justice of the European Union (“CJEU”) is expected to issue a judgment this year in the Schrems v Facebook 2.0 case concerning international transfers of personal data.
In April 2018 the High Court of Ireland referred eleven questions to the CJEU for a preliminary ruling concerning the validity of the European Commission’s standard contractual clauses (“SCC”) for personal data transfers to processors in third countries (SCC decision, 2010/87/EU as amended). The SCC decision has become increasingly important after the CJEU invalidated the US Safe Harbor framework in 2015. However, the questions not only concern transfers of personal data from the EEA to the US but also raise the ultimate issue of whether the SCC decision in itself violates some rights under the Charter of Fundamental Rights of the European Union (Charter, 2000/C 364/01).
As in the case invalidating the Safe Harbor framework, the High Court referred the current case to the CJEU due to a complaint made by Maximilliam Schrems to the Data Protection Commissioner of Ireland. Mr. Schrems had claimed that personal data transfers from Facebook Ireland Ltd to Facebook Inc. based on the SCC decision violate the Charter and the Data Protection Directive (the predecessor of the GDPR).
In the US, companies transferring personal data from the EEA in accordance with the SCC decision may be obliged to disclose the data to the US authorities for purposes of national security, law enforcement, and the conduct of foreign affairs. Therefore, many of the questions the High Court submitted to the CJEU concern the adequate level of protection and judicial remedies for data subjects in the US or in another third country. In addition, some of the questions posed to the CJEU put pressure on the US Privacy Shield framework.
SCCs are heavily relied upon in cross-border transfers of personal data across the Atlantic and around the globe. Depending on the outcome, the upcoming ruling may therefore have significant implications for international data transfers.
On 23 January 2019 the European Commission adopted its adequacy decision on Japan, allowing personal data to flow freely between the two economies. Immediately after its adoption, the decision has now created the world’s largest area of safe data flows.
The EU-Japan adequacy decision is the first time that the EU and a third country agree on reciprocal recognition. Before this decision, Japan implemented additional safeguards to guarantee that data transferred from the EU would enjoy European-standard protection guarantees. This included a set of supplementary rules to bridge differences between the two data protection systems, assurances to the Commission regarding safeguards concerning the access of Japanese public authorities for criminal law enforcement and national security purposes, and a complaint-handling mechanism for Europeans regarding access to their data by Japanese public authorities.
The adequacy decision is now systematically monitored to ensure that the grounds for reciprocal recognition will continue. After two years, a first joint review will be made to assess its functioning. A subsequent review will occur at least every four years.
On 21 January 2019 the data protection authority of France, CNIL, imposed a fine of EUR 50 million against Google, in accordance with the GDPR. The fine reflected the lack of transparency, inadequate information and lack of valid consent for personalization of ads. This is the biggest GDPR fine yet to be issued by a European authority. The GDPR allows fines up to EUR 20 million, or up to 4% of the total worldwide annual turnover, whichever is higher, for more serious offences.
According to CNIL, the amount and publicity of the fine are justified by the severity of the infringements, because the essential principles of the GDPR require transparency, information and consent.
Swedish Data Protection Authority DPA (Datainspektionen) has sent Google a request for clarification. This is part of DPA’s investigation concerning Google’s collection of users’ location history and whether that complies with the GDPR. The investigation is based on an initial complaint by the Swedish Consumer Association (Sveriges Konsumenter) on behalf of a data subject who raised concerns about Google’s possible storing of location data irrespective of whether the user has specially turned their mobile device’s location history off. Discussion has also arisen as to which country’s DPA is competent to handle cross-border investigations in cases where privacy violations affect individuals in more than one EU member state.
According to the initial complaint, Google is using deceptive designs, misleading information and repeated pressuring to manipulate users to allow Google to constantly track their location. Moreover, Google may not have acceptably gained users’ consent for such collection and processing of location data by combining consent from several different services. Until now, Google has reasoned that location history solely benefits the users as it enables “personalized maps, recommendations based on places you’ve visited, help finding your phone, real-time traffic updates about your commute, and more useful ads.”
The Swedish DPA invited Google to provide its reply and other requested information and documents at the latest by 15 February 2019.
In 2018 the main focus was undoubtedly on the General Data Protection Regulation (“GDPR”, 2016/679), which became applicable on 25 May 2018 and resulted in new legislation being imposed or amended in the EU and in Member States.
However, regulations in some sectors are still in progress, such as the ePrivacy Regulation, which was supposed to enter into force together with the GDPR in May. However, the regulation is still under negotiation in the Council of the European Union.
Free flow of non-personal data was also addresses in 2018, as Regulation 2018/1807, which sets rules for the use of non-personal data, entered into force on 18 December 2018. The regulation will be directly applicable in all Member States on 18 June 2019. Non-personal data means any data which is not related to an identified, or identifiable, natural person, as opposed to personal data defined in the GDPR.
The importance of data protection is also highlighted in the development of new intelligence technologies such as artificial intelligence. On 23 October 2018, the International Conference of Data Protection & Privacy Commissioners (“ICDPPC”) published a Declaration on Ethics and Data Protection in Artificial Intelligence, as well as a Resolution on E-learning Platforms.
The declaration emphasizes the importance of personal data protection, privacy and self-determination. Artificial intelligence systems should be designed to respect data protection and privacy rights, such as the right to information, the right to access, the right to object to processing and the right to erasure, by applying the principles of privacy by default and privacy by design. In addition, the declaration focuses on other legal challenges AI systems may face. For example, the use of data in AI may lead to unlawful biases or discriminations. Due to this, AI and machine learning technologies should be designed, developed, and used in respect of fundamental human rights and in accordance with the fairness principle.
After the GDPR became applicable in May 2018, the Court of Justice of the European Union (“CJEU”) gave two preliminary rulings on data protection. In these two rulings, the court examined the concept of “joint controllership,” a concept which also existed under the data protection directive but was only clearly codified in the GDPR.
The first decision, C-210/16 Wirtschaftsakademie Schleswig-Holstein, rendered 5 June 2018, concerned the role of fan page administrators on social networks. According to the court, the concept of “controller” must be interpreted in the sense that it also encompasses the administrator of a fan page hosted on a social network. In the case, the fan page administrator Wirtschaftsakademie, a company offering educational services, did not inform the visitors of their page that Facebook was using cookies to collect their personal data or that the data was processed using the Facebook Insight application. Through the use of this application, the administrators can obtain statistical information about visitors to the fan page. The CJEU found that because the creation of a fan page on Facebook involves defining the criteria for drawing up the statistic, the administrator of a Facebook fan page contributes to the processing of the personal data of visitors to its page. Therefore, the administrator is jointly responsible with Facebook for this processing.
In the second judgment, C-25/17 Tietosuojavaltuutettu, issued on 10 July 2018, the CJEU confirmed previous case law interpreting the notion of joint controllership. The concept of controllership was examined in the context of data collected and processed by the members of a religious community in the course of their door-to-door preaching. The data collected in their notes included names and addresses as well as information about their religious beliefs and family circumstances. The data was collected without the knowledge or consent of the persons concerned. The CJEU found that a religious community is a controller – jointly with members of its congregations – for processing of personal data collected in the course of their door-to-door preaching.
Both cases were assessed under the Data Protection Directive (95/46) but the court’s findings may be equally applicable to processing carried out under the GDPR. As such, the cases may serve to further clarify the concept of joint controllership under the GDPR.
Criminal offences that are not particularly serious may justify access to personal data retained by electronic communications services providers if the access does not constitute a serious infringement of privacy, according to the CJEU in case C-207/16 published on 2 October 2018. In that case, while investigating the robbery of a wallet and a mobile telephone, Spanish police requested access to telephone numbers activated during a twelve-day period after the robbery and personal data relating to the identity of the owners or users of the SIM cards activated with the code, such as their surnames, forenames and addresses.
The CJEU found that the requested data would not allow precise conclusions to be drawn concerning the persons’ private lives. It only enables the SIM card or cards activated with the stolen mobile telephone to be linked with the identity of the owners of those SIM cards. It is impossible to ascertain the date, time, duration and recipients of such communications, nor the locations of those communications or the frequency of those communications with the data as such. Therefore, access to the requested data is not a serious interference with fundamental rights. Public authorities may access data such as surnames, forenames and addresses for the purpose of identifying the owners of SIM cards activated with a stolen mobile telephone because any interference with their fundamental rights is not sufficiently serious for limiting access to fighting only serious crime.
The effects of the GDPR were also apparent in the Nordics.
Sweden enacted the Data Protection Act (2018:218) to supplement the GDPR in 2018.
In Finland, a new Data Protection Act (1050/2018) came into force on 1 January 2019 after the Parliament of Finland finally approved the government proposal for the Act (HE 9/2018 vp) in November. The Data Protection Act complements and specifies the provisions of the GDPR and serves as a general personal data protection law in Finland. The new Act includes several provisions on national discretion, including provisions on health-related data, personal identity numbers, the processing of children’s personal data, and criminal sanctions. Please read more here.
The government of Finland has also proposed amending the Act on the Protection of Privacy in Working Life (97/2018) to protect the personal data of employees. The Act would provide rules for processing employees’ personal data, processing information on drug use, and for camera surveillance in the workplace. The proposal is currently being negotiated in the Parliament of Finland.
In addition, the Constitution of Finland (731/1999) was amended in October to enable the enactment of civilian intelligence legislation. The Constitution was amended to limit the secrecy of confidential communications by an ordinary act. The amendment allows the enactment of separate intelligence legislation before the end of this government term (spring 2019). The Parliament of Finland is currently discussing the government proposal (202/2017) for the legislation.
The Finnish Supreme Administrative Court considered a data subject’s right to be forgotten in its judgment KHO 2018:112 rendered on 17 August 2018. The case concerned removing Google search results that led to websites containing sensitive personal information of a private person. The applicant had requested the search results be removed because they violated his right to privacy; the results contained information on his state of health and the fact that he had been convicted for murder with diminished responsibility. In 2015, the Finnish Data Protection Ombudsman had ordered Google to remove two URL search results from all Google Search results available when searching the applicant’s name. Now the Supreme Administrative Court considered whether the two search results were unnecessary concerning the purpose of the personal data processing and, on the other hand, whether it was possible to order Google to erase the search results.
The Supreme Administrative Court held that freedom of expression cannot prevail over a data subject’s right to privacy. Thus it adopted the same view as the CJEU had previously established in its ruling C-131/12, that the rights of a data subject override both the rights of the search engine operator and the interests of the general public in accessing information from searching a data subject’s name. However, the court noted that a fair balance needs to be found between these rights and interests, which can depend on the nature of the information in question. Moreover, the interest of the general public may vary according to the data subject’s role in public life.
Even though a person who has committed a serious crime is as a rule, in a public role, the Supreme Administrative Court considered that the information about the applicant’s health and mental state were sensitive personal data within the meaning of the Personal Data Act. Moreover, the information was available with other searches than just the data subject’s name. Therefore, the Supreme Administrative Court held that even though the data subject had been in a rather public role in this particular case, there were no grounds to make this information available to the public.