Data protection report
Sponsored by
Produced by
Digital
Events
data protection report
A productive year: Ireland’s Data Protection Commission Due to the colocation of several US technology giants in Ireland, the Irish Data Protection Commission (DPC) is, in certain circumstances, the lead supervisory authority of ‘big tech’ in Europe. eolas provides an overview of its performance in 2020. Led by Commissioner for Data Protection Helen Dixon, at its core, the DPC is the national supervisory authority in Ireland responsible for “upholding the fundamental right of EU persons to have their personal data protected”. The role of Ireland’s DPC is multifaceted and balances complaint processing and resolution with systemic supervision and investigation. 2020 was a year of significant progress and substantial output for the DPC. Tasked with supervising the application of the General Data Protection Regulation (GDPR) (Regulation (EU)
40
2016/679), the DPC also supervises several additional legal frameworks including the Law Enforcement Directive (LED) and Data Protection Act 2018, as well as the e-Privacy Regulations. The DPC’s work ranged from the culmination of legal proceedings it initiated in the High Court in 2016 relating to EU-to-US data transfers, to the conclusion of 6,000 complaints and liaisons with the Government on legislative initiatives. Through GDPR, the EU sought to establish a One-Stop-Shop (OSS)
mechanism to streamline how multinationals which operate in more than one EU member state interact with data protection authorities. These companies are subject to oversight from a single supervisory authority in the member state where they have a ‘main establishment’. As such, Ireland’s DPC often operates as the lead supervisory authority for investigations. In 2020, it received 354 cross-border complaints through the OSS mechanism, lodged with other EU data protection authorities. 2020 was the second full year of the
Core functions of the Data Protection Commission application of the General Data Protection Regulation (GDPR) which comprehensively regulates every sector. The DPC emphasises that the GDPR is an ongoing project, with many areas that “remain for exploration to the benefit of organisations and data subjects alike, including data codes of conduct and certification”.
Driving improved compliance with data protection legislation.
Conducting inquiries and investigations into potential infringements of data protection legislation.
However, the DPC is not immune from criticism. In the last year, some of the most prominent criticisms have related to the perceived leniency of the fine handed down to Twitter International Company (TIC), the efficiency of its decision-making process and the fact that international data transfers have not been immediately blocked. Decisions made by the Irish Data Protection Commission are the first that have progressed through the Article 65 dispute resolution mechanism. Previously, Data Protection Commissioner Helen Dixon has criticised the fact that very little had been written about the first Article 65 decisions and “what has been clarified in terms of documenting a breach and how it must be distinguished from an incident tracking”. Responding to the criticism that the fine issued to Twitter International Company was insufficient, Dixon asserts that the objective of the inquiry was to reasonably prove that there were infringements and apply a proportionate fine.
Large-scale inquiries Through statutory inquiries, the DPC determines whether infringements of data protection legislation have occurred and, where these occur, it decides on the corrective power to be exercised. In 2020, the DPC issued detailed decisions in respect of its inquiries. At the close of 2020, the DPC was pursuing 83 statutory inquiries, 27 of which were cross-border inquiries. Organisations under investigation range from Apple Distribution International and Facebook Inc. in cross-border inquiries to An Garda Síochána, Bank of Ireland and the Catholic Church in domestic inquiries. 4
data protection report
Handling complaints from individuals relating to potential infringements data protection rights.
Promoting awareness of the risks, rules, safeguards and rights required in the processing of personal data. Cooperating with data protection authorities in other EU member states on cross-border processing.
Inquiry decisions, 2020 Organisation
Decision date
Kerry County Council
25 March 2020
Waterford City and County Council
21 October 2020
Tusla Child and Family Agency (3 breaches)
07 April 2020
Tusla Child and Family Agency (1 breach)
21 May 2020
Tusla Child and Family Agency (71 breaches)
12 August 2020
Health Service Executive (HSE South)
18 August 2020
Health Service Executive
29 September 2020
(Our Lady of Lourdes Hospital) Ryanair
11 November 2020
Twitter International Company
09 December 2020
Groupon
16 December 2020
University College Dublin
17 December 2020 Source: Data Protection Commission Annual Report, 2020
41
data protection report
Data protection breach notifications by category Category
Total
Unauthorised disclosure
5,837
Hacking
146
Malware
19
Phishing
74
Ransomware/Denial of service
32
Software development vulnerability
5
Device lost or stolen (encrypted)
19
Device lost or stolen (unencrypted)
29
Paper lost or stolen
275
E-waste
1
Inappropriate disposal of paper
21
System misconfiguration
40
Unauthorised access
146
Unintended online publication
61
Other
78
Total
6,783 Source: Data Protection Commission Annual Report, 2020
A final decision issued through the Article 65 procedure in the Twitter International Company case represented the pinnacle of large-scale inquiries concluded in 2020. This determination of this case represented the first substantial fine issued by the DPC. In this high profile inquiry, which commenced in January 2019, the DPC investigated Twitter International Company’s compliance with its obligations under the GDPR. In December 2020, the DPC issued its decision and found that TIC had infringed Article 33 by failing to notify without delay the DPC about a personal data breach which arose from a bug in the Twitter mobile for Android. The DPC’s draft decision was
42
submitted to other Concerned Supervisory Authorities (CSAs) via the Article 60 mechanism of the GDPR in May 2020. It was the first draft decision to traverse the Article 65 dispute resolution process as well as being the first in a ‘big tech’ case on which all EU supervisory authorities were consulted a CSAs. The European Data Protection Board adopted the decision in November 2020 and the DPC issued its final decision the following month, imposing an administrative fine of €450,000. The DPC asserted that this fine was an “effective, proportionate and dissuasive measure”.
Complaints Alongside large-scale inquiries, routine work undertaken by the DPC involves
processing thousands of complaints made to the office by organisations and individuals. In 2020, 4,660 GDPR complaints and 59 Data Protection Act complaints were made against organisations by individuals with 4,476 complaints (including those received before 2020) resolved. Over 60 per cent, or 2,186 complaints, were received by the DPC in 2020 were resolved within the calendar year. These complaints ranged from securing access to personal data to unauthorised and unnecessary disclosure of personal data to third parties. In order to trigger the DPC’s complaint processing function, a complaint must emanate from: • an individual in relation to the processing of their own personal data; • a legally authorised person or entity on behalf of an individual; or • an advocacy group which meets GDPR, LED and DATA Protection Act 2018 requirements to act on behalf of one or more individuals. However, an inadvertent trend has been the increase in complaints received that have “little or nothing to do with data protection”. Likewise, a phenomenon whereby organisations and individuals have attempted to “misuse the GDPR to obfuscate or pursue other agendas” has continued in 2020.
Breaches As a result of the mandatory requirement to notify the DPC in relation to data protection breaches, the volume of notifications received by the DPC remains high. In 2020, the DPC received 6,783 data breach notifications under Article 33 of the GDPR, 2 per cent of which did not meet the criteria of a personal data breach. A total of 6,673 valid data protection breaches constitute a 10 per cent increase on data breach notifications in 2019. By some margin, the most commonly notified data category of data breaches under the GDPR in 2020 was ‘unauthorised disclosures’ (86 per cent), with the majority of the total data
breaches occurring in the private sector. Additionally, the DPC received 70 valid data breach notifications under the e-Privacy Regulations and 25 breach notifications relating to the LED.
Other specific projects undertaken by the DPC in 2020 include Children Front and Centre: Fundamentals for a Child Orientated Approach to Data Processing, a comprehensive guide relating to the protections required for processing children’s data under the GDPR. A ‘regulatory sweep’ of the most frequently visited websites in Ireland was also completed in order to establish the extent of compliance with e-privacy regulations in Ireland. The results of this were described as “disappointing” and the DPC has indicated that its cookies investigations and enforcement action will continue throughout 2021.
Covid-19 The DPC identifies the Covid-19 pandemic as a moment which exemplifies the value of the GDPR. In rolling out its public health response, government was required to consult with the DPC on any public health initiative with personal data processing implications. Such initiatives included the Return to Work Safely Protocol and the Covid-19 Passenger Locator Form. GDPR provided the parameters within
Sector
Number of data-breach notifications
Private
4,097
Public
2,559
Voluntary
data protection report
Other projects
GDPR data-breaches, 2020
16
Charity
1
Total
6,673 Source: Data Protection Commission Annual Report, 2020
which to ensure that these initiatives
Data transfers
were proportionate and that the rights of individuals were protected.
From a litigation perspective, 2020 was a demanding year for the DPPC. Legal
The most prominent example of this
proceedings initiated by in the Irish
was the consultative engagement
High Court by the DPC in 2016 were
between the DPC and the Government
concluded by a July 2020 judgement
on the Covid Tracker App. Beginning in
made in the Court of Justice of the
March 2020, the DPC emphasises the
European Union (CJEU). On the use of
data protection challenges of developing a national contact tracing app. The supervisory authority later provided a Data Protection Impact Assessment for the Covid Tracker App,
Standard Contractual Clauses in underpinning personal data transfers from the EU to the US, the CJEU clarified that that regardless of the legal mechanism employed, the personal data of EU citizens must have
ensuring that all risk was adequately
equivalent protections in the US as are
assessed prior to its launch. After the
guaranteed in the EU. Following this
app’s launch, this engagement with the
judgement, the DPC initiated an
Department of Health continued in
investigation into Facebook transfers to
relation to cross-border interoperability.
the US.
PROFILE
Helen Dixon Appointed as Data Protection Commissioner for Ireland in September 2014, Helen Dixon is responsible for upholding the rights of individuals regarding how data about them is used. This role requires regulation of a large number of US internet multinationals with European bases in Ireland. Previously, Dixon led regulatory enforcement of compliance with the filing provisions of the Companies Acts with the Irish Registrar of Companies. Dixon has also held senior roles in the former Department of Jobs, Enterprise and Innovation working on economic migration policy, science, technology and innovation policy. She spent the first 10 years of her career in the IT industry. The Data Protection Commissioner holds postgraduate qualifications in European Economic and Public Affairs, Governance, Computer Science, Official Statistics for Policy Evaluation, and Judicial Skills and Decision Making. 43
External or internal cyberthreats to government and public sector data? data protection report
that was in the pre-GDPR era, right, when things weren’t regulated yet? Well, just over a month ago, we could read about “Thousands have highly personal details exposed in Covid-19 vaccine data breach” (Extra, 2021), and a year earlier we could read that “Ireland had fourth most data breach notifications in the EU last year” (Extra, 2020).
Last October, ESET researchers uncovered an Advanced Persistent Threat group that has been stealing sensitive documents from several governments in Eastern Europe and the Balkans since 2011. Named XDSpy by ESET, the APT group has gone largely undetected for nine years, during which it has compromised many government agencies and private companies, using spear phishing emails in order to compromise their targets.
Advertorial
The emails displayed a slight variance, as some contained an attachment, while others contained a link to a malicious file. Before that, in August, several Canadian government services were disabled following a series of cyberattacks. On 15 August, the Treasury Board Secretariat announced that approximately 11,000 online government services accounts, originating from the Government of Canada Key service and Canada Revenue Agency accounts, had been victims of hacking attempts. A year before that over 752,000 birth certificate applications have been exposed online by an unnamed company that enables people to obtain copies of birth and death records from state governments in the United States. Over a 12-month span between June 44
2018 and May 2019, a total of 2.3 billion files were discovered exposed online due to misconfigured or non-secured file storage and sharing technologies. These are just some examples of various threats to state organisations and infrastructure that we repeatedly read about. Some are external, like hackers, sometimes state-backed, others are organised crime and some originate from within, either through malicious activity or just negligence or bad design. Some headlines from Ireland from a while ago included: “Personal data of 380,000 welfare recipients stolen” (The Irish Times, 2008), “75,000 customers’ bank details on stolen Bord Gáis laptop” (Siliconrepublic, 2009), “Lost Phoenix Ireland data tape had details of 62,000 customers” (Databreaches, 2011), but
So what is it with data security and why is it such a headache? Well, unlike in the pre-digital era, when stealing a database would entail a truckload of paper having to be hauled around, today huge databases with thousands of entries can be attached to an email or put on a flash drive. The key questions are therefore how access is regulated and how secure the storage is. Who can access the data and what can they do with it, can it be accessed by unauthorised people and can they copy or manipulate it in any way. In 2016, the UK’s communications regulator Ofcom was investigating the biggest data breach in its history. The incident was caused internally – a former employee had been surreptitiously gathering data over a six-year period. The breach only came to light after the individual offered the information to his new employer, a major TV broadcaster. Insider threats can be both intentional and unintentional. They are influenced by technical, behavioural, and organisational issues, meaning that organisations need to consider solutions that address each of these key areas of weakness to ensure they have responses to most scenarios. Three years ago, a survey for Accenture revealed that almost one in five (18 per cent) employees in the healthcare industry in the United States and Canada said that they would be willing to give access to confidential medical data about patients to an unauthorized outsider for financial gain. They would expect no more than $500 to $1,000 for their login credentials or for deliberately installing tracking software or downloading the data to a portable drive. In addition, this way of compromising patient data is not a purely hypothetical phenomenon. Roughly one in four (24 per cent) respondents said that they
All this reveals that whether initiated externally, or coming from within the organisation, the “human factor” is the weakest link of data protection. In fact, a 2015 paper by Nuix, reported that the overwhelming majority of respondents (93 per cent) consider “human behaviour” to be the number one threat to their security. So what can be done about it? 1. Know about the threats The first step in preventing cyberthreats is knowing about them. Following general cybersecurity news, browsing through the headlines of IT topics, anything is good just to get a feel of what sorts of threats are currently trending and help recognise them when encountered and increase the chance of avoiding them.
Security clearance for access to data should be strictly regulated and enforced in every organisation, so that unauthorised people do not even gain access to sensitive data, while authorised ones are accountable and tracked for any transfer, removal or manipulation of data. 3. Employee education Employees need to be regularly educated and briefed on any current
4. Encryption Well encrypted data is pretty useless to any attackers, even when successfully exfiltrated from an organisation as current encryption tools are practically impossible to decrypt by unauthorised prying eyes. That is why GDPR legislation mentions encryption as one of the recommended tools, although not mandatory, in mitigation of potential damage caused by data breaches.
in the prevention of account hijacking. Threats to cybersecurity of government and public sector institutions are therefore many and varied, can come from outside or inside and can cause various degrees of difficulties and disruptions. But overall, while countering them may appear very intimidating at first, mostly all can be avoided with proper planning and introduction of even some relatively simple measures. Security professionals have spent years coming up with optimal prevention measures, solutions and best practices for most threat scenarios, so the knowhow and the hardware are there. With a bit of effort to put it all together, there is really no obstacle to properly benefit from and enjoy safer technology.
5. Secure backup and storage Secure backup of sensitive data is key to long-term data breach prevention. Whether on-site or cloud storage is used, strict protocols on content, control and accessibility, as well as accountability of the storage provider need to be observed in order to ensure maximum data security and data loss prevention.
T: +353 53 914 6600 E: info@eset.ie W: www.eset.ie
Advertorial
2. Regulation and access control
external types of threats, particularly ones that initiate the breach through social engineering, which takes advantage of the employees themselves, by asking them to open a file attachment or click on a compromised link in order to gain access.
data protection report
were actually aware of a co-worker who had made a profit by providing a third party with access to such information.
6. System Patching and Multi Factor Authentication Administrators should also endeavour to ensure their systems are regularly updated to prevent exploitation of known vulnerabilities and employ Multi Factor Authentication as another valuable layer 45
data protection report
European Data Protection Board makes first Article 65 decision The European Data Protection Board (EDPB) reached its first binding decision on an Article 65 dispute in November 2020. The dispute concerns the Irish Data Protection Commission’s (DPC) decision to fine Twitter over a five-year data breach, although some European supervisory authorities are likely to still be unhappy with the fine imposed. The EDPB announced that a vote to back a draft decision submitted by the DPC had passed by a two-thirds majority. The settlement relates to Twitter’s disclosure of a bug in its tweet protection feature in early 2019. The feature, designed to protect tweets from public viewing, was found to have a bug meaning that Android users who applied the feature may have had their data exposed to the public internet as far back as 2014. Since the breach lasted from 2014-2019, it fell under the EU’s General Data Protection
46
Regulation (GDPR), introduced in 2018, with Article 65 of the regulation relating to cross-border matters. With Twitter’s European headquarters situated in Dublin’s George’s Quay, the DPC became the lead supervisory authority (LSA) in the case, but its cross-border nature meant that the EDPB, which brings together the data commissioners of Europe to coordinate pan-EU regulatory activity, was brought in to adjudicate on the draft decision the DPC had taken. The EDPB process allows these data
commissioners and supervisory authorities to raise “relevant and reasoned” objections to draft decisions. In the summer of 2020, deputy commissioner of the DPC, Graham Doyle stated that, following consultations with the concerned supervisory authorities (CSAs), the DPC had submitted the matter to the EDPB under Article 65, making this case the first referral of its kind. Despite hoping to have a decision on the case “early” in 2020, action was delayed by disagreements between the
DPC and other supervisory authorities. Eventually, agreement was reached two years after the investigation into Twitter began and the EDPB announced in November 2020 that it had “adopted by two-thirds majority of its members its first dispute resolution decision on the basis of Art. 65 GDPR”.
“As the LSA rejected the objections and/or considered they were not ‘relevant and reasoned’, it referred the matter to the EDPB in accordance with Art. 60 (4) GDPR, thereby initiating the dispute resolution procedure. Following the submission by the LSA, the completeness of the file was assessed, resulting in the formal launch of Art. 65 procedure on 8 September 2020. In compliance with Article 65 (3) GDPR and in conjunction with Article 11.4 of the EDPB Rules of Procedure, the default adoption timeline of one month was extended by a further month because of the complexity of its subject matter.” The binding decision was subsequently adopted on 9 November. In its decision, the EDPB stated that the DPC had ruled that Twitter had not met its obligations under Article 33 (1) GDPR and also found that Twitter had not acted in a timely manner with regard to the data breach. Twitter “became actually aware of the breach on 7 January 2019 but should have been aware of the breach at the latest by 3 January 2019, since on that date Twitter, Inc. as processor first assessed the incident as being a potential data breach and the Twitter, Inc. legal team instructed that the incident be opened”. Companies are required to notify commissioners of a data breach within 72 hours of its discovery under Article 33 (1) GDPR, but in this case the “ineffectiveness of the process” in the “particular circumstances” and “a failure by [Twitter] staff to follow its
that it had imposed an administrative fine of €450,000 on Twitter ‘as an effective, proportionate and dissuasive measure’ after its investigation had found that Twitter
data protection report
Explaining the process through which EU-wide data regulation is now performed, the EDPB said: “In May 2020, the Irish SA [the DPC] shared its draft decision with the CSAs in accordance with Art. 60 (3) GDPR. The CSAs then had four weeks to submit their relevant and reasoned objections (RROs). Among others, the CSAs issued RROs on the infringements of the GDPR identified by the LSA, the role of Twitter International Company as the (sole) data controller, and the quantification of the proposed fine.
“The DPC announced on 15 December
had ‘infringed Article 33 (1) and 33 (5) if the GDPR in terms of a failure to notify the breach on time to the DPC and a failure to adequately document the breach’.” incident management process” mean that Article 33 (1) GDPR had been violated nonetheless. In its binding decision, the EDPB ruled: that the DPC did not have to amend its draft decision on the basis of the complaints raised by the other supervisory authorities; that, despite concerns raised about further infringements committed by Twitter, the DPC was not required to amend its draft decision as the “factual elements” of the DPC decision were “not sufficient to allow the EDPB to establish the existence of infringements”; that amid protestation that the fine the DPC wanted to issue was not dissuasive enough, it was “required to reassess the elements it relies upon to calculate the amount of the fixed penalty to be imposed on TIC [Twitter] and to amend its draft decision by increasing the level of fine in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality”. Having been given a month to announce its decision, the DPC announced on 15 December that it had imposed an administrative fine of €450,000 on Twitter “as an effective, proportionate and dissuasive measure” after its investigation had found that Twitter had “infringed Article 33 (1) and 33 (5) if the GDPR in terms of a failure to notify the breach on time to the DPC and a failure to adequately document the breach”. The DPC had originally
proposed a fine of between €135,000 and €275,000 in its draft decision, but the binding decision of the EDPB forced that figure upwards. It is unlikely that this new figure will have appeased the European supervisory authorities that had raised issues with the perceived laxity of the fine. The German supervisory authority had advocated for a fine of between €7,348,035 and €22,044,195, stating: “As Twitter’s business model is based on processing data, and as Twitter generates turnover mainly through data processing, the DE SA considers that a dissuasive fine in this specific case would therefore have to be so high in that to would render the illegal processing unprofitable.” Twitter responded to the judgement stating its delay in reporting the breach had been “an unanticipated consequence of staffing between Christmas Day 2018 and New Years’ Day”; it is thought that DPC took this into account when deciding the fine amount. This being the first fine issued by the DPC under GDPR rules could be a portent of things to come, with many major firms accused of data breaches having their European headquarters in Dublin and over 6,600 valid breach notifications received in 2020, this is most likely the beginning of a long battle for the DPC. 47
data protection report
EU Commission GDPR review Fragmentation across member states needs continuously monitored if the EU is to develop “a truly common data protection culture”, the first EU Commission evaluation of GDPR has found. While largely positive in its outlook, the review report highlighted, in particular, that greater harmonisation was required in relation to the handling of crossborder cases and recommended a more effective use of all tools provided in the GDPR for the data protection authorities to cooperate. In response to the review, the EU Commission has strengthened its ambition for a greater convergence of data protection standards, including eradicating differences in how governments and national data protection authorities apply data protection law, an expansion of jurisdictions offering equivalent data protection to the EU’s and a revision of
48
standard contract clauses (SCCs) to
stronger enforcement powers and
help companies transfer personal data
established a new governance system
around the world more easily.
among those authorities.
Separately, the Commission has
Written into the GDPR was that the EU
outlined its intention to refine data
Commission would carry out a review
protection law and guidance to better
and evaluation of the set of rules two
support digital innovation in areas such
years after application and then every
as AI use and blockchain technology.
four years thereafter.
The General Data Protection Regulation
The Commission’s report in June 2020
(GDPR) was a set of rules introduced to
represented the first assessment and
protect individuals with regard to the
found that GDPR has met most of its
processing of personal data and on the
objectives, including “offering citizens a
free movement of such data.
strong set of enforceable rights and by
In order to create a level playing field for all companies operating in the EU
creating a new European system of governance and enforcement”.
market, the legislation equipped
Published pre-Covid-19, the report
national data protection authorities with
found that GDPR has proven to be
flexible in supporting digital solutions, even in the unforeseen circumstances of the pandemic.
Amongst the key findings and further action outlined in the review, the Commission says that while EU citizens have become more aware of their rights, with some 69 per cent of the EU population over the age of 16 having heard of GDPR, more can be done to help citizens exercise these rights, particularly in relation to the right to data portability. Data protection rules, the review says, have helped individuals to play a more active role in relation to what is happening with their data in the digital transition. It also points out that the enhanced corrective powers which have been given to data protection authorities are being used but that these authorities are being supported differently within member states. Human, technical and financial resources needed by national data protection authorities to enforce the rules are largely recognised, with a combined 42 per cent increase in staff and a 49 per cent increase in budget for all national data protection authorities between 2016 and 2019, however, the report points to “stark” differences between member states. On the performance of data protection authorities, the review says that while there is evidence that data protection authorities are working together in the context of the European Data Protection Board, room for improvement exists. The one-stop-shop governance mechanism, ensuring that a company processing data cross-border “has only one data protection authority as interlocutor” saw 79 final decisions issued in response to 141 draft decisions submitted between May 2018 and December 2019. “More can be done to develop a truly common data protection culture. In particular, the
cooperation agreements with relevant third countries. As well as publishing the review, the Commission also published a communication identifying 10 legal acts regulating processing of personal data
Pointing to international engagement on free and safe data transfers over the past two years, including with Japan with which the EU now shares the world’s largest area of free and safe data flows, the review says that the Commission will continue to work on adequacy with its partners around the world and is seeking to modernise other mechanisms of data transfer, not least the SCCs. However, the review also outlines an ambition to go further than existing relationships. In stating that “it is time to step up the international cooperation between data protection enforcers”, the review highlights that the Commission aims to open negotiations for the conclusion of mutual assistance and enforcement
data protection report
Two years on, it states that businesses are developing a compliance culture and are now increasingly using strong data protection as a competitive advantage.
handling of cross-border cases calls for a more efficient and harmonised approach and an effective use of all tools provided in the GDPR for the data protection authorities to cooperate,” the review stated.
by competent authorities for the prevention, investigation, detection or prosecution of criminal offences which should be aligned with the Data Protection Law Enforcement Directive. “The alignment will bring legal certainty and will clarify issues such as the purposes of the personal data processing by the competent authorities and what types of data may be subject to such processing,” the Commission stated. The Commission’s next evaluation report, which will also review implementation of the actions listed within the inaugural report, is expected for 2024.
Action list for member states to support GDPR application: • complete the alignment of their sectoral laws to the GDPR; • consider limiting the use of specification clauses which might create fragmentation and jeopardise the free flow of data within the EU; • assess whether national law implementing the GDPR is in all circumstances within the margins provided for member state legislation; and • allocate resources to data protection authorities that are sufficient for them to perform their tasks.
49
Working from home: Cybersecurity risk
data protection report
cent said that they were using their own personal device, which often lack the same protections or capability for monitoring as work devices. Furthermore, one-fifth admitted that they had shared or stored work documents on personal devices. Cybersecurity risks to businesses come in various forms, the most obvious being lost or stolen devices containing sensitive material. At a basic level, this risk has dramatically enhanced given the increased mobility of the workforce.
The short-term security solutions implemented to enable working from home en masse at the beginning of the pandemic have increased the levels of risk to businesses, many of whom are now moving to implement future-proofed solutions. While remote/home working was far from a new concept when the stay at home restrictions were introduced, the scale of mobilisation required to rapidly equip the majority of a nation’s workforce to work from home was unprecedented. Something that might have taken one organisation years to pilot, policy up and roll out was necessitated in days if not hours. However, while the agility of many organisations to restructure their business office model has been commended, concerns have also been raised around security vulnerabilities. As businesses and organisations expanded the breadth of their ‘attack surface’, the culmination of networks and systems used for work, they also enhanced the level of cybersecurity risk. However, with many implementing fastpaced and short-term remote working solutions in response the pandemic, there is recognition that remote working in some form will remain beyond the
50
pandemic and must then be factored into business security plans. Although current research on security risks to businesses and organisations during the pandemic is far from extensive, a survey carried out on behalf of IT solutions provider DataSolutions offers a glimpse of what risks businesses may be faced currently and into the future. In December 2020, one-in-10 Irish office workers said that they had been targeted by cybercriminals since they began working from home when the pandemic began, while almost one-third admitted to feeling vulnerable to cybersecurity risks while working from home. Given the pace of the switch to home working for many, it is unsurprising that as many as 57 per cent of respondents said that their company had not provided additional security training to prepare them for working from home. Potentially more surprising is that 56 per
A further obvious risk is that in relation to phishing communications. Vulnerabilities in relation to phishing communications were increased not only because a large cohort of workers were without security systems afforded by the office systems such as firewalls or blacklisted IP addresses but also because cyber criminals sought to exploit Covid-19 related fears. Google counted more than 18 million malware and phishing emails related to coronavirus on its service everyday in April 2020 alone. The 2020 Cisco Benchmark Report, which surveys almost 3,000 security leaders globally, highlights how the shift to remote working has changed security priorities. Mobile devices have overtaken user behaviour as the biggest challenge to protecting the mobile workforce, with the report pointing to 52 per cent of respondents stating that mobile devices are now “very or extremely challenging” to defend. Remote working, to some extent, is set to be a permanent feature of the future of work. The Government’s remote working strategy has outlined a 20 per cent target for the public sector, with expectations that the more agile private sector will also see high uptake of remote working within a hybrid model. Over a year on from the mass switch to working from home, businesses should be looking at not only mitigating security risks of the present but also delivering solutions for the future to keep them secure.
data protection report
Children front and centre The Data Protection Commissioner (DPC) has published its draft Children Front and Centre: Fundamentals for a Child-Orientated Approach to Data Processing. ‘The Fundamentals’ will introduce data protection principles and measures designed to protect children. The Fundamentals, which DPC says all organisations collecting and processing children’s data should comply with, have been created “to drive improvements in standards of data processing”. They will “introduce childspecific data protection interpretative principles and recommended measures that will enhance the level of protection afforded to children against the data processing risks posed to them by their
52
use of/access to services in both an online and offline world”. The Fundamentals will also assist those organisations that do process children’s data by clarifying the principles to which they are expected to adhere, which arise from GDPR obligations. The draft document released by the DPC for the purpose of stakeholder consultation which closed on 31 March 2021 outlines 14 principles for organisations to follow.
These are: 1. Floor of protection: the provision of a minimum level, or “floor”, of protection by service providers. 2. Clear-cut consent: that consent given by a child for the processing of their data be “freely given, specific, informed and unambiguous, made by way of a clear statement or affirmative action”.
3. Zero interference: service providers should ensure that their “pursuit of legitimate interests” does not interfere with the best interests of the child.
5. Information in every instance: children are entitled to receive information about the processing of their data “irrespective of the legal basis relied on and even if consent was given by a parent on their behalf to the processing of their personal data”. 6. Child-oriented transparency: information about how data is used must be provided “in a concise, transparent, intelligible and accessible way, using clear and plain language that is comprehensible and suited to the age of the child”. 7. Let children have their say: service providers “shouldn’t forget that children are data subjects in their own right and have rights in relation to their personal data at any age”. The DPC states that a child can exercise these rights at any given time, provided “they have the capacity to do so and it is in their best interests”. 8. Consent doesn’t change childhood: consent obtained from children and/or their parents/guardians should not be used to justify the treatment of them as adults. 9. Your platform, your responsibility: companies who derive revenue from providing or selling services through digital and online technologies are expected to “go the extra mile” to prove that their age verification methods are effective. 10. Don’t shut out child users or downgrade their experience: services that are “directed at, intended for, or likely to be accessed by children” cannot bypass their obligations under the fundamentals by “shutting them out or depriving them of a rich service experience”.
that the best interests of the child must always be the primary consideration in all decisions relating to the processing of their personal data.”
data protection report
4. Know your audience: service providers should take steps to identify their users and ensure that their child-specific services have child-specific data protection measures in place.
“The core message of the fundamentals is
Helen Dixon, Commissioner for Data Protection 11. Minimum user ages aren’t an excuse: user age thresholds are not a reason for organisations to ignore controller obligations under GDPR where underage users are concerned. 12. Prohibition on profiling: service providers “should not profile children and/or carry out automated decision making in relation to children” or use their personal data for marketing/advertising purposes “due to their particular vulnerability and susceptibility to behavioural advertising” unless it can be clearly demonstrated that it is in the best interests of the child to do so. 13. Do a DPIA: providers should undertake a data protection impact assessment (DPIA) in order to minimise data protection risks to their service and to the children. The “principle of the best interests of the child” is expected to be a key aspect of any DPIA and to take precedence over the commercial interests of the provider. 14. Bake it in: service providers that consistently process children’s data must have high levels of data protection “baked in” across the services they provide. Writing in her foreword to the Fundamentals, the Commissioner for Data Protection, Helen Dixon said: “About a quarter of Ireland’s population are children, all of whose data is processed every day online and offline, in educational, health, recreational and sporting, social services and commercial contexts. It is with this in mind that the DPC has produced this guidance to set out the standards that all organisations should follow when collecting and processing children’s data. The core message of the Fundamentals is that the best interests
of the child must always be the primary consideration in all decisions relating to the processing of their personal data.” The move to lay down principles by which companies must abide with regard to children’s data follows on from the UK Information Commissioner’s Office’s Age Appropriate Design Code, published in August 2020. The DPC has noted that its Fundamentals differ from those of their UK counterparts in that the UK document focuses on privacy-bydesign features that must be engineered into services used by children, whereas the DPC fundamentals take on a broad-based approach. The DPC has otherwise stated that its Fundamentals are “entirely consistent” with the UK’s code. The Fundamentals include a list of recommended actions for online service providers, although it is stressed that “there is no one-size-fits-all solution to data protection by design and default”. These recommendations include: ensuring the strictest privacy settings are applied to services likely to be accessed by children; ensuring that child users have meaningful choice in a mixed-audience setting; minimising the amount of data collected from children in the first place; not systematically sharing a child’s data with third parties without clear parental knowledge; turning off geolocation by default for children unless the service provided is dependent upon it; turning off profiling identifiers, techniques and settings; avoiding the use of nudge techniques that encourage children to provide unnecessary information; and the provision of layered, child-friendly information that is available to children throughout the user experience. 53
data protection report
Schrems II In July 2020, the Court of Justice of the European Union (CJEU) issued its long-awaited decision in the Data Protection Commission v Facebook Ireland case. The decision invalidated the European Commission’s previous adequacy decision for the EU-US Privacy Shield Framework and will have a significant impact on personal data transfers. The decision, colloquially known as Schrems II as it is the second legal challenge by the Austrian activist Max Schrems, ruled that “the Privacy Shield does not provide adequate protection” and the CJEU affirmed that it had found “for a second time now that there is a clash between EU privacy law and US surveillance law”. In the first Schrems decision in 2015, the Court invalidated the Safe Harbour framework that had governed EU-US personal data flows; Schrems II has now struck down Safe Harbour’s data protection-enhanced successor, Privacy Shield. The CJEU specifically
54
invalidated Decision 2016/1250, the European Commission’s 2016 decision that Privacy Shield was adequate to enable data transfers under US law. The decision also contradicts three years’ worth of annual reports from the Commission affirming the stance of their 2016 decision. The Commission had in the past raised its own issues with the Privacy Shield, for instance it had consistently argued that a permanent ombudsman should be appointed to fill the role of tribunal as specified within Article 47 of the EU Charter of Fundamental Rights.
With the Court now ruling that Privacy Shield is insufficient to govern the data sharing between the EU and the US, over 5,300 participants will be severely affected. Two main reasons were cited by the CJEU in their decision, pertaining to the all-encompassing nature of US surveillance and the lack of action EU citizens can take against the US if they are adversely affected. Firstly, the Court found that US surveillance programmes are not limited to strictly necessary data, despite their assessment by the Commission, thus meaning that they do not meet the requirements of Article 52 of the EU
Charter. Secondly, the Court ruled that EU data subjects lack actionable judicial redress with regard to US surveillance, thus not satisfying the demands of Article 47 of the Charter. The CJEU also issued a further ruling that will significantly affect how
privacy rights once it had transferred his data to the United States, where it could be analysed by US intelligence agencies. Given that Facebook’s EU headquarters are in Dublin, it then fell to Ireland’s DPC to prosecute. The DPC is now charged with action on the guidance provided by the CJEU.
“While noting the Court’s reference to the fact that a supervisory authority would not suspend data transfers while an adequacy decision – such as Privacy Shield – was in force, the DPC acknowledges the central that it, together with its fellow supervisory authorities across the EU, must play
companies establish compliance with EU data protection rules. The Court ruled Commission Decision 2010/87, focused on standard contractual clauses (SCCs), to be valid. This ruling means that personal data transferred subject to said contractual obligations between data controllers and protectors is still sufficiently protected. SCCs are thus still considered a valid method to ensure data protection, but the CJEU’s overall ruling does certainly create the question of the utility of SCCs as a means to govern data sharing, a question the Data Protection Commission (DPC) raised in its reaction to the decision. Divergent opinions also emerged among tech companies, with some saying that it was unclear if SCCs would meet data protection standards given the Schrems II ruling, while others rushed to reassure clients that data transfers were still possible. The entire case began when Schrems alleged that Facebook had violated his
In a statement released after the CJEU ruling, the DPC said it “welcomes” the decision. “The Court has endorsed the DPC’s position, it has also ruled that the SCCs transfer mechanism used to transfer data to countries worldwide is, in principle, valid, although it is clear that, in practice, the application of the SCCs transfer mechanism to transfers of personal data to the United States is now questionable,” the DPC said.
data protection report
“Firstly, the Court found that US surveillance programmes are not limited to strictly necessary data, despite their assessment by the Commission, thus meaning that they do not meet the requirements of Article 52 of the EU Charter. Secondly, the Court ruled that EU data subjects lack actionable judicial redress with regard to US surveillance, thus not satisfying the demands of Article 47 of the Charter.”
across the EU.” In September 2020, the DPC sent Facebook a preliminary order to halt the transfer of EU citizens’ data to the US, with a fine of 4 per cent of annual revenue to be imposed if conditions are not met. Facebook’s Vice President of Global Affairs and Communications Nick Clegg, responding on a Facebook blog, acknowledged that data protections laws are changing, but stated that more legal clarity was needed and advocated a revision of Privacy Shield. “These [reform] efforts will need to recognise that EU member states and the US are both democracies that share common values and the rule of law, are deeply culturally, socially and commercially interconnected, and have very similar data surveillance powers and practices,” Clegg wrote. The European Data Protection Board issued recommendations for firms dealing with transfers. There were six recommendations: know your transfers; identify the transfer tools you are relying on; assess whether the Article 46 GDPR transfer tool you are relying on is effective in light of all circumstances of the transfer; adopt supplementary measures; procedural steps if supplementary measures have been
“This is an issue that will require further and careful examination, not least because assessments will need to be made on a case-by-case basis. As well as providing clarity on points of substance, today’s judgement also contains important statements of position relating to matters of process, to include the allocation of responsibility between data controllers and national supervisory authorities when it comes to ensuring that the rights of EU citizens are protected in the context of EU/US data transfers.
identified; and re-evaluate at appropriate levels. Further repercussions of trans-Atlantic intelligence services and surveillance could arise from a push to place European intelligence services beyond court jurisdiction. EU member states, led by France, are now seeking to insert a national security exemption in the pending ePrivacy Regulation, which would exclude third parties such as the US.
55
data protection report
Data Protection Commission: Annual Report for 2020
Source: Data Protection Commission 2020 Annual Report
• Prosecutions concluded against six
In February 2021, the Data Protection Commission published its annual report for 2020. Key figures which emerged include: entities in respect of offences under the E-Privacy Regulations
• Enforcement Notices were served on seven
organisations for non-compliance in relation to the use
of cookies and tracking technologies in December 2020
10,151 cases handled (a 9% increase on 2019)
6,628 valid data security breaches notified
83
statutory inquiries on hand (56 domestic inquiries and 27 crossborder inquiries) as of 31 December 2020
56
4,660
4,476
GDPR complaints received
complaints concluded
€450,000
354
fine issued to Twitter International Company in December 2020
147
new complaints investigated under S.I. No. 336 of 2011 in respect of various forms of electronic direct marketing
cross-border
processing complaints received through the One-Stop-Shop
570
new Data Protection Officer notifications (with a total of 2,166 at year end)
Clifton House · Lower Fitzwilliam Street · Dublin 2, D02 XT91 · Tel: 01 661 3755