8 minute read
14.2 Assessment of the proportionality
to the purposes of the processing. Subsidiarity means that the purposes of the processing cannot reasonably be achieved with other, less invasive means. If so, these alternatives have to be used.
Proportionality demands a balancing act between the interests of the data subject and the data controller. Proportionate data processing means that the amount of data processed is not excessive in relation to the purpose of the processing. If the purpose can be achieved by processing fewer personal data, then the data controller needs to limit the processing to personal data that are necessary.
Advertisement
Therefore data controllers may only process personal data that are necessary to achieve legitimate purpose. The application of the principle of proportionality is thus closely related to the principles of data protection from Article 5 GDPR.
14.2 Assessment of the proportionality
The key questions are: are the interests properly balanced? And, does the processing not go further than what is necessary? To assess whether the processing is proportionate to the interests pursued by the data controller(s), the processing must first meet the principles of Article 5 of the GDPR. As legal conditions they have to be complied with in order to make the data protection legitimate.
Data must be ‘processed lawfully, fairly and in a transparent manner in relation to the data subject’ (Article 5 (1) (a) GDPR). This means that data subjects must be informed about the processing of their data, that all the legal conditions for data processing are adhered to, and that the principle of proportionality is respected. As analysed in Sections 11.1 and 11.2 of this report, Google nor the government organisations currently have a legal ground for any of the processing through G Suite Enterprise. This means the personal data are not processed lawfully.
Google does not process the data in a transparent manner either. Google does publish extensive documentation for administrators about the 19 different audit log files they can access to monitor end user behaviour. However, at the time of completion of this DPIA Google did not publish documentation about other Diagnostic Data it collects through its own system-generated log files. The logs that can be accessed by admins do not contain any information about the website data Google collects, nor information about the use of Features, Additional Services, the Technical Support Services or the Other related services, or an exhaustive overview of all activities performed with a Google Account.
Google equally fails to provide any public explanation to its Enterprise customers in the EU about the other kinds of Diagnostic Data it collects through the use of the G Suite Enterprise services, such as the telemetry data. Administrators and end users cannot inspect the contents of these telemetry data either, nor does Google provide access thereto in response to a formal Data Subject Access request, as laid down in Article 15 of the GDPR.
The lack of transparency makes the data processing inherently unfair. The lack of transparency also makes it impossible to assess the proportionality of the processing.
The principles of data minimisation and privacy by design require that the processing of personal data be limited to what is necessary: the data must be 'adequate, relevant and limited to what is necessary for the purposes for which they are processed' (Article 5(1)(c) of the GDPR).’ This means that the controller may not collect and store data which are not directly related to a legitimate purpose.
The principle of privacy by design (Article 25 (2) GDPR) requires that ‘the data controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed.’ According to this principle, the default settings for the data collection should be set in such a way as to minimise data collection by using the most privacy friendly settings.
As described in Section 3 of this report, Google frequently processes personal data in the Core Services with a privacy unfriendly default setting. This is the case for the accessibility of the Other related service Feedback.
Through Feedback, Google can process personal data in Customer Data for unauthorised purposes. In view of the possibly sensitive nature of such data, lack of transparency and possible risks for data subjects if Customer Data are processed for unlawful purposes and absence of opt-out controls, this processing is disproportionate.
Google equally fails to apply the principle of privacy by design with regard to data processing in the context of the Google Account, the Diagnostic Data, the Technical Support Services and the Additional Services.
Google frequently offers opt-out choices, instead of active opt-in choices. Google offers opt-out choices in many locations (different menus on the devices, in the browser and on different webpages). This places an unnecessary burden on the shoulders of the employees and makes the data processing disproportionate.
There is only one Google Account, that can be used in both an enterprise and consumer environment. All end users with a Google Account must accept the same general (consumer) Terms of Service and the (consumer) Privacy Policy, regardless if they create the account as a consumer or as an employee. Google explains that this is the case because end users may use their Google Account to sign into and use Google’s consumer services, if their administrator does not restrict such use.
Google allows end users to sign-in with multiple Google Accounts. This design of the services does not sufficiently and systematically take the specific data protection risks for employees and the government organisations into account. Government organisations need to draw strict lines between processing of personal data in the consumer and enterprise environments, in order to prevent data breaches and unauthorised processing of personal data and Classified Information.
At the time of writing of this report, administrators could block access to the existing Additional Services for work accounts (not to any new Additional Services). However, they could not completely prevent logged in users from accessing Additional Services. When an end user accessed an Additional Service such a Google Search when logged-in with their work-Google Account, whilst the administrator had centrally disabled the use of the Additional Services, Google ensured that the user was logged-out from the work account. Google then proceeded to process the data as if the user had not account at all.
This automatic (and privacy friendly procedure) does not apply to the use of all Additional Services. End users can for example use Google Photos with their enterprise Google Account. It is not clear why Google applies different rules to different Additional Services.
Google does protect the privacy of the work account when a government employee uses (the consumer service) Google Search. In that case, Google ensures that the data are processed as if the end user had no account. But this automatic procedure
does not apply to the use of all Additional Services. Users can for example use Google Photos with their work credentials. It is not clear why Google applies different rules to different Additional Services.
The absence of a technical separation between enterprise and consumer Google Accounts, combined with the privacy unfriendly default setting of access to all Additional Services, leads to spill-over from personal data in Customer Data to Google’s consumer environment. This is the case for (i) Ads Personalization, (ii) providing access to all Customer Data for the Chrome browser as ‘trusted’ app, (iii) the sending of telemetry data (Diagnostic Data) from Android devices, Chrome OS and the Chrome browser with data about app usage and use of biometric authentication, and (iv) installing three kinds of unique identifiers in ChromeOS and the Chrome browser and use these for installation tracking, tracking of promotional campaigns and field trials.
As long as these settings remain privacy unfriendly by default, and admins do not have controls to block or at least minimise the data processing with tools provided in G Suite Enterprise, the use of the Chrome OS, the Chrome browser and Android devices disproportionately infringes on the interests and rights of data subjects, in particular as regards confidential data or data of a sensitive nature or special categories of data. As joint controllers with Google, government organisations are accountable for the risks of any unlawful processing of personal data.
The principle of storage limitation requires that personal data should only be kept for as long as necessary for the purpose for which the data are processed. Data must 'not be kept in a form which permits identification of data subjects for longer than is necessary for the purposes for which the personal data are processed' (Article 5(1)(e), first sentence, GDPR). This principle therefore requires that personal data be deleted as soon as they are no longer necessary to achieve the purpose pursued by the controller. The text of this provision further clarifies that 'personal data may be kept longer in so far as the personal data are processed solely for archiving purposes in the public interest, for scientific or historical research purposes or for statistical purposes in accordance with Article 89(1), subject to the implementation of appropriate technical and organisational measures required by this Regulation in order to safeguard the rights and freedoms of the data subject' (Article 5(1)(e), second sentence, GDPR).
As explained in Section 10 of this report, Google will delete Customer Data actively deleted by the customer as soon as reasonably practicable, but can retain these data for half a year. This maximum period seems long, once a government organisation has decided to delete Customer Data.
With regard to the Diagnostic Data, the retention period of 6 months for most of the audit logs seems proportionate to the objectives pursued by admins, to be able to look back in case of data security incidents, and to regularly inspect the logs for correct application of the access rules.
Google does not have a fixed retention period for other types of Diagnostic Data, such as the telemetry and website data. The general rule is to retain these data for 6 months as well, but Google explained that “other Diagnostic Data is retained for much longer periods (e.g. account deletion events).” 283 Cookie-based data are generally anonymized after 18 months, wrote Google. G Suite admins cannot customize these retention periods.
283 As quoted in Section 10.2. From responses provided by representatives of Google to SLM Rijk during the course of this DPIA.