3. ADM AND THE GDPR CASE-LAW IN SPECIFIC SCENARIOS: WORKPLACE — FACIAL RECOGNITION — CREDIT SCORING
T
he following sections explore three specific scenarios where individuals tend to challenge ADM systems more often: the workplace (managing employees, contractors, hiring processes); Facial Recognition (automated facial recognition, both in the public interest and for commercial purposes); and credit scoring. The cases summarized show that the GDPR provides for protection of individuals against unlawful practices in these scenarios, even where Article 22 is not applicable. In addition, each section briefly introduces new legislative proposals introduced by the EU to tackle specifically each of these scenarios, creating thus potential overlap which deserves further exploration.
3.1 ADM in the workplace often interacts with labor rights Courts often assess the lawfulness of profiling and ADM processes through other lenses than data protection law. This is particularly evident in judicial proceedings which involve the use of algorithmic tools by organizations to manage their workforce and individual service providers or contractors. A significant body of case-law is emerging on the issue of ADM in the gig economy, which often includes both GDPR enforcement and labor law considerations. Interestingly, it is precisely the use of ADM systems to manage gig workers which is considered the relevant argument by enforcers to qualify this situation as an employment relationship, and therefore a “labor law” issue (see cases 3 and 29).
Case 28 (related to Case 3): Fairness of automated ranking depends on the factors which are weighed in by the algorithm In December 2020, the Labor division of the Italian Bologna Court found that Deliveroo’s reputational ranking algorithm “Frank,” which determined the order in which Deliveroo’s riders would be called for a given service, was discriminatory and unlawful, after three riders sued the company. The algorithm took into account riders’ absences, without considering the reasons behind absenteeism (e.g., riders or their children could have been sick that day). Riders which were not available for, or canceled a given service would lose “points,” thus leading them to a less favorable position to be attributed services in the future, which could eventually result in a quasi-ban from the platform. The Court stressed that Deliveroo’s profiling system could and should have treated riders that did not participate in booked services for trivial reasons differently from riders who did so because of legitimate reasons (like strikes, sick leave, etc.). The Court did not reach any direct conclusions on whether “Frank” fell under the scope of Article 22 GDPR, as it approached the case from an Italian labour and anti-discrimination law perspective.123 AUTOMATED DECISION-MAKING UNDER THE GDPR: PRACTICAL CASES FROM COURTS AND DPAS
39