
7 minute read
1 Sco pe
The European Data Protection Board
Having regard to Article 70 and (1e) o f the Regulatio n 2016/679/ EU o f the Euro pean P arliament and o f the Co uncil o f 27 April 2016 o n the protectio n of natural perso ns with regard to the pro cessing o f perso nal data and o n the free m ovement of such data, and repealing Directive 95/ 46/ EC, (hereinafter GDP R ),
Advertisement
Having regard to the EEA Agreement and in particular to Annex XI and P rotoco l 37 thereo f, as amended by the Decisio n o f the EEA jo int Com m ittee No 154/ 2018 o f 6 July 2018, 1
Having regard to Article 12 and Article 22 o f its Rules of P ro cedure,
HAS ADOPTED THE FOLLOWING GUIDELINES
1 SCOPE
The aim o f these guidelines is to provide recom mendatio ns and guidance fo r the design o f the interfaces o f social m edia platfo rm s. They are aimed at so cial m edia pro viders as co ntro llers o f so cial m edia, who have the respo nsibility for the design and o peratio n o f social m edia platfo rm s. With reference to the so cial media providers, these Guidelines aim to recall the obligatio ns com ing fro m the GDPR, with special reference to the principles o f lawfulness, fairness, transparency, purpo se lim itatio n and data m inim isatio n in the design o f user-interfaces and content presentatio n o f their web services and apps. The afo rem entio ned principles have to be implemented in a substantial way and, fro m a technical perspective, they co nstitute requirements fo r the design o f so ftware and services, including user interfaces. An in-depth study is m ade o n the GDPR s requirement when applied to user interfaces and co ntent presentatio n, and it is go ing to be clarified what sho uld be co nsidered a dark pattern , a way o f designing and presenting co ntent which substantially vio lates those requirem ents, while still pretending to form ally com ply. These Guidelines are also suitable fo r increasing the awareness o f users regarding their rights, and the risks po ssibly co m ing from sharing to o m any data o r sharing their data in an unco ntro lled way. These Guidelines aim to educate users to reco gnize dark patterns (as defined in the fo llo wing), and ho w to face them to pro tect their privacy in a conscio us way. As part o f the analysis, the life cycle o f a social m edia account was exam ined o n the basis o f five use cases: Opening a social m edia acco unt (use case 1), Staying info rm ed o n social media (use case 2), Staying pro tected o n social m edia (use case 3), Staying right o n so cial media: data subject rights (use case 4) and So lo ng and farewell: leaving a so cial m edia acco unt (use case 5).
In these Guidelines, the term user interface co rrespo nds to the m eans for peo ple to interact with so cial media platfo rm s. The do cum ent focuses o n graphical user interfaces (e.g. used for com puter and sm artpho ne interfaces), but so me o f the o bservatio ns m ade m ay also apply to vo ice-co ntrolled interfaces (e.g. used fo r sm art speakers) o r gesture-based interfaces (e.g. used in virtual reality). The term user experience co rrespo nds to the overall experience users have with so cial m edia
1 Refe rence s to Me mber Stat es mad e throughout this document should be understood as ref erenc es to EEA Memb er State s .
platfo rm s, which includes the perceived utility, ease o f use and efficiency o f interacting with it. User interface design and user experience design have been evo lving co ntinuo usly over the last decade. Mo re recently, they have settled fo r ubiquito us, custom ised and so-called seam less user interactio ns and experiences: the perfect interface sho uld be highly perso nalised, easy to use and m ultim o dal. 2 Even tho ugh tho se trends m ight increase the ease o f use o f digital services, they can be used in such a way that they prim arily prom ote user behavio urs that run against the spirit o f the GDPR. 3 This is especially relevant in the co ntext o f the attentio n eco no my, where user attentio n is co nsidered a com mo dity. In tho se cases, the legally perm issible lim its o f the GDP R m ay be exceeded and the interface design and user experience design leading to such cases are described below as dark patterns .
In the co ntext o f these Guidelines, dark patterns are co nsidered interfaces and user experiences im plem ented o n so cial media platfo rm s that lead users into making unintended, unwilling and po tentially harm ful decisions in regards o f their personal data. Dark patterns aim to influence users behavio urs and can hinder their ability to effectively pro tect their perso nal data and make co nscio us cho ices 4, fo r exam ple by m aking them unable to give an info rm ed and freely given co nsent . 5 This can be exploited in several aspects o f the design, such as interfaces co lo ur cho ices and placement of the co ntent. Co nversely, by providing incentives and user-friendly design, the realisatio n o f data pro tectio n regulatio ns can be suppo rted.
Dark patterns do not necessarily o nly lead to a violatio n o f data pro tectio n regulatio ns. Dark patterns can, fo r exam ple, also vio late co nsumer protectio n regulatio ns. The boundaries between infringem ents enfo rceable by data protectio n autho rities and those enfo rceable by natio nal co nsumer pro tectio n autho rities can o verlap. Fo r this reaso n, in additio n to exam ples o f dark patterns, the Guidelines also present best practices that can be used to avo id undesirable but still legally com pliant user interfaces. Data protectio n autho rities are respo nsible for sanctio ning the use o f dark patterns if they actually vio late data protectio n standards and thus the GDP R. Breaches o f GDP R requirem ents need to be assessed o n a case-by-case basis. Only Dark patterns that m ight fall within this regulato ry m andate are covered by these Guidelines. Fo r this reaso n, in additio n to exam ples o f dark patterns, the Guidelines also present best practices that can be used to design user interfaces which facilitate the effective im plem entatio n o f the GDP R.
The dark patterns6 addressed within these Guidelines can be divided into the fo llo wing catego ries:
Overloading: users are co nfro nted with an avalanche/ large quantity o f requests, info rm atio n, o ptio ns o r possibilities in o rder to pro m pt them to share m ore data o r unintentio nally allo w perso nal data pro cessing against the expectatio ns o f data subject.
Skipping: designing the interface o r user experience in a way that the users forget o r do not think abo ut all o r som e o f the data pro tectio n aspects.
2 For more detail s s ee CNIL, IP Report No. 6: Shaping Choice s in the Digital World, 2019. p. 9 https://w ww. cnil.fr/ sit es/d ef ault/file s/atom s/f ile s/cnil _ip_ report_06 _shaping_ choice s_i n_the_digital _world.p df. 3 CNIL, Shaping Choices in the Digital World, 2019. p. 10. 4 CNIL, Shaping Choices in the Digital World, 2019. p. 27. 5 See Norw egian Consu mer Co uncil, Deceived by design: How tech companies use dark patterns to discourage us from exercising our rights to privacy, p. 10 https://fil. forbrukerradet.no/wp-cont ent/uploads/201 8/06/ 201806-27-dec ei ved-by-desi gn-f inal.pdf, but also CNIL, Shaping Choice s in the Digital World, p. 30, 31.. 6 Categorie s of dark patterns a nd types of dark patterns within these cate gorie s wil l be dis played in bold and itali cs in the text of the Guide lines. A detail ed ov ervi ew i s p rovided in the Annex.