2 minute read

4.5.1 Effectiveness of the detection

(including stills from videos). To enable the matching, the images that the provider is processing as well as the images in the database must have been made digital, typically by converting them into hash values. This kind of hashing technology has an estimated false positives rate of no more than 1 in 50 billion (i.e., 0,000000002% false positive rate).

34

Advertisement

40. For the detection of new CSAM, a different type of technology is typically used, including classifiers and artificial intelligence (AI). 35 However, their error rates are generally significantly higher. For instance, the Impact Assessment Report indicates that there are technologies for the detection of new CSAM whose precision rate can be set at 99.9% (i.e., 0,1% false positive rate), but with that precision rate they are only able to identify 80% of the total CSAM in the relevant data set.

36

41. As for the detection of solicitation of children in text-based communications, the Impact Assessment Report explains that this is typically based on pattern detection. The Impact Assessment Report notes that some of the existing technologies for grooming detection have an “accuracy rate” of 88%37 . According to the Commission, this means that ‘out of 100 conversations flagged as possible criminal solicitation of children, 12 can be excluded upon review [according to the Proposal, by the EU Centre] and will not be reported to law enforcement’ . 38 However, even though – contrary to the Interim Regulation – the Proposal would apply to audio communications too, the Impact Assessment Report does not elaborate on the technological solutions that could be used to detect grooming in such a setting.

4.5.1 Effectiveness of the detection

42. Necessity implies the need for a fact-based assessment of the effectiveness of the envisaged measures for achieving the objective pursued and of whether it is less intrusive than other options for achieving the same goal. 39 Another factor to be considered in the assessment of proportionality of a proposed measure is the effectiveness of existing measures over and above the proposed one. 40 If measures for the same or a similar purpose already exist, their effectiveness should be assessed as part of the proportionality assessment. Without such an assessment of the effectiveness of existing measures pursuing the same or a similar purpose, the proportionality test for a new measure cannot be considered as having been duly performed.

43. The detection of CSAM or grooming by providers of hosting services and providers of interpersonal communications services is capable of contributing to the overall objective of preventing and combating child sexual abuse and the online dissemination of child sexual abuse material. At the same

34 See European Commission, Commission Staff Working Document, Impact Assessment Report Accompanying the document Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse, SWD(2022) 209 final (hereinafter ‘Impact Assessment Report’ or ‘SWD(2022) 209final’), p. 281, fn. 511. 35 Impact Assessment Report, p. 281. 36 Ibid, p. 282. 37 Ibid, p. 283. 38 Proposal, COM(2022)209 final, p. 14, fn. 32. 39 EDPS, Assessing the necessity of measures that limit the fundamental right to the protection of personal data: A Toolkit, 11 April 2017, p. 5;EDPS, EDPS Guidelines on assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data (19 December 2019), p. 8. 40 EDPS, EDPS Guidelines on assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data (19 December 2019), p. 11.

This article is from: