2 minute read
H. Content authenticity and provenance
tool is shared, the easier it will be for bad actors to exploit, meaning that dissemination should be controlled carefully.433
Given the many difficulties with using AI or other automated means to detect harmful content, it makes sense to focus on the flip side: authentication. While authentication tools do not necessarily help with every harm listed by Congress, they can be widely used to help determine the true source of content and whether text, images, audio, or video are deepfakes (see above) or have been otherwise manipulated. Indeed, multiple federal government reports state that these tools are key for challenging foreign disinformation and deepfakes.434 Experts from the State Department and elsewhere have pointed to blockchain technology as a means of determining content authenticity.435 Authentication could also help counteract the Liar’s Dividend, a problem discussed above, in that it would be harder for public figures to claim falsely that audio or video content is fake if one could point to technological markers that it is real and unaltered.
Advertisement
A major collaborative effort to advance authentication tools is the Coalition for Content Provenance for Authenticity (C2PA), formed in early 2021 by merging two other coordinated efforts, the Content Authenticity Initiative (led by Adobe) and Project Origin (led by Microsoft and the BBC). The goal of this coalition is to create an “open technical standard providing publishers, creators, and consumers the ability to trace the origin of different types of media.”436 In January 2022, it released technical specifications and guidance documents.437
Of course, proving that content has not been altered and comes from its claimed origin does not prove the truth of the content itself. Further, and just like detection technology, these tools are fallible, and it would be problematic if people were either too distrustful of content that had no authenticity markers or too trusting of content that did. For example, authentication does not help
433 This issue is discussed above in the part of Section I on deepfakes. See also Sam Gregory, et al., Governing Access to Synthetic Media Detection Technology, Tech Policy Press (Sep. 7, 2021), https://techpolicy.press/governing-access-to-synthetic-media-detection-technology/; EPRS, Tackling deepfakes in European policy, supra note 43 at 59 . 434 See NSCAI, Final Report, supra note 3 at 48; DHS, Increasing Threat of Deepfake Identities, supra note 43 at 31; EPRS, Tackling deepfakes in European policy, supra note 43 at 20, 65. See also Jaiman, supra note 62; Engler, supra note 62. 435 See J.D. Maddox, et al., Toward a More Ethical Approach to Countering Disinformation Online, Public Diplomacy 23(12) (Jul. 1, 2020), https://static1.squarespace.com/static/5be3439285ede1f05a46dafe/t/5efd72972af517215e330cdd/1593668272484/E THICS+IN+DIPLOMACY+Final.pdf. See also Kathryn Harrison and Amelia Leopold, How Blockchain Can Help Combat Disinformation, Harvard Bus. Rev. (Jul. 19, 2021), https://hbr.org/2021/07/how-blockchain-can-helpcombat-disinformation; Haya R. Hasan and Khaled Salah, Combating Deepfake Videos Using Blockchain and Smart Contracts, IEEE Access 7:41596 (Feb. 25, 2019), https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8668407. The News Provenance Project is also exploring the use of blockchain as a way to store contextual information about news photos. See https://www newsprovenanceproject.com/a-solution. 436 See https://c2pa.org/. 437 See https://contentauthenticity.org/blog/milestones-in-digital-content-provenance-specification-open-sourceprojects.