Cyber Security
s part of its ongoing efforts to prevent and eradicate sexual exploitation and abuse of children online, Facebook, together with its partners across the Asia-Pacific, launched its public safety campaign against Child Sexual Abuse Material (CSAM) in 10 countries including the Philippines. The campaign aims to raise awareness on how the public can help prevent the revictimization of affected children by reporting CSAM content to Facebook and law enforcement and avoid sharing these malicious content.
A
“While this data indicates that the number of pieces of content does not equal the number of victims, one victim is one too many. Preventing and eradicating online child sexual exploitation and abuse requires a cross-industry approach, and Facebook is committed to doing its part to protect children on and off our apps. We are taking a research-informed approach to develop effective solutions that disrupt the sharing of child exploitation material,” - said Malina Enlund, Facebook Safety Policy Manager, APAC.
Over the past year, Facebook consulted the worldleading experts in child exploitation, including the National Center for Missing and Exploited Children (NCMEC) and Professor Ethel Quayle, a worldleading clinical psychologist who specializes in sex offenders, to improve the company’s understanding of why people share child exploitation content. The campaign was launched in light of the recent findings from research conducted by Facebook on its CyberTips with the US-based National Center for Missing & Exploited Children (NCMEC). This was done to better understand why people may share CSAM on Facebook and its family of apps. The study evaluated 150 accounts that were reported to NCMEC for uploading CSAM in July and August of 2020 and January 2021.
As part of the campaign, Facebook released a public service announcement video that stresses the impact of CSAM on the victimized children, especially when these are re-shared online. Instead, the public is encouraged to report CSAM content to Facebook, through the in-app reporting channels, and the authorities to better help and protect the young victims.
Results have shown that more than 75% of these materials did not exhibit malicious intent to harm a child, but appeared to be shared for other reasons, such as outrage or poor humor. Data also showed that more than 90% of CSAM were reshares of previously reported content. While this data indicates that the number of pieces of content does not equal the number of victims, the same content, potentially slightly altered, is being shared repeatedly.
July 2021
In the Philippines, Facebook has partnered with the Inter-Agency Council Against Trafficking (IACAT), the Inter-Agency Council Against Child Pornography (IACACP), Stairway Foundation, and Child Rights Network in rolling out the campaign locally.
116
gadgetsmagazine.com.ph