Another CSAM Report with Little to Say About Adult Sites
While some anti-porn crusaders would have you believe adult websites that publish user-generated content are the root of all evil when it comes to the distribution of child sexual abuse material (CSAM), reports published by experts in the field reflect a different reality.
“Protecting Children in the Age of End-to-End Encryption”, a report authored by Laura Draper, Senior Project Director at the Tech, Law & Security Program at the Washington College of Law at American University, is another example of a detailed, in-depth look at the problem with CSAM that has precious little to say about adult websites.
Draper’s report addresses the ways in which end-to-end (E2E) encryption further complicates the already difficult challenge of combatting CSAM and the broader phenomenon of child sexual exploitation and abuse (CSEA). Taking as a given that E2E is here to stay as a technology, Draper says the question is how we go about combatting online child sexual exploitation and abuse in the environment enabled by E2E?
“Discussions of how to combat online child sexual exploitation and abuse often morph into debate over the wisdom of end-to-end encryption, which is a method of secure communication that prevents third parties from accessing content while it is transferred from one system or device to another,” Draper notes. “In the context of online CSEA, the debate often treats privacy and child safety as mutually exclusive concepts and pits them against each other. This report avoids this debate by acknowledging that end-to-end encryption is or will become the default across information and communication technologies. The question motivating this project is how can stakeholders — tech companies, law enforcement, civil society—be held accountable for their commitment to combat online CSEA given the increasing adoption and proliferation of end-to-end encryption?”
Draper drills into the issues surrounding CSAM in detail, including addressing the difficulty of even establishing a reliable and accurate measure of the scope of the CSAM problem, “due to variations in definitions and data collection practices across jurisdictions” and the difference between “known” and “new” CSAM, while acknowledging that the sharing and distribution of either variety does harm to victims.
While the report offers several technology-based recommendations, Draper emphasizes that with a problem as broad and multifaceted as CSAM, the solutions we employ will need to be comprehensive in order to be effective.
“Because the unifying thread in online CSEA is the internet, the temptation is to look exclusively at technological solutions,” Draper observes. “However, the underlying issue – child sexual exploitation and abuse – would exist even if the internet did not. While there are some promising technological interventions, they must be considered within the larger context of the issue and should be explored concurrently with in-person, nontechnical solutions for a holistic approach to address the problem.”
Draper’s recommendations are sensible – although in some cases, likely not particularly palatable to social conservatives. Among her recommendations are “comprehensive sex education” and perpetration-prevention programs that include “peer support for people who are at risk of offending, but who have committed never to harm a child.”
While the report mentions adult content and adult sites only sparingly, it does include one recommendation for adult platforms that accept uploads of user-generated content – one that some platforms have already adopted on their own.
“For websites and platforms that primarily host explicit adult content, a disconnect often exists between the incentives for people managing the site and the incentives for people creating content for the site,” Draper writes. “Content creators are often aligned with people seeking to combat child exploitation; by empowering creators to have more control over platform design features, less illicit content would be hosted on a given site. For instance, mandating preverified uploads – i.e., content cannot be uploaded without express consent and age verification of the individuals featured – protects both groups: content creators ensure their original work cannot be copied and uploaded without their permission, and CSAM would also be prevented from upload.”
What Draper’s report doesn’t recommend is telling, as well. She doesn’t advocate for banning porn, sweeping federal law to impose stricter regulations on the adult industry, or that law enforcement prioritize investigating adult sites.
For anyone interested in combatting the problem of CSAM and CSEA, Draper’s report offers numerous promising, constructive suggestions. On the other hand, for anti-porn activists simply looking for more ammunition in their ongoing campaign to demonize the adult industry under the guise of combatting CSAM, the report will prove a disappointment.