Does Banning Adult Content on Social Media Sites Prevent CSAM?
WASHINGTON — The National Center on Sexual Exploitation (NCOSE) sent a joint letter to Reddit demanding that the platform remove and censor thousands of subreddits hosting hundreds of millions of NSFW images, GIFs, and videos that are otherwise legally produced by consenting adults. NCOSE’s letter was signed by 320 “experts” accusing Reddit of failing to prevent image-based sexual abuse materials, including child sexual abuse material and non-consensual intimate imagery. However, the letter doesn’t outline any movements or policy recommendations to update the social media platform’s terms of service to prevent this material and chooses to remove all content, even if it is legal and consensual. This is just yet another flashpoint in the latest attempts to widely block adult content on social media sites.
Reddit is one of the most popular social media websites. This social network has grown in popularity over the years and is a friendly and pro-speech platform for various viewpoints and subject matter, especially adult content posts, videos, and images. But a recent trend in forcing censorship on sex-friendly social networks has led to a significant crackdown. Partly driven by the legislative history behind the highly controversial FOSTA-SESTA signed into law by former President Donald Trump, sex workers have openly reported that Reddit has de-platformed them time and again. This trend stems from other platforms having removed adult content including Tumblr in 2018.
Reddit also isn’t the first platform to face accusations of widespread image-based sexual abuse. Much of the justification for these types of campaigns to censor legal adult content can be linked to efforts to counter CSAM circulation on some of the world’s most popular sites. The social media networks owned by Meta Platforms Inc., Instagram and Facebook, have long had terms and services that prohibit adult content and pornography on their platforms. Reporting numbers to the CyberTipline program sponsored by the National Center for Missing and Exploited Children (NCMEC), however, indicate that adult content prohibitions have little to do with image-based sexual abuse prevention. Over 22 million reports, the vast majority of CyberTipline reports made to NCMEC in 2021, were made by the Meta-owned platforms. As Vice News points out, Reddit only transmitted 10,059 to NCMEC.
NCMEC data for 2022 indicates that Reddit submitted 52,592 reports. Imgur, a popular platform for hosting images on social media sites like Reddit, has recently faced a similar controversy. Imgur updated its terms of service with the intention of banning the upload of new adult content to its website. It will also remove several years of other sexually explicit material, mainly virtual pornography and consensual adult material, from its site. This caused a stir, including among subreddit communities with millions of members. These communities include r/Gonewild (4.4 million members), r/NSFW (4 million members), r/Sex (2.4 million members), and r/RealGirls (3.6 million members).
Members of these sorts of communities commonly use Imgur to host images and GIFs. Imgur will remove petabytes of content that, according to some sources, would “break” entire web communities. But does banning adult-oriented and sexual content prevent CSAM? In the same datasets from the CyberTipline for both 2021 and 2022, Imgur did its civic duty of reporting to NCMEC even with nudity allowed.
In no way am I indicating that there is a direct connection between reporting numbers of CSAM and the terms and services for social media networks that permit or ban adult content. The point is to understand everything possible that can account for harm reduction in site policy.
Social media image by Visual Tag Mx from Pexels