PACT Act Section 230 ‘Fix’ Could Harm Adult Platforms
WASHINGTON — Congress is developing several proposals to overhaul technology platform liability shields outlined under Section 230 of the Communications Decency Act of 1996. Sens. Brian Schatz, D-Hawaii, and John Thune, R-South Dakota, reintroduced the Platform Accountability and Consumer Transparency (PACT) Act. The proposal intends to update Section 230 to make platform content moderation practices more transparent and hold interactive computer services accountable for supposedly violating free speech online.
“Section 230 gave internet companies the flexibility to grow the internet economy along with the responsibility of setting and enforcing reasonable rules on content,” said Schatz in a statement. Thune, a pro-Trump Republican politician, is standing with Schatz to change technology policy in the United States to benefit internet consumers. That might not be the case, though.
The PACT Act was previously introduced in the summer of 2020 and was viewed extensively as an unconstitutional attempt to fix an issue that lawmakers like Thune seem to misunderstand. One of the PACT Act’s biggest problems pertains to how the bill would change the liability shield. Currently, there are law enforcement exceptions to the liability standard.
However, if the PACT Act were to become law, it would extend the exception of liability to civil statutes. Consequently, regulatory agencies like the Federal Communications Commission would have the power to file legal complaints against platforms that allegedly violate civil laws, including laws dealing with anti-discrimination and user accessibility.
In an emailed statement to Bloomberg Law, Thune said that the PACT Act would “preserve user-generated content and free speech on the internet while increasing consumer transparency and the accountability of big internet platforms.” Thune and Schatz also claim that the bill could be a much-needed reform to better the internet.
Under the bill’s current form, large online platforms would be required to remove content within at least four days of a court order claiming that the specific content is illegal. The updated version would also require private companies to issue biannual transparency reports that include removed, demonetized, and deprioritized content. In its right, that would grant the Department of Justice, the Federal Trade Commission, and state attorneys general the legal authority to file civil lawsuits against companies that are said to violate the law.
All of this simplified, the PACT Act essentially proposes various mandates that go far beyond Section 230 compliance.
After further review of the proposal, this bill basically looks like a bipartisan laundry list of policy changes that have already been addressed by the courts and in private arbitration. The PACT Act would mandate internet companies to implement a variety of regulatory requirements. First, the law would require platforms to publish an acceptable use policy. Second, companies would have to provide a 5-days-per-week, 8-hours-per-day hotline for people to ask live company representatives to answer questions about the acceptable use policy and content moderation decisions.
The law would mandate companies to offer users a means to report content that they may find illegal or might violate an acceptable use policy. There is to be an email complaint system for the same purposes while mandating a ‘formal appeals process for people who don’t like a company’s content moderation decisions.’
The PACT Act would also remove protections for companies with “actual knowledge” of illegal content or illegal activity posted by users and fail to remove it within four days. Online entities that allow for third-party-generated content would have to explain to government regulators why a company made a content moderation decision and why they continue to do so. If a platform manager fails to comply with the transparency requirements, they would be violating federal law in an unfair or deceptive act.
Platforms, however, already do much of this. Considering the PACT Act’s potential impact, the law still applies directly to “interactive computer services.” That means that adult entertainment platforms could fall under a similar regulatory purview of the PACT Act is implemented into law.
Platforms that feature consensual NSFW content, like OnlyFans, maintain large user databases with international reach. OnlyFans, owned by the London-based Fenix International Limited, operates with well over 50 million users and tens of thousands of content creators. Because of this reach, the PACT Act’s requirements applied to the so-called “big tech” segment would directly apply to such a platform and would introduce a variety of new issues that could impact performers, amateur studios, cam models, NSFW influencers, and other social media users.
Cam platforms like ManyVids, Chaturbate, LiveJasmin, MyFreeCams, and CAM4.com could also fall under this PACT Act framework — a fact that’s highly problematic, for reasons elucidated by legal scholar Eric Goldman.