The PROTECT Act: Overbroad and Vague
In a press release issued last week, Sen. Mike Lee (R-UT) announced that he has introduced the Preventing Rampant Online Technological Exploitation and Criminal Trafficking (“PROTECT”) Act of 2022, saying that adult websites “need to do more to prevent the exploitation that is occurring on their platforms and allow individuals to remove images shared without their consent.”
Ostensibly, the bill is designed to prevent the distribution of child sexual abuse material (“CSAM”) on adult websites – and those that accept the upload of user-generated content (“UGC”), in particular – but by its own terms, the PROTECT Act’s scope appears to be much broader than that.
One of several pieces of legislation that have sported the “PROTECT” acronym over the years, Lee’s new proposal begins with a long set of findings that reference reports by the New York Times, the National Center for Missing and Exploited Children (NCMEC) and other sources. These findings emphasize data points that appear to be tied to the criminal prosecution of individuals associated with GirlsDoPorn.com (the names of this company and related individuals are not included), as well as claims found in civil lawsuits targeting Pornhub and other sites that feature UGC. (The bill omits reference to some of NCMEC’s less convenient reporting – including the fact that a much higher percentage of CSAM online is found on mainstream platforms like Facebook than on adult websites of any kind.)
Several of the bill’s key definitions are quite expansive, including a definition of “covered platform” that encompasses “an interactive computer service that hosts or makes available to the general public pornographic images.” A subsequent definition, that of “pornographic image”, means “any visual depiction of actual or feigned sexually explicit conduct; or any intimate visual depiction.”
The definition of “intimate visual depiction” is broader still, covering “any visual depiction… of an individual who is reasonably identifiable from the visual depiction itself or information displayed in connection with the visual depiction, including through… facial recognition; an identifying marking on the in1dividual, including a birthmark or piercing; an identifying feature of the background of the visual depiction; voice matching; or written confirmation from an individual who is responsible, in whole or in part, for the creation or development of the visual depiction; and in which… the individual depicted is engaging in sexually explicit conduct; or the naked genitals, anus, pubic area, or post-pubescent female nipple of the individual depicted are visible.”
As far as what the bill would require operators of “covered platforms” to do, the bill would impose exacting age and identity verification responsibilities of the sort that have been rejected by U.S. courts when scrutinizing previous legislation.
Under the bill, a “covered platform operator may not upload or allow a user to upload a pornographic image to the covered platform unless the operator has verified… the identity of the user… and that the user is not less than 18 years old.” (Internal citations omitted.)
To comply with this requirement, platform operators “shall verify the identity and age of a user by… requiring use of an adult access code or adult personal identification number… accepting a digital certificate that verifies age… or using any other reasonable measure of age verification that the Attorney General has determined to be feasible with available technology.”
The bill would also require platform operators to “obtain verified consent forms from individuals uploading content and those appearing in uploaded content” and “mandate that websites quickly remove images upon receiving notice they uploaded without consent,” to put it in the language of Lee’s press release.
Attorney Larry Walters notes that “many of the obligations the bill seeks to impose are already followed by adult platforms in accordance with the Updated MasterCard Guidelines effective in October, 2021.”
“Mainstream platforms that allow some adult content will be more severely impacted than adult platforms since they have generally not been required to adopt these same standards since they do not rely on credit card processing to sell subscriptions,” Walters told YNOT.
From a constitutional perspective, Walters said the bill “suffers from many of the same infirmities that resulted in Section 2257 to be found largely unconstitutional in the Free Speech Coalition litigation.”
“Treating ‘pornography’ different from other forms of protected speech is a content-based distinction and will require that the government demonstrate a compelling interest which has been addressed through the least restrictive means,” Walters explained. “This test has resulted in invalidation of most content-based restrictions on speech and can only be met in unique circumstances. The law also appears to be overbroad and vague, given the imprecise definitions of the content and platforms subject to the restrictions and obligations.”
The bill’s vagueness creates other potential constitutional complications, as well.
“A separate question arises whether some or all of these new requirements apply to websites that do not allow user uploads, or even to hosts that are removed from the operation of a platform,” Walters said. “The imprecise language used in the Bill could be constitutionally problematic from a First and Fifth Amendment perspective.”
Lee’s bill has not been heard by any Senate committee or subcommittee yet and is bound to undergo changes in the process of being brought to the floor for a vote – if indeed it advances that far. YNOT will continue to track the bill and provide updates on any substantive developments in the weeks and months ahead.