Should Adult AI Platforms Be Concerned About MN’s “Deep Fake” Law?
SAINT PAUL, Minn. – Earlier this month, a new law went into effect in Minnesota that criminalizes nonconsensual dissemination of “deep fake” images “depicting intimate parts or sexual acts.” The new law also creates a civil right of action for the same offense, as well as criminal offense for using “deep fake technology to influence an election.”
Under the law, “deep fake” is defined as “any video recording, motion-picture film, sound recording, electronic image, or photograph, or any technological representation of speech or conduct substantially derivative thereof… that is so realistic that a reasonable person would believe it depicts speech or conduct of an individual and the production of which was substantially dependent upon technical means, rather than the ability of another individual to physically or verbally impersonate such individual.” (Statutory section numbers are omitted throughout this post.)
The section of the statute which creates the civil right of action for nonconsensual dissemination of sexual deepfakes states such a cause of action exists when “a person disseminated a deep fake with knowledge that the depicted individual did not consent to its public dissemination; the deep fake realistically depicts any of the following: the intimate parts of another individual presented as the intimate parts of the depicted individual; artificially generated intimate parts presented as the intimate parts of the depicted individual; or the depicted individual engaging in a sexual act; and the depicted individual is identifiable from the deep fake itself, by the depicted individual or by another individual; or from the personal information displayed in connection with the deep fake.”
The law also specifies that the “fact that the depicted individual consented to the creation of the deep fake or to the voluntary private transmission of the deep fake is not a defense to liability for a person who has disseminated the deep fake with knowledge that the depicted individual did not consent to its public dissemination.”
Under the law, the court may award damages to a prevailing plaintiff that include “general and special damages, including all finance losses due to the dissemination of the deep fake and damages for mental anguish; an amount equal to any profit made from the dissemination of the deep fake by the person who intentionally disclosed the deep fake; a civil penalty awarded to the plaintiff of an amount up to $100,000 and court costs, fees, and reasonable attorney fees.”
Under the law, it is a crime intentionally disseminate a deep fake under the same set of conditions described earlier in the statute as a basis for a civil right of action. Those convicted under the law “may be sentenced to imprisonment for not more than three years or to payment of a fine of $5,000, or both” when “the depicted individual suffers financial loss due to the dissemination of the deep fake;” when the guilty party “disseminates the deep fake with intent to profit from the dissemination;” if the person convicted “maintains an Internet website, online service, online application, or mobile application for the purpose of disseminating the deep fake;” if the culprit “posts the deep fake on a website” and/or “disseminates the deep fake with intent to harass the depicted individual.” The law also provides for the increased fees and jail time if the person convicted “obtained the deep fake by committing a violation” of certain other sections of Minnesota law, or if they have “previously been convicted under this chapter.”
Another section of the law stipulates nothing in the new statute “shall be construed to impose liability upon the following entities solely as a result of content or information provided by another person: (1) an interactive computer service as defined in United States Code, title 47, section 230, paragraph (f), clause (2); (2) a provider of public mobile services or private radio services; or (3) a telecommunications network or broadband provider.”
Attorney Larry Walters of FirstAmendment.com noted Minnesota is not the only state to have established laws concerning deepfakes – nor is Minnesota’s statute the most broadly worded of such laws.
“Nonconsensual sexual deepfake laws are not generally limited to celebrities, and anyone depicted in a nonconsensual sexual deepfake can bring a claim, regardless of whether they are famous or not,” Walters observed. “Separately, New York has an additional cause of action for celebrities. Under that statute, a celebrity can bring a claim if a deepfake is used in an advertisement without their authorization, regardless of whether the deepfake is sexual or not.”
Asked whether sites and platforms which enable users to create deepfakes should be concerned about the scope of the Minnesota law, Walters said the “the questions surrounding platform liability for deepfake content are unsettled, particularly if the platform allows users to generate AI images utilizing the platform’s own technology/input.”
“However, if the platform simply allowed users to share their own AI content, Section 230 immunity would presumably protect the platform – unless a FOSTA exception was invoked relating to prostitution or sex trafficking,” Walters added.
For more information on the status of deepfake-related laws and how they might impact your business, see this article by Walters and his colleague, Bobby Desmond.
Image by cottonbro studio from Pexels