Do Porn Stars Really Have ‘Control’ Over Their Images?
DENVER – It’s amazing how often a pundit hailing from outside the adult entertainment industry invokes something about the porn industry to make a point. Most of the time they haven’t done enough homework to assure their industry-related observation holds water.
The latest example I’ve seen is a well-intentioned piece published on CNN.com, written by Amy Adele Hasinoff, an assistant professor of communications at the University of Colorado, Denver, and the author of Sexting Panic: Rethinking Criminalization, Privacy, and Consent.
The title of the article, “The policy that the US porn industry has and Facebook needs,” broadly implies the adult industry has solved a problem Facebook hasn’t. In this case, the problem is the unauthorized, non-consensual publication of intimate images, an act typically referred to as “revenge porn.”
Hasinoff’s argument starts with a statement that is true, so far as it goes: “Surprisingly, performers in the legal U.S. pornography industry have more control over their nude images than Facebook users. Facebook’s general policy is to post photos first and then deal with illegal content or requests for the image to be taken down later. In contrast, porn performers need to provide written consent — by signing a “model release” — before their images are ever published and distributed.”
The problem with this assertion is it fails completely to examine what happens to the images in question after a performer signs the model release.
Put bluntly, if performers and the rights-holders who possess the copyright to their images truly had “control” over the subsequent digital redistribution of those images, many of the world’s most popular adult sites wouldn’t exist.
To be fair, Hasinoff’s point appears to be Facebook should behave the way legitimate, law-abiding adult producers do (or are supposed to do, at any rate) by establishing a policy under which those depicted in images posted to the social networking platform have the option of denying others the ability to publish pictures of them.
“Imagine this: I could get a request for permission every time you try to post a photo of me — you’d have to tag everyone in your photo and facial recognition could help too,” Hasinoff writes. “Right now, I can get a notification if you tag me, but the photos have already been posted. Maybe you could post a photo with my face blurred out until you have my permission to show the full image. Maybe a setting could let me always (or never) trust you to post photos of me without asking each time.”
Aside from the obvious downsides of this notion from Facebook’s perspective — especially the way in which it would slow down and impede the users’ experience in making use of the platform — Hasinoff’s idea relies on one of two very uncertain protections: Either we’d need to rely on Facebook users to be completely honest and mistake-free in accurately tagging the images in question to identify the person from whom permission is to be sought, or Facebook’s image-recognition technology would need to be flawless in identifying the person in the image so they could be notified.
The idea also doesn’t address the possibility of photos being posted of people who don’t have Facebook accounts (yes, such people do exist) or the possibility of posting images of people who are unavailable to provide permission, whether it’s because they’re offline or deceased.
Going back to Hasinoff’s comparison between the relative ability of Facebook users and porn performers to control the use of their images, porn performer images wind up in memes and advertisements for products, and they’re also used as avatars on dating sites (often in the context of fake profiles) in addition to the previously referenced distribution on all manners of tube site. I think if we were to ask 1,000 porn performers whether they’re OK with unauthorized uses of their images and likeness, substantially more than 500 of them would say hell no.
The bottom line here is for all the same reasons content piracy and copyright violation are intractable problems inherent to digital distribution, revenge porn is going to be a very tough nut to crack no matter how it is approached.
This is not to say an idea like Hasinoff’s can’t help at all, but just like the porn industry “policy” (it’s more of a legal prophylactic than an industry policy) she cites, it wouldn’t come close to preventing the unauthorized image distribution that lies at the heart of the problem.
Image © publicdomainphotos