Google to Remove ‘Revenge Porn’ from Search Results
MOUNTAIN VIEW, Calif. – Google on Friday announced it will remove “revenge porn” from search results at the request of the victim. Revenge porn comprises nude or sexually explicit images posted online without permission and with intent to harass, humiliate or harm the reputation of the person depicted.
“Our philosophy has always been that Search [sic] should reflect the whole web,” Amit Singhal, senior vice president of Google Search wrote in a post on the company’s Public Policy Blog. “But revenge porn images are intensely personal and emotionally damaging, and serve only to degrade the victims—predominantly women. So going forward, we’ll honor requests from people to remove nude or sexually explicit images shared without their consent from Google Search results. This is a narrow and limited policy, similar to how we treat removal requests for other highly sensitive personal information, such as bank account numbers and signatures, that may surface in our search results.
“We know this won’t solve the problem of revenge porn—we aren’t able, of course, to remove these images from the websites themselves—but we hope that honoring people’s requests to remove such imagery from our search results can help.”
Singhal’s post noted a request form will be available “in the coming weeks.”
Victim advocates immediately applauded the move. Although 18 U.S. states had outlawed revenge porn by the middle of June, the laws ban only websites, not search engines, from displaying the material. Rep. Jackie Speier [D-Calif.] is preparing a federal bill to ban revenge porn, but it, too, will take aim at websites, not search. None of that goes far enough to prevent additional harm to revenge porn victims, activists said.
University of Maryland law professor Danielle Citron, author of Hate Crimes in Cyberspace, said Google’s move is an important step in toward protecting personal privacy rights. Google effectively controls what 84 percent of the global population can find on the web. Even when images are removed from the primary source, secondary vectors like Google and its brethren perpetuate distress by maintaining copies of the images with links to where they originally were posted, Citron noted.
“What we have seen in the last six months is this public consciousness about the profound economic and social impact that posting nude images without someone’s consent and often in violation of their trust can have on people’s lives,” Citron told USA Today. “What victims will often tell you and what they tell me is that what they want most is not to have search results where their employers, clients and colleagues can Google them and see these nude photos. It’’s not just humiliating; it wrecks their chances for employment. It makes them undatable and unemployable.”
Worldwide, Google already removes some search results based on court orders and other legal requests. The search engine also removes bank account numbers, signatures and other sensitive information upon request. Child sexual abuse images are removed as soon as they are discovered.
Under a European Union ruling known as the “right to be forgotten” law, Google is obligated to remove, upon request, links to any unfavorable information linked to an individual’s name, including records of criminal convictions.
That doesn’t mean the information no longer exists. Google has no control over what websites post or whether they remove objectionable material. Without a link in Google’s immense database, though, the material is virtually invisible. Those who applauded Google’s move hope invisibility will take all the fun out of what amounts to cyber-bullying.
“If it’s not in Google, does it actually exist? The answer is yes, it does exist but it’s a heck of a lot harder to find,” Danny Sullivan, founding editor of SearchEngineLand.com, told USA Today. “Even this won’t make it impossible but it does make it more difficult and, when it’s more difficult, it makes it less attractive for people to do this kind of behavior.”