AI Knows It When AI Sees It, Unless “It” Is A Sand Dune
LONDON – Every since the idea of web content filters first came around, consumers, developers and policy-makers alike have been confronted with the issues of over- and under-inclusiveness.
Whether we’re talking about browser-based filters which interpret on-page content and determine information about breast cancer is pornographic, or a more recent kid-friendly content filter which somehow fails to block Pornhub, it’s clear these systems historically haven’t been as accurate or foolproof as they need to be to fend off criticism from speech activists, aggrieved parents or ass-covering politicians.
While image recognition technology has improved greatly over the years, the Metropolitan Police have discovered it’s still not reliable enough to obviate the need for human eyes to be involved in the process of discovering and cataloging illegal images like depictions of child abuse – particularly if the suspect whose hard drives and devices are being scanned happens to like images of sand dunes.
“Sometimes it comes up with a desert and it thinks it’s an indecent image or pornography,” said Mark Stokes, the MP’s head of digital and electronic forensics. “For some reason, lots of people have screen-savers of deserts and [the software] picks it up thinking it is skin color.”
The MP have a good reason to hope artificial intelligence will someday entirely supplant human reviewers of such images, of course: Nobody wants to be tasked with sifting through explicit images to identify and categorize horrific depictions of child abuse.
“We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans,” Stokes said. “You can imagine that doing that for year-on-year is very disturbing.”
Despite the ongoing problem of misidentification, Stokes said he believes such AI-aided image recognition systems will be able to reliably pick out images of abuse “within two-three years.”
In the meantime, the MP is trying to tackle another logistical headache of the effort: Storing the images themselves.
Stokes said the police force is in the process of outlining a plan to move its abuse image data to a cloud service offered by the likes of Google, Microsoft or Amazon, but such a move is greatly complicated by legal concerns.
While centralizing the image data on a cloud service would facilitate digital analysis using the enormously powerful resources available to Microsoft or Google, law enforcement has permission from the courts to store illegal images, while cloud service providers are granted no such exception from the law.
Plus, given the fact there have been several well-publicized breaches of cloud-based systems in recent years, including some which resulted in explicit private images being released online, policymakers might be reasonably concerned about any plan to host child pornography and sexual abuse images in the cloud.
Still, Stokes believes such legal issues can be dealt with through careful contract-writing, including stipulations which would indemnify the cloud service providers from liability and provide them with legal cover for storing the images.
“We have been working on the terms and conditions with cloud providers, and we think we have it covered,” Stokes said.
Stokes also noted that when it comes to security concerns, former breaches notwithstanding, companies like Amazon, Microsoft or Google are likely better able to defend against unauthorized access than a law enforcement agency, in part because the private sector is better funded to invest in expert tech talent and advanced security technology.