Advertisement

London police will use AI to look for child porn on seized devices

It would make the force more efficient and save officers from psychological trauma.

Jack Taylor via Getty Images

Around this time last year, Interpol revealed it was using an AI system to track down child porn on P2P networks in the global hunt for predators. Tech firms like Google and Microsoft have been using their own tools in the fight against child exploitation for years, too. Now, the UK's Metropolitan Police say they want AI recognition software of their own that's capable of identifying images and video of abuse on confiscated devices like smartphone and computers.

Currently, the Met's image recognition software can detect guns, drugs and money while scanning hardware for evidence, but struggles to make accurate calls on nudity. In the next "two to three" years, though, the force wants a more sophisticated AI tool. Not only should this help with investigation load -- the digital forensics department processed 53,000 devices last year -- but spare officers from some of the psychological trauma that comes with looking at images of child abuse day-in, day-out.

The Met police is also looking to move all the information it holds from its London-based data centre to a commercial cloud provider, such as Google or Amazon. Regarding security, the Met's head of digital and electronics forensics told The Telegraph these companies are actually better positioned to safeguard this data since they have the resources to invest in the latest server armour. As you might imagine, there are many legal issues that could get in the way of the Met moving sensitive images off-site. It's merely a plan at this point, though, so who knows what special arrangements the Met and cloud providers may come up with in the future.