The system, known as iCOP (Identifying and Catching Originators in P2P Networks), works similarly to Microsoft's Photo DNA, wherein images of child porn are tagged with a digital signature after being collected in the course of an investigation. These signatures are then shared as a global database for law enforcement. If the same images or videos resurface during other investigations, they're automatically flagged. This saves law enforcement the stomach-turning drudgery of manually checking the images against the database. This saves time, manpower and accelerates investigations. What's more, it automatically identifies new material (anything that doesn't get flagged), which provides fresh leads on more recent crimes.
And given that, according to the UN, 16 percent of people who possess this sort of material have themselves abused children, reducing the amount of time between discovery and arrest can help save children from further exploitation. The iCOP system is designed for use on Gnutella and has been trained with tens of thousands of images ranging from adult porn and benign images of kids to the full-on sexual abuse of minors.
Interpol has already begun testing iCOP for its own use in the Lyon region of France. Once installed on the Interpol system and linked to other databases like Project Vic, iCOP returned false positives in less than 8 percent of images and in just over 4 percent of videos.
"It significantly reduces the overhead for investigators," Awais Rashid, a professor at Lancaster University (which helped develop the system) told WIRED. "Instead of having to trawl through large numbers of images and videos to identify new child abuse material, investigators are provided with automated matches which are highly accurate. In practice, this means investigators having to look at a small number of images and videos rather than thousands." Given its initial success with Interpol, the iCOP team hopes to expand the system out to TOR-obscured networks.