Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

NYPD faces lawsuit for withholding info on facial recognition

After 100 information requests, the department only released one document on the program.

Spencer Platt via Getty Images

A think tank is suing the NYPD over its failure to reveal details about its secret facial recognition program. Georgetown University's Center on Privacy and Technology (CPT) alleges that the department hasn't complied with New York state's Freedom of Information Law (FOIL) by forking over information on the system, which the department started using to investigate crimes in 2011. When groups submitted FOIL requests for training manuals and documentation, the NYPD insisted they didn't have any, so CPT is taking the department to court.

The NYPD did share one document, a Chief of Detectives memo that instructs officers on protocol to submit a request for a facial recognition search. It's confirmation the program exists, but not how it's used, how it was built and what privacy protections are in place for the database of citizens' faces. Which is unsettling considering that the unit built around the facial identification program had conducted over 8,500 facial recognition investigations leading to 2,000 arrests from 2011 until last year, a former NYPD official told the NY Daily News. The department didn't deny that purchase invoices for the program exist, but claimed they couldn't be requested by FOIL, which is another claim CPT is debating in its lawsuit.

New York City isn't the only place in the state investing in more face-screening tech. The NY DMV's facial recognition system was responsible for over 100 arrests since refining it in January 2016, and Governor Cuomo announced last October that he wanted to expand it to critical locations by covering bridges, tunnels and airports in cameras and sensors. Ideally, this will help eyeball terrorists, but even facial recognition databases used by our top agencies are far from foolproof: The FBI's system returns false positives 15 percent of the time, more frequently with women and minorities.