Billions of license plate scans are part of a private surveillance database
Repo drivers and others can pay a small fee to track cars.
The US government might have reconsidered its plans for license plate recognition, but companies haven't -- and they've raised serious privacy concerns in the process. Motherboard has posted an exposé detailing the Digital Recognition Network, a privately run database that collects legions of plate recognition scans (roughly 9 billion to date) from repo drivers with camera-equipped cars. The system automatically captures both the plates and locations of every car they drive by, making it possible to track the movement of car owners across the US over months or even years. Anyone with access could find out where you live, work and socialize.
It costs just $20 to look up a license plate in the data base, and $70 to receive a "live alert" that flags when a plate shows up.
As you might have already suspected, this automatic data gathering creates many issues. For one, most of the vehicles in the database are of completely innocent people who have no way of knowing if they're even included in the data set. And while a spokesperson for DRN said the company "takes data security seriously" and doesn't allow access without its approval, there have been instances where unauthorized people have obtained that access. It's feasible that users (approved and otherwise) could exploit this for stalking or gaining the upper hand in court without revealing sources.
Law enforcement can also use the system, and DRN's sibling brand Vigilant Solutions sells the tech to government agencies. That raises the potential of rogue officers using the plate tracking to intimidate protesters or witnesses of police abuses.
It may be difficult to challenge the practice. DRN has argued that it's taking photos of license plates in public spaces, where there allegedly isn't an expectation of privacy. As a private organization, DRN also isn't obligated to respond to requests for information or accept external oversight. Much like with facial recognition, you'll just have to hope that companies either mend their ways or limit the potential for abuse -- at least, for now.