Advertisement

Self-driving cars can be fooled by fake signals

You'd think that self-driving cars would be most vulnerable to remote hacks, but the biggest danger may come from someone nearby with a handful of cheap electronics. Security researcher Jonathan Petit has determined that you can fool LIDAR (the laser ranging common on autonomous vehicles) by sending "echoes" of fake cars and other objects through laser pulses. All you need is a low-power laser, a basic computing device (an Arduino kit or Raspberry Pi is enough) and the right timing -- you don't even need good aim. Petit managed to spoof objects from as far as 330 feet away in his proof-of-concept attack, and he notes that it's possible to present multiple copies of these imaginary objects or make them move. In other words, it'd only take one prankster to make a self-driving car swerve or stop to avoid a non-existent threat.

There's no guarantee that this will be a major issue if and when self-driving cars become commonplace. Petit's technique only works so long as LIDAR units' pulses aren't encrypted or otherwise obscured. While that's true of many commercial systems at the moment, it's possible that production-ready vehicles will lock things down. Still, this is a not-so-friendly reminder that car makers have a lot of work ahead of them if they're going to secure their robotic rides.

[Image credit: AP Photo/Tony Avelar]