Researcher hacks self-driving car via sensors

By Mark Harris, for IEEE Spectrum

Sep 11, 2015

The multi-thousand-dollar laser ranging (lidar) systems that most self-driving cars rely on to sense obstacles can be hacked by a setup costing just $60, according to a security researcher.

“I can take echoes of a fake car and put them at any location I want,” says Jonathan Petit, Principal Scientist at Security Innovation, a software security company. “And I can do the same with a pedestrian or a wall.”

Petit set out to explore the vulnerabilities of autonomous vehicles, and quickly settled on sensors as the most susceptible technologies. “This is a key point, where the input starts,” he says. “If a self-driving car has poor inputs, it will make poor driving decisions.”

Using such a system, attackers could trick a self-driving car into thinking something is directly ahead of it, thus forcing it to slow down. Or they could overwhelm it with so many spurious signals that the car would not move at all for fear of hitting phantom obstacles.

Read more about how Petit devloped the hack technology, and why sensor security matters.

Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments