Researchers were able to trick a Tesla Inc vehicle into speeding by putting a strip of electrical tape over a speed limit sign, spotlighting the kinds of potential vulnerabilities facing automated driving systems.
Technicians at McAfee Inc placed the piece of tape horizontally across the middle of the “3” on a 35 mile-per-hour speed limit sign. The change caused the vehicle to read the limit as 85 miles per hour, and its cruise control system automatically accelerated, according to research released by McAfee on Wednesday.
McAfee says the issue isn’t a serious risk to motorists. No one was hurt and the researcher behind the wheel was able to safely slow the car.
But the findings, from 18 months of research that ended last year, illustrate a weakness of machine learning systems used in automated driving, according to Steve Povolny, head of advanced threat research at McAfee. Other research has shown how changes in the physical world can confuse such systems.
The tests involved a 2016 Model S and Model X that used camera systems supplied by Mobileye Inc, now a unit of Intel Corp Mobileye systems are used by several automakers though Tesla stopped using them in 2016.
Tests on Mobileye’s latest camera system didn’t reveal the same vulnerability, and Tesla’s latest vehicles apparently don’t depend on traffic sign recognition, according to McAfee. Tesla didn’t respond to e-mails seeking comment on the research.
“Manufacturers and vendors are aware of the problem and they’re learning from the problem,” Povolny said. “But it doesn’t change the fact that there are a lot of blind spots in this industry.”
To be sure, the real-world threats of such an occurrence today are limited. For one, self-driving cars are still in the development phase, and most are being tested with safety drivers behind the wheel. Vehicles with advanced driver-assist systems that are available now still require the human to be attentive.
And the McAfee researchers were only able to trick the system by duplicating a certain sequence involving when a driver-assist function was turned on and encountered the altered speed limit sign. Manufacturers are also integrating mapping technology into systems that reflect the proper speed limit.
“It’s quite improbable that we’ll ever see this in the wild or that attackers will try to leverage this until we have truly autonomous vehicles, and by that point we hope that these kinds of flaws are addressed earlier on,” Povolny said.
In a statement, Mobileye said human drivers can also be fooled by such a modification and that the system tested by the researchers was designed to assist a human driver and not to support autonomous driving.
“Autonomous vehicle technology will not rely on sensing alone, but will also be supported by various other technologies and data, such as crowd sourced mapping, to ensure the reliability of the information received from the camera sensor and offer more robust redundancies and safety,” the company said.
The McAfee research follows similar academic work in what’s known as adversarial machine learning, a relatively new field that studies how computer-based learning systems can be manipulated. Researchers in 2017 found that placing four black and white stickers in specific locations on a stop sign could “trick” a computer vision system into seeing a 45 mile per hour speed limit sign, for example.
The issue isn’t specific to Tesla or Mobileye, but is a broader weakness inherent in the advanced systems powering self-driving cars, said Missy Cummings, a Duke University robotics professor and autonomous vehicle expert, and researchers have shown that by cause potentially serious malfunctions by changing the physical environment without accessing the system itself.
“And that’s why it’s so dangerous, because you don’t have to access the system to hack it.”
Related Story