Back in April, news agencies both local and international news covered a particularly controversial test from Consumer Reports. The magazine concluded that Tesla Autopilot was particularly problematic since it could be fooled into operating without a human driver. Granted, Consumer Reports used several hacks and a defeat device to trick the system, but that was the magazine’s point: if a driver wants to, Tesla Autopilot could be fooled into operating dangerously.
A new, more extensive test from Car and Driver has now proven that it is not only Tesla Autopilot that can be tricked into operating dangerously. Unlike Consumer Reports, which published its conclusions without benchmarking Tesla Autopilot with comparable systems like GM’s Super Cruise, Car and Driver actually gathered 17 vehicles from a variety of brands to test how well their driver-assist systems could detect an inattentive or absent driver.
As it turned out, every single system tested by the Car and Driver team fell prey to driver hacks. The motoring publication found that a good number of the vehicles it tested didn’t even stop when the driver unbuckles the seatbelt, and an even greater number of cars were fooled by a defeat device on the steering wheel. Some vehicles from automakers like BMW and Mercedes-Benz performed well against steering wheel defeat devices, though, as they rely on touch response, not pressure from the driver.
In Consumer Reports’ scathing criticism of Autopilot, the magazine noted that other systems such as GM’s Super Cruise are safer since they rely on eye-tracking technology to detect if the driver is paying attention to the road. To test Super Cruise, Car and Driver had to take the vehicle into an actual highway since the system only worked on pre-mapped areas. But even Super Cruise’s eye-tracking system proved vulnerable since the eye-tracking system was fooled by a simple pair of gag glasses with eyes painted on them.
The same was true when Car and Driver evaluated if driver-assist systems would operate without a human in the driver’s seat. Every single system from the 17 vehicles tested by the motoring publication fell prey to simple hacks like a hockey stick on the accelerator. These included Super Cruise, which operated when gag glasses were placed at the driver’s seat.
Car and Driver was careful to highlight that its tests should not be attempted by anyone at any time. Tricking driver-assist systems, after all, is arguably the most foolish thing act that a driver can perform on the road today. But if there is anything that was proven by its mass tests, it is the fact that driver-assist vulnerabilities are not exclusive to Tesla Autopilot. With this in mind, automakers, from Tesla to GM, must continue to improve their systems to make them far better than they are today, hopefully to a point where hacks and defeat devices become ineffective.
Watch Car and Driver‘s test in the video below.
Don’t hesitate to contact us with news tips. Just send a message to email@example.com to give us a heads up.
The post Tesla Autopilot is not the only system that can be tricked into operating with no driver appeared first on TESLARATI.