X
Business

Tesla and other autopilot-driven cars tricked with 2D projections

Car driving assistance systems and autopilots will need to be able to recognize 2D depthless objects going forward, similar to how facial recognition systems can distinguish between a real face and a photo.
Written by Catalin Cimpanu, Contributor

A team of academics says you can trick moderm smart cars driving in autopilot mode and make them take unwanted actions -- such as brake or switch direction -- just by projecting 2D images on the road, on roadside surfaces, or by embedding certain "triggers" inside video billboards.

This new research, published last week, studied how advanced driving assistance systems (ADASs) and autopilot technologies react to 2D objects (also known as depthless objects, or phantoms) that are projected in the path of a moving car and its autopilot's sensors.

For their research, entitled "Phantom of the ADAS: Phantom Attacks on Driving Assistance Systems" [PDF], researchers studied only two technologies -- the Mobileye 630 PRO (used in Honda, Mazda, and Renault cars) and the Tesla Model X HW 2.5 autopilot system.

The researchers found several scenarios in which these systems would recognize 2D projections as traffic-related items, rather than what they were -- projections.

For example, cars would recognize 2D projections of human beings projected on a road, and the autopilot would engage the break to slow down or stop the vehicle.

case-1.jpg
Image: Nassi et al.

The same trick would also work with the projection of a car.

case-2.jpg
Image: Nassi et al.

In another case, the research team projected new lines on the road, tricking the car into thinking it needed to turn and adjust lanes, a trick that could make cars enter nearby lanes and put passenger's lives at risk.

case-5.jpg
Image: Nassi et al.

But the trick worked beyond projecting 2D objects on the road. For example, projecting images of road signs on roadside surfaces like walls or trees worked just as well, and the tested autopilot systems couldn't tell the difference between the projections and the real road signs.

case-3.jpg
Image: Nassi et al.

Furthermore, the 2D objects don't always need to be projected. The research team says that embedding a flat 2D object inside any other medium works just fine, such as a video ad playing on a billboard. In their experiment, they embedded a road sign inside a billboard ad, fooling the car it was in a higher speed zone.

case-4.jpg
Image: Nassi et al.

Researchers say that objects can be projected in a variety of ways, using cheap $300 projectors that you can buy from Amazon. These projectors can be handheld, or installed on flying drones.

In addition, the research team says that projecting rogue 2D objects doesn't necessarily mean the projections need to be visible for long periods of time. A few hundreds milliseconds is enough, they said.

Short-burst projections would be invisible to the human eye, but they'd still be visible and picked up by the powerful sensors and video cameras used by ADAS and autopilot systems.

This opens the door for real-world scenarios where human drivers wouldn't even spot the projections, but the car would suddenly break or steer towards oncoming traffic.

This is an important observation because most car makers advise drivers to use the autopilots only under direct supervision. Car vendors say the systems should be used to assist drivers while driving, but hands should always be kept on the wheel and eyes on the road. Since projections can be flashed in front of a car for a few milliseconds, drivers who don't abide these rules won't be able to react in time.

The research team said it notified both Mobileye and Tesla about their findings, but the car makers have yet to make improvements to their systems. One reason is that there is no way software could detect the difference between a road sign projected on a tree's leaves, and a legitimate road sign strapped to a tree (or wall), where some road signs are expected to be found. The same thing goes for distinguishing projections of road markings from legitimate road lines.

However, autopilot systems should be able to distinguish between projections of humans and cars from the real thing. In this case, the way moving forward is that autopilot and other ADAS systems get support for recognizing 2D depthless objects, similar to how some facial recognition systems can detect photos of a human face and prevent a successful authentication.

I 3D printed a Tesla Cybertruck because...why not?

Editorial standards