Home Actualité internationale World News – US – Hacked billboards may cause Teslas to see « ghost objects », causing them to deflect or abruptly stop
Actualité internationale

World News – US – Hacked billboards may cause Teslas to see « ghost objects », causing them to deflect or abruptly stop

Tesla's autopilot system relies on vision rather than LIDAR, which means it can be fooled by messages on billboards and projections created by hackers

Safety researchers have demonstrated how Tesla’s Autopilot driver assistance systems can be tricked into shifting, deviating or stopping abruptly, simply by throwing fake traffic signs or virtual objects in front of them

Their hacks worked on both a Tesla running HW3, which is the latest version of the company’s Autopilot driver assistance system, and the previous generation, HW25

The most disturbing finding is that a false road sign should only be displayed for less than half a second, in order to trigger a response from Tesla’s system

In an example cited by researchers, a « Stop » sign hidden in a fast food restaurant ad successfully shut down a Tesla running on autopilot, despite the command only flashing on-screen for a fraction second

The system also recognized virtual projections of people and cars as real objects, responding by slowing the car down to avoid hitting them, and was tricked by a drone that threw a false speed sign onto a Wall

Researchers at Ben-Gurion University in the Negev said their findings « reflect a fundamental flaw in models that detect objects [but] have not been trained to distinguish between real and fake objects »

It’s easy to imagine how a bad actor could use this loophole to cause an accident or a traffic jam, by hacking into a digital billboard for example

Such attacks must potentially be both dangerous and easy to carry out as they « can be applied from a distance (using a drone equipped with a portable projector or by hacking into digital billboards that face the Internet and are located close to roads), thus eliminating the need to physically approach the attack scene, changing exposure vs. balance of demands, ”the researchers wrote

They are also so fleeting that they are difficult for the human eye to detect and leave very little evidence

Similar hacks have also worked on the Mobileye 630 autopilot system, as it and Tesla’s system both rely on visual recognition through the use of cameras

Researchers have confirmed that these attacks would not have fooled autopilot systems that rely on LIDAR, which measures distances and maps surrounding areas using lasers

Company CEO Elon Musk, however, has always criticized LIDAR, which is a more expensive system, proclaiming in 2019 that: « Lidar is a mad race Anyone who relies on lidar is doomed »

Tesla, who insists autopilot requires « active driver supervision and [does not make] the vehicle autonomous, » has been briefed on the researchers’ findings

Sign up for our daily newsletter for more articles like this access to 5 additional articles

Find out why nearly a quarter of a million subscribers start their day with the Starting 5

Tesla, Inc, Autonomous Car, Elon Musk, Tesla Autopilot

News from around the world – US – Hacked billboards may lead Teslas to see ‘ghost objects « , Causing them to deviate or stop suddenly


SOURCE: https://www.w24news.com

A LIRE AUSSI ...

Un avion léger s’écrase après une poursuite par des chasseurs à réaction dans la région de Washington.

The Federal Aviation Administration said a Cessna aircraft crashed into mountainous terrain...

Opinion : Il est temps de licencier Elon Musk

The unsteady trend of Tesla stock and reputation has prompted many shareholders...

CM – Le pilote automatique de Tesla dans le collimateur des autorités américaines

La responsabilité du système d'aide à la conduite Tesla dans 11 accidents...

[quads id=1]