the accidents it causes go beyond what the statistics say

  • 42

It paints the thing badly with the Tesla Autopilot. The NHTSA (National Highway Traffic Safety Administration) of the United States published this week a new analysis about Tesla’s autonomous driving software, reaching several conclusions that leave this system in a very bad place. To such an extent that there is already open talk of withdrawing this function from the market.


More than 830,000 vehicles with Autopilot could be affected. The NHTSA argues that there is a “backlog of crashes in which Tesla vehicles were operating with Autopilot enabled.” The investigation focuses on 16 accidents, but in total the analysis will cover the more than 800,000 Tesla that come with Autopilot in the US, from all Model S since 2014, Model X since 2015, Model 3 since 2018 and Model Y since 2020.

All these vehicles could be affected by the conclusions of the study, since in case of determining it, they could have to deactivate the Autopilot of their vehicles. Running through the cloud, these cars would not need to be physically returned to fit.

Autopilot disabled one second before impact. Of the cases analyzed, the NHTSA explains that, on average, the Autopilot “was deactivated less than a second before the first impact”, despite the fact that the videos show that the driver could have detected the problem of the accident, on average, about eight seconds before impact.

Most of the drivers had their hands on the wheel, as autopilot demands, but the problem is that no action was taken in time. As described by the regulator, in 25% of the accidents, Tesla did not issue any “visual alert or alarm during the final cycle of use of the autopilot”. That is, drivers relied on Autopilot until it was too late (just a second before the crash).

braking "ghost"complaints and official investigations: what is happening with the Tesla Autopilot

Elon Musk noted that Autopilot was not active at the time of the crash. One of the sticking points, which the NHTSA is going to review, is the actual role that Autopilot has in the number of accidents. In the past, Elon Musk has explained that Tesla data showed that Autopilot was not active at the time of the crash, reducing the number of accidents directly related to Autopilot.

However, the NHTSA investigation expands the potential scope of Autopilot, indicating that while it was certainly not activated at the time, the leeway left by Autopilot is very small.

Responsibility lies with the driver. Despite the advances of Tesla’s Full Self-Driving (FSD), still in beta, these systems are still considered driver assistance. In other words, the responsibility in the event of an accident falls on the driver and not on the autonomous driving system itself. That is, the “fault” is not Tesla, in the event of an accident, since the driver still has to be alert at all times.

The investigation continues. Several of the accidents related to the Autopilot have led to lawsuits, where it is necessary to determine the scope of responsibility of the driver and the manufacturer itself. It is a matter that is analyzed case by case and where a lot of money is at stake, also by insurers.

The NHTSA will continue with this investigation for now, auditing the operation of Tesla Autopilot and checking why it deactivates moments before impact.

In Xataka | The economic future of Tesla goes through autonomous driving. At the moment everything is problems

It paints the thing badly with the Tesla Autopilot. The NHTSA (National Highway Traffic Safety Administration) of the United States…

It paints the thing badly with the Tesla Autopilot. The NHTSA (National Highway Traffic Safety Administration) of the United States…

Leave a Reply

Your email address will not be published.