Advertisement

NHTSA concludes Tesla Autopilot investigation after linking the system to 14 deaths

The organization has opened a new inquiry into the efficacy of recent software fixes.

Unsplash / Roberto Nickson

The National Highway Traffic Safety Administration (NHTSA) has concluded an investigation into Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes, including 13 fatal incidents that led to 14 deaths. The organization has ruled that these accidents were due to driver misuse of the system.

However, the NHTSA also found that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.” In other words, the software didn’t prioritize driver attentiveness. Riders using Autopilot or the company’s Full Self-Driving technology “were not sufficiently engaged,” because Tesla “did not adequately ensure that drivers maintained their attention on the driving task."

The organization investigated nearly 1,000 crashes from January of 2018 until August of 2023, accounting for 29 total deaths. The NHTSA found that there was “insufficient data to make an assessment” for around half (489) of these crashes. In some incidents, the other party was at fault or the Tesla drivers weren’t using the Autopilot system.

The most serious were 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path” and these were often linked to Autopilot or FSD. These incidents led to 14 deaths and 49 serious injuries. The agency found that drivers had enough time to react, but didn’t, in 78 of these incidents. These drivers failed to brake or steer to avoid the hazard, despite having at least five seconds to make a move.

That’s where complaints against the software come into play. The NHTSA says that drivers would simply become too complacent, assuming that the system would handle any hazards. When it came time to react, it was too late. “Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances,” the organization wrote. The imbalance between driver expectation and the operating capabilities of Autopilot resulted in a “critical safety gap” that led to “foreseeable misuse and avoidable crashes.”

The NHTSA also took umbrage with the branding of Autopilot, calling it misleading and suggesting that it lets drivers assume the software has total control. To that end, rival companies tend to use branding with words like “driver assist.” Autopilot indicates, well, an autonomous pilot. California’s attorney general and the state’s Department of Motor Vehicles are also investigating Tesla for misleading branding and marketing.

Tesla, on the other hand, says that it warns customers that they need to pay attention while using Autopilot and FSD, according to The Verge. The company says the software features regular indicators that remind drivers to keep their hands on the wheels and eyes on the road. The NHTSA and other safety groups have said that these warnings do not go far enough and were “insufficient to prevent misuse.” Despite these statements by safety groups, CEO Elon Musk recently promised that the company will continue to go “balls to the wall for autonomy.”

The findings could only represent a small fraction of the actual number of crashes and accidents related to Autopilot and FSD. The NHTSA indicated that “gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes.” This means that Tesla only receives data from certain types of crashes, with the NHTSA claiming the company collects data on around 18 percent of crashes reported to police.

With all of this mind, the organization has opened up another probe into Tesla. This one looks into a recent OTA software fix issued in December after two million vehicles were recalled. The NHTSA will evaluate whether the Autopilot recall fix that Tesla implemented is effective enough.