US Opens Probe Into Tesla's 'Full Self-Driving' System After Fatality

US Opens Probe Into Tesla's 'Full Self-Driving' System After Fatality
Above: The dashboard screen and steering yoke of a Tesla Cybertruck electric vehicle at the Paris Motor Show in Paris, France, on Monday, Oct. 14, 2024. Image copyright: Nathan Laine/Bloomberg via Getty Images

The Facts

  • The US National Highway Traffic Safety Administration (NHTSA) launched a preliminary evaluation on Thursday into Tesla's Full Self-Driving (FSD) system's alleged failure to "detect and respond appropriately to reduced roadway visibility conditions."

  • It covers an estimated more than 2.4M Tesla vehicles that offer the optional FSD feature, including the 2016-2024 Model S and Model X, 2017-2024 Model 3, 2020-2023 Model Y, and 2023-2024 Cybertruck.


The Spin

Narrative A

Recent crashes involving Tesla's FSD system sent a warning signal to auto regulators about serious safety concerns in the system. Now, they are finally taking action against a technology that should never have been allowed on the road in the first place, especially as data shows that this system is far more dangerous at driving than humans.

Narrative B

It's hard to fault a driver assistant system clearly labeled as "supervised" for accidents when those crashes are mostly — if not entirely — to blame either on human error or bad road conditions. In fact, issues with this technology apparently stem from the fact that it is safe because it can make drivers too complacent.


Metaculus Prediction


Articles on this story

Sign Up for Our Free Newsletters
Sign Up for Our Free Newsletters

Sign Up!
Sign Up Now!