The recall comes after a safety regulator stated that modern driver-assistance systems are vulnerable to ‘foreseeable misuse.’
Tesla is recalling nearly 2 million vehicles equipped with its Autopilot advanced driver-assistance system in the United States to install new protections after a safety regulator claimed the technology was vulnerable to “foreseeable misuse.”
For more than two years, the National Highway Traffic Safety Administration (NHTSA) has been examining Tesla, led by billionaire Elon Musk, to determine if Tesla vehicles effectively ensure that drivers pay attention when utilizing the driver assistance system.
According to Tesla’s recall filing, the software system controls for Autopilot “may not be sufficient to prevent driver misuse” and therefore raise the risk of a crash.
In August, Ann Carlson, the acting NHTSA administrator, told Reuters that it’s “really important that driver monitoring systems take into account that humans over-trust technology.”
Tesla’s Autopilot is designed to allow cars to automatically steer, accelerate, and brake inside their lane, whereas improved Autopilot can aid in changing lanes on highways but does not make them autonomous.
Autosteer is a feature of Autopilot that aims to keep vehicles in their driving lane by maintaining a predefined speed or following distance.
Tesla stated that it disagreed with the NHTSA’s findings, but that it would deploy an over-the-air software update that would “incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged.”
The corporation did not comment when asked if the recall would take place outside of the United States.
After discovering more than a dozen collisions in which Tesla vehicles collided with stopped emergency vehicles, the NHTSA launched an investigation into Autopilot in August 2021 and improved it in June 2022.
According to the NHTSA, Tesla initiated the recall as a result of its inquiry after the agency discovered that “Tesla’s unique design of its Autopilot system can provide inadequate driver engagement and usage controls that can lead to foreseeable misuse of the system.”
Separately, the NHTSA has started more than three dozen Tesla special crash investigations since 2016, with 23 collision deaths documented to date, in circumstances where driving systems such as Autopilot were suspected of being engaged.
According to the NHTSA, there is an increased risk of a crash when the system is activated but the driver does not maintain responsibility for vehicle operation and is unprepared to intervene or fails to detect whether the system is canceled or not.
The NHTSA’s examination of Autopilot will continue as it assesses the effectiveness of Tesla’s fixes. Since mid-October, Tesla and the NHTSA have met many times to discuss the agency’s preliminary conclusions on potential driver misbehavior and Tesla’s proposed software solutions in response.
According to the agency, the corporation will push out the upgrade to 2.03 million Model S, X, 3, and Y vehicles in the United States dating back to the 2012 model year.
The update based on vehicle hardware will include increasing the prominence of visual alerts on the user interface, simplifying Autosteer engagement and disengagement, and additional checks upon engaging Autosteer, as well as “ultimate suspension from Autosteer use if the driver repeatedly fails to demonstrate continuous and sustained driving responsibility while the feature is engaged,” Tesla said.
It did not go into detail regarding how warnings and safeguards will be altered.
In premarket trade, shares of the world’s most valuable automaker were down 1%.
Tesla revealed in October that the US Justice Department has issued subpoenas for information about its Full Self-Driving (FSD) and Autopilot technologies. In October 2022, Reuters claimed that Tesla was under criminal investigation for promises that its electric vehicles could drive themselves.
Tesla recalled 362,000 US vehicles in February to update its FSD Beta software after the NHTSA said the vehicles did not comply with traffic safety standards and could cause crashes.
In 2017, the NHTSA concluded a prior examination into Autopilot without taking any action. The National Transportation Safety Board (NTSB) has chastised Tesla for failing to include system safeguards for Autopilot, and the National Highway Traffic Safety Administration (NHTSA) for failing to ensure Autopilot’s safety.