Tesla recalls 2 million cars with ‘insufficient’ Autopilot safety controls

Authored by washingtonpost.com and submitted by StevenSanders90210

Listen 8 min Share Comment on this story Comment Add to your saved stories Save

Federal safety regulators announced the largest recall in Tesla’s 20-year history on Wednesday, as the carmaker began distributing fixes to more than 2 million vehicles equipped with Autopilot systems found to have “insufficient” safeguards against driver misuse. Tech is not your friend. We are. Sign up for The Tech Friend newsletter. ArrowRight The voluntary recall amounts to a sweeping rebuke of the nation’s premier electric vehicle maker by the National Highway Traffic Safety Administration, the agency charged with regulating vehicles on America’s roads. In a statement, the agency said Tesla had done too little to ensure that drivers pay attention to the road while its Autopilot driver-assistance system is activated.

“There may be an increased risk of a crash” when the system is engaged, the agency wrote, “and the driver does not maintain responsibility for vehicle operation and is unprepared to intervene as necessary.”

The recall comes three days after The Washington Post published an investigation that identified at least eight fatal or serious crashes involving Tesla drivers using Autopilot on roads where the software was not intended to be used. In user manuals, legal documents and communications with federal regulators, Tesla has acknowledged that Autosteer, Autopilot’s key feature, is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.”

The recall report posted by NHTSA confirms that “Autosteer is designed and intended for use on controlled-access highways” except when Tesla vehicles are operating in a more advanced version of driver-assistance known as Full Self-Driving. To “encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged,” NHTSA said, Tesla would implement “additional checks” on drivers “using the feature outside controlled access highways,” among other remedies.

At a congressional hearing Wednesday on an unrelated matter, NHTSA’s acting administrator Ann Carlson said the agency had found that many crashes involving Autopilot have occurred when a driver failed to recognize and react to a sudden obstacle.

“One of the things we determined is that drivers were not always paying attention when that system was on,” Carlson said.

NHTSA said Tesla will send out a software update to fix problems affecting virtually every Tesla vehicle equipped with Autopilot, including its 2012-2023 Model S, 2016-2023 Model X, 2017-2023 Model 3 and 2020-2023 Model Y vehicles. Autopilot is now a standard feature on Teslas; only some early models are not equipped with the software.

“Automated technology holds great promise for improving safety but only when it is deployed responsibly,” NHTSA said in a statement. “Today’s action is an example of improving automated systems by prioritizing safety.”

Tesla did not immediately respond to requests for comment about Wednesday’s recall. However, in a statement this week responding to The Post’s report on Autopilot crashes, Tesla said it has a “moral obligation” to continue improving its safety systems, while arguing that it is “morally indefensible” to not make these features available to a wider set of consumers.

The company has long argued that vehicles in Autopilot perform more safely than those guided by unassisted human drivers, citing a lower frequency of crashes when the software is enabled.

“Regulators around the globe have a duty to protect consumers, and the Tesla team looks forward to continuing our work with them towards our common goal of eliminating as many deaths and injuries as possible on our roadways,” the company said on X, the platform formerly known as Twitter.

Tesla’s policy chief Rohan Patel on Wednesday hailed the work of both Tesla and federal regulators in his own post on X. “The regulatory system is working about as well as it can given the lack of clear regulations in this field,” Patel said, adding that those who had “demonized” the company and NHTSA were “on the wrong side of history.”

Former NHTSA administrator Steven Cliff, who oversaw the launch of the Autopilot probe more than two years ago, said the recall was historic. “To get to that point of getting the company to voluntarily recall 2 million vehicles … is no joke,” Cliff said. “This is a monumental achievement.”

Cliff credited the voluntary recall to the agency’s collection of Autopilot crash data, an effort he spearheaded before leaving the agency in 2022. The agency’s vast store of crash data left Tesla little choice but to act, he said, lest it risk a mandatory recall that would be conducted on the regulators’ terms rather than Tesla’s.

In a statement, U.S. Sens. Richard Blumenthal (D-Conn.) and Edward J. Markey (D-Mass.) called the recall “egregiously overdue.” “We urge NHTSA to continue its investigations to spur necessary recalls,” they wrote, “and Tesla to stop misleading drivers and putting the public in great danger.”

NHTSA began investigating Tesla’s Autopilot software more than two years ago in a probe sparked by around more than a dozen crashes involving Teslas in Autopilot running into parked emergency vehicles. In 2021, the agency began requiring all automakers that offer driver-assistance software to begin reporting crashes involving the technology to the agency.

In all, NHTSA said it had reviewed 956 crashes allegedly involving Autopilot before zeroing in on 322 software-related crashes that involved “frontal impacts and impacts from potential inadvertent disengagement of the system.”

According to a timeline released by NHTSA, Tesla cooperated with the agency’s inquiries starting in August 2021. That led to a series of meetings beginning in early October 2023. In those meetings, Tesla “did not concur” with the agency’s safety analysis but proposed several “over-the-air” software updates to make sure drivers who engage Autopilot keep their eyes on the road.

The remote updates mean the vehicles do not have to be returned to service centers to receive the software fixes necessary to meet NHTSA requirements.

Late Tuesday, Tesla began rolling out those updates, mainly new “controls and alerts” to encourage drivers to maintain control of their vehicles, including “keeping their hands on the steering wheel and paying attention to the roadway,” the recall report said. The update also will include new precautions when Autosteer is engaged outside controlled-access highways, the recall report said, as well as a feature that can suspend a driver’s Autosteer privileges if the person repeatedly fails to stay engaged at the wheel.

“In certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse,” the recall report said.

NHTSA said it would keep its investigation open “to support an evaluation of the effectiveness of the remedies deployed by Tesla.”

In the past, Tesla has remedied multiple software flaws with remote updates at NHTSA’s behest, including a 2021 fix issued to Full Self-Driving software after cars started sharply activating their brakes at highway speeds.

Tesla chief executive Elon Musk, who has decried NHTSA as the “fun police,” has taken issue with regulators’ use of the word “recall” for software updates. Use of the word “‘recall’ for an over-the-air software update is anachronistic and just flat wrong!” Musk posted on X.

The remedies do not require Tesla to limit where drivers can activate Autopilot, a long-standing recommendation by the National Transportation Safety Board. NHTSA has said that it had looked into the prospect of verifying that vehicles using driver-assistance software operate only on roads where they can reliably function, known as their design domain. But the agency determined that doing so would be complex and resource-intensive, and might not solve the problem of drivers relying too heavily on the software to control their cars.

When Autopilot is activated, the driver is still considered the “operator” of the vehicle. That means the person is responsible for the vehicle’s movement, hands on the steering wheel, ready to brake. In a related safety recall report, NHTSA said the risk of collision can increase if the driver fails to “maintain continuous and sustained responsibility for the vehicle” or fails to recognize when Autopilot turns off.

Philip Koopman, a professor at Carnegie Mellon University who has conducted research on autonomous-vehicle safety for 25 years, said the recall failed to address a basic flaw of Tesla’s driver-assistance model.

“The good message NHTSA is sending is ‘We are going to be serious about requiring effective driver monitoring. And we intend to be serious about making sure these features are only engaged when they should be,’” he said. “The elephant in the room is that the software in Tesla is still beta software and they are still using retail customers with no special training and no special skills to test their software.”