Tesla Must Now Report 'Autopilot' Crashes to the Government or Face Fines

Authored by vice.com and submitted by theinternetstapler

Moveable explores the future of transportation, infrastructure, energy, and cities. See More →

The National Highway Traffic Safety Administration (NHTSA) took a major step towards improving road safety on Tuesday by issuing a new rule that will remove the veil of secrecy about whether semi-autonomous driving features actually make driving safer.

Under the new NHTSA rule, companies must report all crashes in which so-called "semi autonomous" driving assist features like steering assist or automatic lane-keeping are involved, including Tesla's Autopilot and dubiously named "Full Self-Driving Beta" options. The rule will also apply to companies operating self-driving pilot programs like Waymo, Zoox, and Cruise. This will not only provide the independent government agency with a greater understanding of how safe these systems really are, but also allow it to spot patterns of particularly unsafe ones.

The new rule says that any crash involving a semi-autonomous system and "a hospital-treated injury, a fatality, a vehicle tow-away, an air bag deployment, or a vulnerable road user such as a pedestrian or bicyclist" must be reported to NHTSA within one day of learning about the crash, with an update submitted 10 days later. Companies must also report crashes involving any injury or property damage on a monthly basis, to be updated monthly with new information. The Washington Post reported failure to comply with this new rule will carry fines of $22,992 per day and a maximum penalty of "more than $100 million" and referral to the Justice Department.

NHTSA said in the press release it will use the data collected to "help the agency identify potential safety issues and impacts resulting from the operation of advanced technologies on public roads and increase transparency."

This seemingly minor change could have a big impact on road safety, because we simply don't know how good any of these systems are, how often they malfunction, or to what extent they're abused by drivers. Up until now, virtually all the data available on computerized driving features—ranging from the various auto-steering functionalities virtually every automaker offers nowadays to Tesla's semi-autonomous Autopilot to Waymo's fully autonomous taxi service—have come from the companies themselves, which have an obvious interest in making their product appear as safe as possible. That has left the general public with only the occasional (or, in Tesla's case, frequent) anecdote about misuse or malfunction. But soon, we should have actual information to determine whether these are mere anecdotes or something more significant.

coolnasir139 on June 30th, 2021 at 14:49 UTC »

They should call it driver assist so dummies don’t just say it’s on auto pilot and get into wreaks thinking it will do everything for them

FlyingLap on June 30th, 2021 at 14:24 UTC »

I think more American leaders need to spend some time driving Western Europe to really grasp how poorly trained we are.

If we did HALF of what a Finland or even a Germany does in car control training, our national accident rates would be substantially reduced.

And that doesn’t even address the stress of even a simple commute in the States. I drove halfway across Europe and had less “incidents with idiots” than my drive to the office.

There are no quick fixes, especially with transportation. And Elon is the master of quick and cheap.

Also, I’d like to opt out of being in the beta test please. I don’t want to share the road with a car that is autonomous at the hands of a driver with zero training except how to pass a road sign test and “can you signal and drive the speed limit?”

Fleischgewehr2021 on June 30th, 2021 at 04:51 UTC »

As does every other company who offers, or is planning to offer a self-driving feature in their automobiles. No need to single out Tesla here.