New NHTSA Investigation Will Look Into Five Years of Unusual Tesla Crashes

Work for Torque News, follow on Twitter, Youtube and Facebook.

NHTSA has opened a new investigation into why Tesla vehicles crash so often into parked first responder vehicles.

Tesla automobiles operating with the company's driver-assist technology and an inattentive driver have crashed into static objects for half a decade. The company, who calls its latest driver assist technology suite "Full Self Driving", seems incapable of designing an effective automatic emergency braking system to prevent such crashes. Now the National Highway Traffic Safety Administration (NHTSA) has decided to look into these unusual crashes. The National Transportation Safety Board (NTSB) has looked into Tesla crashes and made recommendations to change the safety systems in Tesla vehicles, but NTSB is a toothless agency without regulatory powers. NHTSA, by contrast, does have regulatory powers and can issue a stop-sale or recall if it feels it is warranted. The crashes NHTSA seems most focused on are the ones where a Tesla crashed into the most visible objects on the roadway such as a firetruck or police cruiser covered in reflective tape surrounded by flares with strobe lights flashing. Torque News has been reporting on unusual Tesla crashes since 2016. The most recent crash was into a semi-tractor, not a first responder vehicle this past May. In between, Tesla crashes into stationary emergency responder vehicles have happened frequently. In some cases, as often as two per month. You can view a selection of the coverage from the past crashes at the bottom of this story. In its coverage of this new federal investigation into Tesla vehicles, the Associated Press quoted Jason Levine, executive director of the nonprofit Center for Auto Safety, who said, “We are glad to see NHTSA finally acknowledge our long-standing call to investigate Tesla for putting technology on the road that will be foreseeably misused in a way that is leading to crashes, injuries, and deaths. If anything, this probe needs to go far beyond crashes involving first responder vehicles because the danger is to all drivers, passengers, and pedestrians when Autopilot is engaged.” Torque News will update this story as it develops. Other outlets covering this story today include ABC News, USAToday, Reuters, CNN, and the New York Times.

May 2021 Five Years After First Tesla Tragedy Seemingly Preventable Crashes Still Occur

April 2021: Video of Tesla Model Y On Autopilot With No Driver Demonstrates Need For Government Intervention

March 2021: Another Tesla Hits Another Semi From the Side - This Time It Looks Like a Model Y

August 2020: Tesla Operated By Full Self Crashing System Hits Two Parked First Responder Vehicles - Again

July 2020 - Police: Tesla On Autopilot Hits Not One, But TWO Parked First Responder Vehicles

January 2020: Second Crash In One Month Of A Tesla Into A Parked Firetruck Results In Fatality

December 2019: Tesla Model 3 On Autopilot Hits Yet Another Police Vehicle - Why Won't They Stop?

August 2018: Third Tesla Crashes Into Back of Firetruck - That's Four Crashes Into Emergency Vehicles This Year

May 2018: Another Tesla On Autopilot Hits Another Emergency Vehicle - You Can't Make This Stuff Up

January 2018: Tesla Police Blotter News - Tesla Driver Hits Parked Firetruck - Blames Autopilot

Submitted by DeanMcManis (not verified) on August 18, 2021 - 10:05AM

Permalink

Remember that people using Autopilot are supposed to be paying attention to the road. So actually those few accidents were all the fault of those drivers. It is true that Tesla's representation of Autopilot and FSD (Full Self Driving) can mislead drivers into thinking that they do not need to pay attention to the road and keep their hands on the wheel. But Tesla's safety rating is excellent overall. Both in achieving far lower accidents, and lower injuries and death from accidents than other automakers. The incorrect assumption is that automated safety driving systems can be 100% effective. This is where many people highlight the rare driver assisted accidents each year, usually pointing to Tesla. Autopilot is not 100% perfect, but overall it is safer than unassisted humans, and it is being steadily improved every day.

Yes, I got the inattentive driver thing out of the way in the first sentence. I've attended seminars hosted by experts on driver-assist tech. Some say that their research shows humans cannot be expected to take back control in sudden situations. They (we) are just incapable. Some, like Toyota, are working on the opposite tech - A system that intervenes and takes control away from the driver in some situations where the human cannot react quickly enough or if the occupant is not reacting. In fact, all automatic emergency braking systems and lane keep assist fit that bill now. Tesla already has multiple driver attention monitoring systems in cars equipped with AP or FSD already. Even Tesla knows the inattentive driver excuse is not going to be enough if a first responder dies. That said, NHTSA hasn't impressed me with its regulation of driver assistance technology in the past. I expect this new inquiry will not be definitive, but who knows?

Submitted by DeanMcManis (not verified) on August 19, 2021 - 1:02PM

Permalink

I worked specifically on night pedestrian monitoring and inattentive driving system for Toyota in the past. And I thought that it was strange that Tesla did not install and use internal cabin cameras sooner, but they have now. It is not accurate to judge a series of individual and separate accidents over the course of several years where Tesla was using a system that is constantly being updated and improved. The auto assist software and hardware today is fundamentally different than it was 3 years, 5 years, and 10 years ago. Many of these emergency vehicle accidents came from the unusual circumstances of the extreme fire conditions in California then and now. Which put many emergency vehicles in harms way on many public roads, and I suspect that the few Tesla crashes into these vehicles is no higher than the average. But that news doesn't get clicks. This is not saying that Tesla deserves any special consideration in trying to develop a fully safe auto driving system. And I agree that Elon needs to temper his marketing enthusiasm to better reflect the true state (and limitations) of Autopilot and FSD. But I am glad that Tesla is leading this challenge to develop self driving vehicles. And hopefully they will weather these inquiries well, so that they can move forward with their safety product development.

Tell us how the California wildfires may have played a role. That's an angle I had never considered. There is a link to the states and dates on which the accidents occurred in the story (first one). Only three of the crashes happened in California and one of those was on the South Coast.