NTSB predicted most recent Autopilot crash.
John Goreham's picture

NTSB Report Eerily Predicts Tesla Model X Autopilot Fatality 6 Months In Advance

Six months before the latest accident that took an occupant’s life, a report by the National Transportation Safety Board clearly listed the reasons why Autopilot is unsafe.
Advertisement


In September of 2017, the National Transportation Safety Board (NTSB) released the findings of its study of a 2016 Tesla Model S crash that killed the occupant while the car was in Autopilot mode. In that crash, a Tesla Model S operating in Autopilot mode drove straight into the side of tractor trailer killing the occupant in the Tesla. That Autopilot mode was operating the vehicle above the speed limit was not judged to be a major reason for the crash. However, some of the reasons that NTSB did list seem to match very closely the reasons that Tesla itself gives for this most recent Autopilot fatality in which a Model X drove itself straight into a lane divider, killing its occupant.

In the 2017 NTSB report, titled Driver Errors, Overreliance on Automation, Lack of Safeguards, Led to Fatal Tesla Crash, the group lists in bullet point format why the first fatal Autopilot crash happened. Two of the bullet point causes include:
• The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.
• The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.

Just days after the most recent Autopilot fatality occurred, Tesla released information on the crash. It sounds almost like Tesla cribbed its notes from the NTSB’s report from six months earlier. Tesla said the following:
“The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.” Is what Tesla is saying here not the same thing as, “The way in which the Tesla Autopilot system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.”

Tesla went on to say, “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.” Is this not very similar to the prior problem the NTSB had noted when it said, “The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation?”

It seems pretty clear that some Tesla owners are not on board with the idea that Autopilot is a driving aid. They treat the system like it is an automatic pilot of some kind. Can’t imagine where they got that idea from.

Related Story: Tesla Police Blotter News - Tesla Driver Hits Parked Firetruck - Blames Autopilot


Sign-up to our email newsletter for daily perspectives on car design, trends, events and news, not found elsewhere.