Tesla Blames Latest Autopilot Model X Death On Driver and Road Condition

Work for Torque News, follow on Twitter, Youtube and Facebook.

Tesla is again blaming a vehicle owner and other factors in its latest example of autopilot directing its vehicles into objects, killing the occupant.

As in every Tesla tragedy or SNAFU, it's always the other guy's fault. In the latest example of a Tesla vehicle killing one of its customers while the vehicle was operating in Autopilot mode, Tesla says the driver of a Model X luxury minivan and the roadway are the real problems, not that its vehicles drive straight into stationary objects without slowing.

In the most recent crash, a Model X driven by an Apple employee, father, and Tesla early adopter, drove straight into a stationary barrier in the highway while operating on Autopilot. The crash has many similar elements to the crash in 2016 that killed a former U.S. special forces operator Joshua Brown. In that crash, the speeding Tesla Model S luxury sedan drove straight into the side of a slow-moving semi trailer that was turning ahead of it at a legal intersection. Mr. Brown was killed instantly. The NTSB later determined that Tesla bore a significant part of the blame.

The driver in the most recent fatal Tesla autopilot crash is being blamed by Tesla for not having had his hands on the wheel often enough prior to the accident. Autopilot can operate a Tesla without ensuring that the driver is attentive, and even lets the car operate at high speeds when the driver is not touching the steering wheel. Tesla's imaginary safety backup in that situation is that the driver will grab back control if something dangerous is about to happen. Let's face it, if a former U.S. special forces soldier can't do that, neither can you or I.

The roadway in California where the driver was killed had a barrier that separated lanes. There was a collapsable impact safety barrier in that location, but another car had recently crashed there and compressed it.

The publication Bloomberg-Quint reports that the NTSB is unhappy with Tesla's release of information about the crash to the public, which is clearly aimed at managing the narrative.

This crash comes the very same month that an Uber autonomous vehicle struck and killed a pedestrian in Arizona.

Please take a minute to participate in our poll: Are self-driving and autonomous vehicles safe?

Submitted by Steve (not verified) on April 2, 2018 - 12:00PM

Permalink

How is Autopilot worth people dying? And who in their right mind ever thought people weren't going to die while using it? IMO this ought be pulled entirely, I cannot imagine a scenario that autopilot will be as a safe as a human driver obeying traffic laws and attentive. if we want to make driving safer perhaps an "auto monitor" feature that would detect drivers under the influence or distracted and cut-off fuel and trigger an indicator would be the way to go, however who would actually request that?

Driver distraction is the cause of most accidents in this case it was the autopilot driving and a distracted or unattentive driver at the wheel. To say that "autopilot will not be as a safe as a human driver" is to completely ignore the advancements in technology. Driverless cars are in the future and the technology is already there and improving every year. Once a system is in place that replaces driver input (maybe in the next 15 or so years) it will cut down on accidents significantly. The question is whether we are at the stage of letting technology take over our responsibilities as drivers.

Submitted by Dups (not verified) on April 3, 2018 - 1:36PM

Permalink

AI is a lot of hype. Computers still crash no matter how advanced they are. How about places like India and China where people swarm the roads. How about snow and ice conditions, etc..etc.. how are these cars supposed to be trusted? Nature and humans are the best AI, it has always been here, and it will never be replicated by stupid machines. If you want the best AI out there, educate the humans further.

Submitted by Webbie (not verified) on April 6, 2018 - 11:05AM

Permalink

There's inaccurate info here. First of all, Tesla Model X is an SUV, not a van. Second, Tesla's autopilot is clearly stated as Beta software, and only a driver's aid. Anybody who drives one knows there are places where it doesn't work. The driver is responsible for driving the car at all times. Third, the rate of Tesla fatal crashes is lower than any human driven car. Tesla's fatal crash rate on vehicles equipped with Autopilot is 3.7 times better than the national average.

If you think a jellybean-shaped vehicle that can't carry kayaks on its roof is a sports utility vehicle, you are entitled to your opinion. To us, a vehicle with three rows and side doors that open funny is a minivan. Was the Model X that crashed being used for sports or for utility? Two people have proven that there are places Autopilot doesn't work. They paid with their lives for thinking it worked on the highway in California and on a rural road in Florida. Did the story in any way mention a ratio of human driver to Autopilot deaths? If it didn't, how can it be inaccurate? IIHS found in back to back studies of driver death rates that the Lexus RX had a zero driver death rate. Over six model years. Tesla would have been included in the study, but its models' driven miles were not large enough to be considered statistically meaningful. You can cut and paste this link to learn more about the safest vehicles human drivers operate. https://www.torquenews.com/1083/iihs-releases-list-vehicles-zero-death-rates-over-3-period