Skip to main content

Tesla Fights Back Against New Article That Claims Drivers Run Autopilot Where It's Not Intended, Resulting In Crashes

Tesla is doing a little bit of PR right now, something many have called for, against an article that was written stating that drivers run Autopilot where it's not intended, resulting in crashes. We will go over Tesla's response.

Article Against Tesla Autopilot

Tesla (via X) responded against a recent article in the Washington Post, claiming people use Autopilot in areas they shouldn't be able to, such as cross traffic, causing traffic accidents, and said that the article:

"Is particularly egregious in its misstatements and lack of relevant context."

Tesla also responded and said that they have a moral obligation to continue improving their already best-in-class safety systems, of which Autopilot is one.

They also responded sternly, saying:

"It is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury."

Tesla went on to say that regulators around the globe have a duty to protect consumers, and Tesla's team looks forward to working with them in eliminating as many deaths and injuries as possible in the roads.

Tesla utilizes data to record incidents, crashes, and anything unusual that happens to a Tesla car and makes the following claims:

In the 4th quarter of 2022, Tesla recorded one crash for every 4.85 million miles driven in which drivers were using Autopilot.

For those not using Autopilot, one crash was reported for every 1.40 million miles driven.

Even further, Tesla says that the most recent data from NHTSA and FHWA shows that in the U.S.A., there was an auto crash for every 652,000 miles driven.

Tesla goes on to say statements, which I'll paraphrase:

  • With more automation comes safer roads
  • More recent data show Autopilot is 10X safer
  • The driver is in control at all times, regardless of what software is chosen to run (Autopilot, Cruise Control, FSD, etc...)
  • Tesla still has a number of safety features beyond a user being able to disengage at any time, such as torque-based and camera-based monitoring
  • The data strongly indicates customers are far safest being able to choose when to engage Autopilot
  • The Washington Post article leverages instances of driver misuse of the Autopilot driver assist feature

The facts of the lawsuit in the article, according to Tesla, word-for-word, are:

1. Contrary to the Post article, the Complaint doesn't reference complacency or Operational Design Domain.

2. Instead, the Complaint acknowledges the harms of driver inattention, misuse, and negligence.

3. Mr. Angulo and the parents of Ms. Benavides who tragically died in the crash, first sued the Tesla driver—and settled with him—before ever pursuing a claim against Tesla.

4. The Benavides lawsuit alleges the Tesla driver “carelessly and/or recklessly” “drove through the intersection…ignoring the controlling stop sign and traffic signal.”

5. The Tesla driver didn’t blame Tesla, didn’t sue Tesla, didn’t try to get Tesla to pay on his behalf.  He took responsibility.

6. The Post had the driver's statements to police and reports that he said he was “driving on cruise.” They omit that he also admitted to police “I expect to be the driver and be responsible for this.”

7. The driver later testified in the litigation he knew Autopilot didn’t make the car self-driving and he was the driver, contrary to the Post and Angulo claims that he was mislead, over-reliant or complacent. He readily and repeatedly admitted:

a. “I was highly aware that was still my responsibility to operate the vehicle safely.”

b. He agreed it was his “responsibility as the driver of the vehicle, even with Autopilot activated, to drive safely and be in control of the vehicle at all times.”

c. “I would say specifically I was aware that the car was my responsibility.  I didn’t read all these statements and passages, but I’m aware the car was my responsibility.”

8. The Post also failed to disclose that Autopilot restricted the vehicle's speed to 45 mph (the speed limit) based on the road type, but the driver was pressing the accelerator to maintain 60 mph when he ran the stop sign and caused the crash. The car displayed an alert to the driver that, because he was overriding Autopilot with the accelerator, "Cruise control will not brake."

You May Also Be Interested In: Serena Williams and husband, Alexis Cohanian, seen storming Miami in Foundation series Cybertruck - see, Tesla doesn't need to advertise!

My Take on the Article and Tesla's Response

Tesla was very clear in their response and the details of what the article stated. I have seen many articles from news sources over the years about accidents with Autopilot, and I haven't seen a case yet where it was actually Autopilot's fault that an accident occurred.

It doesn't mean a case doesn't exist - I just haven't seen it.

I also agree that misusing Autopilot, such as overriding its speed limit and speed reduction to go faster, is generally not a good idea. However, there are times when Autopilot takes me to a speed limit that is far below what the actual speed limit is. In those cases, I disengage and drive manually. Tesla needs to fix that issue.

I can use Tesla Autopilot on "side roads" and in areas where there is a stop sign or stop light. Autopilot will not stop for stop signs or stop lights. It will stop for vehicles in front though, but I don't think it can handle cross-lane traffic, nor is it intended to. I can see arguments that Autopilot shouldn't be allowed on these types of roads with cross traffic, stop signs, and stop lights. However, as always, the driver is responsible at all times for monitoring and taking over, Tesla makes that very clear.

If Autopilot is used correctly, it hardly ever causes me an issue. As I said, one issue I have had, and it happens in the same stretch of freeway near where I live, is the speed gets reduced too much. I'm in a 70 mph zone and the car gets reduced to 50 mph for some strange reason.

I also had the car incorrectly think I had a device on the steering wheel once and Autopilot disconnect randomly with a system error. Beyond that, I haven't had any issues with it.

It's good to see Tesla doing more PR. If I learn anymore about this response to Autopilot and the Washington Post article, I'll be sure to update.

In Other Tesla News: Do you need to be rich to afford a Tesla? Not Exactly - Model 3 RWD With Incentives Priced At Ridiculous $22,590

What do you think - in this situation, who is right? Is it Tesla or the Washington Post, or a combination of the two?

Share this article with friends and family and on social media - or leave a comment below. You can view my most recent articles here for further reading. I am also on X/Twitter where I post more than just articles daily, as well as LinkedIn! Thank you so much for your support!

Hi! I'm Jeremy Noel Johnson, and I am a Tesla investor and supporter and own a 2022 Model 3 RWD EV and I don't have range anxiety :). I enjoy bringing you breaking Tesla news as well as anything about Tesla or other EV companies I can find, like Aptera. Other interests of mine are AI, Tesla Energy and the Tesla Bot! You can follow me on X.COM or LinkedIn to stay in touch and follow my Tesla and EV news coverage.

Image Credit, My Own Tesla Car Running Autopilot, Screenshot

Article references: Tesla response on X | Washington Post article