Skip to main content

Autopilot Is Still Very Dangerous, It Tempts People, Costing Lives

Tesla's Autopilot is very dangerous because it gives a false feeling of comfort, tempts people to release control of the car to the technology, which is yet to be perfected and therefore costs lives.


After last week's Model X crash, for which Tesla accepted that the Autopilot was engaged and blamed the driver, making NTSB "unhappy" for making public information about the crash, one Tesla owner tried to recreate the fatal Model X crash and almost killed himself. This shows that Autopilot's warnings are simply not enough. Tesla drivers are at their own risk, when engaging the Autopilot.

Torque News does not recommend trying, testing or recreating anything like this by sharing this video, but we are bringing this to the attention of our readers showing how dangerous Tesla Autopilot may be. See how Tesla AP2.5 takes a wrong turn into gore point barricade.

"For everyone who doesn't understand the context: he is replicating the exact same scenario that killed the driver to prove that there is a problem with the way tesla doesn't detect collisions. This is how you find and report bugs with programs and algorithms. Hopefully Tesla takes action to save lives.," commented one person under the video.

People put too much trust on Tesla's Autpilot.

Even the airline pilots don't fully trust the their planes' autopilot systems. And their systems are more advanced and pilots are well trained. Tesla's Autopilot is not yet advanced and drivers are not trained.

"As a commercial airline pilot, I would like to point out that an autopilot on any aircraft is constantly monitored by pilots. At no point is it 100% trusted. Also frequently during more complex portions of flying such as transitioning to approach it needs to be disconnected and the pilot will take over manually because it does not perform as required," writes one pilot in our story published in January 2018, titled "Pilots and Tesla Owners Explain How Autopilot Works."

Ban The Autopilot for Now

Autopilot tempts people. The drivers think it will do the job, yet it does not. When it does half of the job well, people release the control to the Autopilot and when it fails it costs lives, shows the last week's terrible crash, in which we see how tragic the consequences can be seen in this video. As a result people will become lazy and put too much trust in the autopilot.

We have that tendency as human beings. Our passions overcome us and we are easily tempted. This is why people needs to be protected from risky situations as much as possible.

We see videos abounding on Youtube praising The Autopilot's performance. Torque News also in the early days of Autopilot has written positively about it point out its limits. But we now believe that praising the Autopilot may give people false hope, tempt them and give them false hopes of security. We are not there yet and this technology needs years to develop. Meanwhile, we don't want to see anyone dying because of their passions, false sense of security and releasing their driving control to the technology which is not yet perfected.


Klaus Schmitt (not verified)    April 3, 2018 - 4:10AM

With the same reason, you can ban all the cars because they are threatening lives.
Autopilot is just a tool.
A fool with a tool is still a fool.

Not the tool is the problem.

Keith (not verified)    April 4, 2018 - 9:08AM

Oh please. This is overly alarmist. I just drove from Toronto to Florida and back using autopilot extensively. It was amazingly relaxing and easy. The nag is persistent and you have to pay attention. It certainly felt a lot more safe than the crazy lane changes, tail gating, and other dangerous maneuvers I saw all around me in 4 days of driving.

There was destined to be hysteria around every accident or death associated with early driving assist, self-driving, or autopilot systems. I hope this does not delay this technology because the sooner it comes the sooner we will save a lot more lives.

Alan (not verified)    April 6, 2018 - 11:44AM

In reply to by Armen Hareyan

It is alarmist. 1. The Model X is an SUV. 2. They say Tesla autopilot "tempts" people to ignore driving, but all Tesla owners have been repeatedly instructed that it is Beta software, that it is only a driving aid like cruise control, and that the driver is responsible for driving the car at all times. 3. They want autopilot banned, despite the fact that it is saving lives -- the fatal crash rate of Tesla vehicles is 3.7 times better than the national average -- people make more mistakes and kill themselves and others 30,000 times a year.