After last week's Model X crash, for which Tesla accepted that the Autopilot was engaged and blamed the driver, making NTSB "unhappy" for making public information about the crash, one Tesla owner tried to recreate the fatal Model X crash and almost killed himself. This shows that Autopilot's warnings are simply not enough. Tesla drivers are at their own risk, when engaging the Autopilot.
Torque News does not recommend trying, testing or recreating anything like this by sharing this video, but we are bringing this to the attention of our readers showing how dangerous Tesla Autopilot may be. See how Tesla AP2.5 takes a wrong turn into gore point barricade.
"For everyone who doesn't understand the context: he is replicating the exact same scenario that killed the driver to prove that there is a problem with the way tesla doesn't detect collisions. This is how you find and report bugs with programs and algorithms. Hopefully Tesla takes action to save lives.," commented one person under the video.
People put too much trust on Tesla's Autpilot.
Even the airline pilots don't fully trust the their planes' autopilot systems. And their systems are more advanced and pilots are well trained. Tesla's Autopilot is not yet advanced and drivers are not trained.
"As a commercial airline pilot, I would like to point out that an autopilot on any aircraft is constantly monitored by pilots. At no point is it 100% trusted. Also frequently during more complex portions of flying such as transitioning to approach it needs to be disconnected and the pilot will take over manually because it does not perform as required," writes one pilot in our story published in January 2018, titled "Pilots and Tesla Owners Explain How Autopilot Works."
Ban The Autopilot for Now
Autopilot tempts people. The drivers think it will do the job, yet it does not. When it does half of the job well, people release the control to the Autopilot and when it fails it costs lives, shows the last week's terrible crash, in which we see how tragic the consequences can be seen in this video. As a result people will become lazy and put too much trust in the autopilot.
We have that tendency as human beings. Our passions overcome us and we are easily tempted. This is why people needs to be protected from risky situations as much as possible.
We see videos abounding on Youtube praising The Autopilot's performance. Torque News also in the early days of Autopilot has written positively about it point out its limits. But we now believe that praising the Autopilot may give people false hope, tempt them and give them false hopes of security. We are not there yet and this technology needs years to develop. Meanwhile, we don't want to see anyone dying because of their passions, false sense of security and releasing their driving control to the technology which is not yet perfected.