Video Showing Why Tesla ‘Autopilot’ Accident in Texas Can’t Be On FSD Beta

Work for Torque News, follow on Twitter, Youtube and Facebook.

A Tesla Model S crash in Houston Texas has been blamed by several media outlets on Tesla's FSD Beta software. However, a video by Brandonee916 shows why Tesla FSD beta could not have been engaged during the crash.

Tesla’s driver-assist software Autopilot has once again been thrown into the limelight due to a tragic Tesla Model S accident that took the lives of two individuals.

The accident happened Saturday night when a 2019 Model S with 2 occupants crashed into a tree in the Carlton Woods Subdivision on Hammock Dunes Place in Houston, Texas.

According to authorities, the Model S crashed into a tree after failing to navigate a turn at high speed near 18 Hammock Dunes Place. The collision resulted in the death of both passengers.

Related news: New Videos Further Showcase Model 3’s Track Prowess

What makes the accident stand out is when first responders arrived at the crash site they found the occupants one sitting in the passenger seat and the second passenger at the back seat.

According to crash experts who examined the crash they are certain there was no one in the driver’s seat. This has led some in the media to speculate the crashed Model S was operating on autopilot.

On top of this, some on social media have even gone so far as to suggest the vehicle was equipped with Tesla’s latest limited release FSD beta software. ARK Invest believes FSD will propel Tesla to being the largest company in the world.

Tesla’s FSD beta software is the next iteration of Tesla’s autopilot software and is eventually expected to enable level 5 self-driving. Currently, there are around 2000 Tesla owners in the early access program. However, this number should increase significantly with a new update coming to the FSD software next month.

The speculation Tesla’s latest FSD beta software was engaged came from the fact the Model S was driving on a windy road and the fact that the road does not have lane lines. Unlike FSD Beta Tesla’s current autopilot software can’t be engaged on roads without a lane line.

Related news: GM's New Battery Plant With LG Energy Will Make Large Pouch-Style Cells

However, a new video by Tesla early access program participant Brandon from the youtube channel Brandonee916 proves this can’t be the case.

In the video, Brandon demonstrates what happens if a driver unbuckles his or her seat while the Tesla FSD beta is engaged.

As you can see from the video as soon as the seat belt is unbuckled the vehicle gives off loud chimes, there is a flashing red light on the center screen, and immediately the vehicle pulls to the side and safely stop. And the whole process happens within few seconds.

This shows FSD beta couldn’t have been engaged long enough for the driver to climb out of the driver’s seat and into the back seat.

So what do you think? Could Tesla’s FSD beta or autopilot have been engaged during the Houston accident? Also, who do you think is at fault if Tesla on autopilot crashes? Let me know your thoughts down in the comments below.

For more information check out: Apple ‘Very Near’ To Sign A Deal With Tesla Battery Supplier To Build An EV Also, see Panasonic Battery VP Says 4680 Cells Necessary To Build Tesla’s $25,000 Car

Tinsae Aregay has been following Tesla and The evolution of the EV space on a daily basis for several years. He covers everything about Tesla from the cars to Elon Musk, the energy business, and autonomy. Follow Tinsae on Twitter at @TinsaeAregay for daily Tesla news.

Submitted by Gary (not verified) on April 19, 2021 - 3:17PM

Permalink

Why doesn't NHTSA require Teslas to have a fireproof Black Box that have been required on other cars, since 2013?

Submitted by TC (not verified) on April 22, 2021 - 12:28PM

Permalink

These comments, like the original fake news is more FUD from anti-Tesla people trying to protect ICE oligarchs

Submitted by Bruno Wals (not verified) on April 22, 2021 - 8:16PM

Permalink

It is possible to trick the safeguards put in place by the Tesla Autopilot system by attaching a weight to the steering wheel and sitting on top of the seat belt already buckled and then leaving the seat while the car is in motion. This is not Tesla's fault. Someone who goes to all this trouble is clearly determined.

You can just as easily take a normal ICE car and engage the cruise control on a straight stretch of highway, tie a rope around the steering wheel to keep it from swerving, and then move to the back seat.

Would anyone blame the ICE car maker because the driver did something stupid? If someone drives drunk, you could argue it's the ICE car maker's fault for not installing a breathalyzer that prevents the engine from starting if alcohol is detected.

No matter how many safeguards are in place, irresponsible people will do irresponsible things.