Tesla Vehicles Just Months From Full Self-Driving Can Still Run Redlights

Work for Torque News, follow on Twitter, Youtube and Facebook.

Tesla claims that its cars are just months from full self-driving through Autopilot, yet they still run red lights and cause crashes and deaths.

A recent fatal Tesla Model 3 crash in San Fransisco, California highlights an apparent contradiction. Tesla says that its vehicles will soon be capable of "full self-driving." Yet they can still run red lights and still allow occupants to set the car to exceed speed limits.

Tesla Full Self Driving
Tesla recently updated fans and owners on the status of the autonomous driving systems that the company already has installed in its 2019 vehicles - but has yet to activate. Tesla says that the cars will be ready for full self-driving in less than six months. Yet, the cars today still won't stop at a red light if the occupant/operator ignores one. If these vehicles are so close to making the leap to being able to operate in a mode that Tesla names "full-self driving" it seems crazy that the company has yet to enable its cars to automagically stop at a red light, a driving event that happens as frequently as every fifty yards in many cities.

Tesla Fatality
In San Fransisco, California this past Sunday at 2 pm in broad daylight police and news reporters in San Fransisco say that the occupant/operator of a rented Tesla (which appears to be a Model 3 in images) was speeding and ran a red light. That kicked off a crash event involving another vehicle and pedestrians. A pedestrian was killed and another critically injured. We have no reports to validate if the vehicle was operating autonomously, or if the driver was controlling it. If you think about it, why does that matter?

Automatic Braking
Almost every mainstream vehicle sold in America today either has standard forward-collision prevention or optional systems that also specifically detect pedestrians. These systems can stop the vehicle, or slow it to prevent or minimize the damage from an accident.

With Tesla's full-self driving capability just single months away, why has Tesla not opted to enable its vehicles' stoplight recognition and braking system? Why has it not enabled the speed limit controls that it will surely have to use when the vehicles self-drive?

Why Highlight Tesla?
We're certain that Tesla fans have outstanding, very logical reasons why these luxury-priced vehicles along with the occupant/operators are still involved in crashes that are killing people. Those excuses (explanations) will also come along with outrage that the media dares to highlight a brand involved in an accident. Yet, try as we might, we can't think of any other manufacturer that is charging its buyers for a system it calls full self-driving that it claims will be operating in a handful of months.

Watch how Tesla Autopilot handles construction zones and click to subscribe to Torque News Youtube channel for daily Tesla and automotive news analysis.

Top of page image courtesy of Kate Larsen via Twitter.

See you in my next story where I am discussing how Toyota kicks off 2020 Supra sales with big advert roll-out.

In addition to seeing him here, John Goreham can be followed on Twitter at @johngoreham.

Submitted by DeanMcManis (not verified) on July 22, 2019 - 11:19AM

Permalink

This looks like another attempt of you seeing Tesla's success and interpreting it as being a failure. Their success is moving forward with much needed self driving technology advancements. I doubt that you really know what changes are included in the software for that revision of the hardware and software that will support fully autonomous driving. This accident appears totally unrelated to that technology. There is pretty much zero chance that this driver was using autopilot in the city where this accident occurred. The driver's speeding and running a red light are totally separate from Tesla's safety technology. The flaw in many people's understanding of computer driving assist technologies is that they are not now, or ever will be 100% accurate. But that is not really the point. The value is in making things safer overall. In this case Autopilot was not involved, and the driver was most likely at fault. My guess is that you wanted the autopilot to intercede without being turned on? I think that before you criticize the advanced autopilot system you need to actually see it working, as opposed to comparing it to the ideal of what you think that it should do before it becomes operational.

I think you are missing my point entirely. I don't care one whit about Autopilot, whether it was used or not, what version software is today's flavor. If Tesla is being truthful about its "full self-driving" product timeline it has the capability right now to enable stoplight detection and auto-braking. And it has not. And it has the ability to employ speed-limit control and it has not. I have driven Tesla's Autopilot system as of March 2019 along with a Tesla employee who demonstrated all of its capabilities with me in the left seat. Formerly known as the driver's seat. I saw very little more than Honda Sensing and Nissan Pro Pilot Assist can do. To answer your question directly, yes, I want Tesla's advanced safety systems to always be on and to intervene to protect lives. Just like the systems in almost every brand do. There should be no "off" setting for stoplight autobraking, forward collision prevention, and pedestrian detection in passenger vehicles other than law enforcement vehicles.

"There should be no "off" setting for stoplight autobraking, forward collision prevention, and pedestrian detection in passenger vehicles other than law enforcement vehicles."

No way. There are numerous reports of Autopilot identifying stationary objects above or alongside the road as obstacles in the road and slamming on the brakes in traffic. That is not acceptable.

Traffic lights are also a really hard problem. Even the most basic four-way intersection between two-lane roads isn't exactly easy because of how many possible light configurations there are. Add in multiple lanes, odd angles, etc. and human drivers get confused all the time. I drive through a six-way intersection regularly that I guarantee would confuse an automated system from time to time, especially after storms where the lights might not be pointing exactly where they're supposed to. No way I want an automated system deciding to stop the car for me when I'm driving.

Give me all the driver assists in the world, they're great, but don't actually take control of the car from me unless I've explicitly given it up.

Thanks wolrah. See my comments on auto-braking below under Dean's comments. I agree with you that stop lights are a huge challenge. But, Tesla says it has it figured out and that full self-driving is just single months away. Since they have it sorted, I make the case that the safety side of that existing tech should be ready to go now. Thanks for offering your comments. I plan to research and expand on your topic in a coming story on auto-braking tech. It really got me thinking. Cheers,

Submitted by DeanMcManis (not verified) on July 22, 2019 - 1:48PM

Permalink

First off, the full self driving capability is not on all Teslas, just with the ones that have the extra hardware and software package that supports those capabilities. Again, odds are that this (rental?) car did not have the full autopilot+ package installed, and again it was not enabled. Your wish that it should be automatically enabled is interesting when you are offering so much public criticism even when their safety systems are not enabled. Good luck in forcing customers to have them on all of the time. I have collision alert systems on my Cadillac, and this morning I had a false alert that sensed a parked car as a "stopped" car in my path and flashed red lights on the HUD display. It startled me, but I've known that false alerts happen from time to time. Still, it would have been a totally different matter if it suddenly locked up the brakes! I am interested in reading the other articles that you have written talking about deaths and accidents caused by Honda or Nissan's safety systems. Oh wait...there are none. Not because the systems are flawless, but because it doesn't make flashy headlines the way that attacking Tesla does. If it weren't for negative autopilot articles singling out Tesla I'm sure that they would have advanced far more quickly, but with internet land mines like yours being set out everywhere, they are forced to roll out their technology carefully.

You make the best pro-Tesla arguments Dean. And they are all valid. Just to expand the conversation, shouldn't every company "roll our their technology carefully?" Shouldn't every Tesla owner and fan ask that Tesla be the best it can be with regard to their safety and the public's? - - Separate thoughts: False-positive alerts are too common and on all brands. It is improving though. I know from testing the advancing generations, particularly Subaru's whose auto-braking came before Tesla's and is installed on many more vehicles. Stepping back, let's remember than in just a few model years every mainstream vehicle in America will have this technology standard and none will have "latching" off switches. In other words, you may be able to disable it, but it will resume once the vehicle restarts. I suspect almost none will allow a vehicle to shut it off over about 10 MPH (sort of like how stability control off switches work). Right now, there is no vehicle in America I know of that has a switch fast enough to disable auto-braking in real-time emergencies.

Submitted by Martin Winlow (not verified) on July 23, 2019 - 9:39AM

Permalink

From other reports on this incident, it is far from clear if Autopilot was even engaged at the time. If it wasn't then your story is nothing more than click-bait claptrap.

Mind you, Torquenews is hardly renowned for its unbiased and informed commentary!

Hi Martin. Thank you for taking the time to read and offer a comment. Where in the story did I say that Autopilot was involved in this fatal incident? In the third paragraph, I specifically point out that we have no reports it was involved. I have not made any edits since your comment. The point of this story is that if Tesla is single months from having vehicles with "full self-driving" it must already have stoplight recognition and auto-braking ready to go. The story posits the question "if that system is available, why don't Tesla cars have this safety feature now? "

Submitted by DeanMcManis (not verified) on July 23, 2019 - 1:47PM

Permalink

This accident was ALL about driver impatience, and NOT about a shortcoming in Tesla’s safety systems. Most all of the traffic issues and accidents that I see in my 80-100 mile daily commute relate to driver impatience, and distraction. Now the autopilot technology will definitely help with distracted drivers, but the kind of driver that regularly speeds and runs red lights is exactly the person who will NEVER buy a car that limits their speed and forces them to drive more safely. Safety technologies like seat belts and airbags are federally mandated, but they are passive systems that do not police driver safety. It is most likely that Tesla will phase in the self driving features, starting with freeway driving, and then expanding to include more complex driving environments. I wonder why you regularly question the value and effectiveness of Tesla’s auto driving safety systems, and then also complain that they are not already fully implemented before the new system has even been released. How does this make sense?

Submitted by John (not verified) on July 23, 2019 - 4:24PM

Permalink

I don't think this author gets it. Continuously reinforces the idea that "if Tesla is single months from having vehicles with 'full self-driving' it must already have stoplight recognition and auto-braking ready to go." Tesla could be 2 days away from full self driving and that doesn't mean any software should be released into the wild and assumed to be fully functioning. The point is if Tesla is not ready to release tech then there's no reason you should be criticizing it as if it were already released and production ready. Full stop.

Thanks for your comments and for making sure we knew when you were finished with your "full stop." Tesla's explanation of what FSD can do includes, "Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed." You can read about it on Tesla's website of course. There, you will also see that Tesla is now enabling cars with Autopilot to sample FSD for 14 days with no fee. The promotion includes the statement, "This includes access to our suite of Full Self-Driving Capability features..." By all appearances, if Tesla is offering owners this system now as a free trial, it must be ready now, no? Noplace on the trial page does it mention that FSD can't handle traffic lights. You can cut and paste this link if you like: https://www.tesla.com/support/full-self-driving-capability-trial#enhanced-autopilot

Submitted by Kevin (not verified) on July 25, 2019 - 4:58PM

Permalink

"...why has Tesla not opted to enable its vehicles' stoplight recognition and braking system? Why has it not enabled the speed limit controls that it will surely have to use when the vehicles self-drive?"

The stop light breaking system, and the other features you mentioned are what will be enabled this year as part of the FSD package. Its important to realize that FSD is not level 3 autonomous driving, and that the driver is still required to keep their hands on the wheel and be actively paying attention. This is the time when Tesla will be able to work out any issues with the system and bring the FSD system into a level 3 configuration, but that wont be for another year at least. It seems as though the author is confused about what FSD actually offers as part of the initial release.