Skip to main content

"Stop This Nonsense": Tesla's Autopilot and Its Problems With Stationary Object Detection

Tesla vehicles on Autopilot seem to be hitting a lot of big, obvious, unmoving objects. An expert opinion on what may be the cause and what should be done.

Tesla's Autopilot is an amazing new technology in its beta stage of development being tested daily by Tesla owners on public roads. While there is little doubt that the technology could eventually prove to be a net positive for safety, the reality is that Autopilot has been engaged during many crashes that seem ridiculous. Just to summarize: Tesla vehicles utilizing Autopilot have hit the side of tractor trailers, crashed into not one, but two stationary firetrucks, have crashed directly into a highway divider, and most recently, police report that a vehicle with Autopilot engaged crashed into the back of a parked police vehicle.

Tesla fans want to wish the problem away. Comments under our stories reporting the multiple crashes include comments blaming the drivers in many ways. Some feel that Autopilot should only be used on the highway. But three of the crashes did occur on the highway. Others feel that these crashes are caused by drivers who are inattentive. Indeed, Tesla owners have been caught on video showing themselves in the passenger seat while Autopilot drives them around. But given that the crashes are happening is blaming the drivers enough? Doesn't Tesla owe it to the public to at least ensure that drivers must remain attentive for the system to operate?

The New England Motor Press and the Massachusetts Institute of Technology (MIT), recently convened a symposium on autonomous driving systems that brought together leading experts in the field. The presentations and panel discussions were very informative and provided the media with a detailed look behind the curtain at what the current state of the art is. One expert from Nvidia gave a detailed overview of how autonomous driving systems are programmed to detect and then react to small objects. One example was a seagull. Presently, there are self-driving system in operation on public roads than can see a seagull, and know how to react to it given real-time options. Yet, a Tesla Model S costing up to $160K can't detect a firetruck ahead and avoid it or stop before hitting it.

We asked the panel of experts why Tesla vehicles are repeatedly crashing into huge, brightly colored objects (white, red, blue with flashing lights and miles of silver and red reflective tape). You can view the question yourself at the video below. It starts at time stamp 1:24:30. The panel hesitates to reply, but MIT Research Scientist, Bryan Reimer, Ph.D, tackles it when the others opt not to. Dr. Reimer begins by explaining that after the first time a Tesla Model S crashed into a huge object in its path (killing its owner/occupant) NHTSA did a thorough investigation and could have, he implies it should have, issued a recall notice. Since then, there have been at least four more known crashes of vehicles on autopilot into stationary objects resulting in one additional death. Watch how Dr. Reimer ends his comments to hear what he thinks about Tesla vehicles crashing into stationary objects. That comment comes at time stamp 1:26:27. Dr. Reimer's comments include, "Stop this nonsense" and "We know there is an issue with the detection of static objects."

Dr. Reimer is currently part of a long-term research project working with Tesla vehicles to track and study Autopilot operation and safety. Prior to the panel discussion, he overviewed the progress to date in detail in a keynote address. Dr. Reimer points out that "The benefit of Autopilot is that there are a lot of miles being driven, we can understand that, that's helpful in the understanding of design and understanding of engineering." Please watch the video to get the full context.

The other panel members shown when viewing the video from screen left to right are;
Peter Secor, Humatics' Vice President Product Marketing & Business Development
Sanford Russell, Nvidia's Senior Director of Autonomous Driving Ecosystem, North America
Kris Carter, Co-Chari of the Boston Mayor's Office of New Urban Mechanics
Luke Fletcher, Senior Manager TRI, Toyota Research Institute at MIT, Silicon Valley, Detroit

Comments

Paul Judd (not verified)    June 8, 2018 - 7:37AM

In barrier crash, the auto pilot was set to 10mph above speed limit. Why is this possible to do? Must have bearing on accident, reckless driving ?

Dean McManis    June 27, 2018 - 6:29PM

Years ago I helped work on a tracking system for Toyota/Lexus cars. It monitored and alerted when people were in the path of the car at night. Another part of the system was called inattentive driver, and an internal camera tracked eye movement.
The main goals were to alert when the driver's eyes were off of the road for a period of time. This included reading, and importantly sleeping. I really do not know why a similar system is not designed into all autonomous cars today to alert and enforce the idea that they are in the research phase, and imperfect.
The trouble with public perception is that people seem to think of advanced technologies as either untrustworthy or perfect. The truth is that they are evolving technologies, and the flaw with the press is that they call out items like this where the self driving car missed seeing the big red fire truck, when the reality is that tracking systems detect objects differently than we do. But they are far better than we are in so many other areas like real time, 360 degree monitoring, 24/7.
The key to higher accuracy is multiple, overlapping technologies, because optical, laser/LIDAR, microwave, and other technologies each have their advantages and disadvantages.
The thing that I think about each time I read another story about a Tesla Autopilot accident is the 100 similar accidents (which aren't being written about) that happened across the country on the same day due to driver error, drunkenness, texting, or impatience. Tesla is providing a working automated driving system today, and they are learning from it's mistakes and improving the product that is poised to save countless lives in the future. Kudos to them.