Tesla Autopilot may have trouble with stationary objects.
John Goreham's picture

"Stop This Nonsense": Tesla's Autopilot and Its Problems With Stationary Object Detection

Tesla vehicles on Autopilot seem to be hitting a lot of big, obvious, unmoving objects. An expert opinion on what may be the cause and what should be done.

Tesla's Autopilot is an amazing new technology in its beta stage of development being tested daily by Tesla owners on public roads. While there is little doubt that the technology could eventually prove to be a net positive for safety, the reality is that Autopilot has been engaged during many crashes that seem ridiculous. Just to summarize: Tesla vehicles utilizing Autopilot have hit the side of tractor trailers, crashed into not one, but two stationary firetrucks, have crashed directly into a highway divider, and most recently, police report that a vehicle with Autopilot engaged crashed into the back of a parked police vehicle.

Tesla fans want to wish the problem away. Comments under our stories reporting the multiple crashes include comments blaming the drivers in many ways. Some feel that Autopilot should only be used on the highway. But three of the crashes did occur on the highway. Others feel that these crashes are caused by drivers who are inattentive. Indeed, Tesla owners have been caught on video showing themselves in the passenger seat while Autopilot drives them around. But given that the crashes are happening is blaming the drivers enough? Doesn't Tesla owe it to the public to at least ensure that drivers must remain attentive for the system to operate?

The New England Motor Press and the Massachusetts Institute of Technology (MIT), recently convened a symposium on autonomous driving systems that brought together leading experts in the field. The presentations and panel discussions were very informative and provided the media with a detailed look behind the curtain at what the current state of the art is. One expert from Nvidia gave a detailed overview of how autonomous driving systems are programmed to detect and then react to small objects. One example was a seagull. Presently, there are self-driving system in operation on public roads than can see a seagull, and know how to react to it given real-time options. Yet, a Tesla Model S costing up to $160K can't detect a firetruck ahead and avoid it or stop before hitting it.

We asked the panel of experts why Tesla vehicles are repeatedly crashing into huge, brightly colored objects (white, red, blue with flashing lights and miles of silver and red reflective tape). You can view the question yourself at the video below. It starts at time stamp 1:24:30. The panel hesitates to reply, but MIT Research Scientist, Bryan Reimer, Ph.D, tackles it when the others opt not to. Dr. Reimer begins by explaining that after the first time a Tesla Model S crashed into a huge object in its path (killing its owner/occupant) NHTSA did a thorough investigation and could have, he implies it should have, issued a recall notice. Since then, there have been at least four more known crashes of vehicles on autopilot into stationary objects resulting in one additional death. Watch how Dr. Reimer ends his comments to hear what he thinks about Tesla vehicles crashing into stationary objects. That comment comes at time stamp 1:26:27. Dr. Reimer's comments include, "Stop this nonsense" and "We know there is an issue with the detection of static objects."

Dr. Reimer is currently part of a long-term research project working with Tesla to track and study Autopilot operation and safety. Prior to the panel discussion, he overviewed the progress to date in detail in a keynote address. Dr. Reimer points out that "The benefit of Autopilot is that there are a lot of miles being driven, we can understand that, that's helpful in the understanding of design and understanding of engineering." Please watch the video to get the full context.

The other panel members shown when viewing the video from screen left to right are;
Peter Secor, Humatics' Vice President Product Marketing & Business Development
Sanford Russell, Nvidia's Senior Director of Autonomous Driving Ecosystem, North America
Kris Carter, Co-Chari of the Boston Mayor's Office of New Urban Mechanics
Luke Fletcher, Senior Manager TRI, Toyota Research Institute at MIT, Silicon Valley, Detroit

Sign-up to our email newsletter for daily perspectives on car design, trends, events and news, not found elsewhere.


In barrier crash, the auto pilot was set to 10mph above speed limit. Why is this possible to do? Must have bearing on accident, reckless driving ?