Skip to main content

NHTSA Opens Probe Of Teslas' ‘Phantom Braking'

Tesla has an issue, honestly several. Aside from running into emergency vehicles and having reports of vehicles having issues with FSD Beta software, there's another problem, "phantom braking," where Model3 and ModelY vehicles are suddenly slowing or stopping on Interstates.
Posted: February 18, 2022 - 7:34PM
Author: Marc Stern

Even as automakers like Ford, General Motors, and others explore putting vehicle control and autonomous mode software – self-driving -- into general use, Tesla keeps telling the world that its AutoPilot software is in good shape. The problem is there are issues.

Looking At Some Software Problems

Though Tesla has said many times in the last several years that its AutoPilot software is just peachy, the reality and press releases do not jibe. Let's take a look at some of its recent software problems.

Not too long ago, a Tesla in AutoPilot mode (with the vehicle's software controlling things) managed to get in too deep, and the vehicle ran out of road and into another vehicle killing two people aboard in California. And, for some reason, other Teslas love first-response vehicles like fire trucks and police vehicles.

Several serious crashes have involved Teslas and first-response vehicles in California and elsewhere, as Tesla software has this habit of finding and crashing into first-response vehicles.

It is also not the first time that a Tesla has been found to have been in Full Self-Driving Beta Mode under its AutoPilot software when it was involved in a crack-up.

For example, there was the time when drivers saw a Tesla screaming past them on a major interstate. According to reports, the operator was sound asleep as the vehicle sped past, and it reportedly ended up ultimately in a crash where there were fatalities. Another incident occurred while an operator was reportedly reading a newspaper, which also ended in a fatal crash. Tesla has said it has corrected the problem with a software update. But why were the crashes allowed in the first place? One would have thought that given the seriousness of someone falling asleep or reading a newspaper, the vehicle should not have been allowed to go into autonomous mode at all.

Teslas Have Issue With First-Response Vehicles

Then, there's the problem of Teslas having an affinity for first response vehicles – police and fire trucks. This problem, which did result in serious injuries as many crashes resulted, ultimately resulted in another Tesla investigation by NHTSA.

Interestingly, says AutoWeek, just a few weeks after NHTSA launched a probe into Tesla crashes into emergency vehicles, the automaker says its AutoPilot driver assist system can now see emergency lights and slow down, but only at night.

There was no major announcement of the upgraded functionality. According to the enthusiast publication, a Twitter user, Analytic.eth, spotted the change in updated vehicle manuals.

A recently updated driver's manual states now that "If Model3/ModelY detects lights from" emergency vehicles "when using AutoSteer at night on a high-speed road, the driving speed is automatically reduced, and the touch screen displays a message informing you of the slowdown."

"You will also hear a chime and see a reminder to keep your hands on the steering wheel. When the light detections pass by or cease to appear, AutoPilot resumes your cruising speed. Alternatively, you may tap the accelerator to resume your cruising speed."

Some Models May Not Detect Emergency Vehicles

According to AutoWeek, the automaker "does note that Model 3 and ModelY" vehicles might "still not detect vehicles' emergency lights in absolutely all situations, reiterating that" drivers "should keep their eyes on the road."

AutoWeek continued that the "functionality was added only after the NHTSA launched its investigation into Tesla crashes into police and fire vehicles." The enthusiast site notes that it "may have been planned for some time and that it's also confined to nighttime." AutoWeek notes that "not all Tesla crashes into emergency vehicles occurred at night, and Tesla does not elaborate just how dark it has to be outside for this functionality to kick in."

AutoWeek notes that this "could be a bit of an issue since the majority of highway crashes or even traffic stops do not happen at night – they happen during daytime, morning, and evening hours." Further, they said that "emergency light usage at night is far easier to spot by a human driver in the first place, from up to several miles away. So, this is an improvement, but not one that can be taken advantage of during the hours when most emergency vehicle responses or traffic stops occur on highways."

The National Highway Traffic Safety Administration (NHTSA) has recently looked at Tesla's AutoPilot software. And, now the agency has another Tesla probe on its hands as the ADAS software has been giving drivers fits as their Model3 and ModelY vehicles have been slowing suddenly from high speeds on major throughways.

NHTSA Opens New Probe Of Teslas

The agency opened a new probe of Teslas this week. It responded to a basket-full of complaints from 354 owners who told NHTSA of the alarming problem of having their vehicles suddenly dangerously slow down from highway speeds at random times. NHTSA has termed this a probe of phantom braking.

The agency's Office of Defects Investigation (ODI) has had more than 350 complaints of what has been termed "phantom braking" in 2021-2022 Models3 and Y.

The problem isn't new, either, as it has been reported to have occurred in the last nine months or more. Some 416,000 vehicles are covered by the new investigation, according to AutoWeek. Indeed, the problem, says AutoWeek, has been occurring for much longer than just nine months.

"Phantom braking is actually an issue we've heard about in connection to AutoPilot for years, certainly prior to the relatively small population of 2021 and 2022 Model Y and Model3 vehicles targeted in this investigation." AutoWeek indeed acknowledges that "it's a serious issue" because as vehicles slow down sharply, it increases the risk of "colliding with trailing vehicles."

Were There Other Reports?

NHTSA "does not say whether it has received any reports of the same issue before 2021, or what event or volume of reports triggered the investigation this week (though 354 complaints do qualify as a number to kick off another Tesla probe).

The safety agency said that it had received the reports over the last nine months, and it noted that consumers had called the braking problem "phantom braking." It further said that the automaker "describes the subject vehicles as equipped with a suit of advanced driver assistance system (ADAS) features referred to as AutoPilot, which Tesla states will allow the vehicle to brake and steer within its lanes."

AutoWeek noted that Tesla had switched to a vision-only approach to AutoPilot sensors "in some new models within the past year, which could possibly explain the relatively narrow pool of affected cars. In May 2021, the automaker would drop radar from these two models, with the system relying solely on cameras to maneuver the vehicles in AutoPilot mode. The move was greeted with some skepticism from industry observers at the time, with many noting that a number of sensor types are generally preferable to fewer."

The ODI's preliminary evaluation "is considered introductory and mostly involves the review of service bulletins and owner complaints. At this stage, the matter could be closed, or it could instead proceed to an engineering analysis (EA)." Depending on the "outcome of an EA, the matter could be closed or escalate to a recall. The EA process itself could take quite some time before proceeding to a recall if warranted."

AutoWeek said that "it's worth noting that Tesla has already issued a number of over-the-air updates to address various AutoPilot issues, including once concerning phantom braking so that a possible future recall would involve more over-the-air updates."

NHTSA Probing Teslas Hitting First Response Vehicles

Though AutoWeek noted in an article this week that NHTSA has been "investigating more than a dozen instances of Tesla crashes into the backs of first responder vehicles while AutoPilot was engaged." And other agency probes over longer terms have looked into reports of drivers using AutoPilot in full autonomous mode. The reports have covered those instances where drivers have engaged AutoPilot in full-autonomous mode and which have ended up tragically (the sleeping driver or the reading driver).

It's funny how Tesla has referred to Full Self-Driving (FSD) mode. Even though NHTSA has recently taken a relatively "liberal approach to Tesla's various ADAS endeavors, including the distribution of Full Self-Driving Beta software to vehicle owners … the company notes that FSD does not make a vehicle self-driving (though one wonders what it does do).

AutoWeek concluded that the NHTSA earlier this year "prompted Tesla to disable certain features in vehicles operating FSD Beta software that could permit them to illegally roll through stop signs instead of coming to a full stop." And while you wonder if Tesla allows vehicles to operate in fully autonomous mode, notices that the automaker calls it FSD Beta software. In other words, it is software that is still in test, which is a euphemism for the fact that the behavior can happen – driving with your hands off the wheel.

Elon Musk, CEO of the electric vehicle, is a staunch defender of Tesla's "Full Self-Driving" tech. The automaker has had a number of issues recently, from software glitches to suspension separations. And, while it has had its recent problem, Musk has said it isn't like "some little feature." In the quarterly earnings call, he called it (the FSD tech) "the most profound software upgrade in history."

Photo Courtesy Tesla Motors

Normally, I am one of the Ford reporters for Torque News, however, I have written about a number of models, including Teslas here. Marc Stern has been an automotive writer since 1971 when an otherwise normal news editor said, "You're our new car editor," and dumped about 27 pounds of auto stuff on my desk. I was in heaven as I have been a gearhead from my early days. As a teen, I spent the usual number of misspent hours hanging out at gas stations Shell and Texaco (a big thing in my youth) and working on cars. From there on, it was a straight line to my first column for the paper, "You Auto Know," an enterprise that I handled faithfully for 32 years. Not many people know that I also handled computer documentation for a good part of my living while writing YAN. My best writing, though, was always in cars. My work has appeared in Popular Mechanics, Mechanix Illustrated, AutoWeek, SuperStock, Trailer Life, Old Cars Weekly, Special Interest Autos, etc. You can follow me on: Twitter or Facebook.


DeanMcManis (not verified)    February 19, 2022 - 7:26PM

First off, it is important to understand that NONE of the Teslas crashed into emergency vehicles, or drove sleeping drivers to their deaths. Almost every single accident (and they were very rare) were solely the responsibility of the drivers, who are required to drive their Teslas with the driver assist helping. Drivers were not paying attention to the road and also did not see or stop for the parked vehicles stuck in the middle of the road, as people regularly do on occasion. The mistake is that Tesla has mis-titled their driver assist software "Autopilot" and their beta program "Full Self Driving" fostering the false assumption that the EVs are capable of driving these vehicles without human assistance. And when a few people have cheated or tricked the system to make it believe that the human was driving and paying attention, and it crashed. The Tesla system was fully blamed for the accident. I do believe that the recent occurrences of "phantom braking" are a consequence of Tesla choosing to drop their redundant sensors like radar from their driver assist systems. I worked on vehicle tracking systems before, which were video camera only based, and they were flawed because there were gaps in the visual information that the intelligent software could not resolve with 100% certainty. Other sensors have their own shortcomings, but have strengths over video cameras in certain instances. Of course Tesla and others have made great strides in software and hardware over the years, but ultimately when you give the car complete control to drive and stop, having 99.99% accuracy results in the software making bad judgements from time to time. The reality of the situation is that as humans, we make similar mistakes far more often. But we have grown accustomed to "human error", let alone people who are tired, or distracted, or under the influence. The head of the NHTSA has a personal grudge against Elon Musk and Tesla because they have a financial stake in a LIDAR company, and Musk chose to not use LIDAR in Autopilot. So the NHTSA will take every opportunity to file and pursue claims against Tesla, Autopilot, and FSD, in a way that they do not attack other drivers assist products from legacy automakers. So Tesla is to blame for their marketing names and claims Autopilot, and Full Self Driving, but they have made recent changes (internal cameras, seat and belt sensors) to reduce the possibility that drivers would fake out the system and allow full computer control without driver monitoring. And they are steadily improving the Autopilot software to not be fooled by the cameras and misjudge the data to cause phantom braking. Regulators may be able to force Tesla to rename their Autopilot software, or Tesla might be able to improve it quickly enough (to not have those false responses) before regulators take legal action.