Analysis: Autonomous Safety Systems Coming, But Still Years Away

Work for Torque News, follow on Twitter, Youtube and Facebook.

Submitted by Marc Stern on September 29, 2021 - 5:02PM

To listen to some automakers, the autonomous vehicle -- self-driving -- is here today as they believe that AI-controlled safety software is mature and ready. But it is? Our Torque New analysis doesn't agree.

You have to keep wondering what has gotten into the design and marketing teams of the auto world. It began About a decade ago when someone decided it would be great if cars and trucks could handle the driving chores for people.

Do Computers Handle Things Better?

After all, it seemed to be reasoned that computers and programming would handle the driving chores far better than your basic operator. Under the control of computers and programming, vehicles would be in far better hands than those of drivers who do, after all, get tired or who may be cranky and out of sorts, or who could be just feeling unwell.

Whatever the case, automotive developers reasoned that these factors would be dialed out of the driving equation, leaving the people with the keyfobs time to take care of business or office chores. Drivers could use the dashboard time to their great advantage. By letting computers handle the mundane driving chores of driving or merging with traffic, and other stuff, like starting and stopping and avoiding other vehicles.

It's a desirable goal. Taking the weight of driving off the operator's shoulders could lead to safer highways and better transport, it has been reasoned. At least, that seems to be a pivotal theory to autonomous vehicles and driving. Imagine a world where your vehicle responds to a route you have chosen, analyzes all of the possible troubles it may encounter, and then takes care of the mundane driving chores for you.

Wow, it's a great picture. And, to some extent, it seems to be working limited amounts with Ford's BlueCruise and GM's hands-off driving system.

Okay, you probably know what's coming now, the significant BUT. But is it a reality? In the humble opinion – okay, not so humble – of this writer, the world of vehicle autonomy has a way to go before you can into a vehicle and let it drive for you.

Some Systems, Like Ford’s Show Real Promise

And, I am not saying that BlueCruise, the Ford hands-off product, which so far has received raves in the technical press, isn't a good idea whose salient points make it workable on many levels, but here's the thing, can it handle the full monty? Can it take over for you as you drive your Ford F-150 Lightning electric truck?

I think it has a way to go before we can honestly say everything is ready for prime time.

I like BlueCruise because the driver can take control and handle the vehicle if there is a problem on the road – indeed, it is expected. But, the trouble with most drivers is that they assume that the system can handle everything so they can sit back and let the vehicle have its way with the road ahead.

Honestly, the experience of Tesla and its autonomous AutoPilot software system is quite informative of where the general state of driving software is. Indeed, the National Highway Traffic Safety Administration (NHTSA) is looking closely at the AutoPilot system following many severe and fatal accidents.

Let's face it, Tesla's AutoPilot software has flaws, despite the automaker's stance, which believes it is ready. One problem that keeps cropping up is AutoPilot's inability to distinguish police, fire, and other first-response vehicles from other vehicles. You would think that this would be a no-brainer, but it isn't. Last February, a Tesla Model X, operating on AutoPilot, slammed into five Splendora, Texas, police officers and their vehicles.

Tesla Sued By Police Officers

The officers were involved in a traffic stop near Houston. They conducted a drug search with a canine when the Model X approached at speed and slammed into two police cruisers. The officers and their cruisers were stopped in the right-hand lane of an expressway. The Tesla was doing about 70 at the time of the crash.

According to Business Insider, the Tesla pushed two parked police cruisers into the police officers and the driver of the stopped vehicle. This information was obtained from court records. The officers have sued Tesla over the incident. According to the suit, Tesla engaged in false advertising by claiming that the AutoPilot system can handle driving chores better than a human driver.

"You've probably seen that Elon Musk and Tesla have proudly touted [that] Teslas on AutoPilot are safer than your everyday driver, that Teslas on AutoPilot have fewer accidents than others," Tony Buzbee, an attorney, said in an interview with KPRC 2. "But, what we've learned is that this information is misleading.

The lawsuit claims that Tesla's AutoPilot mode was "completely unable to detect the existence of at least four vehicles, six people, and a German Shepherd fully stopped in the lane of traffic because it does not recognize cars and pedestrians when lights are flashing. Indeed, the suit claims that the issue is still not fixed, "despite multiple crashes with first responders," according to Business Insider. Tesla had no comment.

It isn't the first issue that the AutoPilot system has had. NHTSA has been looking at the software closely as part of several investigations. According to the safety agency, more than 11 crashes have been identified. The AutoPilot system in "Traffic-Aware Cruise Control Mode" has struck first-response vehicles at crashes or other scenes. The majority of the crashes have been at night, says Business Insider, with flashing lights illuminated, plus road flares and traffic cones.

Tesla, Restaurant Sued Due To Crash

The Splendora officers' suit seeks damages for "multiple injuries and permanent disabilities in a range of $1 million to as much as $20 million." In addition, a restaurant is involved as the officers have sued the bistro for reportedly overserving the Tesla operator before the crash. "Police reports from February show the driver was taken into custody due to suspicions" of operating under the influence. There was no comment from the restaurant regarding the crash.

Meanwhile, NHTSA has been studying as many as 14 other crashes where the AutoPilot software may have caused problems. In one 2020 crash involving a fire truck, the Tesla slammed into the truck at speed, killing a passenger. The vehicle was reportedly using the AutoPilot system at the time. And, the same day, another Tesla ran a red light, striking a Honda Civic in Gardena, Calif., killing two in the Civic. According to information from the safety agency, the 2018-19 increase in accidents involving Teslas and its AutoPilot software suite was 138 percent.

You have to wonder whether autonomous vehicle performance is little more than a pipe dream with this type of record. Granted, Tesla drivers seem to be pushing the envelope of vehicle autonomy. Still, it doesn't instill confidence in the software as a whole, no matter what the automaker claims.

Perhaps it is an implementation issue as Tesla seems to be the automaker with the most significant number of open NHTSA investigations involving first-responders and civilian fatalities.

BlueCruise involves a relatively robust AI (artificial intelligence) software suite from Ford that keeps the driver involved in the vehicle's operation. Other autonomous software packages such as GM's do the same. Yes, you can relax and let the software suite take over, but the driver has to be involved in monitoring, and if the AI software determines there is a problem, the driver is brought back into the picture almost instantly.

Notice, though, that BlueCruise and other software offered by other automakers is more limited right now. However, future versions will likely expand the level of autonomy.

What Causes The Problems?

Why aren't software suites like AutoPilot and others ready for prime time? It's simple, the automotive environment.

Take Ford, for instance. Last year, there was a problem with the video camera used in the Mustang. According to a recall notice, the camera was twisted out of its holder and either delivered the wrong video image or none at all. The fix was replacing the video system. Until the camera was replaced, the system was useless. And this was only for a recall notice.

Here's the problem with similar systems if you happen to live in the Snowbelt. Unless the aperture of the video system remains snow or slush-free, the camera system is useless.

Of course, the automaker will never advertise this as it detracts from the system's usefulness. Instead, the system relies on other sensor suites to take care of other issues. If your vehicle has snow, ice, or slush caked on the sensor array for the road trip duration, it is toast. If there is a heater element involved in trying to keep the array clear, you may have another ice problem as the heater could cause ice buildup,

Then, there's the inclement weather itself. In the summer, there are problems with rain, fog, and other precipitation. The winter has its own set of problems with snow, slush, ice, and the like. And, then there are the related sensors that enable the software-controlled cruise control and braking, lane-centering, and other systems. Poor weather can cause video, sonar, or lidar systems to fail. These are the problems that advocates of vehicle autonomy choose to ignore.

Total Fix Still Years Away

Now, I am not saying that someday these issues will remain unfixed in the future because they won't. They will be, but it is still fully years away from a total solution. Of course, if you ask Tesla, things are already set and ready to go. But, for some reason, the AutoPilot software won't recognize first-response vehicles, a significant problem, as are the fatalities that NHTSA is investigating.

Despite all the talk from Tesla, Ford or GM, and others, there are still too many factors that must be fully workable before believing things are ready. Unfortunately, the solution to these issues is, I believe, still some years away.

The revolution in electric vehicles will be the driving force that causes the industry to perfect autonomous performance. Since electrics are the wave of the future and automakers emphasize electronic safety and control on them, it only stands to reason that the maturity of electric vehicles will also mean that safety systems will also mature with them. That maturity is still, as noted, years away.

Marc Stern has been an automotive writer since 1971 when an otherwise normal news editor said, "You're our new car editor," and dumped about 27 pounds of auto stuff on my desk. I was in heaven as I have been a gearhead from my early days. As a teen, I spent the usual number of misspent hours hanging out at gas stations Shell and Texaco (a big thing in my youth) and working on cars. From there on, it was a straight line to my first column for the paper, "You Auto Know," an enterprise that I handled faithfully for 32 years. Not many people know that I also handled computer documentation for a good part of my living while writing YAN. My best writing, though, was always in cars. My work has appeared in Popular Mechanics, Mechanix Illustrated, AutoWeek, SuperStock, Trailer Life, Old Cars Weekly, Special Interest Autos, and others. You can follow me on: Twitter or Facebook.

Submitted by tangible (not verified) on September 29, 2021 - 11:28PM

Permalink

thank you for the well-reasoned reality check. It's about time it was done without the hysteria of a parrot Media incapable of independent thinking.
However the fanboys, retail investors and advocates of FSD and it's kin will bury you before acknowledging any weakness in the concept they've staked their self-image on.
FSD is a noble goal and will come eventually. But subjecting the public to Beta testing is risking many innocents. It has to be let into the wild at some point, but who makes that call?