Skip to main content

New sensors aim to fill gaps in self-driving cars

Self driving cars still have trouble handling certain driving conditions. New types of sensors could fill those gaps

By Eric C. Evarts

With nearly 80 percent of road accident caused at least in part by driver error, self-driving cars are supposed to dramatically improve driver safety.

Yet accident rates for self-driving cars are showing no such improvement so far. Every few months seems to bring a spate of crashes in Teslas operating under the company’s Autopilot partial self-driving system. And even the best companies that operate self-driving cars under special permits in California still report human safety drivers having to intervene every 13,000 to 18,000 miles or more often, according to 2019 statistics.

So far, that’s still a lot more often than human drivers, which insurance industry studies have shown can drive an average of almost 225,000 miles between accidents. (In fairness, the statistics aren’t parallel, because no one measures near misses—more equivalent to “disengagements”—for human drivers. A small-scale 2006 study by the Virginia Tech Transportation Institute, showed drivers nearly crashed once ever 2,980 miles, putting the best self-driving services ahead of this group of drivers.)

Adasky thermal cameras

So why aren’t today’s self-driving systems better?

Engineers are frantically working to improve software, to “teach” self-driving systems to handle a wider variety of driving situations. But part of the problem also comes down to sensors. The array of sensors now on self-driving cars only cover a portion of driving needs (though some self-driving promoters say they can fill in the gaps with software.) For example, many self-driving systems shut down in bad weather conditions—right when drivers say they need them most. And most struggle to identify pedestrians and other objects at night. Identifying traffic lights poses another challenge.

Torque News recently spoke with two companies that are developing new types of sensors to fill in these gaps.

Today’s self-driving cars use mainly four types of sensors: Cameras, radar, lidar, and ultrasonic.

  • Ultrasonic sensors are used in parking sensor systems in many cars people drive today, which emit higher frequency beeps the closer you get to a parked car. The actual sensors are little dots in bumpers or sometimes fenders. They detect the distance to stationary objects (or objects that are mostly stationary relative to your own car), and don’t cost very much.
  • Cameras are the most prolific of self-driving sensors. Automakers use camera systems to identify lane lines to help lane guidance systems keep cars in their lanes. Radar and ultrasonic sensors also can’t identify the objects they target. They only know there’s something there, they can’t tell what it is. Programmers use camera images to fill in these knowledge gaps using sophisticated modeling software. Programmers can identify objects from a camera image and tell the car what to do about it. For example, if the object looks like the back of a car ahead of you, it’s more likely to keep going your same direction and get out of your way. If it looks like a bicyclist, it’s likely to be moving slowly and could be taking up some of your lane to the right. Cameras allow programmers to make predictions about how objects may behave relative to your car at least several seconds ahead of time.
  • Some automakers use dual cameras looking ahead of the car to replicate human stereo vision, which allows cameras to perform some of the functions of radar in determining distance and relative speed.

    Cameras, however, can have their views obscured by snow, rain, and dirt, rendering them nearly useless in bad weather. And with current technology, they can’t see as well in the dark. Drivers of cars that use these systems find that they just down—giving the driver warnings that they’re not working—just when they need them most: at night and in bad weather. (A study of NHTSA crash data by AAA in 2016 found that, while adaptive safety systems run by cameras and radar reduced accidents by 55 percent in the daytime, their effect at night was negligible.)

  • Radar is also straightforward. Used in naval navigation for decades, radar emits low frequency radio waves to measure the distance (or relative distance) to stationary or moving objects farther away. While it can “see” through bad weather, it can’t tell what it’s looking at. Many high-end cars, and even fancy versions of ordinary cars, use radar to inform active cruise control systems that match pace with cars in front on the highway. Usually mounted in the grille or under the front bumper, they are also used in automatic emergency braking systems. Most can only track the car in front. Some of the latest also “look” under the car ahead to track the location of the car ahead of that. Radar is better at picking out solid objects, such as other cars and buildings, than it is at finding “softer” objects, such as pedestrians.
  • Lidar works like radar, but uses laser-light frequencies rather than radio waves. Lidar systems are usually mounted on spinning drums to give them a 360-degree view around the car. The sensors cost upwards of $5,000 apiece, and many cars use four or more of them.
  • Lidar became controversial last may after Tesla CEO Elon Musk dedicated much of a day-long self-driving conference to calling it wasteful and unnecessary. Tesla doesn’t use lidar, and Musk claims that e company can eliminate the need for it with better software to map data from cameras. Engineers at other self-driving technology companies such as Alphabet’s (Google’s) Waymo, call it indispensable.

Software is a critical consideration. Computers can only act on what they know. They’re not as good as people at inferring what might happen next. To turn computers into truly safe drivers, programmers have to input every possible driving scenario and the outline of every conceivable object that could cross a vehicle’s path.

When we asked one former Google self-driving engineer several years ago what the most unpredictable object the company’s fleet of cars had encountered, his response came easily and quickly: An old woman in a wheelchair chasing a duck across the road waving a cane. How is an engineer supposed to predict that scenario to tell the car to watch out for it?

The leading companies are using heavy doses of machine learning and artificial intelligence to accomplish a lot of this work, but cars still can’t learn what to do in any situation until they encounter it. Even then, Tesla and Google have armies of programmers combing through rare driving occurrences to tell the computers what to do, and the process could take decades. How often would a self-learning self-driving car encounter such an old woman chasing a duck?

That’s where companies with new types of sensors come in.

Heat seeking
An Israeli company called Adasky has developed a thermal camera that sees objects as heat sources. Infrared cameras are hardly new technology, having been used in fighter jets for years, but few companies have so far deployed them in self-driving cars.

We had a chance to ride along with Adasky Sales Director Raz Peleg in a car he drove himself but which was equipped with Adasky’s thermal imaging camera along with a laptop that showed us the view that the camera saw. Like a standard visible-light camera, the thermal camera could pick out lane lines on the road as we drove, along with all types of traffic. It also had no trouble picking out a family waiting at a crosswalk, even a couple of shorter kids hidden from the naked eye behind some bushes. Unlike a standard camera, it had no trouble spotting either this family, oncoming traffic, stoplights, or other road markings, even as the setting sun was blinding drivers through the windshield.

Peleg notes that the cameras cost about $500 apiece, about a tenth the cost of lidar or half the cost of forward radar. He says the company is not looking to replace other sensors in self-driving cars, but says the thermal camera can fill in gaps that other sensors miss as well as make it easier for programmers to identify obstacles that are difficult to do with other sensors. For example, it may have been able to pick out both the woman and the duck as living things and separate them from the wheelchair and the cane, rather than demanding programmers to build the whole scene into the software.

In a famous accident that killed a woman crossing a street in the dark in Phoenix in 2018, who was killed by a self-driving Uber test car, Peleg is confident his sensor would have identified her. The company notes that there are 15,500 vehicle collisions with deer every year in the U.S., which its sensors could help mitigate.

Adasky is also has plans to install thermal imaging sensors in traffic lights, says CEO Yakov Shaharabani. The sensors will communicate with adaptive driver safety systems—the precursors to self-driving systems—to bring a car to a stop if it detects, for example, a jaywalker crossing against the red.

Adasky says it expects to have its system installed on a car for sale in Israel soon, and is in talks with more than one company that sells cars in the U.S.

Radar down
Another company working to improve cars’ self-driving capabilities in bad weather is Wave Sense.

The company takes forward facing radar and turns it on its side to look into the ground. While ordinary cameras and even thermal cameras have trouble seeing through a curtain of falling snow, much less a foot-thick layer on the ground, radars can see through it.

Even on snow-covered roads, the ground penetrating radar is able to find and follow the road and even stay in its lane on a company demonstration video. The catch is, the roads all have to be mapped by radar for it to work.

The system works by following rocks, gravel beds, and other natural or man-made features of the ground under the road. These features are turned into a digital signal, which the radar seeks to follow. The technology can also help in fog, heavy rain or with poor lane markings, the company says.

Eric Evarts has been bringing topical insight to readers on energy, the environment, technology, transportation, business, and consumer affairs for 25 years. He has spent most of that time in bustling newsrooms at The Christian Science Monitor and Consumer Reports, but his articles have appeared widely at outlets such as the journal Nature Outlook, Cars.com, US News & World Report, AAA, and TheWirecutter.com and Fortune magazine. He can tell readers how to get the best deal and avoid buying a lemon, whether it’s a used car or a bad mortgage. Along the way, he has driven more than 1,500 new cars of all types, but the most interesting ones are those that promise to reduce national dependence on oil, and those that improve the environment. At least compared to some old jalopy they might replace. Please, follow Evarts on Twitter, Facebook and LinkedIn.