Tesla surprised everybody last year with the announcement that it was dropping the use of radars in its cars, leaving self-driving capabilities in the hands of cameras and sensors: a risky move for many, that has now received the green light to also be applied to Tesla models sold in Europe, mainly Model Y and Model 3.
This ambitious step is part of Tesla's plan to feature a total autonomous driving system based solely on vision in its cars, provided by the vehicle's cameras and supported by sensors, which has been called Tesla Vision.
Tesla's approach to try to achieve SAE Level 5 is to train a neural network using the behavior of hundreds of thousands of Tesla drivers using chiefly visible light cameras and information from components used for other purposes in the car (the coarse-grained two-dimensional maps used for navigation; the ultrasonic sensors used for parking, etc.) Tesla has made a deliberate decision to not use lidar, which Elon Musk has called "stupid, expensive and unnecessary".
This makes Tesla's approach markedly different from that of other companies like Waymo and Cruise which train their neural networks using the behavior of highly trained drivers, and are additionally relying on highly detailed (centimeter-scale) three-dimensional maps and lidar in their autonomous vehicles.
According to Elon Musk, full autonomy is "really a software limitation: The hardware exists to create full autonomy, so it's really about developing advanced, narrow AI for the car to operate on." The Autopilot development focus is on "increasingly sophisticated neural nets that can operate in reasonably sized computers in the car". According to Musk, "the car will learn over time", including from other cars.
Tesla's software has been trained based on 3 billion miles driven by Tesla vehicles on public roads, (and that was by April 2020). Alongside tens of millions of miles on public roads, competitors have trained their software on tens of billions of miles in computer simulations, as per data available by January 2020 (that is, 2 years ago...). In terms of computing hardware, Tesla designed a self-driving computer chip that has been installed in its cars since March 2019 and also developed a neural network training supercomputer; other vehicle automation companies such as Waymo regularly use custom chipsets and neural networks as well.
This is a very important step for the future and for designing the capabilities of a truly autonomous car system by avoiding the use of an - up until now - critical element such as the radar, which according to Tesla also has other limitations like wave range, which reaches only 525 feet (160 meters), while camera processing systems normally reach 820 feet (250 meters).
Instead, the company will opt for a system of 12 ultrasonic sensors and 360º camera vision systems, which they say will offer way better performance; it will begin to be applied in Model 3 and Model Y in Europe throughout the second quarter 2022, after a few months working without any issues in the United States and Canada. This is a technology that entrusts everything to cameras and the evolution of Artificial Intelligence, which is being incorporated in new versions of the FSD software and also thanks to bitstream sensors and cameras, which according to Elon Musk achieve more bits per second as compared to radar systems, all of which allows to greatly improve performance.
All images courtesy of Tesla Inc.
Nico Caballero is the VP of Finance of Cogency Power, specializing in solar energy. He also holds a Diploma in Electric Cars from Delft University of Technology in the Netherlands, and enjoys doing research about Tesla and EV batteries. He can be reached at @NicoTorqueNews on Twitter. Nico covers Tesla and electric vehicle latest happenings at Torque News.