Skip to main content

Hacking Tesla is Condoned by Traffic Safety Investigators

Can hacking a car company ever lead to good? Here’s one example that should have Tesla drivers sit up and take notice.

Tesla Autopilot and FSD

Earlier, we’ve reported that Tesla Autopilot and FSD has had more than its share of assisted driving successes and questionable quirks as it attempts to achieve the holy grail of FSD—Level 5 categorization. The foundation of much of this Tesla controversy, however, is about safety and whether Tesla is truly being responsible or irresponsible when it comes toward the manufacture of their vehicles and the message they are selling to their customers and potential customers.

More recently, we’ve observed that Tesla is making a larger effort towards ensuring a safe product by addressing the biggest flaw of Autopilot and FSD---the inattentive driver behind (but not always) the wheel. For the latest informative news about how safe Tesla vehicles are, you are encouraged to check out this very recent informative Tesla news analysis on safety.

With safety in mind, the news has lately focused on the NHTSA’s October deadline requesting that Tesla provides needed information in response to an investigation in response to some serious Autopilot allegations. However, it turns out that some automotive safety agencies are taking a more indirect route to getting answers they want from Tesla by reportedly hacking into Tesla’s Autopilot software.

The Tesla Hack

How Tesla is working toward improving Autopilot and achieving true FSD capabilities is through Project Dojo---Tesla’s fastest supercomputer AI training computer that reportedly possesses 720 nodes of 8x Nvidia A100 Tensor Core GPUs (5,760 GPUs total) for up to 1.8 exaflops of performance.

Using Tesla neural network control, Tesla continues to develop simulations that promise to bring Autopilot and FSD closer to reality using petabyte amounts of information collected to enable Autopilot and FSD “to learn” how to respond to varying road conditions and driving scenarios.

However, not only are simulations used, but real-life data by Tesla drivers is also being collected behind the scenes---including that which is gathered before and during an accident. And, it is that data which automotive safety experts want from Tesla, that is not easily available. So much so, that hack efforts amounting to reverse engineering Tesla’s Autopilot have recently been reported.

According to a recently posted Automotive News report, a Dutch forensic lab has just revealed that they have hacked Tesla by successfully decoding Autopilot driving data previously only accessible to Tesla.

The significance of this is that their decoding leads to the potential for uncovering a wealth of information that could be used to investigate serious accidents; and, that this reveals that there is far more data than investigators had previously been aware of that Tesla could provide to ensure safety.

The Dutch agency responsible for the hack is the Netherlands Forensic Institute (NFI) which is part of the European Association for Accident Research and Analysis whose aims and objective include:

• Improvement of the basic principles and the methodology of accident analysis.
• Carrying out of projects in the area of accident research.
• Improvement of the exchange of expert knowledge across Europe.

According to Automotive News:

The Dutch lab said rather than seek the data from Tesla, it had "reverse engineered" data logs -- a process where software is deconstructed to extract information -- present in Tesla vehicles "…in order to objectively investigate them."

Apparently, the Dutch agency found that while Tesla had complied with data requests from the Dutch authorities earlier, that Tesla was selective and they believed withheld a lot of data that could have proven useful to their safety analyses.

"Tesla however only supplies a specific subset of signals, only the ones requested, for a specific timeframe, whereas the log files contain all the recorded signals," the NFI's report said.

"These data contain a wealth of information for forensic investigators and traffic accident analysts and can help with a criminal investigation after a fatal traffic accident or an accident with injury," Francis Hoogendijk, a digital investigator at the NFI, said in a statement reported by Automotive News.

The hack included obtained data from Tesla models S, Y, X and Model 3 that was recently shared at a conference of the European Association for Accident Research so that other accident analysts can use it for EV safety studies and analyses.

Is This a Good Thing?

Hacking is almost never a condoned action. However, it’s a fact of life when you consider that under the umbrella of "National Security" in reality every country has to do this to ensure the security and survival of its citizens. Which begs several questions:

What about when hacking extends to the vehicles we drive? Should we legitimize actions such as those taken by the Dutch NFI when it comes to Tesla? Is Tesla at fault for not sharing all of its data with safety experts outside of Tesla? Where do we draw the line, and how will this affect Tesla with the NHTSA’s current investigation?

One posited good that can come out of this is that it might make Tesla owners much more attentive to their actions and inactions while behind the wheel under Autopilot and FSD. Failing to do so could result in repercussions legally when exact recorded data can be subpoenaed during an investigation and trial that decides who is at fault---is it the auto manufacturer; or…is it the person behind the wheel.

The other side of the coin, however, is that perhaps we should be reminded of the quotation from poet Robert Burns when he penned:

The best laid schemes o' mice and men
Gang aft a-gley [often go astray],
And lea'v us nought but grief and pain,
For promised joy.

Let Us Know What You Think---please post in the comments section below what your thoughts are behind the Dutch hack and whether there may be unforeseen consequences that comes to mind why this hacking of Tesla is a bad thing rather than a good thing.

And finally…

In case you missed it, here is a relatively recent article questioning whether Tesla Owners Technically Do Not Own Their Cars.

Timothy Boyer is an automotive reporter based in Cincinnati. Experienced with early car restorations, he regularly restores older vehicles with engine modifications for improved performance. Follow Tim on Twitter at @TimBoyerWrites for daily news and topics related to ICE and EV cars and trucks.

Comments

Ian Wickham (not verified)    October 23, 2021 - 10:20PM

So why not require all vehicle manufactures to install a ‘black box’ to record both drivers actions and vehicle status - and not just Tesla who are doing more to achieve vehicle safety than any others? Just saying! Sounds like Tesla not being invited the the Biden summit!!!

Timothy Boyer    October 25, 2021 - 9:01AM

In reply to by Ian Wickham (not verified)

Sounds like a great idea to me. Think of the time and resources it would save during traffic accident investigations and court costs if we had nonobjective data to clarify who did what when whatever happened?! I'm with you on that one. However, the other side of the argument concerning this black box idea has been argued that it will lead to loss of freedoms and that car owners will get traffic violation bills online or immediately deducted from their bank accounts fro events smallest of infractions. Big Brother and all that. Thanks for the input---much appreciated.