Skip to main content

Tesla's FSD and "The Trolley Problem": How Will Autonomous Cars Handle Complex Situations Where a Crash Can't Be Avoided?

As autonomous transport begins to expand and grow, there may be situations where these cars, such as a Tesla using FSD, have to make a split decision in a situation involving a judgment where harm can't be avoided.

The Trolley Problem For Tesla FSD And Autonomous Vehicles

Someone brought up an interesting point on X today, while I was scrolling through news for ideas of topics to write about. One such post talked about "the trolley problem", which is a complex moral decision scenario that represents a difficult choice for a runaway train/trolley on tracks. In the scenario, harm is going to come to five people or one, and someone must make a decision what to do, but there is only one of two options.

In the case of a Tesla with FSD or any autonomous vehicle for that matter, what happens if the car is driving in busy traffic, and it runs into a situation where it has to make a choice between harming a person, animal, building, structure, or something else, or deciding to swerve and potentially harm the inhabitants of the car by crashing?

This theorized scenario came up when Tesla's FSD 12.3.3 was seen driving slightly up and on a curb on the right in order to avoid a truck that was driving in the lane the person was in. This was an unusual situation and understanding why the car drove on the curb is important - it avoided an accident, kept the driver and the car safe, at the expense of driving on the curb. In this case, it was the right call.

Ashok, from tesla's Autopilot team, responded to the Tesla driving on the curb with this statement:

The final version of this is the trolley problem!

Some would answer this by saying the software must be designed so that this situation never occurs. However, it likely occurs in daily driving with humans all around the world.

According to the CDC, almost 3,700 people die in car crashes globally each day and that involves buses, motorcycles, bicycles, trucks, and pedestrians. Half of these involved pedestrians, motorcyclists, or bicyclists. Those are the most at risk of harm in any auto accident.

I believe different people would answer this situation in three distinct ways:

  1. Don't be in a situation where you have to make a difficult choice on the road like that - drive defensively and exceptionally cautious.
  2. Swerve and avoid causing harm or damage to people, at the expense of potentially crashing the car and harming its inhabitants.
  3. Don't swerve to keep the car's occupants safe, potentially causing harm or damage to the other car, people, or structure.

Let's examine each of these answers to the autonomous "trolley problem" and see if there is some kind of solution that makes sense.

You May Also Like: Model 3 Performance "Ludicrous" Seen In Showroom In Malibu, California: Unveil Imminent

Don't be in a situation where you have to make a difficult choice on the road like that - drive defensively and exceptionally cautious

The first answer is for humans to drive safer and to not be distracted. In addition, it is for Tesla's FSD or any autonomous software to show an exceptional ability to avoid accidents or problems while driving. This is the march of 9's that Tesla is undertaking right now. It must be so low of a chance that a Tesla must be 99.9999999% safe to drive. A crash with autonomous software should be unheard of, just like a crash with a standard airline is today.

However, as we know, many people drive distracted, using their phones, or glancing to the sides while driving. I think Tesla's FSD or any AI driven autonomous vehicle that has been properly trained will greatly surpass what any human can do - even a careful one, but regardless, I believe this answer will have to hold up - it must be so obvious what the safety benefits are of Tesla's FSD and an autonomous vehicle are before they will be allowed to drive without someone in the driver seat.

However, there is the 0.00000001% chance or scenario where something is going to go wrong. What happens then? There are two options:

Swerve and avoid causing harm or damage to people, at the expense of potentially crashing the car and harming its inhabitants

In this option, the car will swerve to avoid causing harm to the other car, its inhabitants, at the expense of itself and whoever is in it.

This option seeks to preserve other life and structures around by always swerving if there is no other choice but to do that.

The pro is that other people's lives are likely going to be saved in this very rare case.

The con is that the inhabitants of the vehicle (along with the vehicle being driven) will be put at risk.

If it was a guarantee that the car swerving would not harm whoever was in it every time, I think this option makes the most sense. A car that is damaged or totaled can be repaired or fixed, however, a human life, is much more precious.

The other option is this:

Don't swerve to keep the car's occupants safe, potentially causing harm or damage to the other car, people, or structure

This option seeks to preserve the lives of those in the car driving around autonomously and if that an accident can't be avoided, the car should not recklessly swerve to avoid hitting something in front of it.

This option makes sense when you think about it from the standpoint that swerving away creates additional unpredictability with the environment around and whatever could be in the direction the car is swerving.

In this case, given the choice, the car would not try to avoid the accident, but would hit the brakes as hard as it could and the damage would be done.

In this case, there is still a crash. If it's another car, now you have two cars with people inside where harm can be done plus the damage to both the cars.

Both of these scenarios represent a difficult choice, and my hope is that Tesla's FSD and any autonomous vehicle will be part of a system that thinks about this intently and creates a system where a choice like this never has to be made.

This system will hopefully function in a very controlled and regulated way, like airlines are today, where an accident should be just about impossible.

I'm hopeful that this won't be a situation that comes up in the future. The way to prevent that is to start thinking about it now.

For Further Reading: Just Like That, My Tesla Model 3 Is Driving Itself Everywhere (While I Supervise)

What do you think about the trolley problem for Tesla FSD and autonomous vehicles? How would you answer this difficult moral question?

Share this article with friends and family and on social media - or leave a comment below. You can view my most recent articles here for further reading. I am also on X/Twitter where I post more than just articles daily, as well as LinkedIn! Thank you so much for your support!

Hi! I'm Jeremy Noel Johnson, and I am a Tesla investor and supporter and own a 2022 Model 3 RWD EV and I don't have range anxiety :). I enjoy bringing you breaking Tesla news as well as anything about Tesla or other EV companies I can find, like Aptera. Other interests of mine are AI, Tesla Energy and the Tesla Bot! You can follow me on X.COM or LinkedIn to stay in touch and follow my Tesla and EV news coverage.

Image Credit: Tesla, Screenshot

Article Reference: CDC Accident Data | Ashok From Tesla