Uber Fatality vs Tesla Model X Fatality: The Difference and Safeguards for Self-driving Cars
Yesterday a story broke out in which a US senator said more safeguards are needed for self-driving cars.
"U.S. Sen. Richard Blumenthal, who took a firsthand look at self-driving vehicle technology on Tuesday, said it was frightening to see "no hands on the wheel" as his car approached a parked car and called for more safeguards to be added to federal legislation following two recent fatal crashes.The bill awaiting action in the Senate should ensure people can manually override highly automated vehicles, the Democrat said," read the story published in Yahoo Finance.
Note that there is a bill in the Senate that awaits action. This bill aims to ensure that people can manually override self-driving cars.
Tesla owners say that Tesla's Autopilot does not equal to self-driving. They say:
- Uber fatality was a self-driving fatality.
- Model X fatality was likely operator error.
I agree, but unfortunately many Tesla owners get comfortable with the Autopilot and confuse it with fully automated driving they see in science fiction movies. We are not there yet. Even airline pilots don't fully trust their autopilot system as pilots and Tesla drivers here explain how Autopilot works. In this article, titled Autopilot Is Still Very Dangerous, It Tempts People, Costing Lives, I explain that we can't just trust that people will use good judgement and keep the control over their cars, while driving their vehicles in which autopilot technologies are activated. It's because our nature is such that in comfort we forget to stay alert, put too much trust in that technology and get tempted. As a result accidents happen for this reason and they cost lives. Autopilots and self-driving technologies should be strictly regulated.
Few days ago Torque News even started a poll asking our readers if self-driving autonomous vehicles are safe. As you can see from the comments, most people who commented there don't think they are safe. You are welcome to comment as well.
In any case you need to be an alert driver. But since our nature is such that we get tempted and fall into false sense of comfort, self-driving vehicles need to be strictly regulated.
Cars With No Steering Wheels
Earlier this year GM said that a car with no steering wheel or pedals will be ready for streets in 2019. I don't know how many people are ready for this. I am not ready yet.
As for the future, I think manual driving should be enabled in all self-driving cars if the driver wishes to take full control over his or her car. This is not only a safety issue, but people sometimes choose to drive for their own pleasure. They should be able to switch to manual mode if they want to do so.
Tesla and Autopilot Regulations
But I think that regulation shouldn't be a big problem for Tesla.
First, Tesla gives its drivers full autonomy to retake control of their cars whenever they wish (unlike the no steering wheel gimmick GM was advertising).
Second, any rule or laws implemented by the government can be updated via OTA by Tesla.
The Insurance Company Factor
"Imagining a future where all cars have autonomous capability that is guaranteed to be 25x safer than humans, the insurance companies will take the freedom of manual driving away though, simply by making those that want non-autonomous capability pay an exuberant premium. That way people will "willingly" give up their wish to be able to manually drive. Driving will likely become a thing for enthusiasts and motorsport, wrote Jee Vee under a discussion at TeslaMotorsClub.
It only takes one insurance company that charges much less.
Do you think self-driving cars are safe?