Tesla Should Remove The Black Box as Humans Aren't Ready for Autopilot
Yesterday there was another report of Tesla on Autopilot hitting yet another emergency vehicle, a parked police car. Something has to be done about Tesla's Autopilot. Something has to change. There is a problem with it, as You can't make this stuff up.
"Humans are not ready for the Autopilot and the Autopilot is not ready for humans. I know there will be those on here that are selfish enough to disagree because they like having their 'toy," but the reality is, its still a long way to go before it will be ready for mainstream production," writes Ben Jenkins in Tesla Model S Owners Club on Facebook.
Some say it's not a toy, it's a feature that comes with the responsibility that any other car does: pay attention and be in control of your 4,000 lb privilege of a machine. We shouldn't have to give up our feature because others are irresponsible, they say. Sadly, some even go so far as to say "if they were to disable it, I would absolutely sue." What would you reply to these types of people?
But my response is that it's a big temptation. Humans are fallible being. They make mistakes. They put too much trust on Autopilot. They fall sleep. They rely on it more than the airplane pilots on their autopilots. As a result people die because of accidents. Where is love and compassion. Where is social responsibility and a little bit of self-sacrifice?
Autopilot is still very dangerous because it tempts people. In case you are using it, don't ignore these key display messages when using Tesla's Autopilot. Overall, the key to getting the most out of Tesla Autopilot is to learn as much as possible about its components and limitations.
If Tesla doesn't want to disable the Autopilot, at least it can just include shorter time limit. In most other manufacturers, you can go maybe 10 seconds before it tells you to put your hand on the steering wheel, and if you don't, it disables steering assist. That forces people to hold their steering wheel.
Name It Lane Keep
Alternatively, Tesla can remove the name and call it a Lane Keep. Or, deactivate the App anywhere above 30mph if not on a freeway.
Even if you are so selfish that say "I paid for it, why should Tesla take it away from me," then know this. Even if you are a responsible driver and pay good attention to your driving, then remember that those who don't pay attention are raising the cost of insurance for all of us. This is yet another case of paying for those who don't pay attention. In fact, many Tesla owners will say that Tesla's insurance premium is getting higher and higher.
And isn’t autopilot only designed for exit to exit freeway use anyway? Eyes on the road and be ready to take over.
I also don't understand why when approaching a stopped vehicle the driver wouldn't simply apply the brakes. So this again, brings up the issue of the owner responsibility. But if so many owners fail to act responsibly Tesla should act. If Tesla doesn't want to or can't act, perhaps the government can impose a ban or strict limitations on the Autopilot to save lives until humans are more ready for it.
Tesla Autopilot is just a better version of what Ford, Mercedes, Volvo, Lexus etc. all have. Its adaptive cruise with lane keeping. These cars with the similar technology crash daily. Why we don't see articles on those crashes? There is a simple reason for that. Tesla's technology shouldn't be named Autopilot. People aren't ready for that. They think they can engage it and go make a sandwich because of its name and relatively great performance.
Our cars are nowhere near autonomous and are far from it.