Skip to main content

My Tesla Model 3 Hit Traffic Cones While in Full Self-Driving Mode Causing Significant Damage and Tesla Won’t Look at My Car for A Month

Real world FSD turns tense for a Tesla Model 3 driver, raising hard questions about trust and accountability.
Posted:
Author: Chris Johnston
Advertising

Advertising

In October 2020, Tesla Full Self-Driving (FSD) launched to the public in beta form. Although it has steadily improved, it doesn’t seem ready for prime time, given the steady stream of near catastrophes posted across social media. 

As described in many posts, even a minor FSD hiccup can cause major damage to cars and disruption to owners’ lives. In one of many Reddit posts, a Tesla driver describes having to take over from FSD to avoid a serious situation. 

“My Tesla Model 3 was in FSD when it tried to switch lanes and it hit express lane traffic cones. There wasn’t enough time to avoid a collision. There was significant damage to the front end, quarter panels, and door, plus the tire went flat and the rim was bent. I initially tried to avoid a claim by getting the tire swapped but the rim was so bent that it wouldn’t hold air in the tire. Tesla won’t look at my car for 1 month so it’s un-drivable unless I buy a new wheel separately.”

 

sqamo responded by how lucky the driver was:

“Lucky that it wasn't a concrete divider.”

 

TechnicianExtreme200 commented about a similar incident with a horrible outcome:

“That's what killed Walter Huang. He didn't have his hands on the wheel for six seconds before the crash, but even if he did have his hands on the wheel would he have had time to react?”

 

This alarming activity seems to have caught the attention of the U.S. National Highway Traffic Safety Administration. After receiving over 50 reports of traffic safety violations and some crashes, the NHTSA announced in October that it is conducting a new investigation of Tesla vehicles that are equipped with FSD. The agency says the Tesla FSD software has induced behavior that violates traffic safety laws. It cites reports of Teslas driving through red lights and changing lanes into opposing traffic.

NHTSA investigators are reviewing 58 reports of alleged violations while FSD was active. The cases include 14 crashes and 23 injuries, with six specific incidents where cars entered intersections against a red light and collided with other vehicles. Four of those crashes resulted in one or more injuries. This action is a preliminary evaluation, which is the first formal step in NHTSA’s process. If the agency determines there is an unreasonable safety risk, a recall could follow.

Drivers have reported a variety of erratic FSD driving patterns and inconsistent traffic signal recognition. There are reports that FSD runs red lights and stops at green lights. During the first days of Tesla’s Robotaxi pilot demonstrations in Austin, TX, multiple ride videos showed a robotaxi entering a left-turn-only lane, hesitating, then swerving across double-yellow lines into an oncoming lane before correcting. Other clips showed the service exceeding posted limits by a few mph. These were captured by invited riders and covered by major outlets. Tesla owners have also documented frequent, abrupt or unnecessary lane changes, including drifting toward the left line on narrow roads, which raised oncoming-traffic risk. Multiple posts describe forcing a cancel (driver taking over) to keep the car in the correct lane.

This probe comes on top of earlier investigations. In October 2024, NHTSA began an inquiry into about 2.4 million Teslas with FSD after four reported collisions in low visibility, including a 2023 fatal crash. The agency has been evaluating Tesla’s advanced driver assist systems for a year.

Tesla Full Self-Driving (Supervised)

Advertising


Tesla Full Self-Driving (FSD) is an advanced driver-assistance system (ADAS) that requires driver supervision to operate almost anywhere, including city streets and highways. In early 2024, Tesla added "Supervised" to the FSD name, replacing "Full Self-Driving Beta" with "Full Self-Driving (Supervised)" to emphasize that a human driver must always remain attentive and responsible for safe operation. FSD features include automatically changing lanes, making turns, following navigation routes, and stopping for traffic lights and stop signs. As explained below, FSD is a Level 2 system and does not make a Tesla fully autonomous. 

Tesla is careful to position FSD as a tool that can drive almost anywhere with active supervision. The company stresses that it does not make the car self-driving. Critics argue that branding and marketing can blur the line between assistance and automation, which risks mismatched expectations on the road.

The Tesla Model 3 was launched in March 2016, with first customer deliveries in July 2017. It’s among a short list of compact premium sedans such as the BMW 3 Series, Audi A4, and Mercedes C Class, plus electric peers like the Hyundai Ioniq 6, and Polestar 2. Public response was huge, with hundreds of thousands of reservations, followed by a difficult production ramp that finally achieved high volume. Having a competitive range, quick performance, over the air updates, and a minimalist cabin built its appeal. Another competitive advantage was Supercharger access. Every Model 3 shipped hardware ready for FSD from launch, with supervised FSD city streets reaching owners in late 2020 and expanding through 2021 and 2022.

The Six Levels of Autonomous Driving – What the Future Holds

The different levels of autonomous driving, defined by the SAE International and often used by authorities like the National Highway Traffic Safety Administration (NHTSA), range from Level 0 (No Automation) to Level 5 (Full Automation). Each level specifies the extent to which a vehicle can handle driving tasks, with lower levels requiring constant driver supervision and intervention, while higher levels allow the system to take on more or all driving responsibilities. Here is a definition of each of the six levels:

Level 0: No Driving Automation. A human driver performs all driving tasks. The vehicle may provide warnings or momentary assistance, but it does not control the vehicle's movement. 

Level 1: Driver Assistance. The system provides continuous assistance with either steering or acceleration/braking. The driver remains fully responsible for driving and must remain ready to intervene. 

Level 2: Partial Driving Automation. The system provides continuous assistance with both steering and acceleration/braking simultaneously. The driver is still responsible for driving and must monitor their surroundings, remaining ready to take over control. 

Level 3: Conditional Driving Automation. This is where the system drives the vehicle under specific, limited conditions, but the driver must be available to take over when requested by the system. This is the first level where the system performs all driving tasks within its defined operational design domain, but the driver is still a backup. 

Level 4: High Driving Automation. At this level, the system is capable of handling all driving tasks and monitoring the driving environment under certain conditions and within specific areas. A human driver is not needed to operate the vehicle when the system is engaged within its defined limits. 

Level 5: Full Driving Automation. At the highest level, the system performs all driving tasks under all conditions and on all roadways. No human driver is needed, making the vehicle fully capable of operating without any driver intervention.

Today, most vehicles offer Level 1 and Level 2 systems, like adaptive cruise control and lane-keeping, where the human driver remains responsible and must supervise the system. Level 3 and Level 4 are in development and testing, with limited commercial availability, and require specific operating conditions or geographic areas for the vehicle to drive itself without human intervention. Level 5, where a car drives itself under all conditions, does not currently exist.

What Do You Think?

Have you ever had Tesla FSD make a move that forced you to take over, what happened and how did you handle it?

Where do you draw the line between acceptable FSD hiccup and unacceptable safety risk?

Chris Johnston is the author of SAE’s comprehensive book on electric vehicles, "The Arrival of The Electric Car." His coverage on Torque News focuses on electric vehicles. Chris has decades of product management experience in telematics, mobile computing, and wireless communications. Chris has a B.S. in electrical engineering from Purdue University and an MBA. He lives in Seattle. When not working, Chris enjoys restoring classic wooden boats, open water swimming, cycling and flying (as a private pilot). You can connect with Chris on LinkedIn and follow his work on X at ChrisJohnstonEV.

Advertising

Comments

Zack (not verified)    October 20, 2025 - 6:19PM

I use FSD constantly (902 out of 907 miles in the past month) and maintain a perfect safety score of 100. Monthly premium for full coverage is about $100/month. It is amazing, and it won’t be long before people come to realize just how much safer it is than a human. You do have to keep your eyes on the road, but that’s it. I had zero forced interventions over the past 3k miles.


Advertising


Tommy (not verified)    October 21, 2025 - 10:49AM

I like FSD and use it, but the name is misleading because it needs constant driver supervision despite being advertised as "full self-driving".