
Fatal Tesla Accident Sparks Concerns Over Vision-Based Autonomous Driving Technologies | Carscoops
If a human cannot see the road ahead, it's uncertain how Tesla anticipates that FSD Supervised will perform better.
A fatal crash in Arizona in 2023 has been linked to Tesla's Full Self-Driving software system.
This incident raises concerns regarding Tesla's vision-only approach to autonomous driving.
The timing coincides with Tesla's push for Robotaxis and amplifies worries about their readiness for autonomy.
When technology interacts with reality, the consequences often extend beyond theory. In late 2023, a tragic event occurred in Arizona. Out of 40,901 traffic fatalities recorded that year, it stood out as the sole incident involving a pedestrian and a Tesla reportedly operating in Full Self-Driving (Supervised) mode. Now, as Tesla initiates its Robotaxi rollout in Austin, questions about safety—both now and in the future—are being raised.
The accident took place in November when Johna Story, a 71-year-old grandmother, was pulled over on the interstate to assist others involved in a prior accident. Footage from the Tesla reveals that the roadway ahead was obscured by bright sunlight at the horizon.
However, the video obtained by Bloomberg from the crash indicates warning signs that not everything was right. Despite the road being difficult to see, a car in the right lane slowed down, and other vehicles were parked on the right shoulder. A bystander was seen signaling for traffic to reduce speed.
In a matter of moments, Tesla driver Karl Stock swerved left, then back toward the road, colliding head-on with a parked Toyota 4Runner and Story. She died at the scene. In a statement to the police, Stock said, “Sorry everything happened so fast. There were cars stopped in front of me, and by the time I saw them, I had no place to go to avoid them.”
Notably, Bloomberg reports that FSD was activated during the incident. “He [Stock] had engaged what the carmaker refers to as Full Self-Driving, or FSD,” the report states. However, this claim is not supported by the police report, as neither the attending officers nor Stock mentioned FSD, Autopilot, or any form of cruise control or autonomous system. It's possible that Bloomberg accessed non-public NHTSA crash data that contains additional information.
Vision vs. Lidar & Radar
Ultimately, incidents like this highlight a prominent concern regarding Tesla's FSD. Vision-based systems are similar to how humans perceive the road, meaning that when humans have difficulty seeing (such as with bright sunlight or in foggy conditions), these systems may also struggle.
It remains unclear when exactly FSD was engaged. However, even if the system had disengaged in time for Stock to avert the crash, it's uncertain how he could have seen the impending danger. In fact, this collision and others like it, without additional fatalities, prompted the NHTSA to launch an ongoing investigation into FSD.
“A Tesla vehicle experienced a crash after entering an area with reduced roadway visibility conditions while FSD -Beta or FSD -Supervised (collectively, FSD) was engaged. Reduced visibility was caused by conditions such as sun glare, fog, or airborne dust,” stated the investigation.
Conversely, systems that depend on radar or lidar can "see" through fog, light glare, and smoke, detecting obstacles that vision-based systems might struggle with. In this scenario, a lidar-equipped system may have alerted Stock to the stopped vehicles. Nonetheless, these systems are not flawless.
Cruise, after significant investment, ceased operations due to crashes, despite relying on radar and lidar. Given all that, it is still perplexing why Tesla and its CEO, Elon Musk, remain so committed to using vision-only systems. Only time will reveal if this stance changes.
The Robot Elephant In The Room
We may soon find out whether Tesla will continue with its vision-only system. The automaker is already testing robotaxis and driverless vehicles, with plans to expand this initiative in Austin, Texas. Musk has assured that the program will grow throughout the year and that Level 5 autonomy is on the horizon.
Naturally, Tesla has made significant improvements to FSD over the years, resulting in a much more capable system than in 2023. However, it still confronts major problems. Just weeks ago, a Tesla operating with FSD reportedly crashed on an open road with no obstacles or visual cues to explain its actions. Although the specifics are yet to be confirmed, the car appeared to drive off the road and crash into a tree at approximately 55 mph. These incidents raise concerns about the safety of Tesla's Robotaxis if they aim to achieve mainstream acceptance. For the moment, all we can do is watch and wait to see what unfolds.
Tesla’s Perspective
Tesla is known for its lack of a public relations department—except when



Other articles




Fatal Tesla Accident Sparks Concerns Over Vision-Based Autonomous Driving Technologies | Carscoops
If a person can't see the road in front of them, it's uncertain how Tesla anticipates FSD Supervised will perform better.