Tesla’s fully autonomous driving (FSD) is once again under scrutiny. During November 2022, in San Francisco there was a a traffic accident caused after an unexpected action of a Tesla Model 3. Although initially only the testimonies of those drivers involved were available, a recording of the event was recently obtained, which makes it clear that the vehicle of the electric car company was primarily responsible.
According to the driver of the Tesla Model 3, At the time of the accident, full autonomous driving was activated. The problem occurred when the car came to a complete stop despite not having any obstacles ahead. When driving on a high-speed road, the vehicles behind failed to stop, causing a carambola that, according to the authorities, resulted in 8 affected units. Fortunately, there were no victims to regret, although some were injured.
Now, the California authorities describe that the total autonomous driving of the Tesla involved was in beta 11. Then, it operated in the Level 2 autonomous driving. What does this mean? so what automation is partialso the driver had the opportunity to intervene manually stepping on the accelerator, but did not.
Obviously, the fact that the driver has indicated that he is using total autonomous driving has generated multiple criticisms against the company led by Elon Musk. Also, of course, certain local media covered the accident and took advantage of the situation to devise alarmist headlines against Tesla’s FSD. Are they really justified?
Tesla has again been criticized
Although it is not yet fully proven that the driver’s statement is 100% true, it cannot be ignored that Tesla’s fully autonomous driving has been surrounded by controversy lately. It is no longer just the fact of staying in a Beta state for so many years, but also that some anomalous behaviors have been reported that put passengers at risk.
The issue of braking for no reason is not something new. Since electric report that this phenomenon, commonly known as phantom braking (phantom braking), has affected some owners of Tesla cars with Autopilot (even without testing the FSD). Basically, the Autopilot mistakenly detects the presence of an obstacle in front. When anticipating a collision, the autonomous braking is activated to avoid the “crash”.
At the end of 2021, reports of phantom braking in Tesla cars increased considerably, especially in the United States. Although some customers approached the automaker for solutions, the company limited itself to saying that they were software glitches. Therefore, they could correct them through an update.
However, the video of the San Francisco accident has shown that the problem has not been solved. At least not in its entirety. It is also evident that when the phantom braking occurs while driving on a road with heavy traffic, there is a huge risk of causing an accident.
California Bans the Name “Full Autonomous Driving”

During December 2022, the state of California passed a law that prohibits Tesla, or any other automaker that offers assisted-driving features, from using the name “full autonomous driving” because it can be misleading. The reason is very simple: today no software, not even Tesla’s, is reliable enough to call it that. If any company violates this regulation, it will be considered as “misleading advertising”.
Undoubtedly, the recording of the mishap, coupled with previous reports of phantom braking and the new law passed in California, increase the pressure on Tesla. Those led by Elon Musk must step on the accelerator to improve the capabilities of the FSD, starting with the most basic.