Why Tesla self-driving is bad. Part 3. Consequences

Part 2 is here.

Somebody who read up to this moment can say to me that everything is looking not as bad as I stated. A lot of people use full self-driving and they are just fine. Well, this is not correct because they are not fine.

Previous iterations of full self-driving (FSD) we quite bad and even Tesla fans admitted it. But even in this case, some fans put their life at stake and there were many fatal cases that went public.

Practically it became more or less useful around the beginning of 2024. However, not many people have FSD because it is quite expensive. Mostly Tesla fans bought them but they don’t want to blame Tesla and most likely they will not be vocal about it. After all, it is clearly stated that you must pay attention to the situation around you and be able to take control at any moment. So in case of an accident, fans will blame themselves.

But some cases went public. The Wall Street Journal got video from a crashed Tesla car that crashed at night into an overturned truck on the highway. To the last moment, Tesla didn’t recognize the truck as an object that blocked and continued to drive at full speed. At the last second it looks like the driver tried to avoid the collision but it was way too late. Driver died.

This accident happened in Los Angeles and there was light on the road but the truck overturned exactly between two poles. It has a light color and it also has some lights on its top and these lights were pointed toward the oncoming Tesla. Probably these things confused AI.

Another similar case happened on another highway with an overturned black pickup. This time Wall Street Journal had internal data on how Tesla is trying to recognize this object. One camera saw it but another didn’t see it. As a result, for quite some time Tesla was ignored completely. When both cameras saw it but Tesla was not able to recognize it and was changing the classification of that object many times per second. In the end, it crashed into it at full speed.

In this case road was dark and it was hard to recognize the dark pickup. Tesla failed to recognize this object and for some reason decided to completely ignore it. Perhaps AI thought it was some dirt on the road.

Wall Street Journal also has at least 6 different cases when Tesla crashed into cars with emergency lights. Fire trucks, police cars, etc. I definitely heard about the case with Firetruck.

I saw another video where some person activates autopilot or FSD. And around 2 seconds later, the road splits. One part goes slightly to the left and another part goes slightly right. If you continue going straight, there is a tree. And Tesla decided to choose the tree and disengaged. Luckily the driver was holding hands on the steering wheel and was able to steer left at the last moment before collision.

Do you know what would be the result of the investigation if that driver would crash? The “human error”. Because autopilot or FSD was not active at the moment of the crash. They will not tell you that their car created this situation.

I had many cases when my Tesla would start breaking without any reason. Many times on the absolutely free road without any cars in many miles around me and with the best weather possible.

And while FSD is barely working they decided that it is ok to produce a humanoid robot. FSD needs to control the car in 2 dimensions and barely can do it. The robot must control its arms and legs in 3 dimensions which makes it way harder. And robot has only 2 cameras while any car has at least 7.

In conclusion, I hope I convinced you that FSD is far from ideal and way less safe than the CEO of Tesla is telling us. It almost works and this is what fools people to trust it. It is not ready and it is quite dangerous to give it to people in this state.

And to be clear, I’m not against technology and I know that autonomous vehicles are the future. I just don’t want people to use the half-backed solution that puts the lives of other people at risk.

At the current state, FSD looks like an intern who already has some experience and can do simple everyday surgeries. People see that and begin assuming that the intern is capable of doing any type of surgery. However, an intern needs tens of years of experience before becoming a professional surgeon.

Please be smart people and don’t be fooled by that intern. I’m only writing these posts because I don’t want people getting killed by somebody who has a lot of faith in technology and not thinking clearly.

I hope it helps someone.