Part 2 is here.

Somebody who read up to this moment can say to me that everything is looking not as bad as I stated. A lot of people use full self-driving and they are just fine. Well, this is not correct because they are not fine.

Previous iterations of full self-driving (FSD) we quite bad and even Tesla fans admitted it. But even in this case, some fans put their life at stake and there were many fatal cases that went public.

Practically it became more or less useful around the beginning of 2024. However, not many people have FSD because it is quite expensive. Mostly Tesla fans bought them but they don’t want to blame Tesla and

Part 1 is here.

Tesla is planning to use drivers to monitor full self-driving and effectively use them as QA and beta testers while they are developing technology and eventually switch to completely autonomous driving.

But this does not work because it almost works. Let me explain what I meant by that. I will start with this video why the author explained that in 2013 Google started testing its own autonomous vehicles and gave several of them to Google employees. They were told that they needed to always pay attention and to always keep their hands on the steering wheel at all times. Lastly, they told them that there are hidden cameras that will watch them all the

I wrote several articles about Tesla self-driving. But after reading a lot of articles and watching many videos I found new aspects of this problem and I think that it is much worse than I or anybody else thought.

Anyone who writes software knows that there will be bugs and it does not matter how many people will test it, or how good is QA, there still will be bugs and a lot of them. It is simply impossible to find and fix all bugs in any big program.

While it is hard to find all the bugs in code that is written by people, we developed a lot of different techniques that help us develop and maintain big applications