When will we have self-driving cars?

I’m pretty sure that many of you have heard about self-driving becoming a reality soon. Famous Elon Musk promised it first in 2013 and his “release date” is getting sooner and sooner. And it is not baseless blabbing. Recent videos from Tesla in fact are looking quite promising. Will we have full autonomy soon? I’m sure a lot of people will pay quite good money for it. But unfortunately, the answer is hard NO. Not in the next 10-20 years unless some major breakthrough happened. And let me explain why.

Cost

The main reason why Elon pushed so heavily for autonomous driving is to make Tesla cheaper. If you compare Tesla with any car even a very basic one you will find out that number of physical controls in any Tesla are extremely minimalists.

Take, for example, my very basic Honda Civic, It has way more physical buttons, knobs, indicators, etc. And if you check BWM or Audi you will find probably close hundred different things. They all add cost to the car. It is necessary to mount the button, wire it and test it. It is also necessary to add them to all training, repair, and diagnostic procedures. And lastly, it is extra weight that is quite important for electric cars, as it improves range.

And a certain percentage will fail during warranty time and replacing them also cost money. And then Tesla will need to keep them as spare parts for the next 15-20 years. Moving everything to the screen allows them to reduce costs, improve build time, etc. Tesla will still need to keep the screen in spare parts, but at least it will be screen only and not screen and a hundred different types of buttons.

Remember Tesla was on the edge of collapsing and they need every trick to survive and self-driving allows them to state that you should not worry and soon cars will drive themselves and you don’t need that deadweight. Remember they don’t have time to do hundreds of iterations of Model 3 to reach their goal and they aggressively reduce weight everywhere they can. And self-driving by itself was enough to buy Tesla those days.

Sure, as a result, they were able to use it to their advantage and it brought them a lot of other unexpected benefits. For example, I myself like it. Surely I prefer a little bit more physical controls. For example, I prefer controlling the speed of wipers the same way as in any Honda or Toyota. But Tesla was able to mitigate that problem in a recent update and it is possible to control the speed of wipers without touching the screen.

Complex

Making a car drive itself is a very very hard thing. When many people heard about AI and neural networks they are probably thinking about some kind of intelligence. At least at the level of a cat or dog. Unfortunately, it is not the case. It is mostly pattern matching and then pattern applying.

Yes, these patterns are smart but you need to train the system and train a lot in all different circumstances. And when AI has situation that it wasn’t trained for, it is hard to predict what will happen. Real intelligence can apply previous knowledge and predict a correct answer. We do not get trained in all possible situations before we will get a driving license and most of us drive quite safely.

Just as an example, one day I was in 2 lines on the highway. And for like 100 meters it looks like they forgot to paint stripes that separate lanes. My Tesla decided that it is a really wide line and attempt sharply to center itself. I had to intervene apply a lot of force and move it back. I’m not sure if there were cars next to me but it was stressful.

Next example, I was driving at night there were 2 lanes in each direction. And then all road painting disappeared. I didn’t see any of them. But I saw cars parked on the other side and I knew there were 2 lanes, so I projected it to my side of the road and drove this way. What Tesla will do in this case? I have no idea.

As another example, imagine an obstacle on the road. It could be a plastic bag, a rock, or even a box. Each of us has lifetime experience with objects around us to distinguish safe objects to run over vs objects that should be avoided at all costs. And the same applies to potholes: for example, how Tesla will decide what speed you should go over rough round.

And just to show you the current state, right now autopilot is not able to recognize speed limit signs correctly. Very often it confuses recommended speed with the actual speed limit. Very often it confuses the speed limit for slow vehicles and trucks. And these are very basic things in comparison with actual self-driving.

Lastly, imagine the hardware issue. For example, breaks stopped working. Will AI try to use emergency means to stop cars like parking breaks? And what if they also do not work? We humans know the physical properties of materials around us and we can pickup things to crash into to minimize risk to others.

I believe everybody knows the 80/20 rule. It is easy to spend 20% of the time and get 80% of the work done and then spend 4 times more time to finish the last 20% of the work. And I am afraid that in this particular case, it could be a 90/10 rule or even 95/5.

Also, I would like to note that recently Tesla started making cars with self-driving hardware of version 4. They will add 3 more cameras and considerably increase the resolution of all cameras. And that tells us that previous versions are not enough for self-driving. These are quite costly changes and if it would be possible to do with version 3 then will not add more cameras.

Regulations

Let's imagine that Tesla pulled a miracle and they made self-driving quite safe and reliable. But they cannot release it by themselves. They need to get approval from appropriate authorities and this will require years of testing.

It cannot be a single clerk who will make this decision and most likely it suppose to be decided by people who are typically got elected. And imagine they approved self-driving and the next day Tesla hit a bus stop and kill 20 people. I would imagine that their political carrier will end there.

I will re-iterate the problem here. If some politicians approve self-driving cars they most likely will not get additional votes, but if any incident happened then they will lose a lot of votes because their opponents will actively use that fact to their advantage.

And remember that allowing self-driving cars will make obsolete a lot of jobs. Taxi/Uber/Lyft drivers, bus and truck drivers. For some reason, people prefer not to vote for politicians who made them unemployed.

A last thing, who will be responsible for cases when a self-driving car will kill somebody? Will it be the software engineer, who wrote the code? Or the QA person who did test that system? Or perhaps the person who approved that car to self-drive? There is a possibility that it will be treated as a lightning strike where nobody is responsible. But then somebody can re-program the car to kill somebody intentionally.

This will open Pandora's box of different problems. Laws will need to be changed, there will be a lot of debates and I think different states or countries will approve it at different times and that will create a lot of confusion and problems.

Conclusion

As you can see above, all these reasons above will show that cars will not become autonomous any time soon and we can forget about robotaxi and our cars earning money when we are working or sleeping. I predict around 5-10 years to iron out self-driving hardware/software and at least 5-20 years to allow them to pass all regulations. It could be sooner only if there will be a serious need for autonomous cars. For example, if businesses will really need it and that can drive acceptance faster. Otherwise, it could take even up to 50 years or more.

Update is here.

I hope it helps someone.