Have you even been in a Taxi? Or a bus? Or any situation where someone who is not you is driving the car?
You have put your life into their hands when you do this.
No matter what algorithm the self driving car is running, it is going to be 100 times safer than you driving the car. Yes, YOU are a worse driver than a self driving car, NO MATTER what "ethics algorithm" it is running, because any of the ones that could be running will be safer than you or the taxi driver.
What you are basically saying is that "I don't trust any braking system in my car that isn't 100% perfect, therefore I am going to drive my car without ANY braking system! I'll physically stop the car myself if I ever need to brake!"
No matter what algorithm the self driving car is running, it is going to be 100 times safer than you driving the car.
And yet, I have driven hundreds of thousands of miles over many years under all kinds of road conditions without ever having an accident. Statistically my record is better than many, but it's hardly a unique achievement among experienced drivers who are careful.
Tesla's Autopilot probably has done more than 100x the miles I have by now, but it's also had several accidents, some of them fatal. As I understand it, it has also only been used on highways, which are statistically much safer and have much less challenging driving conditions than places like winding rural roads or inner city residential areas.
As another example, Google's self-driving cars have apparently done less than 10x the driving miles I have, but under somewhat more challenging conditions than highways. They too have had accidents, and they too reportedly still can't cope with anything close to truly realistic driving conditions on their own.
So no, it appears that the state of the art in self-driving technology today is not 100x safer than me driving a car. If anything, it's looking considerably more dangerous. Of course there is potential there -- no matter how skillful and careful I am, I still only have two eyes and human reaction times -- but automated driving is still a long way from outperforming good human judgement as things stand today.
But are self-driving cars better drivers than some identifiable other group of people than you? E.g. Legal but elderly drivers with poor sight.
Are the algorithms a useful helper for some drivers?
I really really want all our rental vehicles to remind the drivers if they are on the wrong side of the road (we get multiple deaths every year from Americans and Europeans driving incorrectly in our small country).
I expect that with time these self-driving algorithms will become safer than an increasing proportion of human drivers. I'm just very wary of assuming they are better than most or all human drivers prematurely.
We as a society have a habit of putting too much faith in technology. Most of us aren't knowledgeable and unbiased enough to assess its risks objectively, particularly in areas where there is a very low chance of something happening but it will be very bad if it does happen.
If you get into a taxi or a bus, there's still a human driver who has the same survival instinct that you have. If he has to run someone over so that he can survive, he likely will, saving you too in the process.
This isn't about systems being perfect, it's about whether the system should err on the side of protecting the car's occupants or on minimizing casualties overall.
You have put your life into their hands when you do this.
No matter what algorithm the self driving car is running, it is going to be 100 times safer than you driving the car. Yes, YOU are a worse driver than a self driving car, NO MATTER what "ethics algorithm" it is running, because any of the ones that could be running will be safer than you or the taxi driver.
What you are basically saying is that "I don't trust any braking system in my car that isn't 100% perfect, therefore I am going to drive my car without ANY braking system! I'll physically stop the car myself if I ever need to brake!"