The problem is, a human driver wouldn't have had any trouble avoiding that accident -- the truck was visible for some distance.
It's not just about the overall fatality rate per mile. Self-driving cars also need to not have any accidents that humans would have easily avoided. Until they demonstrate that, they're not going to be widely accepted.
Teslas don't come equipped with a LIDAR, that cruise, uber, udacity and google have been adding as a very important sensor.
Lidar sees a 3D cloud around the car with almost a cm accuracy. Better sensors could have most definitely prevented the accident.
However, to add to tesla's point. I have a car with adaptive cruise control. Not having to constantly check distance with car head has probably saved me once or twice from a crash.
Good auto-pilot's are definitely going to reduce human deaths on roads. Not only just deaths, but the amount of traffic all of us have to endure because someone just didn't keep the right distance from the car infront and bumped into them.
I could be wrong, but I just don't think people are going to accept that. It's related to the illusion of control that driving provides: everyone thinks they're a safer-than-average driver. So even if self-driving cars are safe enough for everyone else, they're not safe enough for me.
There will, of course, be exceptions, as we've already seen (poor guy).
And my point wasn't "boohoo look at Tesla". It was that, contrary to what OP claimed, Google made the explicit decision to shoot for fully autonomous after having tried what Tesla did (and with LIDAR-equipped cars!).
It's not just about the overall fatality rate per mile. Self-driving cars also need to not have any accidents that humans would have easily avoided. Until they demonstrate that, they're not going to be widely accepted.