Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

who do you think should take the other portion of responsibility? Certainly not the person sleeping in th back seat with no control?


worth noting the manufacturer doesn't have complete control over the car either, once it's "in the wild". maintenance is an important and frequently overlooked part of driving safety. a sophisticated ai driver (or human for that matter) can't do much with a set of bald tires or bad brakes.


Well, if they're the ones who are nominally in control of the self driving car, because they activated it and are occupying the car, I would say that they have a portion of the responsibility.


Then we're back to the Tesla model - you don't have a self-driving car, you have a car that claims it is such, but a human needs to be 100% aware and in control at all times.


Honestly, I think that's fair. If you are the one who pushes the button and sends the device out into the world, even if you didn't program it, I think you deserve some part of responsibility of it malfunctioning.


I would think for a self-driving car to reach an acceptable level, it should be like a train. If I get on a subway or a bus and it hits someone, I am not held responsible, even if it stopped specifically to pick me up first.


My biggest problem with this approach is humans are terrible at paying attention to things that don't demand 100% focus.

A car that drives itself 99% of the time, but fails catastrophically the other 1% is doomed to failure. The human operator won't be engaged enough to take over that 1%. At least not without airline levels of squawks and beeps and wheel shakers and even that might not be enough - airlines don't have to worry about children chasing balls into busy streets, etc. And the pilots are highly trained - drivers are not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: