> I guess "we" have decided that driving, and self-driving, is worth a number of deaths.
Yes, although it has been that way all along, and not just with self-driving cars. Driving is implicitly seen as either an inevitability or worth the risks, although few people want to explicitly say deaths are an acceptable tradeoff for, well, anything. Implicitly accepting a (usually small) number of deaths, or a statistical risk of death, for almost any activity is a part of life.
Eliminating accidental deaths entirely would generally make us pay through the nose in some other way, economical or otherwise, so almost nobody actually even wants what it would entail. It's socially and emotionally hard to say that, so it's easier to just consider it inevitable. (Frankly, that's just kind of a different way of saying the same thing, and understandable.)
Self-driving cars just turn it from an inevitable human error into something with a target on it. It's no longer an inevitable human error because there's a technology that caused the accident to happen.
Self-driving cars are supposed to reduce the total risks in the end, though, so if that happens, the more interesting question will be in whether people will be allowed to drive at some point in the future at all.
Yes, although it has been that way all along, and not just with self-driving cars. Driving is implicitly seen as either an inevitability or worth the risks, although few people want to explicitly say deaths are an acceptable tradeoff for, well, anything. Implicitly accepting a (usually small) number of deaths, or a statistical risk of death, for almost any activity is a part of life.
Eliminating accidental deaths entirely would generally make us pay through the nose in some other way, economical or otherwise, so almost nobody actually even wants what it would entail. It's socially and emotionally hard to say that, so it's easier to just consider it inevitable. (Frankly, that's just kind of a different way of saying the same thing, and understandable.)
Self-driving cars just turn it from an inevitable human error into something with a target on it. It's no longer an inevitable human error because there's a technology that caused the accident to happen.
Self-driving cars are supposed to reduce the total risks in the end, though, so if that happens, the more interesting question will be in whether people will be allowed to drive at some point in the future at all.