I guess "we" have decided that driving, and self-driving, is worth a number of deaths.
And to be honest, I agree. It's a tough balance to consider, and the death of any individual is going to be weighed much more heavily by their nearest and dearest than it will be for a dispassionate observer, but I'm inclined to say that advancement of humanity is a big enough reward that some risk is justified.
> I guess "we" have decided that driving, and self-driving, is worth a number of deaths.
Yes, although it has been that way all along, and not just with self-driving cars. Driving is implicitly seen as either an inevitability or worth the risks, although few people want to explicitly say deaths are an acceptable tradeoff for, well, anything. Implicitly accepting a (usually small) number of deaths, or a statistical risk of death, for almost any activity is a part of life.
Eliminating accidental deaths entirely would generally make us pay through the nose in some other way, economical or otherwise, so almost nobody actually even wants what it would entail. It's socially and emotionally hard to say that, so it's easier to just consider it inevitable. (Frankly, that's just kind of a different way of saying the same thing, and understandable.)
Self-driving cars just turn it from an inevitable human error into something with a target on it. It's no longer an inevitable human error because there's a technology that caused the accident to happen.
Self-driving cars are supposed to reduce the total risks in the end, though, so if that happens, the more interesting question will be in whether people will be allowed to drive at some point in the future at all.
Honestly, it doesn't seem like society has decided any such thing w/r/t self-driving! The couple of deaths from self-driving have attracted enormous negative media attention, and I suspect that if 100 people had been killed by self-driving cars, there would have been immense public pressure to shut down the programs. The public is not running cost-benefit analysis on this.
I'll go further. Tesla has decided (correctly?) that there is enough money in it for them to push that line. The chemistry set manufacturers decided that there isn't enough money.
Bingo, if the set manufacturers were convinced they could make hundreds of billions, somehow, I would bet they would have continued selling them down to this day.
If "we" haven't decided it's worth it, it's likely "we" are wrong. Trillions of dollars and an untold number of of lives have been lost to inefficient human driven transport. See also: Every preventable driving accident ever, and every dollar ever spent on transport for further details.
Are you implying that automating it might somehow NOT be the most important innovation since the invention of penicillin, or are you implying that it can't be automated? Sorry, it may be that I'm dense, but I'm actually not clear what your argument here is.
Self driving overall is definitely worth it. What Tesla is currently doing, however, is selling beta software as FSD and testing it on public roads. That's very different.
> Are you implying that automating it might somehow NOT be the most important innovation since the invention of penicillin
Do people really think self-driving cars are anywhere near the same class of benefit as penicillin, let alone the most important invention since then? Hell, if self-driving cars came out tomorrow, they wouldn't even be the most important innovation in the last two years (that would be the COVID vaccines).
> I guess "we" have decided that driving, and self-driving, is worth a number of deaths.
I don't think anyone has decided anything of the sort. If I shoplift from a store and I'm not stopped, that doesn't mean that "we" have decided that my theft is okay. It just means that I haven't got caught, or that the system isn't sufficient to hold me accountable, or that there isn't enough evidence to present a damning case against me just yet.
> but I'm inclined to say that advancement of humanity is a big enough reward that some risk is justified.
There's absolutely no evidence that Tesla's rushed and untested beta product is helping the "advancement of humanity" versus better designed and controlled research by organizations like Waymo.
The only evidence of FSD's "advancement" I see is the advancement of Tesla's profits. FSD serves as a vehicle for interest-free loans to Tesla directly from their customers.
> > I guess "we" have decided that driving, and self-driving, is worth a number of deaths.
> If I shoplift from a store and I'm not stopped, that doesn't mean that "we" have decided that my theft is okay.
Those aren't equivalent, though. Stores _have_ decided that _some_ theft is okay in the grand scheme of running a store.
Conversely, I think most of us agree e.g. that the incident where a woman was killed by an Uber self-driving Volvo[1] was _not_ okay, and that multiple negligent factors were at play that contributed to her death.
This doesn't mean that society thinks self-driving cars are _all_ not okay.
Similarly, the royal/societal "we" _absolutely_ have decided that collective driving is worth some quantity of deaths. We seek to minimize them, but there are extremely few people seriously advocating for the abolition of cars today, and they've been around for over a hundred years at this point.
And to be honest, I agree. It's a tough balance to consider, and the death of any individual is going to be weighed much more heavily by their nearest and dearest than it will be for a dispassionate observer, but I'm inclined to say that advancement of humanity is a big enough reward that some risk is justified.