View Single Post
Old 07-02-2016, 05:13 PM   #1509
rumatt
Mugwump
 
rumatt's Avatar
 
Join Date: Oct 2003
Carmudgeonly Ride: E46 330i, Chevy Colorado, Tesla Model 3
Location: NY
Posts: 17,475
Quote:
Originally Posted by JST View Post
In other words, whatever the proximate cause of the original impact, the car's post-crash behavior is a separate problem.
I'm with you, mostly. But you do have to admit there's a certain irony in the fact that it was the self-driving capability that contributed to killing the driver, then once he was dead and there was a legitimate opportunity for autopilot to make society safer, it turned itself off.

If you're going to introduce technology that sometimes makes people less safe, and you argue that's OK because it increases safety overall, there is indeed some obligation to "do good" in the broader sense. Helping bring a car to a stop when the driver is dead is a huge opportunity to demonstrate value of autopilot and help justify its existence, and the risks associated with it.

In this case it would have been difficult to ask for praise for how nicely it brought itself to a stop after killing the driver. But imagine the scenario where someone was driving and had a stroke, the car crashed into a truck and then went barreling towards a playground filled with children. Autopilot kicks in and saves the day. Suddenly the small number of accidents that may or may not have been caused by autopilot might start to seem like an OK trade-off.

Cliff notes: if Tesla is allowed to lower the bar in some cases, I'm allowed to raise it in others.

Last edited by rumatt; 07-02-2016 at 05:42 PM.
rumatt is offline   Reply With Quote