a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment
veen  ·  3484 days ago  ·  link  ·    ·  parent  ·  post: Will your self-driving car be programmed to kill you if it means saving more strangers?

I've been meaning to ask you about that thread. Ethical discussions like the linked article tend to pop up regularly while researching automated vehicles. Could you see if I'm missing something in understanding your line of reasoning?

Using the trolley problem as a way to think about the safety issues surrounding self-driving vehicles is inherently useless; this is because it reduces the complex, real world into a simplified scenario (A or B, kill you or kill me) ; engineers will just try to reduce all risk possible and will try to anticipate situations that the car can't get itselves out of.

I think a much more interesting conversation is how much risk we accept from an automated car. For example, the British press is going bananas over a rollercoaster accident last week. Nevermind the fact that rollercoasters are incredibly well-engineered and much, much safer than any other transportation method (and that it was very likely a human error), a lot of people are now much more hesitant to step in to a rollercoaster. Even if a fully automated car is 100 times safer than driving yourself, it will still have accidents and it will still remove power from people. While a super-safe network of automated vehicles is a great goal, the transition towards is won't be silky smooth, I fear.