a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by syzo
syzo  ·  3411 days ago  ·  link  ·    ·  parent  ·  post: Will your self-driving car be programmed to kill you if it means saving more strangers?

Yup. I'm not sure why this question is so popular when automatic cars will always make sure it's driving safely. I do wonder about what it'll do when faced with a bunch of people on a sidewalk or walking on the side of the road, though. Will it calculate how fast the human could theoretically jump out in front of it, and drive assuming that that can happen at any time? How will it work in large cities where there may be a lot of people on the sidewalk at any given time?

I guess just drive slowly enough that it can stop in a centimeter in under X amount of time.





kleinbl00  ·  3411 days ago  ·  link  ·  

In general, speed limits are chosen such that vehicles meeting safety standards can stop or otherwise maneuver away from hazards. This is why you can be cited for reckless driving if you're obeying the speed limit during inclement weather - most limits are based on safety. "People jumping from the sidewalk" is the sort of thing that is covered under this unless you're dealing with crazy or suicidal people.

Driverless cars will necessarily obey the rules of the road, and won't be licensed if they can't safely do that. Anything outside the regime of legality becomes the fault of whoever broke the law, and an autonomous car won't break the law. Bet on it.

user-inactivated  ·  3410 days ago  ·  link  ·  
This comment has been deleted.
kleinbl00  ·  3410 days ago  ·  link  ·  

Morality has no place in engineering. Morality has a place in the application of engineering. There is nothing inherently evil about chemical weapons - if a stockpile of sarin gas is what it takes to keep a maniacal despot from committing genocide against minorities, then the stockpile of sarin gas is being used in a moral way. Using that sarin gas, on the other hand, is almost always going to be an immoral act.

Caselaw is never about morality. It's always about culpability. And that is why "fuck everything about this entire line of questioning" - it replaces culpability with morality and goes "holy fuck! there's no moral framework here!"

Fuckin' A there's no moral framework here. There' s no morality to bulldozers, there's no morality to slip'n'slides, there's no morality to taxi services, there's no morality to take-out food. There's culpability and when you insist that we now need to come up with a whole new way to understand a tool just because you can't wrap your head around it, I'm not only entitled to call you on it, I'm entitled to do so in a snarky tone of voice.

syzo  ·  3411 days ago  ·  link  ·  

Yeah, i guess i was specifically thinking of the suicidal person scenario. I'm sure there will be nauseating amounts of research dedicated to safety, and i'm sure that the computer-controlled car will be safer than a human-controlled one any day.

Given all that though, i guess the question given in OP comes down to "be as safe as possible to people external to the car, but save the people inside the car at all costs". That's probably what i'd go with.

OftenBen  ·  3411 days ago  ·  link  ·  

I'm fairly certain that within the limits of their sensory apparatus self-driving cars are already safer than human drivers.