I forget where it was from but years ago I found an online survey on autonomous cars and their decision making from a university. It was all about deciding to swerve or not in a collision. All kinds of difficult encounters like do you hit the barrier and kill the passenger or swerve and kill the old lady? Do you hit thin person or serve and hit the heavier person?
I’ve never seen a survey drill down into biases quite so deeply.
From what I’ve seen of real world examples, not “what if the car had 5 cats in it and the person on the crosswalk had a stroller full of 6 cat, swerve into a barricade?”, telsa cars just release control of the autonomous controls to the person behind the wheel a few seconds before impact so the driver is fully liable.
I forget where it was from but years ago I found an online survey on autonomous cars and their decision making from a university. It was all about deciding to swerve or not in a collision. All kinds of difficult encounters like do you hit the barrier and kill the passenger or swerve and kill the old lady? Do you hit thin person or serve and hit the heavier person?
I’ve never seen a survey drill down into biases quite so deeply.
From what I’ve seen of real world examples, not “what if the car had 5 cats in it and the person on the crosswalk had a stroller full of 6 cat, swerve into a barricade?”, telsa cars just release control of the autonomous controls to the person behind the wheel a few seconds before impact so the driver is fully liable.
Easy. Prioritize who is saved based on social credit score.
I did this as a part of our ethics discussion.
My eventual answer was you always kill the non-driver as no one would ever buy a car that will kill them over someone else.