Results from the moral machine experiment go to show that standards can vary based on culture, economics and geographical location. Interesting how the findings will likely be used in the real world:
The study has interesting implications for countries currently testing self-driving cars, since these preferences could play a role in shaping the design and regulation of such vehicles. Carmakers may find, for example, that Chinese consumers would more readily enter a car that protected themselves over pedestrians.
Seems like a person who is inside a self-driving car should reasonably expect that they would be protected. In the least, the self-driving car would not put them at any greater risk of bodily harm than if they were driving the car themselves. Culture and geography would not impact such baseline expectation of the consumer. In a real life situation, a driver will instinctively try their best not to hit a pedestrian - they understand the trade-off clearly. The person hit by a car will likely die, the driver's life saving maneuver might leave them injured and the car totaled but they will likely survive. If that was not universally true, an extremely large number of jay-walkers would be dying on streets around the world every day.
An average driver would do their best not to kill another human being. That is because people have a moral compass and killing someone accidentally is a very heavy burden for their conscience to carry. To somehow deny this fundamental truth about our nature and now claim that optimizing for the driver's safety over that of the pedestrian is somehow okay in some cultures seems like inept social science.
Comments