A driverless car is speeding down a road and can’t stop. Either it hits an elderly woman crossing the street, or it swerves out of the way and kills its passenger, a young child. Whose life should be spared? As driverless cars become a reality, the answer to the famed “Trolley problem” becomes increasingly pressing. Unlike humans, self-driving cars don’t have an internal moral code; instead, they make decisions by following a series of pre-programmed rules. A code of ethics can be written into these rules, but a recent study shows that such codes may not be universal.

To gather global perspectives on morality, an international group of researchers made an online game called “The Moral Machine.” The game presents players with a series of accident scenarios that boil down to decisions about who is worth saving: humans or animals, men or women, rich or poor, etc. Players from over 200 countries participated, and the site collected nearly 40 billion decisions! The results revealed that players from different countries or regions often had different moral codes. Countries could be divided into three geographic groups with specific decision-making tendencies. For example, Eastern countries were least likely to save the young over the old, Southern countries preferred to save women, and Western countries had the greatest tendency for inaction. And decision-making differences between two countries correlated with measurable divergences of their values. For example, people from countries with more economic inequality were more likely to save the wealthy.

Morality is traditionally thought of as a clear line between right and wrong, but this study shows that cultural values can move the line. This has significant implications—and not just for driverless cars. When what’s “right” in one country isn’t the same as in another, international decision-making becomes a lot more complex.

Managing Correspondent: Aparna Nathan

Original article: The Moral Machine experimentNature

Media coverageWhom should self-driving cars protect in an accident? – The Economist

Should Self-Driving Cars Have Ethics? – NPR

Related SITN Articles: Hold Artificial Intelligence Accountable

Image credit: Eliseo Velázquez Rivero/Wikimedia Commons

7 thoughts on “An effort to make moral machines finds cultural differences in human morality

  1. Profound quality is customarily thought of as an unmistakable line among good and bad! can extremely social worth move this line

Leave a Reply

Your email address will not be published. Required fields are marked *