Tutorial 6 - Article 2 - The Moral Machine Flashcards
What is the problem with AI and machines in terms of decision-making?
We need to learn them to behave ‘ethically’, but can we agree on the correct ethical decisions for machines to make?
What is an example of ethical decision-making by machines?
Self-driving cars: run over an elderly or a child? prioritize saving child?
What is the moral machine experiment?
An online experiment that collects large-scale data on how citizens would want autonomous vehicles to solve moral dilemmas in the context of unavoidable accidents
What are some global preferences of the moral machine experiment?
1) Saving humans over animals
2) Sparing more lives over fewer lives
3) Sparing young lives over older lives
4) sparing pedestrians vs passengers
5) sparing lawful vs unlawful
What about cluster preferences regarding young vs. old?
Preference to spare younger characters and higher status characters is much weaker in the Eastern cluster and higher for the southern cluster
What about cluster preferences regarding sparing humans vs. pets?
Much weaker in the southern cluster
What about cluster preferences regarding men vs women and fit vs unfit?
Southern cluster strong preference for saving women and fit characters
How do individualistic and collectivistic cultures differ?
Individualistic cultures show a stronger preference for saving the greater number of characters, while participants from collectivistic cultures (which emphasize respect for the elderly) show a weaker preference for sparing younger characters