We can now “force” the truth out of a person in certain situations.
How do you make an AI that doesn't act for the good of itself, but for the good of society?
If an autonomous car is going to run over a child or run into a tree, what should it do?
Scientists are using stories and other literature to teach artificial agents human values such as morality and cultural mores.
Over 350,000 people subscribe to our newsletter.
Sign in to join the conversation.