Morals and ethics are sets of values that govern our human-human interactions on an everyday-basis. Robotics and artificial intelligence are increasingly blurring the divide between human lives and technology. Hence, certain human-centric traits may have a role to play in our future interactions with machines. Michael Nogenborg, a research at University of Twente has published a study that explores the role forgiveness can play in human-robot interactions.

The study poses such ethical questions as, can a robot be held fully responsible for its action (even a crime), and if yes, can we forgive a robot? Nagenborg provides a scenario: “a self-driving car kills someone you love.” Aware that one may be able to forgive a human, Nagenborg asks the challenging question of whether we can forgive the machine. The study goes beyond questions of punishment, and discusses philosophical standpoints of forgiveness, responsibility and virtue in dealing with robots.

The result of this study indicates that ‘forgiving’ can be a useful lens to look into human-machine relations. Nagenborg’s underlying assumption is that robots will exist in a realm where they are more than just machines, and, yet, different from humans. Ultimately, the study reveals the complexity of ‘forgiveness’ as a concept itself. The researcher acknowledges that his speculative study opens a window to deliberate over a state of coexistence with such robots. What will a civilization with autonomous robots look like? How will we treat them? Would you be able to forgive one for killing someone you love?

 

Managing Correspondent: Rhea Grover

Press Article: Could we forgive a machine? Study explores forgiveness in the context of robotics and AI on Techxplore

Original Science Article: Can We Forgive a Robot? On SpringerLink

Relevant Science Article: Vulnerable robots positively shape human conversational dynamics in a human-robot team on PNAS.org

Image Credits: Arseny Togulev via Unsplash

Leave a Reply

Your email address will not be published. Required fields are marked *