Len Rosen’s blog ›

The Machine Question – Our Perspective on Sentience, the Singularity and Humanity

Posted on February 24, 2013
Subject(s): Artificial Intelligence

I recently finished reading David J. Gunkel’s book of the same name in which the author challenges the reader to understand the philosophical questions related to our perspective on artificial intelligence (AI) and machines in the 21st century. The HAL computer of Arthur C. Clarke’s novel, “2001: A Space Odyssey,” and Commander Data of “Star Trek: The Next Generation” are often reference points for the discussion on moral agency and patiency, two concepts thoroughly treated throughout Gunkel’s text and equated with the definition of what is a “person.”

Moral agency refers to the capacity of an individual to differentiate between right and wrong and then act knowing full well the implications.

Moral patiency refers to the capacity to endure the actions of a moral agent.

Humans exhibit both agency and patiency. In Gunkel’s book he looks at the application of these concepts to animals and AI. Can an animal be defined as a “person” if it displays agency and patiency? Can a machine? Animal rights advocates believe that several species as we understand them today could easily be seen as qualifying for the definition of “person” based on this criteria.

Humanity has undergone an awakening in the last fifty years. Rene Descartes may have thought animals were automata. But where we once saw ourselves as unique, separate from other animals, today we are very much aware of our evolutionary roots and are cognizant that many animals display high levels of awareness, emotion, moral patiency and agency. Just recently I read a press report that described the results of a scientific study indicating that even lobsters and crustacean feel pain when we boil them in a pot. Pain and patiency go hand in hand.

Read more: The Machine Question – Our Perspective on Sentience, the Singularity and Humanity | World Future Society.

Home           Top of page