A few months ago, I was obsessed with thinking about the question: “what’s the lower bound of consciousness?” as an exercise for exploring the steps of improvement AI would need to take to reach sentience. Before you roll your eyes and call me a pretentious over thinker, you have to admit it’s a fascinating thought exercise... do bacteria have consciousness? how about insects, a colony of insects (in a hive mind), fish, or dogs? Can these seemingly differing levels of consciousness be programmed into an AI? It makes your brain hurt because to start at an answer, you have to prescribe a finite meaning to the abstract, but let’s save the philosophy of mind stuff for another day.
When I read Klara and the Sun recently, I smiled fondly as I realized that Ishiguro is asking a different flavor of the same question. In his dystopian world, parents can buy their children a robot (or Artificial Friend) for companionship. As these tabula rasa robots follow their child around the world, they begin to form their own increasingly complex understanding of human emotion and how different human relationships interact with each other… a way of gaining differing levels of consciousness. Most vividly, Ishiguro likens the heart to a house with many rooms and rooms within rooms, but a house that is ultimately finite and able to be replicated by machine observing the world. It forces us to face the question of whether the actions and decisions we make in life are actually not beautifully random but predetermined by the structure and chemical makeup of our hearts… are we really that special?
Ishiguro provides a hint of preserving anthropocentrism when he makes you realize that although the individual human heart can be replicated, what can’t be replicated is what a person means to n number of people in their life. Machines that are trained to optimize simply can’t predict the complexity of human relationships: like imperfect situations someone will accept because they’re in love, a mother’s love and sacrifice for her child, or a daughter’s forgiveness.
Society can be terribly dismissive when it comes to the question of whether machine will ever gain sentience, but maybe in exploring more deeply, we’ll ultimately learn more about ourselves.
Interesting perspective! It's possible that this external intelligence could indeed hold the key to learning more about ourselves. Will have to check out this book as well. Thanks for sharing!