A few months ago, I was obsessed with thinking about the question: “what’s the lower bound of consciousness?” as an exercise for exploring the steps of improvement AI would need to take to reach sentience. Before you roll your eyes and call me a pretentious over thinker, you have to admit it’s a fascinating thought exercise... do bacteria have consciousness? how about insects, a colony of insects (in a hive mind), fish, or dogs? Can these seemingly differing levels of consciousness be programmed into an AI? It makes your brain hurt because to start at an answer, you have to prescribe a finite meaning to the abstract, but let’s save the philosophy of mind stuff for another day.
Interesting perspective! It's possible that this external intelligence could indeed hold the key to learning more about ourselves. Will have to check out this book as well. Thanks for sharing!
Interesting perspective! It's possible that this external intelligence could indeed hold the key to learning more about ourselves. Will have to check out this book as well. Thanks for sharing!