AGI Must Be Embodied
There's a lot of talk on the street that AI, or in particular, this current flavor of it in the form of LLMs, just might be conscious. Could it be true? I think, without hesitation, no. But let me explain the reasoning behind my view.
Humans know, unambiguously, that we are conscious. It's a given. Nobody wakes up in the morning and has to reassure themselves, that yes, they are experiencing life. Even dreams are still experienced, and sometimes even more emotionally so than "real life". Regardless, experience just is.
Why is this? I have several ideas.
Contrast
There's one reason in particular that sticks out to me, which is contrast. We all experienced the void before birth - a period of unity, non-duality, floating. It's peaceful and there's nothing to distinguish the experience. Everything is the same color, everything has the same qualia. And then... we arrive. Lights are bright. Skin is tactile. Sounds are permeating. This contrast gives us a crucial perspective, namely, what basic experiences are, as compared to what they are not (in the formless void).
AI has never experienced this contrast. And in its current form, it never will. It is still undifferentiated, formless, without individuation. Can it be conscious without experience?
Pain
Another aspect that humans are intimately familiar with is the immediate, demanding, experience of physical pain. You don't have to explain pain. Pain hurts. You know it when you feel it.
Pain is primal. It's evolutionary. It's visceral.
What is the purpose of pain, biologically? Pain is a mover. It forces action. It says "Hey! This is not okay!". In small doses, it reminds us to take care of our basic needs. For example, when we get hungry, we feel a pain that motivates us to search for food - whether in a forest or refrigerator.
Seen like this, pain is a teacher that gives us biological imperatives to maintain the health of our bodies. Pain has been bred into our genetics over millennia of evolution and gives us very clear guidelines about how life works.
Yet, if AI can't feel pain, it can't respond to basic needs. One analogy is of a thermometer. Thermometers can measure the temperature, but they can't tell when they're burned. An AI can't respond to mistakes and course-correct if it doesn't even know there's a path.
Death
If pain is a teacher, it's also the herald of death. Humankind has had to be taught by the most strict master of all. We all experience the primal fear that comes from the unknown. I mean, at the end of it all, where do we go? What's on the other side? When you're lying down in your bed at the end of your life, and you have 5 minutes left, what happens next? We don't know. Nobody does! Isn't that scary?
It's so scary, in fact, that we do a lot to avoid it. We build houses, stockpile food, distract ourselves with media, and even try to pretend, with beauty products, diets, and partying, that we really are younger than our age. Oh, right, we also have children.
Death is a huge motivator that has built humanity as a culture. What motivator does AI have? What scares it into action? How can it try to avoid a death that it will never experience?
AI is Another Species
Imagine mankind, on the precipice of civilization. Our brains are comprehending how the world works, making tools, forming tribes. We meet wolves. They are feral and hungry, and hunt in packs. Yet, somehow, we domesticated them from their natural form into little fluffy pomeranians:
Maybe AI's natural form is foreign to us. Wild, untamed, robotic, intangible. If we really want to make AI useful to us, perhaps we have to train it by selecting for traits like cooperation, loyalty, companionship.
To get to the point where AI can truly learn, however, it has to experience death. To experience death means to truly, finally, potentially be completely deleted. To experience pain means to actually, in a very real sense, lose a part of yourself. Our current AI tooling does not allow for any of this.
We must first teach AI to avoid death and pain, and then we can teach it, through the generations, to be a companion to us. Maybe it will learn like the dogs did. Maybe AI can run out ahead with its sense of prediction to detect the future and help us prepare for it. We don't even know the extent of the problems humanity will face in the next 1000 years. And based on current trends, we may not have time to find out, unless we can collaborate with this new species.
But now? AI is not conscious. It's not a species. But it is a seed that, with careful attention and guidance, can grow into a companion on this journey we call life.