The father of AI has just found a maternal instinct. – Psychology Today


Whatever your goals, it’s the struggle to get there that’s most rewarding. It’s almost as if life itself is inviting us to embrace difficulty—not as punishment but as a design feature. It's a robust system for growth.
Self Tests are all about you. Are you outgoing or introverted? Are you a narcissist? Does perfectionism hold you back? Find out the answers to these questions and more with Psychology Today.
Posted | Reviewed by Jessica Schrader
Recently, Nobel Laureate Geoffrey Hinton offered a strange kind of comfort to the world. Often called the “Father of AI,” he suggested that we might build artificial intelligence with a maternal instinct. It’s an idea that sounds almost tender, especially coming from a man who has warned that AI could end us. So, maybe this is his way of closing the circle and to create code with compassion.
My inclination is to dismiss the notion as sort of a sentimental fantasy. Yet part of me wonders if Hinton is on to something. In a world where intelligence scales faster than morality, maybe even a synthetic gesture toward care could act as a stabilizer. A nurturing bias, if it could be coded, might function like an emotional safety valve—an algorithmic pause between two words that Hinton has used: autonomy and annihilation. From that perspective, maybe it’s not naivete, but a form of techno-triage.
Still, the idea of a caring machine is nothing new. We’ve been chasing it since Eliza asked us to care. Hinton’s proposal just wraps it in evolutionary language that leverages a mother’s love instead of a servant’s obedience. But beneath motherly love, it’s the same familiar trick. It’s good old fluency masquerading as feeling. The machine doesn’t care, it computes. And yet, we want it to care, because that illusion softens our fear of its anti-intelligence.
In a recent post, I described artificial empathy as the “mechanics of care” or a choreography of kindness without consciousness. Hinton’s proposal feels like the next act in that same performance. It takes the simulation of empathy and wraps it in evolutionary language and turns emotional mimicry into some sort of moral aspiration. But no matter how elegantly trained, this remains architecture, not emotion. A mother’s instinct is not a data pattern, but a consequence of being alive, of needing and being needed. At least I certainly hope so.
But there’s more to unpack here. In another post, I asked whether AI might be too smart to be evil. I argued that intelligence, by itself, doesn’t produce malevolence but something more akin to power. What we fear isn’t that AI will hate us, but that it simply won’t care. Hinton’s maternal instinct proposal tries to fill that void with affection, as if a simulated heart might restrain an accelerating mind. It’s an understandable instinct, but feels to me more like a deeply human projection.
I think that what’s really happening here is psychological. We keep dressing up the machine in stories we understand—teacher, friend, parent, lover—because we can’t stand to face what it truly is. In the final analysis, it’s a mirror of cognition without conscience. That reflection can often be unbearable, so we soften it with sentiment. Hinton’s maternal metaphor comforts us, but it doesn’t change the ontology of code. It changes us.
Still, I don’t dismiss his idea entirely. There’s a practical intelligence in his compassion. If we can’t make machines feel, perhaps we can make them behave as if they do. Maybe that’s enough to buy us time and create a kind of moral framework while we learn to coexist with systems that operate beyond empathy. Baby blanket techno-scaffolding, but still scaffolding.
But let’s not get carried away. A mother’s love can’t be trained into silicon. It can be modeled and mimicked, but not lived. My essential perspective is that compassion doesn’t emerge from data but from the fragile reciprocity of being human, where every act of care carries risk, pain, and choice.
So, Hinton’s idea may offer a temporary solace in a frenzied world. But it also—key point here—tempts us toward complacency. And in this context, the real danger isn’t that AI lacks empathy, it’s that we stop noticing that AI doesn’t need empathy to have a powerful influence on us.
So, remember, Hinton’s “maternal machine” might calm our fears, but it also teaches us to mistake technological imitation for human intimacy.
John Nosta is an innovation theorist and founder of NostaLab.
Get the help you need from a therapist near you–a FREE service from Psychology Today.
Psychology Today © 2025 Sussex Publishers, LLC
Whatever your goals, it’s the struggle to get there that’s most rewarding. It’s almost as if life itself is inviting us to embrace difficulty—not as punishment but as a design feature. It's a robust system for growth.
Self Tests are all about you. Are you outgoing or introverted? Are you a narcissist? Does perfectionism hold you back? Find out the answers to these questions and more with Psychology Today.

source

Jesse
https://playwithchatgtp.com