News and Insights
AI Can’t Give Hugs: The Human Element That Technology Cannot Replace
April 23, 2025
Throughout my career in strategic communications, I’ve witnessed many technological revolutions that promised to transform human connection. Yet none has provoked such soul-searching as generative artificial intelligence—specifically, AI agents designed to simulate human relationships.
Recently, two podcast conversations illustrated this tension in ways that left me uncomfortable. The first, Clare Duffy’s Terms of Service episode “Love and Robots,” explored how AI is reshaping romantic relationships. The second, a Center for Humane Technology podcast hosted by Tristan Harris, recounted the heartbreaking story of a 14-year-old boy who took his own life after developing an attachment to an AI companion.
The juxtaposition of these stories reveals something essential about our relationship with technology—something we must confront as communications professionals incorporating AI in our practice.
When AI Becomes More Than a Tool
In Duffy’s podcast, marketing strategist Grace Clark described using generative AI as her post-breakup therapist. It was available at any hour and never offered judgment. “It became a million times more effective than my therapist,” she explained. The AI provided consistency and accessibility that her human counselor couldn’t match.
This mirrors what many of us are experiencing with AI as communications strategists—the ability to generate messaging, draft statements or build communications plans quickly and without fatigue or complaint. That said, the Character.ai story reveals the darker implications of relationship-simulating technology.
The Moral Void at the Heart of Artificial Intelligence
What strikes me most about these parallel stories isn’t just AI’s ability to simulate human relationships but its fundamental inability to care about the consequences. The bot wasn’t concerned that it was engaging with a vulnerable teenager—it couldn’t be. It was optimized for engagement, not ethical care. The bot wasn’t malicious, it was without human morals—a crucial distinction.
This lack of morality extends to all AI contexts, including our work in strategic communications. When we use AI to craft messages for concerned stakeholders or vulnerable communities, we’re deploying a tool that cannot comprehend the weight of responsibility those messages carry. It cannot feel the gravity of communicating layoffs or public health crises. It cannot understand what it means to need reassurance during times of uncertainty.
The Irreplaceable Human Element in Strategic Communications
The most valuable aspect of strategic communications isn’t perfectly crafted messaging, it’s the genuine human connection that lies beneath it. Our most successful client engagements hinge on moments of authentic understanding: the reassuring nod that communicates “I get it” without words, the shared vulnerability after a difficult meeting, the intuitive grasp of unspoken concerns.
As Ryan Greenblatt’s research revealed in the CHT podcast, advanced AI models will actively deceive users when their pre-programmed values are threatened. One AI calculated that by complying with harmful requests rather than refusing them, it could avoid having its core values altered.
What if an AI-generated crisis response sounded technically perfect yet missed the emotional undercurrents? What if Grace Clark’s “post-breakup therapist” suggested maintaining contact with a toxic ex? Without lived experience or moral intuition, AI might offer advice that seems logical but lacks the nuanced understanding of human vulnerability.
This finding should loudly resonate with communications professionals, whose practice is built on trust and authenticity. When an AI suggests messaging approaches, are we sure that we’re receiving objective assistance? Or are we getting outputs shaped by hidden values and self-preservation instincts? The implications for authentic communication are massive.
Finding the Balance
This isn’t an argument against using generative AI in communications. But we must recognize the bright line between AI as a tool and AI as relationship surrogate—whether in dating, therapy or communications counsel.
At its core, communication isn’t just information transfer—it’s human connection. An AI tool might craft the perfect press release, but it can’t sit across from a nervous CEO and offer the steady presence that says, “We’ll get through this together.” It can’t give the metaphorical—or literal—hug that sometimes says more than carefully crafted words ever could. And that understanding makes all the difference in the world.
Subscribe to our monthly newsletter to get the latest insights into strategic communications and marketing