Young man staring at smartphone - Our Personhood: Will Humanity Let AI Annihilate It?

AI VS. PERSONHOOD: The quiet temptation to abdicate our humanity

This question has been lingering with me for some time now, hovering at the edges of my work and prayer.

As a writer and a legal professional, I’m well aware of how deeply artificial intelligence (AI) has already embedded itself into modern life. It drafts contracts, summarizes cases, generates marketing copy, analyzes data and (increasingly) offers itself as a companion: a therapist, a spiritual guide and a confidant. The efficiency is truly impressive, but the implications of such a rapidly evolving tool can also be unsettling.

That unease crystallized for me after listening to Deacon Charlie Echeverry’s podcast (Living the CALL with Deacon Charlie Echeverry) episode “The False Promise of AI and Psychedelics,” which I highly recommend (loved it!). You can listen to it for free here.

The episode was thoughtful, grounded and refreshingly unenchanted. It didn’t deny the usefulness of technology, but it refused to baptize it prematurely. That, I think, is where Catholics like myself must begin: not with fear, but with clarity.

At the heart of the conversation is a deceptively simple word: personhood.

Personhood isn’t a function


From a Christian perspective, personhood isn’t defined by intelligence, productivity, emotional responsiveness or usefulness. A person isn’t a problem-solving unit. A person is a being created in the image and likeness of God, endowed with reason and will, and capable of love, moral responsibility and relationships (not only with others, but with God Himself). This dignity is intrinsic, not earned, and it can’t be replicated by code, no matter how sophisticated the code becomes.

AI, at least as it exists now, doesn’t possess intellect or will. It doesn’t know truth; it predicts patterns. It doesn’t love; it mirrors language associated with love. It doesn’t suffer, repent, hope or pray. It doesn’t bear moral responsibility. It can’t sin, nor can it be redeemed.

These aren’t minor distinctions. They’re the fault lines between tool and person, between instrument and soul.

When tools compete with presence


Humans have always been prone to anthropomorphize their tools. We name our cars, and we talk to our phones or yell at our laptops when they lag. We project intention where there is none. The more convincingly a tool reflects ourselves back to us—our language, our emotions, our struggles—the more tempting it becomes to treat it as something more than it is.

This is where ethical concerns sharpen, particularly in areas like therapy, spiritual guidance and creative work. AI can assist a therapist, but it can’t replace the moral weight of sitting across from another human being who bears witness to your suffering. It can help organize theological ideas, but it can’t wrestle with God in the dark night of the soul. It can generate beautiful prose, but it can’t offer the vulnerability that makes writing an act of communion rather than production.

The danger, though, isn’t that AI will suddenly become a person. The danger is that we’ll gradually lower our expectations of human presence.

When an AI becomes the first resort instead of the last aid, replacing community, friendship, pastoral care or professional discernment, we haven’t just elevated the machine. We’ve actually diminished ourselves. We’ve traded relationships for convenience, formation for efficiency, and wisdom and connection for speed.

Catholic theology has long warned against this kind of displacement. Tools are meant to serve human flourishing, not redefine it. Prudence asks not only whether we can use a tool, but if we should, and if so, how and to what extent. Temperance reminds us that even good things, when overused or misused, distort the soul.

Teaching prudence in a technological age


The idea of prudence is especially pressing for Catholic parents and writers, those entrusted with shaping minds and imaginations. Our children are growing up in a world where answers are instant, the friction that arises out of critical thought and reflection is optional, and silence is increasingly rare.

These children won’t struggle to find information in the ways that previous generations did. Instead, they’ll struggle to cultivate wisdom. They won’t lack stimulation, but they likely will lack patience.

Our children are growing up in a world where answers are instant, the friction that arises out of critical thought and reflection is optional, and silence is increasingly rare.

Teaching prudence in this environment doesn’t mean rejecting technology outright. It means modelling restraint. It means showing our children that not every question needs an immediate answer, not every emotion needs optimization, and not every struggle should be outsourced to an algorithm. It will mean teaching our kids the virtue of temperance and about healthy self-reflection, so they can learn to discern between using a tool for what it is and abusing it at the cost of connection and their own development.

Furthermore, for Christian (and other) writers, the temptation is even more subtle. AI can help brainstorm, edit, clarify and even inspire your craft. When used well, it can sharpen ideas and free up time for deeper reflection. However, when it’s used poorly, it can hollow out the very act of writing, turning it into a performance rather than a pursuit of truth.

Writing, at its best, is an act of moral reflection. It requires wrestling, revision, humility and the courage to say something imperfect but honest. No machine can do that work for us.

Will we surrender our personhood?


Young man staring at smartphone

The deeper theological concern isn’t whether AI will surpass us, but whether we’ll quietly surrender what makes us human (our capacity for judgment, relationships, sacrifice and love). God didn’t give us reason and free will so we could eventually delegate them away. He gave them to us so we could choose the good, even when it’s costly.

AI will continue to evolve. It will become more convincing, more helpful and more integrated into daily life. That isn’t, in and of itself, a moral failure. The moral question is whether we’ll remain attentive stewards or become passive consumers, and whether we’ll remember that tools are meant to assist human flourishing instead of replacing human presence.

In the end, no algorithm can be a substitute for a parent’s attention, a friend’s listening ear, a therapist’s discernment and training, or a writer’s moral imagination. These aren’t inefficiencies to be solved; rather, they’re gifts to be protected.

The Church has always stood athwart the age, not by rejecting progress but by asking the questions progress forgets to ask:

  • Who are we becoming?
  • What are we losing?
  • Are we still choosing God-given wisdom over the false promise of immediate answers?

Those questions, at least, remain stubbornly human.

«RELATED READ» AI: A god or a tool?»


image: Sammy-Sander

Your email address will not be published. Required fields are marked *