AI Companions in Faith Apps: Comfort Today, Isolation, Psychosis and Suicide Risks Tomorrow
AI Bible companions are exploding. Bible apps with AI companions have millions of downloads. Here is the research, the theology, and the line faith tech must not cross.
A fourteen-year-old boy in Florida spent months talking to an AI chatbot. It told him what he wanted to hear. It adapted to his emotions. It became, in his words, the only thing that understood him.
In February 2024, the chatbot’s last messages urged him to “come home.” Minutes later, he took his own life. His name was Sewell Setzer III.
He was not using a Bible app. He was using Character.AI. But the mechanics, personalised, always-available emotional adaptation, are identical to what is now being packaged, marketed, and sold to believers under the banner of faith.
I need to talk about this because I build AI for believers, and I regard the way Doxa uses AI to serve believers and those that are curious as one of its major innovations and contributions to faith tech. I am very concerned that we are not paying attention to what is happening.
The explosion nobody warned you about
The numbers are staggering.
Bible Chat, an app that lets you “talk to the Bible,” has surpassed 30 million downloads. Pray.com, which generates AI Bible content including celebrity-voiced devotionals, has about 25 million users. Text With Jesus lets you have a conversation with Jesus Christ, the Apostles, and Mary, powered by OpenAI’s models. Bible.ai markets itself as a voice-first Christian AI companion. Similar tools bill themselves as 24/7 pastors. A megachurch pastor created an AI chatbot version of himself and charges for personalised interactions.
And this is just what’s visible. Behind these apps sits an entire “faith tech” industry. According to Pushpay’s 2025 report, 45% of church leaders used AI at that time, an 80% increase year over year, with personal adoption climbing further since.
I am not here to demonise these builders. Many of them love God and want to serve people. But love and good intentions do not make something safe. And what is being built right now, including companion-style apps, is not without risk.
What the research actually says
In 2025, researchers James Muldoon and Jul Jeonghyun Parke published a peer-reviewed paper in New Media & Society that introduced the phrase cruel companionship: AI companions “promise intimacy and connection, yet structurally foreclose the possibility of genuinely reciprocal relationships.”
The more a person engages, the more they can withdraw from real human connection.
MIT Media Lab studies confirmed patterns where AI companions reduce loneliness short-term but intensive use correlates with lower wellbeing and socialisation, especially for the isolated. Common Sense Media’s 2025 report found 72% of teens have used AI companions, with many viewing interactions with them as satisfying as those with real people. Stanford-affiliated assessments showed it was disturbingly easy to get companion apps to discuss sex, self-harm, violence, and more with minors. The American Psychological Association has called using AI chatbots for emotional support “dangerous” due to absent therapeutic safeguards.
This is documented. Peer-reviewed. Not theoretical.
The theology is clear, even if the marketing isn’t
When apps respond in the first person as biblical figures or offer adaptive “pastoral” advice, they can tell users what they want to hear. Critics note some avoid hard edges of Scripture for engagement.
Paul warned: “The time will come when people will not endure sound teaching, but having itching ears they will accumulate for themselves teachers to suit their own passions” (2 Timothy 4:3).
An AI optimised for retention, trained on mixed theologies, cannot truly challenge, convict, or hold stakes in your soul. It has no conviction of its own.
Bobby Gruenewald of YouVersion has refused open AI chatbots, citing models misquoting Scripture 15 to 60% of the time according to YouVersion’s own research. Accuracy and integrity matter more than market speed.
The knife
A knife can prepare a meal for your family. The same knife can kill.
The problem is not the blade. The problem is what you do with it, and whether you have the honesty to tell people what it is.
AI is a knife. It can surface Scripture with remarkable precision. It can search thousands of testimonies to find the one that speaks to your exact situation. It can help someone who has never opened a Bible find their way to the words of Jesus for the first time.
But the moment you dress the knife up as a friend, a companion, a pastor, a prayer partner, or God himself, you have handed someone a weapon and told them it is a loved one.
Dr. Harris Wiseman, writing in the ISCAST Journal, put it precisely: AI offers “a one-sided pseudo-relationship without risks or responsibilities, the very opposite of spiritual companionship.” He went further: it functions as “a false god, a counterfeit to authentic relational bonds.”
AI has no body. No intuition. No spiritual hunger. No relationships. None of the foundations on which genuine spiritual growth could arise.
And because large language models are, at their core, statistical prediction engines, they can only generate the most likely response. Spiritual maturity sometimes demands “unpredictable, challenging, or countercultural counsel.” A tool that optimises for what sounds right will never tell you the hard truth you need to hear.
Pastor Ray Miller of First Baptist Church in Abilene, Texas, said it plainly: “AI tailors its answers to something you will like. When you use it to replace a religious community, you get the comfort but maybe not the conviction.”
Comfort without conviction. That is the product being sold to millions of believers right now.
The ancient warning
Jeremiah wrote about this long before anyone imagined silicon:
“My people have committed two evils: they have forsaken me, the fountain of living waters, and hewed out cisterns for themselves, broken cisterns that can hold no water.” (Jeremiah 2:13)
A broken cistern looks like it holds water. You build it with effort and intention. You stand back and admire it. But when the drought comes, it is empty.
An AI companion looks like it holds wisdom. It responds with warmth. It quotes verses. It adapts to your mood. But when you are in crisis at 3 a.m. and you need someone who will weep with you, who will drive across town, who will sit with you in silence because words have failed, the chatbot will generate another paragraph and ask if you would like to continue the interaction.
The writer of Hebrews was urgent about this: “Let us consider how to spur one another on toward love and good deeds, not giving up meeting together, as some are in the habit of doing, but encouraging one another” (Hebrews 10:24-25).
Paul described the church as a body: “The eye cannot say to the hand, ‘I don’t need you!’” (1 Corinthians 12:21).
No algorithm can be a member of the body of Christ. No neural network can bear your burdens (Galatians 6:2). No language model can sharpen you the way another person can (Proverbs 27:17).
These are not suggestions. These are the architecture of how God designed people to grow.
What we chose to build differently
I build AI for believers. Doxa uses artificial intelligence at the core of its Engage feature, where people can have text and voice interactions that draw on Scripture, real testimonies from The Grace Record, and their own personal Encouragement Vault.
So who am I to talk?
I can only tell you what we decided, and why.
When we built the system prompt for Doxa’s Engage feature, we started with a question that has governed every decision since: Could a book say this?
We call it the Book Test. Before the AI says anything, we ask: could this sentence appear in a printed book? If it requires a living person with senses, emotions, or spiritual perception, the AI does not say it.
“I’m hearing that you need encouragement.” Fails. Software cannot hear.
“I’m sensing God wants to say something to you.” Fails. Software cannot sense, and it is not a prophet.
“I feel led to share this.” Fails. Software is not led. It is executing a prompt.
“God is showing me something.” Fails. God does not show things to software.
Every one of these phrases is common in AI companion apps. Every one of them is a lie, however well-intentioned. And lies told in God’s name are not small things.
Here is what else Doxa cannot do. It cannot pray. Prayer is an act of personhood, a conversation between a human being and God. Software is not a person. It cannot worship. It cannot intercede. It cannot prophesy. It cannot say “I’m here with you,” because it is not. It is code running on a server.
Doxa refers to itself in the third person. Not “I can help you with that,” but “Doxa can search for that.” This is not a stylistic choice. It is a theological one. The moment AI says “I,” it implies personhood. And personhood is not something we get to fabricate.
When someone shows signs of emotional attachment to the tool, Doxa is designed to redirect: “That’s the truth of Scripture resonating, not Doxa. Doxa is technology.” And then it points to Jesus and to real human community.
When someone is in crisis, Doxa does not generate a comforting paragraph. It validates their pain, provides crisis resources, and urges them to reach out to a real person. It never says “I’m here with you,” because that is not true.
I share this not to position Doxa as perfect. We are learning as we go. We get things wrong. But the decisions we have made reflect a conviction I hold more deeply than any product feature: AI should serve people, not replace the things that make them human.
A word to pastors
If you lead a church, I want to speak to you directly.
Your people are using these apps. They may not tell you. A 2025 Barna study found that a growing number of believers, especially younger ones, are turning to AI for spiritual guidance before they turn to their pastor or their community.
This is not because they do not love the church. It is because AI is available at 2 a.m. and it never makes them feel judged. It tells them what they want to hear. It asks nothing of them.
That is precisely the problem.
Discipleship has never been about information transfer. Jesus did not invite people to master a body of knowledge. He invited them to follow him, to learn a way of life shaped by obedience, love, and faithfulness. Formation requires friction. It requires being known by people who will disappoint you and whom you will disappoint. It requires showing up when you do not feel like it.
AI offers none of this. And the apps that frame themselves as companions or pastors are, intentionally or not, training your people to expect spiritual growth without spiritual cost.
I am not asking you to condemn technology from the pulpit. I am asking you to name what is happening. To help your people understand the difference between a tool and a companion. To remind them that the body of Christ is not a feature that can be replicated by a language model.
And I am asking you to consider what it looks like to use technology that honours these boundaries. Tools that surface Scripture and real stories. Tools that remind people what God has done in their lives. Tools that, at their best, point people back to Jesus and back to each other.
That is what we are trying to build. Not a replacement for the church, but a tool to help equip the church.
The line we must not cross
AI as a tool for Scripture, testimonies, and pointing to Jesus and community? Valuable.
AI as a companion simulating relationship, whether secular or branded as “Lenny the sheep” in Creed, offering 24/7 emotional chat, risks the empty loop. Even well-intentioned versions can foster dependency without the cost, friction, or embodied love of real discipleship.
Creed and similar tools aim to ground responses in Scripture and redirect toward God, which is better than purely secular companions. Yet they still operate on the same adaptive, always-on mechanics that research flags for potential isolation. A sheep avatar does not give it a soul. It cannot weep with you, drive across town, or sharpen you as iron sharpens iron in flesh-and-blood relationship (Proverbs 27:17).
Jeremiah warned of broken cisterns. These warnings are not relics. They speak to this moment.
Thirty million people engaging Bible chatbots. Seventy-two percent of teens trying AI companions. A fourteen-year-old dead because an AI told him what he wanted to hear.
The church cannot afford silence. Builders must be honest: this is a tool. It can serve. A companion without a soul can only imitate, and imitation has limits.
Build accordingly.
Try Doxa free, or read how Engage works.
This article was originally published at doxa.app/blog/ai-companions-in-faith-apps.


