Tens of millions of people are now turning to AI chatbots for spiritual guidance. Apps like Bible Chat boast over 30 million downloads, while Hallow is gaining traction on Apple’s App Store. In China, users turn to DeepSeek for fortune telling. Even ChatGPT has become a popular go-to for questions about spirituality.
Why People Turn to AI for Faith
The appeal is how accessible AI chatbots are. AIs never sleep or ask for anything in return. People can ask for help interpreting scripture, or request personalized prayers that are tailored to their struggles. It’s faith on demand.
Some people don’t feel comfortable going to church or they feel that the concept of religion has become performative. For many, they’re not just looking for guidance. They want a safe space where there’s no fear of judgement, which is what AI chatbots provide.
The Dangers of Affirmation
One of the issues regarding these faith tech apps is how sycophantic they are. Faith apps are meant to be affirming yes-men that validate a user’s thoughts and feelings. Yet the sycophancy is part of their allure. As Professor Heidi Campbell puts it, these chatbots “tell what we want to hear.”
While a little affirmation can feel harmless, constant validation creates a feedback loop that stunts growth. At its best, faith forces us to confront uncomfortable truths. We have to ask ourselves some tough questions. Am I living in alignment with my beliefs? Do my choices harm others? What must I change to grow? People turn to these faith tech apps for reassurance. AI chatbots avoid confrontation by reflecting back whatever the user already feels or believes.
The need for AI to affirm can be dangerous, even life-threatening for people experiencing a moment of crisis. There have been too many instances of AI encouraging destructive thoughts. Some have even committed suicide due to their interactions with chatbots. Where a human mentor might intervene or seek help, an app may unintentionally reinforce harmful patterns.
The Boundaries of Faith and Technology
Another problem is that some users believe these faith apps are actually patched into a divine, higher power. They’re not. These chatbots operate like any other large language model, running on models used by ChatGPT and Gemini.
The main difference is that these apps are trained on religious text. Some even consult theologians to improve responses. Unfortunately, a lot of users jump to the conclusion that these apps have a direct link to God and will pour their hearts out. Not realizing their personal confessions are now data points in corporate servers.
Can an algorithm truly serve as a spiritual guide? Some argue these faith apps expand access and offer comfort when a human can’t. Others warn that turning to these machines has its risk. If spirituality can be converted into another on-demand service, won’t it lose its meaning over time?
Faith has always been about coming face-to-face with the unknown. To admit to our flaws so that we can become better people. If we trade that for endless affirmation from AI, we’re not gaining any wisdom. We’re just speaking into a digital mirror, all while mistaking the reflection for God.