With the increasing development of AI, more tools are being implemented and made widely available online across various industries. One such industry affected is Online Spirituality, seeing a surge of tools such as AI generated tarot readings and astrological charts, all offering instant spiritual and personal guidance at the click of a button. Across a range of platforms, from Tiktok, to chatbots and forums, AI is infiltrating spaces rooted in trust, intuition as well as human connection.
Surveys suggest that over 40% of adults have now used AI seeking out some form of personal advice, while research shows that 1/3 engage with these tools about mental health and wellbeing. In this context, it shows AI tools are part of a broader trend towards automated guidance.
Supporters of these tools argue that they democratise access, in part due to being free, anonymous and available instantly. For users unable to afford, or hesitant to book private readings with another person, these tools offer a low cost alternative with privacy and without judgement.
Though, as automation reshapes how people seek meaning in their lives through its rising accessibility, some practitioners suggest that AI has the potential to dehumanise an industry that relies on an innate, human emotional awareness, and interpersonal exchanges.
Tali, a queer 27 year old Kosovo Albanian practitioner based in Berlin, has been working professionally in the industry for several years. Working fulltime with clients through consultations and digital content, they describe their practice as deeply interpersonal, being shaped by years of exploration.
“I started studying astrology in 2016 during a period of conflict and questioning,” Tali says. “I was always drawn to esoteric topics, even though I had a lot of resistance to spirituality when I was younger.”
Tali argues that despite AI’s ability to synthesise large volumes of astrological information, it lacks the awareness required to truly support clients. “AI can give you a quick rundown and make astrology more accessible, but there is something about assessing someone’s emotional state, their consciousness, that can’t be automated.”
Concerns revolving around the accuracy of such tools also persist. AI Models are sometimes known to generate convincing albeit inaccurate information. This is a phenomenon researchers have dubbed “hallucination”; in certain situations where users may be emotionally vulnerable and consequently susceptible to receiving said misinformation, the hallucination of the AI can translate into a form of spiritual psychosis in the individual.
“My worry is about dehumanising the work,” Tali says. “Unless someone already has knowledge, they may not realise when AI gets things wrong.”
In spite of this, Tali acknowledges the potential and appeal. It can be exceptionally attractive to people who are reluctant to seek therapy or support run by humans: “For some, it’s a safer first step.”
Further conducted research into AI companionship seems to suggest that, while users can in fact experience emotional comfort, if not for the short term, for artificial, machine empathy lacks the depth and accountability of human relationships.
Going forward, Tali envisions AI being utilised more as an assistant, and less so a replacement. They argue that it could help support education, the creation of content or even the distribution of information on a large scale, less so one on one spiritual support.
