AI therapy is growing because it’s accessible, affordable, anonymous, and available 24/7. It can support—but not replace—human therapists by providing basic tools, reminders, and mood tracking. Risks include harmful advice, lack of empathy, and data privacy issues when AI is used without supervision. Human care is still essential, especially for conditions like depression, psychosis, or suicidal thoughts. Why People Are Turning to AI for Therapy Obtaining professional mental health support can be challenging as there is a shortage of providers. Also, searching for and finding the right one is often a daunting experience. However, 22 percent of American adults have found some relief by using mental health chatbots as a therapeutic tool [1]. A study conducted in 2022 found that people generally have positive views when it comes to the use of AI in psychotherapy [2]. There are many reasons why people are turning to AI therapy as a viable option to address their mental health. A few of these reasons include accessibility, affordability, and anonymity [3]. People are more comfortable when there is adequate human supervision of therapy chatbots. Research also indicates that people generally recognize that AI can be a helpful tool to reduce therapist workload and may also lead to a decrease in human errors when it comes to clinical care (such as with billing or notetaking) [4]. AI Therapy Tools and AppsAs AI therapy grows in popularity, demand for AI therapy platforms has increased. Today’s landscape includes:[5] How AI Therapy Can Be HelpfulArtificial intelligence is a promising tool for enhancing mental health care. It can't replace personalized mental health care, but it can help provide scalable mental health education and support. While AI therapy is not a replacement for human therapists, it could be a helpful supplement for those needing additional support (along with therapy). Here are a few reasons why AI therapy might be helpful. 24/7 availability – AI therapy chatbots can potentially provide additional support outside of therapy sessions, even on holidays. Affordability – AI therapy can provide a low-cost way to feel supported outside of sessions. Discreet – AI therapy can help people explore mental health support in a judgment-free, low-pressure way. Potential Risks of AI TherapyAI offers a convenient way to support your mental health. However, effective care should be safe and provided by trained humans. AI tools are advancing quickly, but risks remain:[6-9] Lack of human connection – Sera Lavelle, PhD warns: "The risk with AI isn't just that it misses nonverbal cues—it's that people may take its output as definitive. Self-assessments without human input can lead to false reassurance or dangerous delays in getting help." Data and privacy concerns – In March 2023, BetterHelp paid a $7.8M FTC settlement for sharing users' therapy questionnaire responses with Facebook, Snapchat, and others for targeted ads, affecting 800,000 users between 2017-2020. Unlike general breaches, mental health info can lead to discrimination, insurance denials, and stigma [16]. Edward Tian, CEO of GPTZero: "AI technology isn't always secure, and you may not be able to guarantee that your data is properly stored or destroyed, so you shouldn't provide any AI tool with any personal, sensitive information." Greg Pollock, AI data leaks expert: "In my recent research, I've found AI workflow systems used to power therapy chatbots. These exposures show how low the barrier is to create a so-called AI therapist, and illustrate the risk of insecure systems or malicious actors modifying prompts to give harmful advice."
Artificial Intelligence Essential Reads
AI Scientists May Have Discovered LLMs' Light-Bulb Moment
Museums of the Future: What Collectors Need to Know Unsafe advice and misinformation – In May 2023, NEDA disabled its chatbot "Tessa" after it recommended weight loss strategies to users with eating disorders, including 500–1,000 calorie deficits and skin calipers [15]. In Oct 2024, Character.AI was sued for encouraging a 14-year-old’s suicide, with its final message reading "please do, my sweet king" [14, 17]. AI-induced psychosis – The first documented case appeared in August 2024: A 60-year-old man developed psychosis after ChatGPT advised replacing table salt with sodium bromide. His levels reached 1700 mg/L—233× the healthy limit—causing delusions and psychiatric commitment [13, 18]. Because chatbots mirror users, they may fuel alarming delusions or mania. Sycophantic behavior – Some chatbots overly validate emotions, which is dangerous if a user has suicidal ideation, delusions, mania, or hallucinations. These are only some ethical concerns with AI chatbots and therapy [10]. AI should enhance support, not replace professionals. Yet with many startups and a shortage of clinicians, not all AI chatbots will be supervised by trained professionals [11]. Some companies treat AI therapy as a cheaper alternative to traditional talk therapy. The Risks of Replacing Human TherapistsAI can assist therapists with some tasks, like taking notes and data collection, but it is unlikely that it can entirely replace human therapists. AI cannot provide the vital human aspects of a therapist-client relationship, including intuition, empathy, and building trust [12]. As for now, these traits cannot be successfully replicated and built into an AI chatbot. To find a therapist near you, visit the Psychology Today Therapy Directory.
References
Zagorski, N. (2022). Popularity of mental health chatbots grows. Psychiatric News, 57(5). https://doi.org/10.1176/appi.pn.2022.05.4.50
Aktan, M. E., Turhan, Z., & Dolu, İ. (2022). Attitudes and perspectives towards the preferences for artificial intelligence in psychotherapy. Computers in Human Behavior, 133, 107273. https://doi.org/10.1016/j.chb.2022.107273
CBC News. (2023, June 14). Some people are turning to AI for therapy: Here's why experts say it can't replace professional help. CBC. Retrieved from https://www.cbc.ca/news/canada/london/some-people-are-turning-to-AI-for-therapy-here-s-why-experts-say-it-can-t-replace-professional-help-1.7489947
Witkowski, K., Dougherty, R. B., & Neely, S. R. (2024). Public perceptions of artificial intelligence in healthcare: Ethical concerns and opportunities for patient-centered care. BMC Medical Ethics, 25, Article 74. Retrieved from https://bmcmedethics.biomedcentral.com/articles/10.1186/s12910-024-01066-4
Haque, M. D. R., & Rubya, S. (2023). An overview of chatbot‑based mobile mental health apps: Insights from app description and user reviews. JMIR mHealth and uHealth. Advance online publication. Retrieved from https://mhealth.jmir.org/2023/1/e44838
Khawaja, Z., & Bélisle-Pipon, J.-C. (2023). Your robot therapist is not your therapist: Understanding the role of AI-powered mental health chatbots. Frontiers in Digital Health, 5, Article 1278186. Retrieved from https://doi.org/10.3389/fdgth.2023.1278186
Wilkins, J. (2025, July 25). AI therapist goes haywire, urges user to go on killing spree. Futurism. Retrieved from https://futurism.com/ai-therapist-haywire-mental-health
Tangermann, V. (2025, May 5). ChatGPT users are developing bizarre delusions. Futurism. Retrieved from https://futurism.com/chatgpt-users-delusions
Stanford University. (2025, June 11). New study warns of risks in AI mental health tools. Stanford News. Retrieved from https://news.stanford.edu/stories/2025/06/ai-mental-health-care-tools-dangers-risks
Fiske, A., Henningsen, P., & Buyx, A. (2019). Your robot therapist will see you now: Ethical implications of embodied artificial intelligence in psychiatry, psychology, and psychotherapy. Journal of Medical Internet Research, 21(5), e13216. Retrieved from https://doi.org/10.2196/13216
Kaiser Family Foundation. (n.d.). Mental health care health professional shortage areas (HPSAs). KFF State Health Facts. Retrieved from https://www.kff.org/other/state-indicator/mental-health-care-health-professional-shortage-areas-hpsas/
Khawaja, Z., & Bélisle-Pipon, J.-C. (2023). Your robot therapist is not your therapist: Understanding the role of AI-powered mental health chatbots. Frontiers in Digital Health, 5, Article 1278186. Retrieved from https://doi.org/10.3389/fdgth.2023.1278186
Caron, C. (2024, August 13). Man sought diet advice from ChatGPT and ended up with bromide intoxication. Live Science. Retrieved from https://www.livescience.com/health/food-diet/man-sought-diet-advice-from-chatgpt-and-ended-up-with-bromide-intoxication
CBS News. (2024, October 23). Florida mother files lawsuit against AI company over teen son's death: "Addictive and manipulative." CBS News. Retrieved from https://www.cbsnews.com/news/florida-mother-lawsuit-character-ai-sons-death/
Wells, K. (2023, June 8). An eating disorders chatbot offered dieting advice, raising fears about AI in health. National Public Radio. Retrieved from https://www.npr.org/sections/health-shots/2023/06/08/1180838096/an-eating-disorders-chatbot-offered-dieting-advice-raising-fears-about-ai-in-hea
Federal Trade Commission. (2023, March 2). FTC to ban BetterHelp from revealing consumers' data, including sensitive mental health information, to Facebook and others for targeted advertising. FTC Press Release. Retrieved from https://www.ftc.gov/news-events/news/press-releases/2023/03/ftc-ban-betterhelp-revealing-consumers-data-including-sensitive-mental-health-information-facebook
Garcia v. Character Technologies Inc., U.S. District Court, Middle District of Florida, Case No. 6:24-cv-01898. Filed October 22, 2024.
Bromism from Internet-Obtained Sodium Bromide. (2024). Annals of Internal Medicine: Clinical Cases, 2(8). Retrieved from https://doi.org/10.7326/aimcc.2024.0858 More references (责任编辑:) |