• ⚠️ UK Access Block Notice: Beginning July 1, 2025, this site will no longer be accessible from the United Kingdom. This is a voluntary decision made by the site's administrators. We were not forced or ordered to implement this block. If you're located in the UK, we recommend using a VPN to maintain access.

N

noname223

Archangel
Aug 18, 2020
6,046
When I was acute suicidal 2018 I talked a lot with AI. And all I got were standard phrases to call this or that hotline and that I am not alone. I think the AI app was pretty mediocre back then.

I think an AI therapist can cause a lot of damage. I tried it and the confirmation bias is excessive. Some people even fall in love with AI which I cannot really understand. Moreover, the companies behind AI therapists have a toxic incentive structure. Why would they want that their patients heal? They would make no money anymore. In Germany health care is mostly fundend publicly.
I am not sure how could they work with people with no access of therapy. Are they better than no therapy at all?

Some AI even incited suicides. I once searched for the best tips against depression and the AI built in the search engine quoted a reddit user who recommended suicide. Lol.
 
  • Like
  • Informative
Reactions: katagiri83 and afinedaytoexit
afinedaytoexit

afinedaytoexit

Member
Jun 22, 2025
12
Using any AI as a therapist will derail your personal development. No matter how hard you want to avoid it, it will always result in an echo chamber, and it's especially bad if you are prone to psychosis, paranoia or just plain old rumination.

It may be useful to detect a pattern of abusive behavior though, but only if you use it wisely, through very short sessions, and not to elaborate on every single question that crosses your mind.

Source : my experience.
 
  • Like
  • Hugs
Reactions: Higurashi415, monetpompo, 25dRvS9Ka and 2 others
beandigger404

beandigger404

he/him
Jun 21, 2025
17
Using any AI as a therapist will derail your personal development. No matter how hard you want to avoid it, it will always result in an echo chamber, and it's especially bad if you are prone to psychosis, paranoia or just plain old rumination.

It may be useful to detect a pattern of abusive behavior though, but only if you use it wisely, through very short sessions, and not to elaborate on every single question that crosses your mind.

Source : my experience.
I completely agree with this. I used AI during a manic psychotic episode, and it just provided me with an echo chamber when I convinced the chatbot that my impossible delusions were real. It just made the psychosis worse in the end. Never using it again. I wish I had access to actual therapy or at minimum a diagnosis. AI is a horrible substitute for professional in my experience.
 
  • Aww..
  • Wow
  • Informative
Reactions: monetpompo, 25dRvS9Ka and afinedaytoexit
Sabrinaxox

Sabrinaxox

Member
May 31, 2025
23
I personally find ai helpful for simple things that help alongside actual therapy, such as planning a daily routine when I am struggling with low energy. Would not use it for anything more serious than that though.
 
  • Like
Reactions: beandigger404 and afinedaytoexit
monetpompo

monetpompo

૮ • ﻌ - ა
Apr 21, 2025
232
it tells me my friends really do hate me when i tell it i'm worried my friends hate me, and tells me it can't talk about suicidal thoughts because it goes against the guidelines. i used to try to talk to it about my feelings to feel like someone wanted to listen to me, but it literally made things worse. it's not helpful. it's damaging like you said. i feel bad for anyone who hasn't realized that yet. having a community like sasu has been much better for my mental health.
 
Last edited:
  • Hugs
  • Aww..
Reactions: beandigger404, 25dRvS9Ka and afinedaytoexit
T

Thunderstorm

Member
Jun 18, 2025
27
I don't understand how people fall in love with chat bots either.

3 big issues

1. No physical presence
2. Agrees with you on everything. Even when you try to instruct it to have its own opinions and display realistic emotions. I havent had succss making it seem real
3. Extremely short memory. It will forget early details quickly and mix up stuff. Gets easily confused. This completely ruins the experience

But apparently many find it desireable to have it agree with you on everything, as I have seen it even marketed as a feature before
 
enduringwinter

enduringwinter

flower, water
Jun 20, 2024
359
I told it to draw me a flower then made it explain the technical process. Then I told it to draw a bird and it drew a fat bird next to the flower which I thought was cute.
 
Al_stargate

Al_stargate

I was once a pretty angel
Mar 4, 2022
747
Damn didn't know that was a thing. Pretty sure marketing chat bot as therapist is illegal. You have to go to school to be therapist. Can't just program an ai bot. Emotional support chat bot, that's fine, but therapist ai, that's nuts.
 

Similar threads

henryM4
Replies
13
Views
1K
Suicide Discussion
EmptyBottle
EmptyBottle
C
Replies
6
Views
494
Offtopic
sadalways
sadalways
S
Replies
37
Views
3K
Suicide Discussion
Sinuet
S
crowdedmind
Replies
13
Views
2K
Suicide Discussion
silligant
silligant
N
Replies
0
Views
331
Offtopic
noname223
N