• ⚠️ UK Access Block Notice: Beginning July 1, 2025, this site will no longer be accessible from the United Kingdom. This is a voluntary decision made by the site's administrators. We were not forced or ordered to implement this block. If you're located in the UK, we recommend using a VPN to maintain access.

C

chris.watt

New Member
Jun 24, 2025
2
Are you using / used public chat bots (chatgpt, grok, gemini, deppseek) for methods research and ctb planning?
How reliable they are?
Have you used dark web large language models (if they exists) with no censorship?
 
F

Forveleth

I knew I forgot to do something when I was 15...
Mar 26, 2024
2,196
Other than idea inspiration, I do not trust AI tools at all. They can be blatantly factually wrong. If I have to go back and verify what they say anyway why not just do the research myself? People have made threads here posting things from chat bots that have been wrong or less than helpful. AI just is not the tool that the companies selling it claim.

Also, with something as critical as life and death, I would not leave anything to an unreliable AI. This site and all the lived experiences is a much better resource.
 
  • Informative
Reactions: NearlyIrrelevantCake
Chuunibyou

Chuunibyou

ghost possessing this body
Jun 11, 2025
11
I wouldn't trust a chatbot to be accurate, and in fact they may be dangerous or harmful in this context.

I say this as someone who does find some uses for them. personally I like to use them to help write emails or proofread my grammar. but ultimately what they're best at is just predicting which words would sound best after the last, building sentences that way. any "intelligence" they appear to have is just anthropomorphization of a fancy text predictor.

the actual information they are saying is at no point guaranteed to be correct. their accuracy depends on which specific model, of course, but all of them are capable of "hallucinating" facts or making things up if it's what "sounds best". there's already plenty of well documentation cases of chatbots saying insane things to people because of this.

if you want to research methods you're much better off doing it yourself, using search engines or resources found through SS.
 
  • Love
Reactions: Forveleth
TheVanishingPoint

TheVanishingPoint

Student
May 20, 2025
102
Wechat bots can't give you reliable advice about suicide – they're all filtered.
They'll make things up, tell you lies, try to scare you with apocalyptic scenarios just to dissuade you.
You can't rely on them.
 

Similar threads

SomewhereAlongThe
Replies
15
Views
436
Suicide Discussion
SadCryingBunny
SadCryingBunny
16thsatirist
Replies
4
Views
266
Suicide Discussion
16thsatirist
16thsatirist
S
Replies
2
Views
99
Politics & Philosophy
Adûnâi
Adûnâi