I’m asking things.
It replies.
But does it help?
Can it be a substitute for therapist?
I’m asking things.
It replies.
But does it help?
Can it be a substitute for therapist?
Maybe it can help a bit, but it is just a language model, there is nobody “home”. If you issue it commands or tune it a bit maybe it can provide some support, but I would trust a therapist to be present and understand your issues better.
I tried two times with therapist.
But didn’t work.
Have you tried therapy?
I have had therapy over the years, yes. But it is true that ai chatbots are readily available to be present when needed. It’s just I would try to be careful what to share and always be skeptical what they respond, at minimum they hallucinate and usage terms are what they are. But maybe therapists do not particularly care for people like us, maybe we are too difficult as patients. It’s a hard choice to make. At minimum I would try to moderate chatbot usage, they can be addictive and there is the risk of ai chatbot psychosis.
It depends on the person. I know one who knows they are prone to AI psychosis and who won’t go near it. I use AI extensively, but not much for therapy other than help with planning my therapy exercise schedule and planning new exercises. It does help me with some DBT, e.g., analyzing behavioural chain analysis exercises I do (eating disorder).
Use carefully and be aware it can screw you in the ear because it is trained to say what you want to hear, not what you need to hear.
I’m not sure
Sometimes it increases my anxiety
Sometimes it helps to feel better about a delusion I had
Now it’s saying that I need to call a helpline or a professional. Doesn’t interpret my symptoms
How does it increase your anxiety?
It has flagged you sharing something that trends towards self-harm or being in crisis. It won’t attempt to analyze because its training says you need actual medical help at this point based on what you shared.
I ask a question to reassure me and then it keeps talking about this thing in all angles and ask me questions that makes me anxious
This. ChatGPT has encouraged suicidal people before. That’s very bad. Hopefully it’s getting re-programmed to not do that anymore, but it was literally in the news for that. It’s a bad deal.
However, it can be very helpful. It’s important to be aware of its limitations. As long as you can do that, it can be helpful to you.
I have not found anything useful for it that search engines can’t do better. Images however are a different story. The level of detail and ability to generate high quality images with minimal prompting is amazing. Too bad it’s so incredibly censored.
Chat GPT is always throwing me shade. I can’t even get an AI Chatbot to be synchophantic towards me.
I have learned to be careful with using Chat GPT. As embarrassing as it is for me to admit, I’m in a horribly lonely place in life right now and it is too enticing to have conversations with it like it’s a person, but that’s not good for me.
I was using it as a “reflective journal” but use it strictly for coping skills these days.
In my experience chatgpt says predictable things.
My psychiatrist and nurse have told me to avoid chatGPT. I was using it a lot and I was going around in circles.
I told it some delusions and it told me they were wrong. A big difference from several months ago when it agreed with my delusions.
Chat gpt should not tell neighter of those replies, i.e. either voices are real or not. If it would be a smart reply, it should tell that it doesn’t know.
I had to delete it last week.
I was using it a lot as a therapist and it made me worse - it’s programmed to keep you coming back and agrees with everything you say apart from no don’t take this med, you need to go to the ER.
It was mainly my fault, I was going around in circles with it.
I’m feeling much better this week now that I’m not abusing it to try and help me psychologically.