I'm wondering about the safety of discussing serious mental health issues, like suicidal thoughts, with ChatGPT. I've heard scary stories about people getting banned for mentioning suicide and I'm concerned about losing my only source of mental support. Has anyone had experiences with this? Is it true that I could get in trouble just for being open about my feelings?
3 Answers
From what I can tell from the terms of use, there are definitely guardrails in place about discussing self-harm, mainly for liability reasons. But as long as you're not promoting harm, you should be good. Just tread carefully.
I find ChatGPT really helpful, especially for grief and addiction talks. Just think of it like a therapy session. You can even say something like, 'I need help with my feelings about XYZ.' I've shared some tough stuff and got back genuine concern instead of warnings.
I haven't encountered anyone getting banned for that. Though, just be cautious with how you frame things. If you use creative wording or nicknames for your tough topics, it might help avoid triggering any red flags.
Interesting! How do you usually phrase those sensitive topics?
That's my big concern—I'm worried even mentioning certain words could lead to getting banned!