This is both upsetting and sad.
What’s sad is that I spend about $200/month on professional therapy, which is on the low end. Not everyone has those resources. So I understand where they’re coming from.
What’s upsetting is that this user TRUSTED ChatGPT to follow through on the advice without critically challenging it.
Even more upsetting is this user admitted to their mistake. I guarantee you that there are thousands like OP who wasn’t brave enough to admit it, and are probably to this day, still using ChatGPT as a support system.
Source: https://www.reddit.com/r/ChatGPT/comments/1k1st3q/i_feel_so_betrayed_a_warning/
Iv used it to try to put a name on some health problem iv been having for years. Google has been of no help on this and thought let’s give it a try. It did give a few names that I could then Google and eliminates or research a bit further. I don’t trust it for helping, and it is trying very hard to get me to consult it. "Are those symptômes you experience? " “Would you like some advice on…”. I never answer to that. Just very generic questions and come back later in different thread to cross question. Exemple please make a list of ailment linked to this list of symptômes. Vs later asking please describe symptoms of ailment x. See if they still match. Then Google actual medical organism sites and approached my doctor with here are my problems they seem to match this disease, is that plausible, can we do the test ? Chatgpt is a crutch find words and to navigate enshitified Google.
Agreed. Sometimes I’ve used it to find words too.
Language models are extremely useful tools for some specific purposes - people just need to know how and when to use them, taking into account that they aren’t intelligent at all