The fact that this is considered a viable option because we live in a country with a government that refuses to actually provide for its people, is painfully depressing. AI as your therapist… seriously what the fuck is this timeline? I work in tech and the people constantly blowing AI hot air are not folks you want in charge of the tools for your therapy and wellbeing.
This assumes all human therapists are ethical and never make mistakes, and that all of their offices, notes and data syatems are secure too. All security is porous.
No, it doesn’t.
You distrust AI therapists.
You distrust bad therapists.
You do trust good therapists.See? Works just fine.
That assumes you can tell and that the best people and processes are flawless which is not true by a wide margin.
are flawless
I cannot help you. You are having a conversation in your head that no one else here is a part of. You gotta come back down to Earth, man.
I mean… Yeah. That’s why they went to a therapist…
By AI head specialist.
No one can afford $2000 bread, well maybe in the great depression
Depending on who you are an AI chat might just be a less tedious journal which can obviously be better than not journaling, I still find it sorta weird too but the ridicule is unfounded imo.
From a privacy perspective it’s likely terrible/ terrifying but given the majority of people are already transparent for the most part, they are at least taking some real world value for their increased transparency.
I imagine at least some of that ridicule stems from this being kind of the exact wrong answer to the big, societal “why is everyone so lonely now?” question.
It’s a bit like watching a pack-a-day smoker buy lozenges for their throat or something, as if you’re not supposed to think about the cancer.
Not sure…
- An AI therapist can already easily handle general good mental advice, such as reducing cognitive load, perspective shifts, alternative methodologies, education of standard mental needs, processes and whatever low-level stuff we can benefit from. 2. hooman therapists are a coin-toss. Most are completely crap and build their business from archaic and/or wrong theories and personal ideology/feelings. 3. whatever flaws AI have now, is going away really really fast.
Hooman therapists cost a lot of money, and a shitload of people won’t get any help at all without AI.
So, I think it is fine. The potential damage is far less than no help at all. Just use a little common sense and don’t take anything as a Gospel - just as when we see hooman therapists.
I think this is true and until we have easily accessible and free mental health services it is the next best option and far more likely to do good than harm.
Removed by mod
I think passing judgement on people who are trying to get mental health assistance within their means makes you far more of a fucking loser.
Removed by mod
You simply don’t get it, there are shit loads of people who are struggling with their mental health and for various reasons they cannot find and/or afford a therapist. So yes, better to have an AI therapist than no therapist.
Removed by mod
selfhosted AI is 1000% more confidential than a human therapist; but stuff like ChatGPT? yeah stay away from those
It might be 1000% more confidential, but is it effective? Anecdotal evidence doesn’t count. For all we know AI therapy could be actively harmful to certain conditions. I’m not sure there’s any published studies on this.
There was actually one published a few days ago that concluded that it can be effective:
Participants with depression experienced a 51% reduction in symptoms, the best result in the study. Those with anxiety experienced a 31% reduction, and those at risk for eating disorders saw a 19% reduction in concerns about body image and weight.
However the person who did the study shares your concerns:
I asked Heinz if he thinks the results validate the burgeoning industry of AI therapy sites. “Quite the opposite,” he says, cautioning that most don’t appear to train their models on evidence-based practices like cognitive behavioral therapy, and they likely don’t employ a team of trained researchers to monitor interactions. “I have a lot of concerns about the industry and how fast we’re moving without really kind of evaluating this,” he adds.
Also they did another article about difficulties and pitfalls of making these things
even that is questionable professionally
Thank you very much-o, Doctor Roboto
Please go on. I’m not sure I understand you fully.
Shrink-ROM from Terry Gilliam’s Zero Theorem.
deleted by creator