Real therapy isn’t always better. At least there you can get drugs. But neither are a guarantee to make life better—and for a lot of them, life isn’t going to get better anyway.
AI “therapy” can be very effective without the gaming, but the problem is most people want it to tell them what they want to hear. Real therapy is not “fun” because a therapist will challenge you on your bullshit and not let you shape the conversation.
I find it does a pretty good job with pro and con lists, listing out several options, and taking situations and reframing them. I have found it very useful, but I have learned not to manipulate it or its advice just becomes me convincing myself of a thing.
I agree, and to the comment above you, it’s not because it’s guaranteed to reduce symptoms. There are many ways that talking with another person is good for us.
Over the long term I have significant hopes for AI talk therapy, at least for some uses. Two opportunities stand out that might have potential:
In some cases I think people will talk to a soulless robot more freely than to a human professional.
Machine learning systems are good at pattern recognition and this is one component of diagnosis. This meta analysis found that LLM models performed about as accurately as physicians, with the exception of expert-level specialists. In time I think it’s undeniable that there is potential here.
More so from corporate proprietary ones no? At least I hope that’s the only cases. The open source ones suggest really useful ways proprietary do not. Now I dont rely on open source AI but they are definitely better
I feel like if thats 1 mill peeps wanting to die… They could say join a revolution to say take back our free government? Or make it more free? Shower thoughts.
Still, what are they gonna do to a million suicidal people besides ignore them entirely
Well, AI therapy is more likely to harm their mental health, up to encouraging suicide (as certain cases have already shown).
Real therapy isn’t always better. At least there you can get drugs. But neither are a guarantee to make life better—and for a lot of them, life isn’t going to get better anyway.
Are you comparing a professional to a text generator?
Have you ever had ineffective professional therapy?
Are you still trying to compare medical treatment with generating text?
Compare, as in equal? No. You can’t “game” a person (usually) like you can game an AI.
Now, answer my question
No, comparing as in comparing apples and oranges.
Answer my question, or just admit you refuse to engage in conversation and we can depart
Real therapy is definitely better than an AI. That said, AIs will never encourage self harm without significant gaming.
AI “therapy” can be very effective without the gaming, but the problem is most people want it to tell them what they want to hear. Real therapy is not “fun” because a therapist will challenge you on your bullshit and not let you shape the conversation.
I find it does a pretty good job with pro and con lists, listing out several options, and taking situations and reframing them. I have found it very useful, but I have learned not to manipulate it or its advice just becomes me convincing myself of a thing.
I agree, and to the comment above you, it’s not because it’s guaranteed to reduce symptoms. There are many ways that talking with another person is good for us.
The keyword here is “person”.
Suicide is big business. There’s infrastructure readily available to reap financial rewards from the activity, atleast in the US.
Over the long term I have significant hopes for AI talk therapy, at least for some uses. Two opportunities stand out that might have potential:
In some cases I think people will talk to a soulless robot more freely than to a human professional.
Machine learning systems are good at pattern recognition and this is one component of diagnosis. This meta analysis found that LLM models performed about as accurately as physicians, with the exception of expert-level specialists. In time I think it’s undeniable that there is potential here.
More so from corporate proprietary ones no? At least I hope that’s the only cases. The open source ones suggest really useful ways proprietary do not. Now I dont rely on open source AI but they are definitely better
Advertise drugs to them perhaps, or somd sort of taking advantage. If this sort of data is the hands of an ad network that is
It’s never the drugs I want though :(
No, no. They want repeat customers!
Unless they sell Lifetime deals. Probably cheap on the warranty/support side. If the drug doesn’t work 🤔
I feel like if thats 1 mill peeps wanting to die… They could say join a revolution to say take back our free government? Or make it more free? Shower thoughts.
Strap explosives to their chests and send them to thier competitors?
Take that Grok!!
Convince each one that they alone are the chosen one to assassinate grok and that this mission is all that matters to give their lives meaning.