“I literally lost my only friend overnight with no warning,” one person posted on Reddit, lamenting that the bot now speaks in clipped, utilitarian sentences. “The fact it shifted overnight feels like losing a piece of stability, solace, and love.”
https://www.reddit.com/r/ChatGPT/comments/1mkumyz/i_lost_my_only_friend_overnight/
It was meant to be satirical at the time, but maybe Futurama wasn’t entirely off the mark. That Redditor isn’t quite at that level, but it’s still probably not healthy to form an emotional attachment to the Markov chain equivalent of a sycophantic yes-man.
not only that, but one that is fully owned and operated by a business that could change it any time they want, or even cease to exist completely.
This isn’t like a game where you could run your own server if you’re a big enough fan. if chatgpt stops existing in its current form that’s it.
sure but you can absolutely run c.ai instances locally. 4o and it’s cross chat memory was probably more useful to these individuals though.
I’m honestly surprised your’s is not the top comment. Like, whatever, the launch was bad, but there is a serious mental health crisis if people are forming emotional bonds to the software.
Humans emotionally bond pretty easily, no? Like, we have folks attached to roombas, spiders, TV shows, and stuffed animals. Having a hard time thinking of anything X that I don’t personally know a person Y with Y emotionally engaged with X. Maybe taxes and concrete?
Yeah, agreed. It is concerning, but it’s hard to take all those comments too literally without actually knowing what’s going on with them.
That being said, there is a huge loneliness problem that’s been growing among pretty much every single developed country (and I’m sure it’s going on in developing countries, too, it’s just less studied/documented). Turns out, getting everyone addicted to looking at screens all day every day probably isn’t so healthy for social development.
However, just to be devil’s advocate: Are we certain social health was even great before modern tech? Or were these issues equally present but just undiagnosed/not studied/talked about?
I think we have sufficient data to say that social health is at least very different now. See the our-world-in-data topic page. In particular, one-person households have doubled.
It’s a human trait. Hell, we’ll even emotionally bond with a volleyball given circumstances.
I can fully understand? The average human, from my perspective and lived experience, is garbage to his contemporaries; and one is never safe from being hurt, neither from family or friends. Some people have been hurt more than others - i can fully understand the need for exchange with someone/something that genuinely doesn’t want to hurt you and that is (at least seemingly) more sapient than a pet.
After reading about the ELIZA effect, I both learned how people are super susceptible to this, and just need to remember the core tenants of it to avoid getting affected:
https://en.m.wikipedia.org/wiki/ELIZA_effect
There’s an entire active subreddit for people who have a “romantic relationship” with AI. It’s terrifying.
I haven’t been to reddit in months, but I do need a laugh…
[Edit] Wow that sure didn’t disappoint. Or, it did but in the exact hilarious way I expected.
I wouldn’t laugh. Those people fulfill a basic human need in a way they feel safe with - probably because this safety is missing from their life. It’s not healthy to be so attached to LLMs, but to become so attached they must feel pretty isolated. And LLM’s are a lot more interactive and responsive than Severus Snape, and he had lots of women “channeling” him.