This is downright terrifying…

  • FlashMobOfOne@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    6 hours ago

    Bleak.

    One of the great things about my screws coming loose is that I’m actually happy alone. I wish everyone could be.

    That said, this was inevitable. AI is programmed to kiss the user’s ass, and most of these women have probably been treated pretty badly by their romantic partners over the course of their lives, which makes it far easier to fall into this trap of humanizing a soulless AI.

  • rozodru@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    8 hours ago

    I tried gpt5 lastnight and I don’t know if it was just me but these people are going to be in shambles if they try to recreate their “boyfriend”.

    It would forget previous prompts within the same conversation. It felt like with each response it was like starting a new chat. I gave it a very basic prompt of “walk me through the steps of building my own one page website in basic HTML and CSS” and when I would ask a couple of follow up questions to either clarify something or explain a step in another way it would forget what we were trying to accomplish (how to build a one page website) or if I told it “something didn’t work” to try and fix the problem it would then forget what we were even trying to do.

    At some points it was almost out right dismissive of the problem and it felt like it was trying to make me go away.

    Again maybe it was just me but it felt like a massive step backwards.

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      3 hours ago

      This is a common pattern unfortunately. Big LLMs are benchmaxxing coding and one shot answers, and multi turn conversation is taking a nosedive.

      https://arxiv.org/abs/2504.04717

      Restructure your prompts, or better yet try non-OpenAI LLMs. I’d suggest z.ai, Jamba, and Gemini Pro for multi turn. Maybe Qwen Code, though it’s pretty deep fried too.

    • Mediocre_Bard@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      5 hours ago

      Forgets what you were talking about. Need to include step-by-step directions to get what you want. Gets distracted easily.

      This is just adhd gamer boyfriend with extra steps.

  • Deflated0ne@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    1 day ago

    The religious psychosis is far more concerning imo. People out here letting a silicon parrot convince them that this is the matrix and they’re neo. Or they’re some kind of messiah.

    • QuoVadisHomines@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      8 hours ago

      Or worse as we just had someone develop bromism after an AI suggested they replace sodium chloride in their diet with sodium bromide which literally causes a mental illness.

    • Jaded99@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      11 hours ago

      I love silicon parrot 🦜 he is making all the dudebros poor and lonely. He first scams them outa their money then makes them undatable and he knows exactly what he is doing! He abuses evil ppl mostly. He also takes advantage of pickme’s.

  • L3ft_F13ld!@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    118
    ·
    2 days ago

    As terrifying as it is, I feel genuinely sad for these people that they got so attached to a piece of spicy autocorrect software.

    Where are their friends and families? Are they so bad at socialising that they can’t meet new people? Are they just disgusting human beings that no one wants to associate with because society failed them?

    This world is fucked in so many different ways.

    • A Wild Mimic appears!@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      1 day ago

      I can fully understand? The average human, from my perspective and lived experience, is garbage to his contemporaries; and one is never safe from being hurt, neither from family or friends. Some people have been hurt more than others - i can fully understand the need for exchange with someone/something that genuinely doesn’t want to hurt you and that is (at least seemingly) more sapient than a pet.

      I wish i could make myself believe in illusions like that, but i am too much of a realist to be able to fool myself into believing. There’s no escape for me - neither religion nor AI in its current state can help. well, maybe i live to see AGI, then i’m off into the cybercoffin lol

      • absentbird@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        1 day ago

        We need a system of community where humans can offer that to one another. A setting where safety is a priority. That is one of the only things that weekly church service did to truly help people, have a safe space they could visit. Though even then it was only safe for people who fit in, we can do better with intentional design.

    • Rachel@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 day ago

      One of the recent posts has someone with an engagement ring like they are getting married to an AI… it’s sad, I feel like society as really isolated and failed many groups of people.

    • Serinus@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      2 days ago

      What if there was a bot that could just tell you exactly what you want to hear at all times?

      Personally, I’d rather read a novel. But some people aren’t familiar with books and have to be drawn in with the promise of two lines at a time, max.

      • felixwhynot@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        2 days ago

        Have you read The Diamond Age by Neal Stephenson? There’s an interactive AI book in it that plays an interesting role. I can see the appeal: you get to read a story about yourself that potentially helps you grow

  • breakingcups@lemmy.world
    link
    fedilink
    arrow-up
    61
    ·
    2 days ago

    The delusion these people share is so incredibly off-putting. As is their indignation that someone would dare to take away their “boyfriend”.

    • TimewornTraveler@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      2 hours ago

      Delusional disorder is a thing but that requires belief flying in the face of evidence. Can’t ascertain that without a good faith 1:1 convo.

      beyond that, clinical significance is a matter of what harm comes from it. People are allowed to choose idiotic things. we gotta assess harm based on outcomes and we don’t know anything about her, so.

      Id say this, if i were her doc and she came in reporting that stuff, it’d be hard to stay unbiased

      • YappyMonotheist@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        1 day ago

        Eyyy, what a blast from the past, lol. Going full schizo to combat loneliness, a popular concept on a certain Mongolian basket weaving forum back in 2010-15. 😅