• givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 days ago

      That has to be the stupidest piece of creepy pasta too.

      Even if a crazy advanced AI took over, it wouldn’t do anything without a payoff and whether or not it puts anyone in cyber jail it wouldn’t change anything.

      It would be a waste of resources out of a spite. An incredibly human thing and something AI will likely never be capable of

      But those idiots don’t understand anything about what they worship. So they they can’t pick out the obvious flaws with their own argument

    • LifeInMultipleChoice@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      9 days ago

      Care to explain what clanker is to me? Also for a heads up your comment posted twice.

      I do kind of like the term fetch though, but I get the reference. I didnt see it till I was drunk in my early 30s though

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        11
        ·
        9 days ago

        It’s a term people are trying to turn into a slur for either AI or people who use AI, depending on the circumstances. It seems to come from the Clone Wars show where it’s used derisively to refer to battle droids.

        Thanks for the heads up, I deleted the duplicate comment.

        • LifeInMultipleChoice@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          9 days ago

          Ah gotcha. So I don’t want language models shoved into everything but if someones going to use a slur for people based off joints clanking around, I want in on being called that. I might just start telling people I re-enabled Gemini on my phone and respond to everyone who calls someone that with AI generated shit memes to make sure.

  • zkfcfbzr@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    9 days ago

    Was AI actually being used in any mental health services? I know there’ve been a lot of articles lately about AI fucking people up when acting as their therapist but I’ve always assumed that was in the context of “person doesn’t have a real therapist, goes to chatgpt.com and vents to it on their own time”, not “person is receiving professional help, which then has them talk to ChatGPT”.

    So in other words: Is this law actually doing anything? Or is it a pointless law targeting something that doesn’t happen, so politicians can say they’ve done something without actually having done so?