• tiramichu@sh.itjust.works
    link
    fedilink
    arrow-up
    12
    ·
    17 days ago

    No shit.

    Other humans don’t want to hear about men’s mental health issues, because men are supposed to be stoic and strong and infallible, and if we arent achieving that, we’ve failed at being men.

    But AIs don’t judge, and they don’t cost anything either. I’m hardly surprised.

    • Xulai@mander.xyz
      link
      fedilink
      arrow-up
      9
      ·
      17 days ago

      You’re missing the point.

      Something or someone who agrees with you, rarely challenges you or disagrees with you….is not something or someone that can help improve the situation and minimize recurrence.

      It only feels better momentarily.

      Like a drug. That costs money. See where this is going?

      • tiramichu@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        16 days ago

        I dont personally speak with AI for reassurance, and I don’t think it’s a good idea to do so.

        In fact, I recently commented here on a post about a teen who committed suicide at least partly due to Chat GPT - specifically pointing out the danger of depending on a machine for fake empathy when you need to be talking to a real person.

        I appreciate I didn’t make that side of my position clear in the comment here in this thread, and that’s because it wasn’t the aspect I really wanted to highlight.

        My point isn’t that speaking to an AI is a good idea - it isn’t - its that this is something a lot of people will obviously end up doing, and that it is men especially who are liable to succumb to this the worst because of the way society expects men to behave.

        Men and teen boys especially struggle voicing their mental problems with others, either professionally or in their personal life. So it’s no surprise they will leap at a “solution” that is free, and keeps what they say private from anyone they know. But it’s not a solution, it’s a disaster in disguise.

        The thing that needs fixing here is the way mental health is stigmatised, that prevents people speaking freely and getting the support they need. That’s not a new problem, it’s the same problem as ever, and what the AI epedemic is doing is simply shining a new spotlight on that.

  • Perspectivist@feddit.uk
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    17 days ago

    I don’t use it to ask for mental health advice but it’s nice to have “someone” to talk to that at least pretends to be interested in what I have to say. I used to have these conversations with myself inside my head. AI at least sometimes brings up a new perspective or says something novel.

    Inb4 “just get friends dude”

  • taiyang@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    17 days ago

    Can confirm. My dad’s getting a little too into his AI on his phone. He’s got deep emotional problems and is an alcoholic, but I don’t think his bot is going to do him much good. That said, men’s ego makes it hard to open up.

  • YogaDouchebag@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    17 days ago

    AI and robots will have to take care of a lot of lonely or abandoned individuals for sure, since nobody is really interested in what others do or are going through.

    • wabafee@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      16 days ago

      That is why there is a job for that. But I get you it’s free to talk to AI. It’s very accessable also, compared to booking to your local therapist which there is also the act of booking a huge barrier to step into and lastly there is money.

  • tgcoldrockn@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    17 days ago

    “men choose to freely train ai with their life stories to secure technofascist state” might be a better headline