• DandomRude@lemmy.world
    link
    fedilink
    English
    arrow-up
    71
    ·
    18 days ago

    ChatGPT is allowed, but chocolate eggs with toys inside are supposed to be too dangerous? 🤔

    • grue@lemmy.world
      link
      fedilink
      English
      arrow-up
      43
      ·
      18 days ago

      I am literally just making this up on the spot and have no evidence for it, but I’m starting to wonder if maybe the real reason for the kinder egg ban was anticompetitive lobbying by Hershey or something like that, and the toy thing was just the excuse.

      • Delphia@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        18 days ago

        It wasnt the “toy thing” theres a law about putting inedible choking hazards in food. The food drug and cosmetic act of 1938. It bans putting any toy or inedible object in candy unless it is a functional part of the food. So the capsule in the kinder egg isnt ok but the wooden stick in a popsicle, the stick of a lollypop or the ring of a ring pop is.

    • Øπ3ŕ@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      18 days ago

      In all fairness, the guy was a fuckin’ idiot before Jippity tested a latent murder-sim theory it’s been puzzling over.

  • dyathinkhesaurus@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    18 days ago

    Bromide sedatives vanished from the US market by 1989, after the Food and Drug Administration banned them

    So Robert Fucken Kennedy Jr will be all over this one shortly. Expect more psychosis I guess…

  • ExLisperA
    link
    fedilink
    English
    arrow-up
    16
    ·
    18 days ago

    Can we just skip to the point where LLMs are just a bunch of preapproved answers and legal warnings? Maybe we could then hire experts to answer some of the questions? Organize it like some sort of a message board. Let people search previous questions. Wait…

  • NutWrench@lemmy.ml
    link
    fedilink
    English
    arrow-up
    14
    ·
    18 days ago

    An early AI was once asked, “Bob has a headache. What should Bob do?” And the AI replied, “Bob should cut off his own head.”

    The point being: AIs will give you logical solutions to your problems but they won’t always give you practical ones.

  • Dagwood_Sanwich@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    17 days ago

    Just wait until AI “partners” are a thing. Person: Honey, can you cook for me? AI: Sure thing. [proceeds to add highly toxic ingredients into the food]

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    4
    ·
    17 days ago

    I already suffer from psychosis neocat, think, anime

    To be fair, I’ve already done something similar. In the aughts, thinking Miskatonic U’s Library restricted vault can’t harm me: I’m already crazy. Then thanks to Bush Administration efforts to consolidate power and persecute minorities, I went and studied the Holocaust, and the process within the German Reich that brought it forth.

    And that figures into my current psychotic break, as of November 2024.