• arc99@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    6 days ago

    Hardly surprising. Llms aren’t -thinking- they’re just shitting out the next token for any given input of tokens.