• Hemingways_Shotgun@lemmy.ca
    link
    fedilink
    English
    arrow-up
    52
    ·
    2 days ago

    The fact that any AI company thought to train their LLM on the answers of Reddit users speaks to a fundamental misunderstanding of their own product (IMO)

    LLMs aren’t programmed to give you the correct answer. They’re programmed to give you the most pervasive/popular answer on the assumption that most of the time that will also happen to be the right one.

    So when you’re getting your knowledge base from random jackasses on Reddit, where a good faith question like “What’s the best way to get get gum out of my childs hair” get’s two two good faith answers, and then a few dozen smart-ass answers that gets lots of replies and upvotes because they’re funny. Guess which one your LLM is going to use.

    People (and apparently even the creators themselves) think that an LLM is actually cognizent enough to be able to weed this out logically. But it can’t. It’s not an intelligence…it’s a knowlege agreggator. And as with any aggregator, the same rule applies

    garbage in, garbage out

    • bridgeenjoyer@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      16
      ·
      2 days ago

      Thats why I have stopped calling it ai. Its a dumbass buzzword just like cloud, that tech bros like to use but cant explain (or blockchain).

      Its llms, and image generators/OCR (which has been around for decades), Using complex markov chains and a fuck ton of graphics cards. NOT AI. NOT AI.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        2 days ago

        It is AI, along with a bunch of optimization algorithms, statistical decision trees (probably used in adaptive AI in games), etc. AI is a field in computer science that includes a ton of things many wouldn’t consider AI.

        Basically, if the solution doesn’t come from direct commands but instead comes from some form of learning process, it’s probably AI.

        It’s not “general AI”, but it is in the field of AI.

        • enbipanic@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          I would argue we need to go back to Machine Learning.

          The field is machine learning, generative machine learning etc.

          This rebrand to AI is doing nothing but confusing people and building investor hype

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 days ago

            Back? Machine Learning has always been a subfield of artificial intelligence since it all started in the 1950s or so. The end goal is to create general AI, and each field in AI is considered a piece of that puzzle, including LLMs.

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                1 day ago

                It’s more specific, sure, but there’s nothing dishonest about using the same terminology that has been used for almost 100 years.

                The disconnect is that average people have a different understanding of the term than is used in computer science, probably because of sci-fi films and whatnot. When I hear “AI,” I think of the CS term, because that’s my background, but when my family hears “AI,” they think of androids and whatnot like in Bicentennial Man.

                I don’t know how to square that circle. Neither group here is wrong, but classifying something like ChatGPT as “AI,” while correct, is misinterpreted by the public, who assume it’s doing more than it is.