Thank you Microslop

  • meathorse@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    15 days ago

    They aren’t even waiting for market saturation, they’re just speed-running straight into enshittification now

  • melsaskca@lemmy.ca
    link
    fedilink
    arrow-up
    1
    ·
    16 days ago

    It’s shit like this that make me want to pursue a career in digging ditches with just a shovel, a bag of sandwiches, a thermos of coffee and the warm sunshine. Fuck the modern world. It jumped the rail a while ago now.

  • shrek_is_love@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    16 days ago

    Good example of why I don’t rely on technology I don’t control. I want my workflow to be future-proof and have a predictable cost.

    • Munkisquisher@lemmy.nz
      link
      fedilink
      arrow-up
      1
      ·
      16 days ago

      Yes, when anyone proposes building our tools on top of these services I ask “what will happen to this when they start charging what it really costs to run these models?”

    • joelthelion@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      16 days ago

      In general I agree with you, but llms are the one exception where it’s not practical and not cost effective to run them locally. If you want to use them, the better option is by far to pay someone for the service.

      • gezero@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        16 days ago

        That’s because now is the phase where they let you try the good stuff cheaper to hook you up.

      • chicken@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        ·
        16 days ago

        Then second best option is an inference provider for open weight models, so at least if they raise the price or stop offering it you can get it from someone else or eventually upgrade to self hosting.

  • spartanatreyu@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    16 days ago

    It’s for the best.

    Learning how code works is better than getting an LLM to produce convincing looking code without anyone having an understanding of how it works.

    LLMs just teach students to paint themselves into a corner without them even realising why bad things keep happening to them.

    • PokerChips@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      16 days ago

      Right. It’s actually shocking that copilot is allowed at all. The next generation is going to be so fucking stupid.

    • KyuubiNoKitsune@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      16 days ago

      I agree, but I learn best when I can ask questions a certain way to understand concepts, and as much as people may hate it, it’s good at answering questions like that and people are often nasty when others ask questions they assume to be stupid.

      Not that I’m a student.

  • squaresinger@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    16 days ago

    So it begins. We are past the unlimited money stage and now the cuts begin to make AI somewhat profitable.

    Expect more of that in the coming months.