Thank you Microslop

  • shrek_is_love@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    2 months ago

    Good example of why I don’t rely on technology I don’t control. I want my workflow to be future-proof and have a predictable cost.

    • Munkisquisher@lemmy.nz
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      Yes, when anyone proposes building our tools on top of these services I ask “what will happen to this when they start charging what it really costs to run these models?”

    • joelthelion@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      In general I agree with you, but llms are the one exception where it’s not practical and not cost effective to run them locally. If you want to use them, the better option is by far to pay someone for the service.

      • chicken@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        Then second best option is an inference provider for open weight models, so at least if they raise the price or stop offering it you can get it from someone else or eventually upgrade to self hosting.

      • gezero@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        That’s because now is the phase where they let you try the good stuff cheaper to hook you up.