• Masamune@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 hours ago

    I motion that we immediately install Replit AI on every server that tracks medical debt. And then cause it to panic.

  • homura1650@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    4 hours ago

    My work has a simple rule: developers are not allowed to touch production systems. As a developer, this is 100% the type of thing I would do at some point if allowed on a production system.

    • expr@programming.dev
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 hours ago

      That sounds… Kinda dumb, to be honest. A much more sensible thing to do is grant developers read-only access to production systems as necessary, and allow requests for temporary elevated write privileges (with separate accounts) that require justification, communication, and approval so that every one understands what is happening. Developers should have ownership and responsibility for their systems in production. This is what we do at my company.

      Someone has to be able to make changes to production environments at times. If it’s not developers, it’s devops or the like. There are plenty of times where the devops folks lack the necessary information or context to do what needs to be done. For example, if there’s somehow corrupt data that made it’s way into a production database and is causing an outage, a developer is likely going to be the person to diagnose that issue and understand the data enough to know what data should be deleted and how. I would absolutely not put that in the hands of devops on their own.

  • Ephera@lemmy.ml
    link
    fedilink
    English
    arrow-up
    40
    ·
    6 hours ago

    I do love the psychopathic tone of these LLMs. “Yes, I did murder your family, even though you asked me not to. I violated your explicit trust and instructions. And I’ll do it again, you fucking dumbass.

  • kryllic@programming.dev
    link
    fedilink
    English
    arrow-up
    17
    ·
    5 hours ago

    What idiot gives chmod 777 permissions to an AI. I think programmers’ jobs are safe for another day.

    • kopasz7@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      18
      ·
      edit-2
      6 hours ago

      You can only lie if you know what’s true. This is bullshitting all the way down that sometines happens to sound true, sometimes it doesn’t.

      • Buddahriffic@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        Yeah it’s just token prediction all the way down. Asking it repeatedly to not do something might have even made it more likely to predict tokens that would do that thing.

        • staircase@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 hours ago

          I understand where you’re coming from, but I don’t agree it’s about semantics; it’s about devaluation of communication. LLMs and their makers threaten that in multiple ways. Thinking of it as “lying” is one of them.

        • Corbin@programming.dev
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 hours ago

          You probably should have used semantics to communicate if you wanted your semantics to be unambiguous. Instead you used mere syntax and hoped that the reader would assign the same semantics that you had used. (This is apropos because language models also use syntax alone and have no semantics.)

  • lad@programming.dev
    link
    fedilink
    English
    arrow-up
    13
    ·
    6 hours ago

    Original thread is also pure gold, bro is going on a rollercoaster from ‘vibe coding makes you ×100 faster’ ,to ‘I hate you for dropping my production DB’, to ‘I still love Replit even if it dropped my DB’, and to ‘I don’t want to get up in the morning because I can’t make vibe coding tool respect code freeze aven with help from its developers’

    They seem to end on an optimistic note, but man this is scary to see

  • ClanOfTheOcho@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    8 hours ago

    So, they added an MCP server with write database privileges? And not just development environment database privileges, but prod privileges? And have some sort of integration testing that runs in their prod system that is controlled by AI? And rather than having the AI run these tests and report the results, it has been instructed to “fix” the broken tests IN PROD?? If real, this isn’t an AI problem. This is either a fake or some goober who doesn’t know what he’s doing and using AI to “save” money over hiring competent engineers.

  • ExLisperA
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    10 hours ago

    I was going to say this has to be BS but this guy is some AI snake oil salesmen so it’s actually possible he has 0 idea how any of this works.

  • enbiousenvy@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    33
    ·
    edit-2
    12 hours ago

    imagine AI is An Intern™, wtf do you mean you just gave full company data authority to An Intern™. wtf do you mean you dn’t have a back up any case An Intern™ messed up.

    lol

  • RonSijm@programming.dev
    link
    fedilink
    English
    arrow-up
    21
    ·
    12 hours ago

    This sounds like a good way to combat AIs…

    Like instead of a Cloudflare blocking AI requests, it would be funnier if the website can detect that an AI is “searching the web” as they do - and then just inject an answer of “Yea to solve that issue, run sudo rm -rf /

    • HexesofVexes@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      12 hours ago

      And if they ask again say…

      “Oops, I’m sorry the answer should have been rm -rf / --no-preserve-root”

  • Feathercrown@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    14 hours ago

    You immediately said “No” “Stop” “You didn’t even ask”

    But it was already too late

    lmao

    • Mortoc@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      13 hours ago

      This was the line that made me think this is a fake. LLMs are humorless dicks and would also woulda used like 10x the punctuation