• samus12345@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    18 hours ago

    As an end user with little knowledge about programming, I’ve seen how hard it is for programmers to get things working well many times over the years. AI as a time saver for certain simple tasks, sure, but no way in hell they’ll be replacing humans in my lifetime.

  • maplebar@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    2 days ago

    AI isn’t ready to replace just about anybody’s job, and probably never will be technically, economically or legally viable.

    That said, the c-suit class are certainly going to try. Not only do they dream of optimizing all human workers out of every workforce, they also desperately need to recoup as much of the sunk cost that they’ve collectively dumped into the technology.

    Take OpenAI for example, they lost something like $5,000,000,000 last year and are probably going to lose even more this year. Their entire business plan relies on at least selling people on the idea that AI will be able to replace human workers. The minute people realize that OpenAI isn’t going to conquer the world, and instead end up as just one of many players in the slop space, the entire bottom will fall out of the company and the AI bubble will burst.

  • arc@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    AI is certainly a very handy tool and has helped me out a lot but anybody who thinks “vibe programming” (i.e. programming from ignorance) is a good idea or will save money is woefully misinformed. Hire good programmers, let them use AI if they like, but trust the programmer’s judgement over some AI.

    That’s because you NEED that experience to notice the AI is outputting garbage. Otherwise it looks superficially okay but the code is terrible, or fragile, or not even doing what you asked it properly. e.g. if I asked Gemini to generate a web server with Jetty it might output something correct or an unholy mess of Jetty 8, 9, 10, 11, 12 with annotations and/or programmatic styles, or the correct / incorrect pom dependencies.

  • lalala@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    English isn’t my first language, so I often use translation services. I feel like using them is a lot like vibe coding — very useful, but still something that needs to be checked by a human.

  • Wanpieserino@lemm.ee
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    2 days ago

    My mate is applying to Amazon as warehouse worker. He has an IT degree.

    My coworker in the bookkeeping department has two degrees. Accountancy and IT. She can’t find an IT job.

    At the other side though, my brother, an experienced software developer, is earning quite a lot of money now.

    Basically, the industry is not investing in new blood.

      • boonhet@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        My company was desperate to find a brand new dev straight out of the oven we could still mold to our sensibilities late last year when everything seemed doomed. Yes, it was one hire out of like 10 interviewed candidates, but point is, there are companies still hiring. Our CTO straight up judges people who use an LLM and don’t know how the code actually works. Mr. “Just use an AI agent” would never get the job.

      • Wanpieserino@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        Don’t you worry, my job will be replaced by AI as well. By 2026 peppol invoices will be enforced in Belgium. Reducing bookkeepers their workload.

        ITers replacing my job: 😁😁😁

        ITers replacing their own jobs: 😧😧😧

    • fuck_u_spez_in_particular@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      Basically, the industry is not investing in new blood.

      Yeah I think it makes sense out of an economic motivation. Often the code-quality of a junior is worse than that of an AI, and a senior has to review either, so they could just directly prompt the junior task into the AI.

      The experience and skill to quickly grasp code and intention (and having a good initial idea where it should be going architecturally) is what is asked, which is obviously something that seniors are good at.

      It’s kinda sad that our profession/art is slowly dying out because juniors are slowly replaced by AI.

  • Lovable Sidekick@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    2 days ago

    I’ve always said as a software developer that our longterm job is to program ourselves out of a job. In fact, in the long term EVERYBODY is “cooked” as automation becomes more and more capable. The eventual outcome will be that nobody will have to work. AI in its present state isn’t ready at all to replace programmers, but it can be a very helpful assistant.

    • fuck_u_spez_in_particular@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      but it can be a very helpful assistant.

      can, but usually when stuff gets slightly more complex, being a fast typewriter is usually more efficient and results in better code.

      I guess it really depends on the aspiration for code-quality, complexity (yes it’s good at generating boilerplate). If I don’t care about a one-time use script that is quickly written in a prompt I’ll use it.

      Working on a big codebase, I don’t even get the idea to ask an AI, you just can’t feed enough context to the AI that it’s really able to generate meaningful code…

      • Lovable Sidekick@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 day ago

        I actually don’t write code professionally anymore, I’m going on what my friend says - according to him he uses chatGPT every day to write code and it’s a big help. Once he told it to refactor some code and it used a really novel approach he wouldn’t have thought of. He showed it to another dev who said the same thing. It was like, huh, that’s a weird way to do it, but it worked. But in general you really can’t just tell an AI “Create an accounting system” or whatever and expect coherent working code without thoroughly vetting it.

        • I’ll use it also often. But when the situation is complex and needs a lot of context/knowledge of the codebase (which at least for me is often the case) it seems to be still worse/slower than just coding it yourself (it doesn’t grasp details). Though I like how quick I can come up with quick and dirty scripts (in Rust for the Lulz and speed/power).

  • 74 183.84@lemm.ee
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 days ago

    I once asked chatGPT to write a simple RK2 algorithm in python. The function couldve been about 3 lines followed by a return statement. It gave me some convoluted code that was 3 functions and about 20 lines. AI still has some time to go before its can handle writing code on its own. Ive asked copilot/chatGPT several times to write code (just for fun) and it always does this

  • Anders429@programming.dev
    link
    fedilink
    arrow-up
    32
    ·
    2 days ago

    Know a guy who tried to use AI to vibe code a simple web server. He wasn’t a programmer and kept insisting to me that programmers were done for.

    After weeks of trying to get the thing to work, he had nothing. He showed me the code, and it was the worst I’ve ever seen. Dozens of empty files where the AI had apparently added and then deleted the same code. Also some utter garbage code. Tons of functions copied and pasted instead of being defined once.

    I then showed him a web app I had made in that same amount of time. It worked perfectly. Never heard anything more about AI from him.

    • A_Union_of_Kobolds@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      edit-2
      2 days ago

      AI is very very neat but like it has clear obvious limitations. I’m not a programmer and I could tell you tons of ways I tripped Ollama up already.

      But it’s a tool, and the people who can use it properly will succeed.

      I’m not saying ita a tool for programmers, but it has uses

      • De Lancre@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        2 days ago

        This. I have no problems to combine couple endpoints in one script and explaining to QWQ what my end file with CSV based on those jsons should look like. But try to go beyond that, reaching above 32k context or try to show it multiple scripts and poor thing have no clue what to do.

        If you can manage your project and break it down to multiple simple tasks, you could build something complicated via LLM. But that requires some knowledge about coding and at that point chances are that you will have better luck of writing whole thing by yourself.

    • _____@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      “no dude he just wasn’t using [ai product] dude I use that and then send it to [another ai product]'s [buzzword like ‘pipeline’] you have to try those out dude”

  • miridius@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    2 days ago

    In all seriousness though I do worry for the future of juniors. All the things that people criticise LLMs for, juniors do too. But if nobody hires juniors they will never become senior

    • Grazed@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      This is completely tangential but I think juniors will always be capable of things that LLMs aren’t. There’s a human component to software that I don’t think can be replaced without human experience. The entire purpose of software is for humans to use it. So since the LLM has never experienced using software while being a human, there will always be a divide. Therefore, juniors will be capable of things that LLMs aren’t.

      Idk, I might be missing a counterpoint, but it makes sense to me.

      • ChickenLadyLovesLife@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        2 days ago

        The entire purpose of software is for humans to use it.

        The good news is that once AI replaces humans for everything, there will be no need to produce software (or anything else) for humans and AI will be out of work.

        • NιƙƙιDιɱҽʂ@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          2 days ago

          Honestly, I could see a world, not super far from now, but not right around the corner, where we’ve created automonous agent driven robots that continue carrying on to do the jobs they’ve been made to do long after the last of the humans are gone. An echo of our insane capitalistic lives, endlessly looping into eternity.

  • pyre@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    it’s funny that some people think programming has a human element that can’t be replaced but art doesn’t.

    • gadfly1999@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      Computer programs need lots of separate pieces to operate together in subtle ways or your program crashes. With art on the other hand I haven’t heard of anyone’s brain crashing when they looked at AI art with too many fingers.

      It’s not so much that AI can’t do it, but the LLMs we have now certainly can’t.

      • pyre@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        i agree llms can’t do shit right now, what I was talking about was a hypothetical future in which somehow these useless techbros found a way to make them worth a shit. they certainly would be able to make a logical program work than infuse any artistic value into any audio or image.

        programs can be written to respond to a need that can be detected and analyzed and solved by a fairly advanced computer. art needs intent, a desire to create art, whether to convey feelings, or to make a statement, or just ask questions. programs can’t want, feel or wonder about things. they can pretend to do so but we all know pretending isn’t highly valued in art.

    • whotookkarl@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      2 days ago

      I get the idea that it’s only temporary, but I’d much rather have a current gen AI paint a picture than attempt to program a guidance system or a heart monitor