• codeinabox@programming.devOP
    link
    fedilink
    English
    arrow-up
    88
    ·
    2 months ago

    This quote on the abstraction tower really stood out for me:

    I saw someone on LinkedIn recently — early twenties, a few years into their career — lamenting that with AI they “didn’t really know what was going on anymore.” And I thought: mate, you were already so far up the abstraction chain you didn’t even realise you were teetering on top of a wobbly Jenga tower.

    They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.

    But sure. AI is the moment they lost track of what’s happening.

    The abstraction ship sailed decades ago. We just didn’t notice because each layer arrived gradually enough that we could pretend we still understood the whole stack. AI is just the layer that made the pretence impossible to maintain.

    • Feyd@programming.dev
      link
      fedilink
      arrow-up
      47
      ·
      2 months ago

      LLMs don’t add an abstraction layer. You can’t competently produce software without understanding what they’re outputting.

      • chicken@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        24
        ·
        2 months ago

        The author’s point is that people already don’t understand what the programs they write do, because of all the layered abstraction. That’s still true whether or not you want to object to the semantics of calling the use of LLMs an abstraction layer.

        • Feyd@programming.dev
          link
          fedilink
          arrow-up
          30
          ·
          2 months ago

          Not knowing what cpu instructions your code compiles to and not understanding the code you are compiling are completely different things. This is yet another article talking up the (not real) capability of LLM coding assistants, though in a more round about way. In fact, this garbage blogspam should go on the AI coding community that was made specifically because the subscribers of the programming community didn’t want it here, yet we keep getting these trying to skirt the line.

          • codeinabox@programming.devOP
            link
            fedilink
            English
            arrow-up
            11
            ·
            2 months ago

            In fact, this garbage blogspam should go on the AI coding community that was made specifically because the subscribers of the programming community didn’t want it here.

            This article may mention AI coding but I made a very considered decision to post it in here because the primary focus is the author’s relationship to programming, and hence worth sharing with the wider programming community.

            Considering how many people have voted this up, I would take that as a sign I posted it in the appropriate community. If you don’t feel this post is appropriate in this community, I’m happy to discuss that.

            • Feyd@programming.dev
              link
              fedilink
              arrow-up
              5
              ·
              2 months ago

              You made a very considered decision that you could argue it’s not technically AI booster bullshit, you mean.

              • codeinabox@programming.devOP
                link
                fedilink
                English
                arrow-up
                5
                ·
                2 months ago

                What I’m saying is the post is broadly about programming, and how that has changed over the decades, so I posted it in the community I thought was most appropriate.

                If you’re arguing that articles posted in this community can’t discuss AI and its impact on programming, then that’s something you’ll need to take up with the moderators.

                • Feyd@programming.dev
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  2 months ago

                  If I thought it was against the rules I’d report it instead of complaining. I complain because posting “I’m sad because everything is different now and also I’m all in in the hype actually” blogs 2 days in a row after agreeing not to post AI hype sure seems like you desperately want to post AI hype.

          • chicken@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            5
            ·
            2 months ago

            Talking about low level compilers seems like moving the goalposts, since they are way more well defined and vetted than the mass of software libraries and copy pasted StackOverflow functions a large segment of programming has been done with.

    • idunnololz@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      2 months ago

      I’ve had this problem with abstractions for the longest time. Of course whenever I say anything negative about abstractions I just get dog piled so I don’t usually like to discuss the topic.

      I think abstractions as a tool is fine. My problem with abstractions is that most developers I meet seem to only talk about the upsides of abstractions and they never take into account the downsides seriously.

      More often then not, I just think people treat abstractions as this magical tool you cant over use. In reality, over use of abstractions can increase complexity and reduce readability. They can greatly reduce the amount of assumptions you can make about code which has many many additional downsides.

      Of course I’m not saying we shouldnt use abstractions. Not having any abstractions can be just as bad as having too many. You end up with similar issues such as increased complexity and reduced readability.

      The hard part is finding the balance, the sweet spot where complexity is minimized and readability is maximized while using the fewest amount of abstractions possible.

      I think too often, developers would err on the side of caution and add more abstractions then necessary and call it good enough. Developers really need to question if every abstraction is absolutely necessary. Is it really worth it to add an additional layer of abstraction just because a problem might arise in the future vs reducing the number of abstractions and waiting for it to become a problem before adding more abstractions. I don’t think we do the latter enough. Often times you can get away with slightly less abstractions than you think you need because you will never touch the code again.

      • codeinabox@programming.devOP
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 months ago

        There is much debate about whether the use em-dash is a reliable signal for AI generated content.

        It would be more effective to compare this post with the author’s posts before gen AI, and see if there has been a change in writing style.

    • MalReynolds@slrpnk.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Thanks for the quote, it caused me to actually read the source, which I enjoyed. It’s a good idea to put these thoughts into the post body.

      As to abstraction layers I am reminded of “I have seen further because I have stood on the shoulders of giants”, and yet you must climb to those shoulders, ontogeny recapitulates phylogeny, and you will see further the more you understand the layers below.

      I suspect it follows the pattern of a tree, you can only reach further up if your roots are deep enough to support it. A seed can grow wherever planted, hence new coders will do fine, and eventually come to this point (maybe toppling a few times, putting down more roots in response). A mature fruit tree has many years of bearing fruit.

      As to AI making the process more soulless, mayhap, but making programmers more exploitable and fungible, almost certainly, but that’s the point.

  • Feyd@programming.dev
    link
    fedilink
    arrow-up
    15
    ·
    2 months ago

    I say that knowing how often those words have been wrong throughout history.

    Yup

    Previous technology shifts were “learn the new thing, apply existing skills.” AI isn’t that. It’s not a new platform or a new language or a new paradigm. It’s a shift in what it means to be good at this.

    A swing and a miss

  • fubarx@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    2 months ago

    Wasn’t “lo-code” a BIG thing a few years ago… that would destroy programming and make every PM a developer? Whatever happened to that? 🤔

    Edit: read the HN comments. If I ever go back to consulting, I’m 10x-ing my rate to work on cleaning up this slop. I’m not anti-AI coding and use it for my own projects, but if you just give it a prompt and walk away, you will be very sad later.

    There’s a BIG difference between prototypes and something others have to use. As the lo-code folks found out the hard way.

    • Cryxtalix@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Aren’t professional tools using node-based compositing, which ends up being just as complicated as code in big projects? They’ll do anything to hide the code, because code is scary, even if it’s the same.

  • NostraDavid@programming.dev
    link
    fedilink
    arrow-up
    6
    ·
    2 months ago

    Creative constraints bred creativity.

    That might explain why there’s so much crap coming out of the gaming industry. All the old constraints are gone, so everything now very much looks the same.

    Just give yourself artificial constraints.

    • Skullgrid@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      2 months ago

      That might explain why there’s so much crap coming out of the gaming industry.

      not really, it’s more of an economic situation instead of a tech one. Indie games are doing just fine, more or less (the choose 1 of 3 fad is a pain).

    • entwine@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      The modern game industry was being run by pedophile billionaires, two of the worst adjectives you can apply to a human being. I’d say that’s more of a factor than not having enough “creative constraints”

  • entwine@programming.dev
    link
    fedilink
    arrow-up
    5
    ·
    2 months ago

    I had a realization recently. All the pro-AI people pushing vibe coding or “coding assistants” are completely missing the point.

    These tools aren’t helping you write code, you are helping the tool write code, because it can’t do it on its own yet. The more they improve, the less you’re needed.

    Idk if they’ll ever reach the point where you can actually give it a prompt, and it’ll provide a fully functional implementation on its own with no human intervention required. If it does, I can’t imagine that tech would be as available as it is now. Your peasant ass isn’t going to be vibing the next big thing that’s for sure.

  • bridgeenjoyer@sh.itjust.works
    link
    fedilink
    arrow-up
    5
    ·
    2 months ago

    I wish i could have started then! I’m not really interested in modern coding, it doesn’t seem as interesting…I kind of want to read the whole c64 manual and try programming on it, but I guess there isn’t much point to that nowadays

  • jimmy90@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    2 months ago

    high level code generating tools have come and mostly gone

    we will see if this one is good if it works and we can maintain the code it makes

    simple

    • Skullgrid@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      high level code generating tools have come and mostly gone

      he’s talking about languages that don’t touch bare metal, not WYSIWYG editors

      EDIT: WYSIWYG stuff continues to live , fucking salesforce

  • I Cast Fist@programming.dev
    link
    fedilink
    arrow-up
    4
    ·
    2 months ago

    They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.

    Yeah, I noticed something was off around the time every new app was essentially “the fucking website on a self contained chrome browser”, aka electron. Sure, it was sold off as being a “write once, run everywhere”, but a significant number of said programs and games were still either windows or android exclusive because ha ha fuck you (anyone that’s dealt with RPM MV/MZ knows it).

    Some layers of abstraction are incredibly welcome - not having to worry about cycles and RAM addressing, having immediate access to graphics without having to reinvent pixel pushing function; but (imo) everything on top of a browser is just an endless series of “why?”

  • talkingpumpkin@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    2 months ago

    This isn’t a rant about AI.

    I feared thus would be about AI, but… this might actually be interesting! I’m glad I started reading.

    This time is different […] Previous technology shifts were “learn the new thing, apply existing skills.” AI isn’t that.

    Well f*ck you and give me back the time I wasted on that article.

    Guys, can we add a rule that all posts that deal with using LLM bots to code must be marked? I am sick of this topic.

    • codeinabox@programming.devOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      Guys, can we add a rule that all posts that deal with using LLM bots to code must be marked? I am sick of this topic.

      How would you like them to be marked? AFAIK Lemmy doesn’t support post tags

        • talkingpumpkin@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          Actual technical articles about LLM/diffusion would be interesting to read (I think?)… maybe something like [vibecoding]?

          Actually, let’s make that generic and use [futurology], so that it may apply regardless of whether the incumbent revolution/menace is LLMs, low code tools, or stack overflow.