As a Java engineer in the web development industry for several years now, having heard multiple times that X is good because of SOLID principles or Y is bad because it breaks SOLID principles, and having to memorize the “good” ways to do everything before an interview etc, I find it harder and harder to do when I really start to dive into the real reason I’m doing something in a particular way.

One example is creating an interface for every goddamn class I make because of “loose coupling” when in reality none of these classes are ever going to have an alternative implementation.

Also the more I get into languages like Rust, the more these doubts are increasing and leading me to believe that most of it is just dogma that has gone far beyond its initial motivations and goals and is now just a mindless OOP circlejerk.

There are definitely occasions when these principles do make sense, especially in an OOP environment, and they can also make some design patterns really satisfying and easy.

What are your opinions on this?

  • Feyd@programming.dev
    link
    fedilink
    arrow-up
    46
    ·
    edit-2
    6 days ago

    If it makes the code easier to maintain it’s good. If it doesn’t make the code easier to maintain it is bad.

    Making interfaces for everything, or making getters and setters for everything, just in case you change something in the future makes the code harder to maintain.

    This might make sense for a library, but it doesn’t make sense for application code that you can refactor at will. Even if you do have to change something and it means a refactor that touches a lot, it’ll still be a lot less work than bloating the entire codebase with needless indirections every day.

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      6 days ago

      I remember the recommendation to use a typedef (or #define 😱) for integers, like INT32.

      If you like recompile it on a weird CPU or something I guess. What a stupid idea. At least where I worked it was dumb, if someone knows any benefits I’d gladly hear it!

      • SilverShark@programming.dev
        link
        fedilink
        arrow-up
        7
        ·
        6 days ago

        We had it because we needed to compile for Windows and Linux on both 32 and 64 bit processors. So we defined all our Int32, Int64, uint32, uint64 and so on. There were a bunch of these definitions within the core header file with #ifndef and such.

        • Valmond@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          6 days ago

          But you can use 64 bits int on a 32 bits linux, and vice versa. I never understood the benefits from tagging the stuff. You gotta go so far back in time where an int isn’t compiled to a 32 bit signed int too. There were also already long long and size_t… why make new ones?

          Readability maybe?

          • Consti@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            6 days ago

            Very often you need to choose a type based on the data it needs to hold. If you know you’ll need to store numbers of a certain size, use an integer type that can actually hold it, don’t make it dependent on a platform definition. Always using int can lead to really insidious bugs where a function may work on one platform and not on another due to overfloe

            • Valmond@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              6 days ago

              Show me one.

              I mean I have worked on 16bits platforms, but nobody would use that code straight out of the box on some other incompatible platform, it doesn’t even make sense.

              • Consti@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                6 days ago

                Basically anything low level. When you need a byte, you also don’t use a int, you use a uint8_t (reminder that char is actually not defined to be signed or unsigned, “Plain char may be signed or unsigned; this depends on the compiler, the machine in use, and its operating system”). Any time you need to interact with another system, like hardware or networking, it is incredibly important to know how many bits the other side uses to avoid mismatching.

                For purely the size of an int, the most famous example is the Ariane 5 Spaceship Launch, there an integer overflow crashed the space ship. OWASP (the Open Worldwide Application Security Project) lists integer overflows as a security concern, though not ranked very highly, since it only causes problems when combined with buffer accesses (using user input with some arithmetic operation that may overflow into unexpected ranges).

                • Valmond@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  5 days ago

                  And the byte wasn’t obliged to have 8 bits.

                  Nice example, but I’d say it’skind of niche 😁 makes me remember the underflow in a video game, making the most peaceful npc becoming a warmongering lunatic. But that would not have been helped because of defines.

          • SilverShark@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            6 days ago

            It was a while ago indeed, and readability does play a big role. Also, it becomes easier to just type it out. Of course auto complete helps, but it’s just easier.

    • termaxima@slrpnk.net
      link
      fedilink
      arrow-up
      1
      ·
      5 days ago

      Getters and setters are superfluous in most cases, because you do not actually want to hide complexity from your users.

      To use the usual trivial example : if you change your circle’s circumference from a property to a function, I need to know ! You just replaced a memory access with some arithmetic ; depending in my behaviour as a user this could be either great or really bad for my performance.

    • ExLisperA
      link
      fedilink
      arrow-up
      1
      ·
      5 days ago

      Exactly this. And to know what code is easy to maintain you have to see how couple of projects evolve over time. Your perspective on this changes as you gain experience.