- Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage
- RAM usage spikes from 1GB to 4GB on Discord both in and out of voice chat
Electron is a f…ing cancer for Desktop
cough Windows START MENU cough cough
This isn’t news lol
I remember how the combination of Internet mass distribution of file data and the blossoming gray market for file-share applications really super-charged the technology of file compression.
I wonder if we’ll see skyrocketing RAM prices put economic pressure on the system bloat rampant through modern OSes.
Isn’t the bloat basically being coded by the same ai that’s eating up the ram to begin with?
I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.
I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.
Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.
Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.
But TL; DR; I’d be more inclined to blame “bloat” on internet web browsers and low cost memory post '00s than on AI written-code.
I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.
I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.
Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.
Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.
It is definitely coming and fast. This was always Microsoft’s plan for an internet only windows/office platform. Onedrive and 365 is basically that implementation now that we have widespread high speed internet.
And with the amount of SaaS apps the only thing you need on a local machine is some configuration files and maybe a downloads folder.
Look at the new Nintendo Switch cartridges as an example. They don’t contain the game, just a license key. The install is all done over the internet.
Thank Google for those cool products.
what’s google got to do with it? this is an article about a product develeped at GitHub (now a microsoft subsidiary) causing problems with Windows and the thumbnail is showing produts from the following companies:
- discord
- microsoft
- microsoft
- microsoft
- microsoft
like. look. i hate google. they partner with israel to conduct genocide (don’t use waze, btw, or better yet, don’t use any google products). but this seems like not looking at the whole of how evil all of big tech is just to focus on how evil one company in big tech is
The article mentions Chrome/Chromium: 9 times
The article mentions Google: 0 timesGoogle made Chrome. Chrome had that multi-process architecture at its core which allowed to consume as much memory as needed even on 32-bit OS. Chromium was always inside it and open source. Then they created CEF, which allowed webdevs to build “real” apps, and that opened the floodgates. Electron was first built on it but they wanted to include Node and couldn’t because it required too much experience in actual coding. So they switched to Chromium. It didn’t change much in the structure, just basically invited more webdevs to build more “real” apps (at 1.0 release Electron advertised hundreds of apps built with it on its website).
Google could do something about how the web engine works in frameworks (that don’t need that much actual web functionality), but didn’t. They invited webdevs to do anything they want. Webdevs didn’t care about security because mighty Google would just publish new Chromium update eventually. They never realized they don’t need more security in their local “real” apps gui that connect to their websites because there is not much room for security danger in such scenarios. They just always updated the underlying engine because why not. Chromium dll is now at 300 mb or something? All of that code is much needed by everyone, is it not?
So, for me the sequence was always seen as this:
Google (caring about webdevs, not OS) ->
Webdevs (not caring about native code and wanting to sell their startup websites by building apps) ->
Reckless web development becoming a norm for desktop apps ->
Corporations not seeing problems with the above (e.g. Microsoft embedding more stuff with WebView2 aka Chromium)
So yes, Google has everything to do with it because it provided all the bad instruments to all the wrong people.
Personally, I don’t care much about hating Microsoft anymore because its products are dead to me and I can only see my future PCs using Linux.
If there’s any silver lining to this, perhaps we can get a renewed interest in efficient open-source software designed to work well on older hardware, and less e-waste.
Morgan Freeman: ”They couldn’t”
I wish we could, but it’s tough to maintain optimism in the face of these sociopathic corporations’ seemingly ever-growing power
If there’s any silver lining to this, fuck JavaScript, fuck JavaScript wrappers and fuck all people picked JavaScript for the programming language of anything cross-platform.
It’s unbelievable I would need 6 gbs of RAM to say a simple “hello” to my friends. It used to take 300kb with IRC.
that has very little to do with JavaScript though 🤷♂️
Even Electron apps aren’t necessarily ram hoarders: Stretchly, which is a break reminder and thus needs to always run in the background, takes something like 20 or 40 MB of memory.
“It sounds like you want low-end devices to be turned into thin clients for cloud-based operating systems. Do I have that right?”
I’d love to see games do this because they are clearly not being optimized. Can’t wait to see that not happen.
Good thing, I’m happy with retro games and the occasional indie.
Why would you do that when you can pull 50 JavaScript libraries and wrap it in Electron?
That’ll be 800€ and all change you own.
there are a shit ton alternatives. Too bad there are more average developers
Way ahead of you Luddites
edit
https://www.reddit.com/r/pcmasterrace/comments/ukwjwa/anyone_remember_this_scam/
first link died for some reason, probably not enough RAM
And here I am resurrecting Dell laptops from 2010 with 1.5gb DDR RAM and Debian
deleted by creator
Yeah, the RAM shortage is definitely to blame on Electron. Won’t someone please think of the poor AI companies who have to give an arm and a leg to get a single stick of RAM!
I wouldn’t mind so much if they were giving their own arms and legs, but they seem to be giving ours.
If you have a better way of generating videos of absurdly obese Olympic divers doing the bomb from a crane, I’d love to hear it.
Tbf isn’t AI mainly used to code electron apps by shitty companies?
I guess the prices give us a new kind of issue ticket template; “new RAM is too expensive for me, please consider optimizing”
Less abstract, more concrete than “take less of a share please”
The proliferation of electron programs is what happens when you have a decade of annoying idiots saying “unused memory is wasted memory,” hand-in-hand with lazy developers or unscrupulous managers who are externalizing their development costs onto everybody else by writing inefficient programs that waste more and more of our compute and RAM, which necessitates the rest of us having to buy even better hardware to keep up.
annoying idiots saying “unused memory is wasted memory,”
The original intent of this saying was different, but ya it’s been co-opted into something else
So what was the original saying? As I see it, this phrase is wrong no matter how you look at it. Because all ram is used at all times, for example if you have 32GB of free ram, the kernel will use all of it as a page cache to speed up the file system. The more free ram you have the more files can be cached, avoiding access to the disk when you read them.
Seems like you understand the original meaning already.
Limitation breeds innovation
Like a rust based alternative to VSCode
No thanks. Any software that has AI integration as one of its main selling points is shitware imo.
Entirely optional, it’s marketing, hate the game not the player
VSCode history is so messed up. Microsoft buys github and stops production of github team’s IDE, then uses the framework developed for that IDE to make VSCode.
Fucking 1600s colonizer behavior.
I wouldn’t mind them all using HTML for UI if they’d learn to share the same one, and only load it when they need to show me something.
No, Razer, your “mouse driver” does not need to load Chrome at all times, when I’ll only ever look at it once.
No, Razer, your “mouse driver” does not need to load Chrome at all times, when I’ll only ever look at it once.
It’s funny; on Linux such devices work perfectly but many users complain that they “aren’t supported” because there’s no UI (that sits uselessly in your notification area and eats memory).
Miss times when 4 gigs of ram was more than enough for browsing and playing game at the sane time
cries in 8gb RAM
It still is if you’re willing to jump through enough hoops
It’s because people want cross-platform apps and web is the easiest way to do it. Yes, you have Flutter, KML or Qt but those are often hard to work with (looking at you, Flutter) or it’s difficult to find devs that can work with them. You choose web (JS/wasm) and you have plenty of devs familiar with the tools and you can support all the platform easily. I’m using Tauri for my personal projects because it’s fun and easy. I could use Qt but I don’t want to work with C++ or Python, at least not in my spare time. If anyone can recommend me a nice framework supporting Linux and Android and using modern language I might switch. I haven’t found one.
It’s because there is no such thing as optimisation anymore. Websites are bloated to the gills with terrible animations and tracking scripts.
It’s because people want cross-platform apps and web is the easiest way to do it.
Just use the website then? There already is a suitable browser installed on every system. But no, must have apps. Makes it easier to stop people from having opinions about data collection and such. And the full browser stack needs to be fully reproduced each time. It gets really ridiculous when these apps sit idly in the notification area. Not to speak of security implications because electron apps and such usually don’t get timely updates.
Just use the website then?
It is a good solution for some apps but if you need to store data locally, use push notifications, run something in the background or access any native APIs you have to go with a native app.
All major browsers can do this - with the exception of running something in the background I guess. But that is exactly the sort of usage scenario where an Electron app is the worst choice. Coding a separate utility with no GUI would be the sane thing to do here, not put whole browser stacks into memory.
Pretty sure PWAs can run in the background.
You’re actually right, by now browsers have APIs to do most of the things apps do. Technically you could convert most apps to websites. I guess as a user I just don’t want all my apps to open a tab in my browser. I want to move apps between virtual desktops and monitors independently and I don’t want my app’s window to be clattered by all the menus from my browser. On mobile I also prefer switching between apps than between different tabs. For me the best compromise is:
- for system tools that don’t have to be cross platform and critical apps write native apps
- for small/medium cross platform apps use webviews like Webview2 or Tauri
- for big apps like Teams or Discord just use a website
I guess as a user I just don’t want all my apps to open a tab in my browser. I want to move apps between virtual desktops and monitors independently and I don’t want my app’s window to be clattered by all the menus from my browser.
All this is already possible with most browsers.
Do you know any websites that integrate into Linux desktop and Android like native apps? I mean I can run it from cmd/icon, and it opens as new window without any decorations? I never saw it but if it’s works fine it’s an interesting option.
Those are called Progressive Web Apps (PWA). You can use firefox to add the website to your desktop like this: https://developer.mozilla.org/en-US/docs/Web/Progressive_web_apps/Guides/Installing
Once you do, when you open the app it should have just the website without the tabs and everything else firefox does.
I have been really enjoying working with Avalonia, it is a .NET library that works across windows, Linux, and Mac and allows you to use C# for desktop app development on those environments. Its what MAUI should have been.












