• Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage
  • RAM usage spikes from 1GB to 4GB on Discord both in and out of voice chat
  • GnuLinuxDude@lemmy.ml
    link
    fedilink
    English
    arrow-up
    57
    arrow-down
    2
    ·
    4 days ago

    The proliferation of electron programs is what happens when you have a decade of annoying idiots saying “unused memory is wasted memory,” hand-in-hand with lazy developers or unscrupulous managers who are externalizing their development costs onto everybody else by writing inefficient programs that waste more and more of our compute and RAM, which necessitates the rest of us having to buy even better hardware to keep up.

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      4 days ago

      annoying idiots saying “unused memory is wasted memory,”

      The original intent of this saying was different, but ya it’s been co-opted into something else

      • pftbest@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        So what was the original saying? As I see it, this phrase is wrong no matter how you look at it. Because all ram is used at all times, for example if you have 32GB of free ram, the kernel will use all of it as a page cache to speed up the file system. The more free ram you have the more files can be cached, avoiding access to the disk when you read them.

  • _cryptagion [he/him]@anarchist.nexus
    link
    fedilink
    English
    arrow-up
    122
    ·
    4 days ago

    I really wish Electron wasn’t as popular as it is. It’s such a fucking memory hog. I mean, sure, I’ve got RAM to spare, but I shouldn’t need that much for a single app.

    • modular950@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 days ago

      maybe a toggle to choose between “take some extra RAM, I’m feeling generous” and “fuck you, I’m computing shit over here” could be used to let the app know your current mood / needs …

      • TheBlackLounge@lemmy.zip
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        4 days ago

        Memory hogging browsers usually do release memory when pressured. You can take it further by getting extensions that unload unused tabs.

        The problem is electron apps that load the whole browser core over and over.

    • cmnybo@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      49
      ·
      4 days ago

      Yes, it runs a separate browser instance for each electron program. Many of the programs that use it could just be a PWA instead.

      • plz1@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 days ago

        I tried the PWA route with Discord. It wouldn’t stay logged in, and acted generally janky. That said, I do PWA with any app that’s Electron, at least to try and avoid the RAM bloat.

      • lastweakness@lemmy.world
        link
        fedilink
        English
        arrow-up
        43
        ·
        4 days ago

        This is what bothers me so much… Browsers should be improving their PWA implementation (looking at you, Firefox) and electron apps should be PWAs more often. Another decent middle ground Is Tauri. SilverBullet and Yaak are both so much lighter and better than anything else on my system.

      • SaraTonin@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 days ago

        Or, even better, let’s start developing for separate platforms again, and optimise software for the platform that’s going to be running it. Rather than just developing everything for Chrome.

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        4 days ago

        I wonder how much exact duplication each process has?

        https://www.kernel.org/doc/html/latest/admin-guide/mm/ksm.html

        Kernel Samepage Merging

        KSM is a memory-saving de-duplication feature, enabled by CONFIG_KSM=y, added to the Linux kernel in 2.6.32. See mm/ksm.c for its implementation, and http://lwn.net/Articles/306704/ and https://lwn.net/Articles/330589/

        KSM was originally developed for use with KVM (where it was known as Kernel Shared Memory), to fit more virtual machines into physical memory, by sharing the data common between them. But it can be useful to any application which generates many instances of the same data.

        The KSM daemon ksmd periodically scans those areas of user memory which have been registered with it, looking for pages of identical content which can be replaced by a single write-protected page (which is automatically copied if a process later wants to update its content). The amount of pages that KSM daemon scans in a single pass and the time between the passes are configured using sysfs interface

        KSM only operates on those areas of address space which an application has advised to be likely candidates for merging, by using the madvise(2) system call:

        int madvise(addr, length, MADV_MERGEABLE)
        

        One imagines that one could maybe make a library interposer to induce use of that.

      • baconsunday@lemmy.zip
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        3 days ago

        Correct! The difference is the OS.

        Windows is a ram hog. Using 4GB or more just to exist. Linux uses 1-2GB, sometimes less.

        Microsoft FORCES electron web components.

        Linux has choice.

        So yes, linux has electron as well, but Linux is a lot lighter and nowhere near a hog like windows.

    • BlueMagma@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      I’m using Linux on all my pc. The ram problems exist here too. Firefox is taking the most, the slack app is taking a big chunk too. Linux is not exempt from badly written code, it’s everywhere and nobody seems to care about optimizing their code’s memory usage anymore.

      • baconsunday@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I would be interested in seeing how you have it set up that firefox or linux are using any substantial amount of ram. That wouldn’t have anything to do with ‘badly written code’.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 days ago

      I will once Nvidia gets off their asses and properly implements support for the Nvidia App in Linux. I’ve tried the alternative control panels for Nvidia GPUs. They suck.

      • baconsunday@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Don’t remind me, I am high on copium right now. I can’t even play any of my steam games, but I have my Miyoo Mini+, so I’m surviving haha

    • The Quuuuuill@slrpnk.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 days ago

      what’s google got to do with it? this is an article about a product develeped at GitHub (now a microsoft subsidiary) causing problems with Windows and the thumbnail is showing produts from the following companies:

      • facebook
      • discord
      • microsoft
      • microsoft
      • microsoft
      • microsoft

      like. look. i hate google. they partner with israel to conduct genocide (don’t use waze, btw, or better yet, don’t use any google products). but this seems like not looking at the whole of how evil all of big tech is just to focus on how evil one company in big tech is

      • Turret3857@infosec.pub
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 days ago

        CoMaps is a good alternative to Waze. If you think it isnt make an OSM account and help make it a good alternative :p

      • rdri@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        3 days ago

        The article mentions Chrome/Chromium: 9 times
        The article mentions Google: 0 times

        Google made Chrome. Chrome had that multi-process architecture at its core which allowed to consume as much memory as needed even on 32-bit OS. Chromium was always inside it and open source. Then they created CEF, which allowed webdevs to build “real” apps, and that opened the floodgates. Electron was first built on it but they wanted to include Node and couldn’t because it required too much experience in actual coding. So they switched to Chromium. It didn’t change much in the structure, just basically invited more webdevs to build more “real” apps (at 1.0 release Electron advertised hundreds of apps built with it on its website).

        Google could do something about how the web engine works in frameworks (that don’t need that much actual web functionality), but didn’t. They invited webdevs to do anything they want. Webdevs didn’t care about security because mighty Google would just publish new Chromium update eventually. They never realized they don’t need more security in their local “real” apps gui that connect to their websites because there is not much room for security danger in such scenarios. They just always updated the underlying engine because why not. Chromium dll is now at 300 mb or something? All of that code is much needed by everyone, is it not?

        So, for me the sequence was always seen as this:

        Google (caring about webdevs, not OS) ->

        Webdevs (not caring about native code and wanting to sell their startup websites by building apps) ->

        Reckless web development becoming a norm for desktop apps ->

        Corporations not seeing problems with the above (e.g. Microsoft embedding more stuff with WebView2 aka Chromium)

        So yes, Google has everything to do with it because it provided all the bad instruments to all the wrong people.

        Personally, I don’t care much about hating Microsoft anymore because its products are dead to me and I can only see my future PCs using Linux.

  • Lightsong@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    4 days ago

    I have couple of old 8 gb sticks from my old 960 GPU pc. Is there any way for me to stick it onto my new pc and have only certain app use it and nothing else?

    • Logical@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 days ago

      As to whether it’s possible to get certain apps use specific physical RAM sticks, I am not sure, but that seems unlikely and would probably require some very low level modifications to your operating system. But even before you get to that point you’d have to physically connect them to your new motherboard, which will only work if there are both free RAM slots on it, and your new motherboard has slots for the same generation of RAM that your old PC uses.

    • towerful@programming.dev
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      4 days ago

      Only for multi CPU mobos (and that would be pinning a thread to a CPU/core with NUMA enabled where a task accessed local ram instead of all system ram). Even then, I think all ram would run at the lowest frequency.
      I’ve never mixed CPUs and RAM speeds. I’ve only ever worked on systems with matching CPUs and ram modules.

      I think the hardware cost and software complexity to achieve this is beyond the cost of “more ram” or “faster storage (for faster swap)”

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    4 days ago

    I remember how the combination of Internet mass distribution of file data and the blossoming gray market for file-share applications really super-charged the technology of file compression.

    I wonder if we’ll see skyrocketing RAM prices put economic pressure on the system bloat rampant through modern OSes.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        4 days ago

        I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.

        I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.

        Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.

        Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.

        But TL; DR; I’d be more inclined to blame “bloat” on internet web browsers and low cost memory post '00s than on AI written-code.

        • nosuchanon@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 days ago

          I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.

          I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.

          Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.

          Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.

          It is definitely coming and fast. This was always Microsoft’s plan for an internet only windows/office platform. Onedrive and 365 is basically that implementation now that we have widespread high speed internet.

          And with the amount of SaaS apps the only thing you need on a local machine is some configuration files and maybe a downloads folder.

          Look at the new Nintendo Switch cartridges as an example. They don’t contain the game, just a license key. The install is all done over the internet.

  • Brkdncr@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    4 days ago

    This is a trade off. Many of these apps work on osx and Linux because they are browser-based. If they go back to native apps you lose that portability.

    • GreenKnight23@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      4 days ago

      electron was a steaming pile of shit 8 years ago. still is. what’s changed?

      our acceptance of shitty corporate software.

    • Feyd@programming.dev
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      1
      ·
      4 days ago

      There are plenty of cross platform frameworks and libraries that don’t involve web tech

      • dukemirage@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 days ago

        There is also the advantage of an army of web devs who can build somewhat functional software for the desktop day one.

      • boonhet@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 days ago

        There are, but few of them also work on the web as an alternative to the desktop. Writing one shitty web app and offering Electron wrapped versions of it gets you a webapp, a Windows app, a Linux app and a MacOS app. And you already have web devs on the team because everyone does.

        • fuzzzerd@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 days ago

          I hate that you are right. Giving up electron would likely mean less Linux and mac compatibility. It’s a shame, but it’s likely true.

          • boonhet@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            For commercial software, definitely. It’d be web and MAYBE Windows unless there’s a Qt nerd spearheading the project or something.

            FOSS is actually better off here IMO, since it’s done by people as passion projects, so there’s no need to pinch pennies by eliminating target platforms. HOWEVER there’d also be more need for the devs to have different platforms to test on.

  • xthexder@l.sw0.com
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    2
    ·
    4 days ago

    Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage

    Lol, this is news? Where have they been the last 15 years?

    In other news, the sky is blue.

    • Pringles@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      13
      ·
      4 days ago

      Just another AI agent bro, that will fix th

      Out of Memory or System Resources. Close some windows or programs and try again.

      • phlegmy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        4 days ago

        No thanks. Any software that has AI integration as one of its main selling points is shitware imo.

      • Jankatarch@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        3 days ago

        VSCode history is so messed up. Microsoft buys github and stops production of github team’s IDE, then uses the framework developed for that IDE to make VSCode.

        Fucking 1600s colonizer behavior.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    3
    ·
    4 days ago

    I mostly use terminal-based software on Linux.

    I think that the only programs I use much that embed a web browser are:

    • Firefox

    • Steam

    • Some games that are Web-based and which I only run one of at once (Neo Scavenger, some RPGMaker-based games, probably some others).

  • Jo Miran@lemmy.ml
    link
    fedilink
    English
    arrow-up
    45
    ·
    4 days ago

    I normally reply with a tux penguin but, is it really a Windows problem if the it’s the apps that aren’t optimized for shit?

      • frozen@lemmy.frozeninferno.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 days ago

        Yep. My 64 gigs of RAM died in my old setup a few weeks ago, and instead of paying out the ass for replacement DDR4 RAM, I decided to pay out the ass for DDR5 RAM and upgrade while I was at it. Only did 32 gigs, because I really wasn’t using most of my 64 gigs (I thought). A few days ago, I ended up having to set up a swap file because a Rust project I was working on kept crashing VSCode while it was running the analyzer. What are we doing here.

    • mushroommunk@lemmy.today
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      4 days ago

      Windows itself is inherently unoptimized these days with its AI search powered start menu and settings pages that are actually electron web apps. Linux uses 3+ less gb of RAM on just desktop after I switched. So really it’s a both problem. Both windows and the programs need a rethink