• OctopusNemeses@lemmy.world
    link
    fedilink
    English
    arrow-up
    73
    ·
    1 day ago

    I’m pretty sure the “unused RAM is wasted RAM” thing has caused its share of damage from shit developers who took it to mean use memory with reckless abandon.

    • ThePantser@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      26
      ·
      1 day ago

      Would be nice if I could force programs to use more ram though. I actually have 100GB of DDR4 my desktop. I bought it over a year ago when DDR4 was unloved and cheap. But I have tried to force programs to not be offloading as much. Like Firefox, I hate that I have the ram but it’s still unloading webpages in the background and won’t use more than 6GB ever.

      • iglou@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        1 day ago

        Programs that care about memory optimization will typically adapt to your setup, up to a point. More ram isnt going to make a program run any better if it has no use for it

      • floquant@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        Set swappiness to 5 or something similar, or disable swap altogether unless you’re regularly getting close to max usage

            • Vlyn@lemmy.zip
              link
              fedilink
              English
              arrow-up
              6
              ·
              1 day ago

              Maybe it has changed again, but in the past I gave it a try. When 16 GB was a lot. Then when 32 GB was a lot. I always thought “Not filling up the RAM anyway, might as well disable it!”

              Yeah, no, Windows is not a fan. Like you get random “running out of memory” errors, even though with 16 GB I still had 3-4 GB free RAM available.

              Some apps require the page file, same as crash dumps. So I just set it to a fixed value (like 32 GB min + max) on my 64 GB machine.

    • iglou@programming.dev
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      In most cases, you either optimize the memory, or you optimize the speed of execution.

      Having more memory means we can optimize the speed of execution.

      Now, the side effect is that we can also afford to be slower to gain other benefits: Ease of development (come in javascript everywhere, or python) at the cost of speed, maintainability at the cost of speed, etc…

      So, even though you dont always see performance gains as the years go, that doesn’t mean shit devs, it means the priority is somewhere else. We have more complex software today than 20 years ago because we can afford not to focus on ram and speed optimization, and instead focus on maintainable, unoptimized code that does complex stuff.

      Optimization is not everything.

      • mnemonicmonkeys@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        unoptimized code that does complex stuff.

        You can still have complex code that is optimized for performance. You can spend more resources to do more complex computations and still be optimized so long as you’re not wasting processing power on pointless stuff.

        For example, in some of my code I have to get a physics model within 0.001°. I don’t use that step size every loop, because that’d be stupid and wasteful. I start iterating with 1° until it overshoots the target, back off, reduce the step to 1/10, and loop through that logic until I get my result with the desired accuracy.

        • iglou@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          10 hours ago

          Of course! But sometimes, most often even, the optimization is not worth the development to get it. We’re particularly talking about memory optimization here, and it is so cheap (or at least it was… ha) that it is not worth optimizing like we used to 25 years ago. Instead you use higher level languages with garbage collection or equivalents that are easier to maintain with and faster to implement new stuff with. You use algorithms that consume a fuck ton of memory for speed improvements. And as long as it is fast enough, you shouldn’t over optimize.

          Proper optimization these days is more of a hobby.

          Now obviously some fields require a lot more optimization - embedded systems, for instance. Or simulations, which get a lot of value from being optimized as much as possible.

    • Vlyn@lemmy.zip
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      1 day ago

      With 32 and 64 GB systems I’ve never run out of RAM, so the RAM isn’t the issue at all.

      Optimization just sucks.

        • Vlyn@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 hours ago

          Decent sized for what?

          Creative writing and roleplay? Plenty, but I try to fit it into my 16 GB VRAM as otherwise it’s too slow for my liking.

          Coding/complex tasks? No, that would need 128GB and upwards and it would still be awfully slow. Except you use a Mac with unified memory.

          For image and video generation you’d want to fit it into GPU VRAM again, system RAM would be way too slow.