• Felis_Rex@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 days ago

      When I saw people’s already processed orders getting cancelled with no reason other than to price gouge I was infuriated.

      The level of greed in this world sickens me.

  • rogsson@piefed.social
    link
    fedilink
    English
    arrow-up
    44
    ·
    4 days ago

    When the yet-to-be data centers never get built because AI slop bubble pops, we will be able to build houses out of RAM sticks for the poor

    • veni_vedi_veni@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      4 days ago

      the problem with data center hardware is that they are often bespoke and nowadays can’t be reused in a consumer context. Think about those headless GPUs, they probably making these RAM modules with a different interface.

      They will just be e-waste instead of having the possibility of being surplus.

      • Tja@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 days ago

        The modules yes, but ram is bought on the chip level. If the modules are never built, the chips can be reused in normal dimms.

        Worst case we get a new HBM dimm format :D

      • Paranoidfactoid@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 days ago

        Those headless GPUs are great for simulation work in blender and other creative tools. I’d love an opportunity to buy a good used one on the cheap.

    • samus12345@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      45
      ·
      5 days ago

      Just about all electronics older than a year or so have. Even a Switch, which came out 9 years ago, costs more to buy now than it did then!

        • fartographer@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          Certainly the most powerful GPU I’ve bought! I got mine 2 years ago for about $120, and it’s been great for my docker containers!

    • Asmodeus_Krang@infosec.pub
      link
      fedilink
      English
      arrow-up
      68
      ·
      5 days ago

      It’s truly mental. I don’t think I could afford to build my PC at the same spec today with RAM and SSD prices being what they are.

      • tempest@lemmy.ca
        link
        fedilink
        English
        arrow-up
        44
        ·
        5 days ago

        I have 128 GB of ddr5 memory in my machine. I paid 1400 for my 7900xtx which I thought was crazy and now half my ram is worth that.

        Never thought I would see the day where the graphics card was not the most expensive component.

          • tempest@lemmy.ca
            link
            fedilink
            English
            arrow-up
            8
            ·
            4 days ago

            I should not have even gotten the 128.

            I can use it but barely at 4600 because ryzen chips can’t handle 4 dimms of 32gvb.

            I honestly didn’t even bother to check at the time of purchase and it is is still a roll of the dice if I restart.

  • LoafedBurrito@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    4 days ago

    Ruining the PC market for consumers on purpose so people will think it’s cheaper to rent computers than to own.

    In the future, you will lease your computer and not own it, just as you are told to do by the billionaires who steal your pay.

    • BanMe@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 days ago

      Yep cloud providers definitely came up with the AI boom in a roundabout conspiracy to end PCs. Total direct chain there.

    • SourGumGum@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      In the future you will connect to a corporate owned terminal and use an online hosted OS, where your files are kept in their cloud ecosystme.

  • Jhex@lemmy.world
    link
    fedilink
    English
    arrow-up
    90
    arrow-down
    3
    ·
    5 days ago

    This article sucks… I think they felt the need to excuse AI lest they upset corporate masters

    While it’s easy to point the finger at AI’s unquenchable memory thirst for the current crisis, it’s not the only reason.

    Followed by:

    DRAM production hasn’t kept up with demand. Older memory types are being phased out, newer ones are steered toward higher margin customers, and consumer RAM is left exposed whenever supply tightens.

    Production has not kept up with demand… demand being super charged by AI purchases

    …newer ones are steered towards higher margin customers… again AI

    consumer RAM is left exposed whenever supply tightens… because of AI

    • AeonFelis@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      4 days ago

      You see, it’s easy to blame AI data centers buying all the RAM - but that’s only half the story! the other half of the story is manufacturers selling to these data centers

    • njordomir@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      Me too, I added more than I could use because today’s gaming rig is tomorrow’s server. Now I’m debating if I should sell a few sticks but who knows when, if ever, I’ll be able to replace them.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 days ago

      I did my desktop but skipped my server.

      Even decade+ old used surplus server DDR4 didn’t escape the apocalypse.

    • hdsrob@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 days ago

      Same … I hadn’t upgraded since 2012, and had some extra cash, so rebuilt in August. Feeling pretty lucky to have done it then, and really glad I went ahead and put 64GB RAM in it.

        • hdsrob@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          Yea, the 2012 build was a 3770k with 16 gb ram, multiple SSDs, a GTX680, etc. So it was a pretty fast machine back in the day.

          I upgraded the video card and SSD drives several times, just didn’t have the budget to replace it all at once for a long time.

      • hdsrob@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 days ago

        Minus the case and video card, I have an entire 3rd gen i7 machine sitting in a box that would actually make a pretty good machine for a lot of different uses.

  • 1984@lemmy.today
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    4 days ago

    Im on Linux and it requires just as much memory as it did in 2018. No problem here.

    • pHr34kY@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      4 days ago

      I upgraded mine from 16GB to 32GB two years ago beacuse RAM was cheap. I didn’t really need it, and have probably never hit 16GB usage anyway.

      Meanwhile my work Windows laptop uses 16GB at idle after first login.

      Windows has always been wasteful computing, and everyone just pays for it.

      • Honytawk@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 days ago

        Seems like you (or your company) installed loads of bloat.

        You do know you can disable programs from starting during boot, right?

        • pHr34kY@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          I fired up procexp and found nessus scanning every file on my drive. I found the SentinelOne had written 10GB of logs since start. I found some bullshit dell service was slamming the CPU. It’s all shit that my company put there.

          I found that it adds an extra 7 minutes to a 12 minute build when I compile my project, compared to doing it on WSL. The Windows bloat is insane.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        4 days ago

        I wish I had a 32gb ram laptop.

        I can have 3 development IDEs open at once, and with all the browser tabs open and a few other programs here and there its stretching the limits on my Mac.

        • pHr34kY@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          edit-2
          4 days ago

          I have 32GB on my Windows PC laptop it can’t do three at once.

          Running the backend (java) and the frontend (react native) in intellij uses 29GB RAM, so I must run Android on real hardware over ADB+USB. Running an android simulator pushes it over the edge.

          Also: Laptops are shit. On Windows, the tau is so bad that the cores are throttled straight after boot because the cooling is rubbish. It almost never hits full speed. It can’t survive more than 40 minutes on a full battery. It might as well be a NUC.

              • morriscox@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 days ago

                You might want to use Process Manager or the like to see if something is pegging the CPU/GPU. What is the model?

                • pHr34kY@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 days ago

                  I’m fairly sure it’s all the antivirus, which in itself is a kernel rootkit. The whole laptop is getting replaced in a month so there’s not much point in fixinig it.

          • Honytawk@feddit.nl
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            4 days ago

            What does Windows have to do with cooling? That is a hardware problem.

            You’d have the same issues if you installed Arch on it.

            Stop blaming Windows for all your problems.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            edit-2
            4 days ago

            Ya, macs are definitely more efficient with their ram.

            I’ll have Android Studio open for my main work, Intellij Idea for all the backend work, and Xcode when I need to tweak some iPhone things. (edit: usually it’s just 2 of the 3, but sometimes its all 3)

            I also mainly use real devices for testing,and opening emulators if all 3 are open can be a problem, and it’s so annoying opening and closing things.

        • pHr34kY@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          4 days ago

          Linux doesn’t waste RAM. All unused RAM becomes a disk read cache, but remains available on demand.

          • KairuByte@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            4 days ago

            Cool, so the thing you stated wasn’t what happened, and you’re correcting me for not fact checking your comment.

      • Bilb!@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        4 days ago

        Requiring less RAM is good, but conceptually, it’s Linux that is “wasting” the RAM by never using it. It’s there, and it’s reusable, fill it up! Now, does Windows make good use of it? No idea. Wouldn’t bet on it, but I could be surprised.

      • Waraugh@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        4 days ago

        Storing data in ram isn’t wasteful though, I have a lot of criticisms of windows but the memory management isn’t one of them. I’d rather have as much predictive content be staged in ram as possible as long as it’s readily dumped out if I go to do something else, which is my experience. Like I don’t earn interest for having unused RAM on my computers (for reference I have an endeavorOS, rhel, fedora, and windows computers under my desk connected to a dual monitor kvm right now; it isn’t like I don’t regularly use/prefer Linux; I mostly access my windows machine via rustdesk for work related stuff I don’t feel like having to dick with on Linux like the purchase order system and Timecard system), I just don’t get this critique.

  • palordrolap@fedia.io
    link
    fedilink
    arrow-up
    17
    ·
    5 days ago

    The DDR4 sticks I got 18 months ago now cost 300-400% the price they were, so it’s not just DDR5.

    … and I just realised the title doesn’t actually mean “DDR5 prices”, but that was an easy misinterpretation on my part, so I guess I’ll post this anyway.

      • 9point6@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        5 days ago

        I more meant now they’re not being made because Micron recently killed the Crucial brand to focus supply towards data center customers

        • Prove_your_argument@piefed.social
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          5 days ago

          I’m well aware, but everybody knows the HBM demand will dry up eventually and that eventually the consumer market will be worth trying to profit from again.

          They just want to manipulate the consumer market to maximize margins. If they can get memory prices to stay at 200-300% for a while, they can up the prices they charge and raise margins to stratospheric heights not before seen on the consumer market. All manufacturers jump on stuff like this when they can get away with it.

          Memory manufacturers still order from micron directly for their own branded chips. Those margins will increase for all parties. Ai data center demand is like Christmas for the entire industry. No pricing is transparent and every vendor is profiteering.