Samsung makes some of the best SSDs!!
The leak comes after another report detailed that Samsung has raised DDR5 memory prices by up to 60%.
MF… And why they wind down SSD production this time? Last time was 2 years ago, because the SSD prices were low and they wanted to raise them (which happened).
Because AI is better for €€€
May all bankrupt when the bubble bursts.
we all know as soon as big bad chip daddy comes back with a big discount everyone not in this thread (and even some that are) will spread their cheeks and beg for more.
humans are dumb greedy little assholes that have zero willpower. that’s why it’s so easy to manipulate us.
What if we get a lack-of-new-computers-crisis before the AI-bubble bursts
Me with my 5 lenovo thinkcentres: 😎
Don’t worry, you can use AI on anything that can access the internet! No need to ever have personal (let alone private) thoughts - I’m sorry, data - again.
MS has been trying to get you to give up your personal computer for years. Do everything in the cloud, please! Even gaming with Stadia! And now they’re getting their wish. All it took was running the entire global economy.
Doing everything in the cloud is crazy. I’m so glad I jumped over to Linux a couple years ago!
Still need hardware to run it on ☹️
I have 4x 6TB HDDs in my NAS. Around 5 years ago I decided to simply replace any dead drives with 6TB ones instead of my previous strategy of slowly upgrading their size. I figured I could swap to 8TB 2.5" SATA SSDs that had just started to exist and would surely only get cheaper in the future…
M.2 to sata converters will probably come to your rescue. But probably not as cheap as you were hoping.
In my head I thought one could make relatively cheap high capacity in 2.5" SATA form factor by having more NAND chips of lower capacity. You give up speed and PCB space but that’s fine since bandwidth and IOPS are limited by SATA anyway and there’s plenty of space compared to M.2.
Turns out to not shake out that way, controller ICs that support SATA aren’t coming out any more, and NAND ICs are internally stacked to use up channels while not taking up PCB space.
There are some enterprise options, but they’re mad expensive.
I’ve cracked open a few faulty sata SSDs. Quite a few of the recent models are just 2242 or 2230 m.2 ssd’s with a converter. Even bigger 2TB ones.
awesome! Thank you shitty ai.
Aside: WTF are they using SSDs for?
LLM inference in the cloud is basically only done in VRAM. Rarely stale K/V cache is cached in RAM, but new attention architectures should minimize that. Large scale training, contrary to popular belief, is a pretty rare event most data centers and businesses are incapable of.
…So what do they do with so much flash storage!? Is it literally just FOMO server buying?
Storage. There aren’t enough hard drives, so datacentres are also buying up SSDs, since it’s needed to store training data.
since it’s needed to store training data.
Again, I don’t buy this. The training data isn’t actually that big, nor is training done on such a huge scale so frequently.
As we approach the theoretical error rate limit for LLMs, as proven in the 2020 research paper by OpenAI and corrected by the 2022 paper by Deepmind, the required training and power costs rise to infinity.
In addition to that, the companies might have many different nearly identical datasets to try to achieve different outcomes.
Things like books and wikipedia pages aren’t that bad, wikipedia itself compressed is only 25GB, maybe a few hundred petabytes could store most of these items, but images and videos are also valid training data and that’s much larger, and then there is readable code. On top of that, all user inputs have to be stored to reference them again later if the chatbot offers that service.
AFAIK this has already been a problem, you can find Samsung M.2 SSDs for cheaper than Samsung SATA SSDs at the same capacity, because their cloud customers have all flown past classic SATA/SAS for NVME U.2 and U.3, which is much more similar to M.2 due to NVME.
I was planning on adding a big SSD array to my server which has a bunch of external 2.5 SAS slots, but it ended up being cheaper and faster to buy a 4 slot M.2 PCIe card and buy 4 M.2 drives instead.
Putting it on a x16 PCIe slot gives me 4 lanes per drive with bifurication, which gets me the advertised maximum possible speed on PCIe 4.
Whether or not the RAM surge will affect chip production capacity is the real issue. It seems all 3 OEMs could effectively reduce capacity for all other components after slugging billions of dollars into HBM RAM. It wouldn’t just be SSDs, anything that relies on the same supply chain could be heavily affected.
Exactly this. Micron ended their consumer RAM. Sansung here is just stopping producing something that is arguably outdated, and has a perfectly fine, already more available, most often cheaper or equivalent modern replacement.
Just bought 2tb for $89.
I just spent $200 on a Samsung 990 Pro 2TB :/
I’m finally swapping my main PC to Linux full time and ended up buying an entire new boot drive rather than dealing with shuffling files around to make space.
I bought exactly that one for my linux box (because I swapped mains to linux and wanted a big boot :-) ), but that was a while back, it’s 280€ now, so $328…
I did similar when preparing my wife and I for windows 10 EOL. I went back to Linux on the new drive, my wife to Windows 11. Honestly both have a similar amount of issues (mostly wake from sleep challenges on Linux, although my PC wasn’t great about waking from sleep on Windows to begin with) and most importantly my wife can still play Fortnite and I can have fun trying new stuff out and reveling at how every single game I try just works on Linux whereas 5 years ago it was more of a 50/50 chance whether or not a game would work
This one if for the raz pi 5.
Nice, where’d you get that deal?
https://www.shopmyexchange.com/
For vets. No taxes and shipping is free or a standard $5 charge.
Nice, that’s awesome.
If prices get bad enough, I’ll start buying from there and selling out of my trunk.
Lol may as well stock up now. It’s coming
More importantly, what brand, type and/or specs. It’s easy to get cheap disks with crap performance. I have a few around that we quickly dubbed “Super Slow Disks”.
HDTCA40XK3CA
The massive upvote count on this post is evidence that this community does not understand technology at all and just wants to be angry and yell at clouds.
Care to explain that understanding you have and the community lacks?
I take issue with this forced distinction they are making
Micron, like Samsung and SK Hynix, already supplies memory chips directly to third-party brands such as G.Skill and ADATA. Even without Crucial-branded kits, Micron DRAM continues to reach consumers through other manufacturers, meaning overall supply remains largely unchanged.
Nobody ever officially suggested the Crucial supply was likely to shift to the other manufacturers for consumers. On the contrary people expect this to be a step towards a general redistribution of manufacturing capacity towards HBM for parallel compute products.
By comparison, Samsung exiting SATA SSDs removes an entire class of finished consumer products from one of the world’s largest NAND suppliers. Tom argues that this is why the Samsung move is “worse” for consumers: it directly affects how many drives are available, not just who sells them.
If you wanted you could make the same argument as for Micron. Who says the Samsung NAND couldn’t be bought by other OEMs to make consumer SSDs. It’s just as possible as the Micron supply shifting to other OEMs who make consumer RAM sticks.
To me neither are likely. The manufacturing capacity both companies are pulling from the consumer market in both cases is going to go to the higher profit margin parallel compute server market. Neither is worse than the other, they are both equally bad news for us consumers.
On the contrary people expect this to be a step towards a general redistribution of manufacturing capacity towards HBM for parallel compute products.
That is where much of the overall wafers are going. But that would be happening regardless of whether the Crucial brand is around or not. Even if Crucial was still a thing going forward, those same wafers would still be going towards HBM.
I think he hit the nail on the head when he said that Crucial being cancelled is just a symptom of our shit market, not one of the causes. It makes zero difference.
Who says the Samsung NAND couldn’t be bought by other OEMs to make consumer SSDs
His point is that Samsung (the manufacturer) is scrapping production, not that Samsung (the consumer brand) is stopping selling products that otherwise are still being produced and sold under different brand names.
Stopping production of something sold under many brands is obviously a lot worse than a brand stopping sales of something that other brands will still sell (albeit in lower quantities in previous years due to HBM production being ramped up at the cost of DDR5).
It’s sort of not needed. M.2 for the OS, HDD for extra stuff like steam/epic games.
No that’s not correct, a lot a consumer hardware have SATA port (like old laptop)
Replacing old HDD into SSD SATA to run the OS is the way to go in this case.So not so useless…
Why would ending sata ssd production create price pressure for m2 ssds? If anything, they should be able to produce more of those.
M.2 is just a connector, you can run SATA over M.2. But you’re right, freeing up 2.5" production for M.2 should reduce price pressure.
So maybe that computer I just bought will be my last for a while then.
Syrup of Squill.
This bubble is going to become the entire market, isn’t it. Until it becomes too big to fail because 80% of the workforce is tied up in it. Then it is allowed to pop, costing the western world everything, all going into the pockets of the super rich, and we get to start over.
it becomes too big to fail because 80% of the workforce is tied up in it
In 2008, banking sector and auto industry needed bailouts for the investor/financial class. Certainly, there was no need to layoff core banking employees, if government support was the last resort to keep the doors open AND gain controlling stake over future banking profitablity in a hopefully sustainable (low risk in addition to low climate/global destruction) fashion. The auto bailout did have harsher terms than the banking bailout, and recessions definitely harm the sector, but the bailouts were definitely focused on the executives/shareholders who have access to political friendships that result in gifts instead of truly needed lifelines, or wider redistribution of benefits from sustainable business.
The point, is that workforce is a “talking point” with no actual relevance in bailouts/too big to fail. That entire stock market wealth is concentrated in the sector, and that we have to all give them the rest of our money (and militarist backed surveillance freedom) or “China will win” at the only sector we pretend to have a competitive chance in, is why our establishment needs another “too big to fail moment”. We’ve started QE ahead of the crash this time.
Work force is relatively small in AI sector. Big construction, but relatively low operations employment. It displaces other hiring too.
Then it is allowed to pop, costing the western world everything, all going into the pockets of the super rich, and we get to start over.
After the bailouts at the expense of the poor, of course.
That’s the entire point. It’s a scam.
Compared to crypto and NFTs, there is at least something in this mix, not that I could identify it.
I’ve become increasingly comfortable with LLM usage, to the point that myself from last year would hate me. Compared to projects I used to do with where I’d be deep into Google Reddit and Wikipedia, ChatGPT gives me pretty good answers much more quickly, and far more tailored to my needs.
I’m getting into home labs, and currently everything I have runs on ass old laptops and phones, but I do daydream if the day where I can run an ethically and sustainably trained, LLM myself that compares to current GPT-5 because as much as I hate to say it, it’s really useful to my life to have a sometimes incorrect but overalls knowledgeable voice that’s perpetually ready to support me.
The irony is that I’ll never build a server that can run a local LLM due to the price hikes caused by the technology in the first place.
It’s the difference between a pyramid scheme and an MLM: one of them has a product in the mix.
I’ve become increasingly comfortable with LLM usage, to the point that myself from last year would hate me. Compared to projects I used to do with where I’d be deep into Google Reddit and Wikipedia, ChatGPT gives me pretty good answers much more quickly, and far more tailored to my needs.
Please hate yourself, reflect on that and walk back from contributing to destroying the environment by furthering widespread adoption of this shitty technology. The only reason you seem to get “useful answers” is because of search engine and website enshittification. What you are getting is still tons worse than a good web research 10 years ago.
Basically you were taught to enjoy rancid butter because all restaurants around you had started tasting like shit first, then someone opened a rancid butter shop.
I do agree entirely. If I could use the internet of 2015 I would, but I can’t do so in a practical way that isn’t much more tedious than asking an LLM.
My options are the least rancid butter of the rancid butter restaurants or I churn my own. I’d love to churn my own and daydream of it, but I am busy, and can barely manage to die on every other hill I’ve chosen.
problem is that the widespread use of (and thereby provision of your data to) LLMs contributes to the rise of totalitarian regimes, wage-slavery and destroying our planet’s ecosystem. Not a single problem in any of our lives is important enough to justify this. And convenience because we are too lazy to think for ourselves, or to do some longer (more effort) web research, is definitely not a good excuse to be complicit in murder, torture and ecoterrorism.
I agree except for the fact that it’s unavoidable
It’s horrific, but its inescapable, the problem is not going away and while you’re refusing to use LLMs to accelerate your progress, the opposition isn’t
Don’t get me wrong, anyone who blindly believes sycophantic LLM garbage is a fool.
Its taken 4 years to overcome my llm moral ocd - and its only because I need to start working, in a world where every company forces AI down your throat, there are many who simply have no choice if they want to compete
Also I’m kinda glad I can spend more of my useful energy working towards my goals rather than battling the exact minutiae without any sort of guide
The thing is: LLMs do not accelerate the progress of proper software development. Your processes have to be truly broken to be able to experience a net gain from using LLMs. It enables shitty coders to output pull requests that look like they were written by someone competent, and thereby effectively waste the time of skilled developers who review such pull requests out of respect for the contributor, only to find out it is utter garbage.
web search isnt magically going back to how it was, and its not just search engines its every mf trying tk take advantage of seo and push their content to the top, search is going to get worse evry year, ai did speed it up by making a bunch of ai images pop up whenever you search an image
I heard a theory (that I don’t believe, but still) that Deepseek is only competitive to lock the USA into a false AI race.
that would be the funniest thing.
It’s the space race all over again!
the shoe event horizon.















