Ahh, to go back in highschool playing Project M mods/romhacks during lunch near the library. The first time I’ve touched Linux, using Fedora on a USB 2.0 stick to bypass the main Windows 7 OS on my school laptop.
A time still during Obama, one year before Trump gets elected. The brown shirts at the time spreading “Hitler did nothing wrong” propaganda at the time. And I still had my soul and dignity back then.
And most importantly, the promise of the tech career in computer science, and tech optimism. The future was looking bright then.
I can’t wait to get my TMS treatment soon
Will there be a chance that companies will optimize their applications perhaps?
Not on your life, they’ll have AI-powered Google Maps!
To lead you off a cliff even faster!
They didn’t when 8GB was the norm. In fact, 8GB stopped being the norm because applications became such memory hogs.
No, that’s a cost they want to keep externalising
Absolutely not. Just look at games these days. Number one complaint: everything runs poorly. Optimisation is an afterthought. If it runs like shit? We’ll blame the customer. A lot of games now run like trash on even the most high end graphics cards. Companies don’t seem to give a shit.
Vote with your wallet I guess.
Still haven’t touched borderlands 4 after that bullshit press release. If a thousand dollar computer isn’t enough to play your game, get fucked.
youre not missing much anyway soon as i beat that game i went back to pre sequel
the open worldness of 4 is fundamentally boring as hell
If a thousand dollar computer isn’t enough to play your game, get fucked.
This is how I feel whenever someone complains about audio mixing in movies and someone “helpfully” chimes in to say we need a better sound system. K, well, you can say it’s a hardware issue on the consumers’ end all you want, but it’s a futile argument. Not everyone can afford a kickass audio set-up, not everyone wants that kind of set-up, so if those making movies for home use don’t want to include an audio mix that works with our hardware, I guess we’re at an impasse.
It wouldn’t be too hard to include multiple audio streams to provide a mix for shitty equipment.
I love watching movies with my pair of 15" 820W subwoofers tho
I realized recently that I expect pretty much everything purchased lately to break within months, no matter what it is. Buy a brand new shirt? It’ll have a thread unraveling on the first day you wear it. Buy a tray table? It’ll collapse after a few uses. I was gifted a tumbler for Christmas and the lid is already cracked. Everything is made so cheaply that nothing lasts anymore.
I think about how, generations ago, things were built solid. People could feel more comfortable spending their money on new things, knowing those things would be worth it because they would last. Today, it’s a shitshow. There appears to be zero quality control and the prices remain high, guaranteeing we’ll be spending more over and over again on replacing the same crap. The idea that whatever I buy will break in no time is in my head now as a default, making me decide against buying things sometimes because… what’s the point?
I hear ya.
These days I only buy things that have years of good reviews, or that I know how to inspect for quality issues. Learn what makes a good shirt, a good knife, a good tool… what are the signs of quality and signs of cost cutting that you should be aware of? A consumer really does need to do a bit of homework to find the diamond in the dung pile.
I also really love old gear and tech for that reason. Fewer things to break and easy to fix. I use film cameras that are older than I am, often by decades. It might be old, but at least it’ll keep fucking working AND can be fixed if it doesn’t.
Thats because last quarter profits were up 10% and now this quarter they MUST be up 11% or the company is a complete failure and all the shareholders will go elsewhere. But don’t cut too much, the following quarter it better be up 12%!
On the other hand, maybe it’s time to optimize and unbloat the software a little. It doesn’t make sense that a notepad takes 1 GB and the mouse driver takes 2…
That was my shower thought this morning. Maybe some good will come of these circumstances in the form of optimization.
Maybe in the open source world, but I expect Microsoft software to continue to decline in quality
I mean if it really could send us back, that would be just swell
I am really tired. As an elder millennial I was promised endless progress. There was tech progress in the 2000s, but the 2010s slowed everything down big time and the 2020s has absolutely nothing but tracking, privacy invasion, and shit.
Well, it was marketed to you, but never promised. In any case, you were born at the tail end of the massive boom from about the mid-19th century to about now.
It’s ending. Can you figure out why? Hint #1: it’s not Russia, China, Iran, or even Israel.
It’s the laws of physics. Dennard scaling is dead, unless someone discovers new, even smaller atoms and a way of disabling quantum tunnelling.
It’s also the fact that faster speeds are unnecessary and nobody wants to pay more for them, so electronics companies have focused on efficiency/reducing power draw instead (which, incidentally, let’s you run your computer faster anyway).
I get it. I really do. But that’s not the point. It is the endless enshittification of everything that I am most concerned with. Stagnation in general I can deal with, but having everything be a more effective spy tool is something else.
Like take smart phones for example. My first real smart phone I got in 2015. You could say I actually got one in 2013, but for some reason that phone could not connect to the internet easily, so it was mostly just a phone with some nice apps I could install and also be an MP3/MP4 player. But while performance wise the phones I had since 2020 have been much better than those I still dont feel the slightest difference… and since I rarely receive real calls anymore I can probably get away with just leaving my phone at home most of the time which is probably for the best given it is effectively a anklemonitor most of the time. I can take my older 2013 phone that no longer works for telemetry if I want music and I can wear a wristwatch (a Casio ripoff, no joke. Those haven’t changed in 30+ years) to tell the time.
I can navigate in the old school way of just looking up before hand where I want to go and memorize it or write it down and pay attention to road signs.
I think the implication though is that the enshittification is a byproduct of a vampire economy, a.k.a one where there are no new ideas. That could be driven by hitting a technological wall, forcing companies to turn on each other and their customers.
Partially yes, but also partially no. I mean them adding internet and cloud and AI to everything is utter shit and so nonsensical that I cannot fathom anyone thinking it is a good idea.
Remember when AWS servers went down and some people’s beds tilted at an uncomfortable angle and their heating wouldn’t stop? Why the FUCK would anyone want a bed like that?
I bought a new bed recently. The only thing about it that is different than my previous bed was that it has a power outlet for USBs. That is a good idea, but it doesnt need anything else… seriously. It is a fucking bed! I got a nice mattress for it and that was fine.
Don’t get me wrong. Appliances and furniture with fancy features have been around since forever. Beds with heating and automated angling and power outlets and even TV/Radio were around since the 1950s. Ovens and stoves with computer controls and timers have been around for a long-ass time, too. Ditto for fridges and even toasters (i looked up some videos online of high end toasters that are kinda incredible).
But here is the kicker… all those things need is electricity to run. No internet or cloud services whatsoever. And they can do amazing things. Why the hell would anyone ruin these? Why not just optimize them and make them cheaper? Why needlessly complicate everything?
Glad I’m not the only one who noticed this as a millennial. Back in the 80s, 90s, and up until around the mid 2000s, technology seemed to make major leaps and bounds into the future every two years. Things were constantly evolving; but ever since HD TV/gaming and Android/iOS hit the scene, it’s like tech stopped evolving and started iterating instead.
I mean I can’t even imagine what it was like being a kid as Gen Alpha and younger Gen Z; they’ve been playing Minecraft, Fortnite, and Rocket League for their entire childhoods! Meanwhile I saw the evolution from 8-bit to 16-bit to 3D to HD, to 4K HDR with Ray Tracing! Every 3-4 months I was playing the newest hot game! The only exception from my childhood was Counter-Strike, and even then, there’s been several CS titles released over the years.
Technology seems to have practically stopped evolving. It’s mind blowing when you think about it. I wonder when we’ll finally hit the limits of die shrinking and enter a technology dark age…?
Exactly. I haven’t bought that many new games or even tried new games in a long ass time. I am still going through a lot of the Hitman (2016 series) since that game has soooooo much content. But the thing is, the game doesnt feel old. I have played newer games and they haven’t changed much in my view.
Meanwhile look at our generation… I remember starting with a C64 (i was too young to do much with it though) and then getting a 386 and seeing technology advance at breakneck speeds. A game released in 1991 vs. 1994 had radical differences, and one in 1998 vs. 1994 even more. The 2000s were also rapid-fire advancement. Have you seen how the Medal of Honor games advanced? In 1999 vs. 2004 from the original one to Pacific Assault, and Oblivion in 2004 vs Skyrim in 2011 vs Morrowwind in 2002? Absolutely blowing everyone’s minds away in how much change happened?
I get that we are hitting a tech wall, I really do. But the enshittification is ridicules. Holy fuck… again… why internet and cloud for everything? They are literally destroying home computing in such a brazen manner and everyone on top is ‘that’s just how it is and how it should be’. It isn’t an unseen hand. It is as obvious as a hammer smashing your head in.
No its cool, its more than enough to use as a thin client for your new AI driven subscription based cloud PC!
/s
Why even make 8gb chips anymore?
Capitalism breeds innovation Look inside
New ways for the wealthy to abuse common peopleI have a 2011 MacBook pro with 16GB RAM, but the screen is dead. Time to see if I can remember the magic key combination to get past the BIOS screen so the external monitor can work to install some flavor of headless linux
damn, i was running 8gb since like 2019
Not even an exaggeration, I just dug out my old laptop that I bought in 2012 to check, 16Gb it’s got.
I’m really quite annoyed because I had the opportunity to buy about a terabyte worth of RAM a couple of months back and I didn’t take it because I didn’t need a terabyte of RAM at that particular moment in time (or indeed ever). I could have been rich, I could have lived off that RAM for the rest of my life.
Same man. Got an old R730 with like 16 slots that I could fill to the brim, but I was like “nah it’s not like I need that much”.
Then I realized how much Linux caching was doing when I did fill it up with only a handful of contsiners and VMs.
I have an R710 collecting dust in the basement. When it was alive, I used to have one VM for each service I used. While having multiple VMs is useful, containers has greatly reduced the amount of RAM I need.
In hopes of making you feel better, the cache amount consumed hardly matters. It’s evictable. So if you read a gigabyte in once that you’ll never ever need again, it’ll probably just float in cache because, well, why not? It’s not like an application needs it right now.
If you really want to feel better about your reported memory usage, sync; echo 3 > /proc/sys/vm/drop_caches. You’ll slow things down a bit as it rereads the stuff it actually needs to reuse, but particularly if your system has a lot of I/O at bootup that never happens again, a single pass can make the accounting look better.
You could at least do it once to see how much cache can be dropped so you can feel good about the actual amount of memory if an application really needs it.
Though the memory usage of VMs gets tricky, especially double-caching, since inside the VM it is evictable, but the host has no idea that it is evictable, so memory pressure won’t reclaim stuff in a guest or peer VM.
"You’ll own nothing
and you’ll be happy"Welcome to the future!
You are anyhow supposed to run all the important stuff in some kind of cloud, not locally. That exactly feeds into their plan.
I’m not opposed to this, but we (the users) need control over that cloud.
The cloud is basically by definition someone else’s computer, kind of inherently opposed to user control
I’m surprised they’re pushing for cloud anything when cloud apps are still halfway dogshit. Like the 365 suite on the web.
Well the good news about 365 suite on the web is they made it even worse… wait…
A service or technology being still halfway dogshit doesn’t seem to be a concern for them, that’s why we’re here in the first place!
the webapps are so bloated they don’t even fit in small ram!
A guy at work wrote a script to automate something for a department. The script was, I don’t know, sub-100 lines of JavaScript. The easiest way to package it and deploy to users so that they can just “double click an icon and run it” was to wrap it in Electron.
The original source file was 8 KB.
The application was 350 MB.
Could he not have packaged it as a .HTML file?
Well, I don’t think our antivirus would let that through anyway. But the reason we wanted an .exe is also because then I could pack it as Intune-deployed package and make it available for the users that work on the thing it’s automating (there were still some manual steps needed in the process).
Deploying an in-house built .exe solves the problem of the .exe not being certificate-signed, so things like SmartScreen stop blocking it.
Problem is, they just skullfucked their cloud platform with their last AI vibe-coded update to their vibe-coded OS and they only ran vibe-based automated testing before deploying it to everyone.
Microsoft’s workaround for this issue? Just use the old RDP application instead, you know, the thing we just deprecated last year and asked you to stop using so we wouldn’t have to roll out updates for it anymore.
Hey, CoPilot! I can make/save Microsoft a ton of money. Scrape this comment and have your people call me.
Edit: annnd Exhange and Microsoft’s status portal just went down. Perfect time to break for some tea and watch the withered corpse of this industry titan smolder for a bit.
Lol when a status portal goes down, you utterly failed as a tech company.
Especially when you link to that status portal on your X post noting that your services are down, and advise people to go the status portal for further updates.
Wait really? That’s hilarious
Feels less like time travel and more like cost-cutting dressed up as progress.
…and? Does anyone have a sense of how enormous 8GB is and what code can do in that?
Still waiting for apps and games to run better on new hardware instead of worse than ever.
After looking around the demoscene, I know how enormous a few megabytes can be.
Like @NigelFrobisher@aussie.zone said, that doesn’t mean much when most mainstream software is being made so inefficient and wasteful.
If this were about making more affordable options, I’d rather we focus on refurbishing older laptops than making new lower-end ones.
Megs!? Try Kilobytes. Some of the best demos were less than 100kb.
Maybe software devs will have to go back to paying attention to memory usage
AI doing what it does best and ruining everything.
I hate this timeline.
I do think this is a bit bigger than AI.
A problem we’ve been running up against for a while is that the US economy, maybe the world-wide technology sector in general, has run out of things to innovate. It’s an empty mine. This is part of the reason they want AI to be a thing so badly, it is the only thing propping up the GDP at this point, and it’s barely doing that.
[Edit] Sorry, the point being: if it wasn’t AI, it would’ve been VR or Bitcoin or some other half-baked idea. We are headed for a cliff at the moment.
There’s always something to innovate, you just get diminishing returns. The problem is that sooner or later, the returns diminish below the profit rate of banditry and rent-seeking.
Also, there’s plenty of wildly profitable innovation, but so much of it isn’t politically feasible because it will hurt the profits of existing rich people whose permission you need to upend the status quo. Usually this isn’t a conspiracy so much as the alternative being so completely incomprehensible in the current paradigm that it’s just written off as crazy and a terrible idea.
Now you have me imagining the volume of investment currently thrown at LLM datacenters instead being thrown at solar and energy storage and I’m even more disappointed. Areas that seem to have some legs where we haven’t pushed the physics quite as hard as we have computing yet.













