Just make 580 legacy. Problem solved. :) (It’s already LTS though)
By the way, I have GTX 1080 and this seems to be end of the way. What is the similar AMD I can get, preferably second hand?
Maybe the RX 7600 XT.
It’s in the second to latest generation (7000 not 9000), should be slightly faster than a GTX 1080, and doubles the VRAM capacity to 16 GB so you wouldn’t be in danger of running into limitations with that too soon.
Sounds nice but let me rephrase. What is the similar AMD card that I can get by selling GTX 1080 without adding too much. I’m not looking for an upgrade anytime soon.
I’ll switch to Debian if I must but I don’t want to do that either unless I have to.
No idea, since I have no way of knowing how the second hand market around you looks like. I just looked for a similar (in performance) card, not of the newest gen so there would hopefully be some used models around.
I see, thanks for checking though. It seems I can sell GTX 1080 around 100$ here but RX 7600 XT is 320$ in second hand. Second hand market doesn’t seem bright here. Even RX 6600 is around 150$. That might work for me but I guess I’ll need to check a lot.
Hard to say what the used market is like, but the cheapest cards that would be broadly similar in performance would probably be the Arc A580, RX 5700 or RX 6600. This page has some rankings that are reasonable for comparison: https://www.tomshardware.com/reviews/gpu-hierarchy,4388-2.html
But…surely there’s a way to just stick with the latest supported driver, right? Or is Arch truly an “upgrade or die” distro?
Thanks for the comparison. It seems RX 6600 seems the most suitable option since the second hand prices for 5700 and 6600 are quite close. Probably GTX 1080 can compensate most of it.
Well, currently I’ll have to switch the AUR driver and will have to stick to it for a while until I find a good candidate. I would expect them to make it at least legacy driver so we don’t have to hoop around though. It’s not usually like this but this is a big change (thanks Nvidia) and Arch team made a big change too, so usual update would break my system it would seem.
Sounds like it’s time to switch out the 1080ti for a 9070xt. Been almost 10 years, probably due for an upgrade.
I will miss having that CUDA compatibility on hand for matlab tinkering. I wonder if any translation layers are working yet?
https://github.com/vosen/ZLUDA I’ve heard is doing pretty well
Looks cool, thanks for the link. I’ll give it a go.
He tried to warn y’all…

whose this and what speaking/event was it, do you remember?
That’s Linus Torvalds, the guy who made the Linux kernel. I think this was some interview he did, but I’m not sure.
I’ve had so many problems with Nvidia GPUs on Linux over the years that I now refuse to buy anything Nvidia. AMD cards work flawlessly and get very long-term support.
Yeah, I stopped using Nvidia like 20 years ago. I think my last Nvidia card may have been a GeForce MX, then I switched to a Matrox card for a time before landing on ATI/AMD.
Back then AMD was only just starting their open source driver efforts so the “good” driver was still proprietary, but I stuck with them to support their efforts with my wallet. I’m glad I did because it’s been well over a decade since I had any GPU issues, and I no longer stress about whether the hardware I buy is going to work or not (so long as the Kernel is up to date).
I just replaced my old 1060 with a Radeon 6600 rx myself.
Sadly GPU passthrough only worked on Nvidia cards when I was setting up my server, so I had to get one of them :(
deleted by creator
Same. Refuse to use NVIDIA going forward for anything.
I’m with you, I know we’ve had a lot of recent Linux converts, but I don’t get why so many who’ve used Linux for years still buy Nvidia.
Like yeah, there’s going to be some cool stuff, but it’s going to be clunky and temporary.
When people switch to Linux they don’t do a lot of research beforehand. I, for one, didn’t know that Nvidia doesn’t work well with it until I had been using it for years.
It’s a good way for people to learn about fully hostile companies to the linux ecosystem.
To be fair, Nvidia supports their newer GPUs well enough, so you may not have any problems for a while. But once they decide to end support for a product line, it’s basically a death sentence for that hardware. That’s what happened to me recently with the 470 driver. Older GPU worked fine until a kernel update broke the driver. There’s nobody fixing it anymore, and they won’t open-source even obsolete drivers.
I JUST ran into this issue myself. I’m running Proxmox on an old Laptop and wanted to use its 750M…. Which is one of those legacy cards now that I guess means I’d need to downgrade the kernel to use?
I’m not knowledgeable enough to know the risks or work I’d be looking at to get it working so for now, it’s on hiatus.
You might be able to use the Nouveau driver with the 750M. Performance won’t be great, but might be sufficient if it’s just for server admin.
Similar for me. All the talk about what software Linux couldn’t handle, I didn’t learn that Linux is incompatible with Nvidia until AFTER I updated my GPU. I don’t want to buy another GPU after less than a year, but Windows makes me want to do a sudoku in protest… but also my work and design software wont run properly on Linux and all anybody can talk about is browsers and games.
I’m damned whether I switch or not.
Linux hates Nvidia
got that backwards
Linus openly hated Nvidia, but I suspect Nvidia started it
If you only suspect then you never heard the entire quote and only know the memes.
My point is they dont work together. I can believe Nvidia ‘started’ it, but it doesnt matter or help me solve my problem. I’ve decided I want to try Linux but I can’t afford another card so I’m doing what I can.
You somehow still learned wrong, and I don’t understand how any of that happened. Nvidia not working well with Linux is so widely known and talked about, I knew about it, and the actual reason (which is the reverse of what you think), for several years before switching. I feel like you must have never tried to look anything up, spent any time in a place like lemmy or any forums with a Linux focus and basically must have decided to and kept yourself in some bubble of ignorance and no connection to learn anything.
This is an uncharitable interpretation of what I said.
Nvidia doesn’t tell me it doesn’t work. Linux users do. When I first used Linux for coding all those years ago, my GPU wasn’t relevant, nobody mentioned it during my code bootcamp or computer science certification several years ago, and ubuntu and Kubuntu both booted fine.
When I upgraded my GPU, I got Nvidia. It was available and I knew what to expect. Simple as.
Then as W10 (and W11) got increasingly intolerable, I came to Linux communities to learn about using Linux as a Windows replcement, looking into distros like Mint and Garuda, and behold: I come across users saying Linux has compatibility issues with Nvidia. Perhaps because it is ‘so well known’ most don’t think to mention it, I learned about it from a random comment on a meme about gaming.
I also looked into tutorials on getting Affinity design software to work on which distros, and the best I could find was shit like, I finally got it to run so long as I don’t [do these three basic functions].
I don’t care who started it, I can already believe it’s the for-profit company sucking up to genAI. But right now that doesn’t help me. I care that it’s true and that’s the card I have, and I’m still searching for distros that will let me switch and meets work needs and not just browsing or games.
I’m here now, aware that they don’t work, still looking for the best solution I can afford, because I did look up Linux.
Nvidia’s poor Linux support has been a thing for decades.
If at all, the situation has recently improved. And that only after high-profile Linux developers telling Nvidia to get their shit together.
Even now, CUDA is gold standard for data science / ML / AI related research and development. AMD is slowly brining around their ROCm platform, and Vulcan is gaining steam in that area. I’d love to ditch my nvidia cards and go exclusively AMD but nvidia supporting CUDA on consumer cards was a seriously smart move that AMD needs to catch up with.
Sorry for prying for details, but why exactly do you need NVIDIA?
CUDA is an Nvidia technology and they’ve gone out of their way to make it difficult for a competitor to come up with a compatible implementation. With cross-vendor alternatives like OpenCL and compute shaders, they’ve not put resources into achieving performance parity, so if you write something in both CUDA and OpenCL, and run them both on an Nvidia card, the CUDA-based implementation will go way faster. Most projects prioritise the need to go fast above the need to work on hardware from more than one vendor. Fifteen years ago, an OpenCL-based compute application would run faster on an AMD card than a CUDA-based one would run on an Nvidia card, even if the Nvidia card was a chunk faster in gaming, so it’s not that CUDA’s inherently loads faster. That didn’t give AMD a huge advantage in market share as not very much was going on that cared significantly about GPU compute.
Also, Nvidia have put a lot of resources over the last fifteen years into adding CUDA support to other people’s projects, so when things did start springing up that needed GPU compute, a lot of them already worked on Nvidia cards.
People buy Nvidia for different reasons, but not everyone faces any issues with it in Linux, and so they see no reason to change what they’re already familiar with.
I had an old NVidia gtx 970 on my previous machine when I switched to Linux and it was the source of 95% of my problems.
It died earlier this year so I finally upgraded to a new machine and put an Intel Arc B580 in it as a stop gap in hopes that video cards prices would regain some sanity eventually in a year or two. No problems whatsoever with it since then.
Now that AI is about to ruin the GPU market again I decided to bite the bullet and get myself an AMD RX 9070 XT before the prices go through the roof. I ain’t touching NVidia’s cards with a 10 foot pole. I might be able to sell my B580 for the same price I originally bought it for in a few months.
Oh, haha I was confused for a second about the Pascal language 🤭
I haven’t noticed as I have zero NVidia GPUs among my computers and servers.
Gnawh Penisburg. I’m still sporting a laptop with a 1070 from about a decade ago…
Yes indeed, this happened to my coworker yesterday and it took them an hour to get everything fixed. This is not OK.
Nah, the distro that openly says updates may cause breakages and to check if you need manual intervention before updating following their long established normal procedure is fine.
Pascal… Now there’s a name I haven’t heard in a long time… A long time
The 1060 is still hanging around ~2% usage on steam hardware survey, so it’s not completely irrelevant yet.
Delphi
Jeeez, lol
I can’t believe they would do this to poor Borland. I guess I’ll just need to use an AMD GPU for my Turbo Pascal fun.
I really dodged a bullet upgrading from my 1070Ti to the last AMD (9070XT or I misremembered?) for the black Friday. Lowest price of the generation just before RAM’s price skyrocketed.
My SO is not so lucky…
Maybe we should use this card for under TV computers with Windows… sadly?
That newer open source driver is still far behind but is progressing. Those graphics cards will have a great new life with modern kernels someday
except for the no reclocking thing, which cripples them
“Brodie” mentioned. To be fair on the Arch side, they are clear the system could break with an update and you should always read the Arch news in case of manual intervention. You can’t fault Archlinux for users not following the instructions. This is pretty much what Arch stands for.
And IMO if anything this is Nvidia’s doing, arch is just being arch, like it sucks but I also don’t see a problem with arch in this instance.
Brodie
Thinking Forth was a great book! I’m surprised it came up here though.
I’m running PopOS and Debian with Nvidia cards that should be affected by this, if I’m understanding this right. Ugh.
What a pain.
Nah, you’re in pop and Debian, you’ll be fine for a while.
My son was going to switch to Linux this week. He has a GTX 1060.
He just needs to stay on the 580 driver. Bazzite is handling that transparently and wont update you to the 590 driver if you have an unsupported GPU.
Then next time round, buy an AMD or Intel GPU. They tend tp treat their customers better.
Nice, maybe bazzite it is then.
deleted by creator
I guess he can’t say he uses arch btw
It’s not that bad as I understand it. If you are using arch with a Pascal GPU you can switch to the legacy proprietary branch. https://archlinux.org/news/nvidia-590-driver-drops-pascal-support-main-packages-switch-to-open-kernel-modules/
He wants to use Mint. This is what is called planned obsolescence. I say what Linus Torvalds says.
I have a 1050 Ti running the 580 driver under Linux Mint; it works fine.
Might be able to use Mint Debian Edition.
Nouveau might be good enough by now for most games that will run on a 1060, maybe worth a try.
AFAIK they still don’t support reclocking on anything older than Turing, meaning the GPU is stuck at the lowest clock frequency and therefore runs very slowly.
Am I the only one that can’t manage to make their Nvidia GTX 1060 run correctly on Linux? It has way worse performance than on Windows, even with the proprietary drivers.
I’ve tried both Kubuntu and Linux Mint.
I’ve got my 1060 running ok on Kububtu, though it was my wife’s when it was running on Windows so I can’t compare the performance. But I’m able to stream Cyberpunk in 1080p via Sunshine to Moonlight on my Apple TV, and it runs just fine.





















