Well I am shocked, SHOCKED I say! Well, not that shocked.

  • sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    43
    ·
    edit-2
    1 month ago

    In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.

    (Lowest price I can find)

    … That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago… or even just a reasonably high end PC from right now.

    The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures… which has necessitated the invention of intelligent temporal frame upscaling, and frame generation… the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.

    This reality is a farce.

    Meanwhile, if you jump down to 1440p, well, I’ve got a future build plan sitting in a NewEgg wishlist right now.

    RX 9070 (220 W) + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X, but about half the wattage draw) … so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you’d need… potentially with enough room to also add in some extra internal HDD storage drives, ie, you’ve got leftover wattage headroom.

    If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.

    That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.

    Or you could swap out for a maybe a 5070 (non TI, the TI is $1000 more) Nvidia card, but seeing as I’m making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.

    • CybranM@feddit.nu
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      1 month ago

      The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures

      You clearly don’t know what you’re talking about here. Ray tracing has nothing to do with textures and very few games force you to use RT. What is “allowing” devs to skimp on optimization (which is also questionable, older games weren’t perfect either) is DLSS and other dynamic resolution + upscaling tech

      • lordbritishbusiness@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 month ago

        Doom the Dark Ages is possibly what they’re referring to. ID skipped lighting in favour of Ray tracing doing it.

        Bethesda Studios also has a tendency to use hd textures on features like grass and terrain which can safely be low res.

        There is a fair bit of inefficient code floating around because optimisation is considered more expensive than throwing more hardware at a problem, and not just in games. (Bonus points if you outsource the optimisation to some else’s hardware or the modding community)

        • sp3ctr4l@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 month ago

          That is a prominent example of forced RT… basically, as I described with the TAA example in my other reply…

          idTech 8 seems to be the first engine that just literally requires RT for its entire render pipeline to work.

          They could theoretically build another version of it off of vulkan-base, to enable you to be able to turn RT off… but that would likely be a massive amount of work.

          On the bright side… at least the idTech engines are actually well coded, and they put a lot of time into making the engine actually very good.

          I didn’t follow the marketing ecosystem for Doom Dark Ages, but it would have been really shitty if they did not include ‘you need a GPU with RT cores’.

          On the other end of the engine spectrum:

          Bethesda… yeah, they have entirely lost control of their engine, it is mangled mess of nonsense, the latest Oblivion remaster just uses UE to render things slapped on top of Gamebryo, because no one at Bethesda can actually code worth a damn.

          Compare that to oh I dunno, the Source engine.

          Go play TitanFall 2. 10 year old game now, built on a modified version of the Portal 2 Source engine.

          Still looks great, runs very efficiently, can scale down to older hardware.

          Ok, now go play HL Alyx. If you don’t have VR, there are mods that do a decent job of converting it into M+K.

          Looks great, runs efficiently.

          None of them use RT.

          Because you don’t need to, if you take the time to actually optimize both your engine and game design.

      • sp3ctr4l@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 month ago

        I meant they also just don’t bother to optimize texture sizes, didn’t mean to imply they are directly related to ray tracing issues.

        Also… more and more games are clearly being designed, and marketed, with ray tracing in mind.

        Sure, its not absolutely forced on in too many games… but TAA often is forced on, because no one can run raytracing without temporal intelligent upscsling and frame gen…

        …and a lot of games just feed the pixel motion vectors from their older TAA implementations into the DLSS / FSR implementations, and don’t bother to recode the TAA into just giving the motion vectors as an optional API that doesn’t actually do AA…

        … and they often don’t do that because they designed their entire render pipeline to only work with TAA on, and half the games post procrssing effects would have to be recoded to work without TAA.

        So if you summarize all that: the ‘design for raytracing support’ standard is why many games do not let you turn off TAA.

        That being said: Ray tracing absolutely does only really make a significant visual difference in many (not all, but many) situations… if you have very high res textures.

        If you don’t, older light rendering methods work almost as well, and run much, much faster.

        Ray tracing involves… you know, light rays, bouncing off of models, with textures on them.

        Like… if you have a car with a glossy finish, that is reflecting in its paint the entire scene around it… well, if that reflect map that is being added to the base car texture… if that reflect map is very low res, if it is generating it from a world of low res textures… you might as well just use the old cube map method, or other methods, and not bother turning every reflective surface into a ray traced mirror.

        Or, if you’re doing accumulated lighting in a scene with different colors of lights… that effect is going to be more dramatic, more detailed, more noticable in a scene with higher res textures on everything being lit.

        I could write a 60 page report on this topic, but no one is paying me to, so I’m not going to bother.

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 month ago

      Saved up for a couple of years and built the best (consumer grade) non nvidia PC I could, 9070XT, 9950X3D, 64gig of RAM. Pretty much top end everything that isn’t Nvidia or just spamming redundant RAM for no reason. The whole thing still costs less than a single RTX 5090 and on average draws less power too.

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 month ago

          I tried Mint and Ubuntu but Linux dies a horrific death trying to run newly released hardware so I ended up on ghost spectre.
          (I also assume your being sarcastic but I’m still salty about wasting a week trying various pieces of advice to make linux goddamn work)

          • bitwolf@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 month ago

            Levelone techs had relevant guidance.

            Kernel 6.14 or greater Mesa 25.1 or greater

            Ubuntu and Mint idt have those yet hence your difficult time.

      • sp3ctr4l@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        1 month ago

        Yep, thats gonna be significantly more powerful than my planned build… and likely somewhere between 500 to 1000 more expensive… but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.

        I’m guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you’d only need that full wattage for running raytracing in 4k.

        Does that sound about right?

        Eitherway… yeah… imagine an alternate timeline where marketing and industry direction isn’t bullshit, where people actually admit things like:

        Consoles cannot really do what they claim to do at 4K… at actual 4K.

        They use checkerboard upscaling, so basically they’re actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they’re actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don’t show console gamers real graphics settings menus, so they don’t know that.

        Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 month ago

          1000W PSU for theoretical maximum draw of all components at once with a good safety margin. But even when running a render I’ve never seen it break 500W.

  • Deflated0ne@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    30 days ago

    Ah capitalism…

    Endless infinite growth forever on a fragile and very much finite planet where wages are suppressed and most money is intentionally funneled into the coffers of a small handful of people who are already so wealthy that their descendants 5 generations down the line will still be some of the richest people on the planet.

  • MystikIncarnate@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    30 days ago

    I was gifted a 2080Ti about a year or so ago and I have no intention on upgrading anytime soon. The former owner of my card is a friend who had it in their primary gaming rig, back when SLI wasn’t dead, he had two.

    So when he built a new main rig with a single 4090 a few years back he gifted me one and the other one he left in his old system and started using that as a spare/guest computer for having impromptu LANs. It’s still a decent system, so I don’t blame him.

    In any case, that upgraded my primary computer from a 1060 3G… So it was a welcome change to have sufficient video memory again.

    The cards keep getting more and more power hungry and I don’t see any benefit in upgrading… Not that I can afford it… I haven’t been in school for a long time, and lately, I barely have time to enjoy YouTube videos, nevermind a full assed game. I literally have to walk away from a game for so long between sessions that I forget the controls. So either I can beat the game in one sitting, or the controls are similar enough to the defaults I’m used to (left click to fire, right click to ADS, WASD for movement, ctrl or C for crouch, space to jump, E to interact, F for flashlight, etc etc…); that way I don’t really need to relearn anything.

    This is a big reason why I haven’t finished some titles that I really wanted to, like TLoU, or Doom Eternal… Too many buttons to remember. It’s especially bad with doom, since if you don’t remember how, and when to use your specials, you’ll run out of life, armor, ammo, etc pretty fast. Remembering which special gives what and how to trigger it… Uhhh … Is it this button? Gets slaughtered by an imp … Okay, not that button. Reload let’s try this… Killed by the same imp not that either… Hmmm. Goes and looks at the key mapping ohhhhhh. Okay. Reload I got it this time… Dies anyways due to other reasons

    Whelp. Quit maybe later.

  • gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    ·
    30 days ago

    Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

    So my next card is probably gonna be an RX 9070XT.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      30 days ago

      even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they’re playing their kid’s console games.

      Every year we say “Gonna look into upgrading” but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn’t also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        30 days ago

        It makes me wonder if this will bring more people back to consoles. The library may be more limiting, but when a console costs less than just a gpu, itll be more tempting.

  • Anomalocaris@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    1 month ago

    plus, i have a 3060. and it’s still amazing.

    don’t feel the need to upgrade at all.

    • adarza@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      me neither. best is a 1070. don’t play newer ‘demanding’ games, nor do i have a system ‘worthy’ of a better card anyway.

    • SuiXi3D@fedia.io
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      Yeah, my 2080ti can run everything sans ray traced stuff perfectly, though I also haven’t had any issues with Indiana Jones or Doom: The Dark Ages.

      • altkey@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        Akschually, Doom DA needs to have raytracing enabled at all times, and your vcard is in the first nvidia gen that has it. While 10xx and 20xx haven’t shown much of a difference, and both series are still okay for average gaming, there’s the planned divide vcard producers wanted. RTX IS ON ads visuals were fancy at best (imho) while consuming too much resources, and now there’s the first game that doesn’t function without it, pushing consumers to either updgrade their hardware or miss out on big hits. Not the first time it happened, but it gives a sense why there were a lot of media noise about that technology in the beginning.

  • Allemaniac@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 month ago

    I’m sitting on a 3060 TI and waiting for the 40-series prices to drop further. Ain’t no universe where I would pay full price for the newest gens. I don’t need to render anything for work with my PC, so a 2-3 year old GPU will do just fine

    • Warehouse@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      30 days ago

      I’m pretty sure that production of the 40 series has stopped. The 50 series uses the same node.

    • okmko@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Exact same here and I’m upgrading any time soon. MHWilds runs like ass no matter the card and I’m back to playing Hoi4 in the mean time.

  • Phoenicianpirate@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 month ago

    I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it’ll be my right for at least 10 years.

    But it was expensive.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      Also built a dream machine in 2022. I have a 4090, a 7700X, 32GB of DDR5 6000, and 8TB of NVME storage. It’s got plenty of power for my needs; as long as I keep getting 90+ FPS @ 4K and programs keep opening instantly, I’m happy. And since I bought into the AM5 platform right at the beginning of it, I can still upgrade my CPU in a few years and have a brand new, high end PC again for just a few hundred bucks.

  • GrindingGears@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    The PC industry has turned into a scuzzy hellscape for average joes that just want to have decent options at realistic prices. They don’t even care about gaming anymore, it’s about YouTube and BitcoinBruhzz now.

    I’ve still got a still pretty decent setup (5800x3d 4070ti), but it’s the last stand for this guy I’m afraid. Looking over the past decade or so, I’ve honestly had better gaming experiences on consoles for mere fractions of the price of a PC build. Mods and PC master race nonsense aside. Sure you don’t need a subscription for online PC playing (I rarely play online), but you can barely get a processor for what a PS5 costs anymore. Let alone a video card, which is upwards of a lot of people’s take home pay for a month, the way things are going.

  • localhost443@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    Bought a 5700xt on release for £400, ran that til last year when the 7900gre released in the UK. Can’t remember what I paid but it was a lot less than the flagship 7900 and I forsee lasting many years as I have no desire to go above 2K.

    AMD GPUs have been pretty great value compared to nvidia recently as long as you’re not tying your self worth to your average FPS figures.

    • vividspecter@lemm.ee
      link
      fedilink
      English
      arrow-up
      50
      ·
      1 month ago

      It doesn’t help that the gains have been smaller, and the prices higher.

      I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.

      • GrindingGears@lemmy.ca
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 month ago

        Not to mention the cards have gotten huge and you just about need a nuclear reactor to power them. Melting cables and all.

      • arudesalad@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        I have a 6700xt and 5700x and my pc can do vr and play star citizen, they are the most demanding things I do on my pc, why should I spend almost £1000 to get a 5070 or 9070 and an am5 board+processor?

      • harxanimous@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        30 days ago

        Well that depends on your definition of significant. Don’t get me wrong, the state of the GPU market is not consumer friendly, but even an RX 9070 provides over a 50% performance uplift over the RX 6800.

      • AndyMFK@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 month ago

        I just picked up a used RX 6800 XT after doing some research and comparing prices.

        The fact that a gpu this old can outperform or match most newer cards at a fraction of the price is insane, but I’m very happy with my purchase. Solid upgrade from my 1070 Ti

      • ByteJunk@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 month ago

        I’m in the same boat.

        In general, there’s just no way I could ever justify buying a Nvidia card in terms of cost per buck, it’s absolutely ridiculous.

        I’ll fork over 4 digits for a gfx when salaries go up by a digit as well.

    • Jesus_666@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 month ago

      When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?

      Somewhere around 1996 when the 3dfx Voodoo came out. Once a year was a relatively conservative upgrade schedule in the late 90s.

        • Jesus_666@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 month ago

          That’s still not cheap when you account for inflation. Of course there’s a world of difference between “not cheap” and what they charge these days.

    • Sixty@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 month ago

      Sticking with 1440p on desktop has gone very well for me. 2160p isn’t worth the costs in money or perf.

    • 474D@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 month ago

      “When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?” - that’s a really good question because I don’t think normal PC gamers have ever, and still aren’t, like that. It’s basically part of the culture to stretch your GPU to the limit of time so idk who you’re complaining about. Yeah, GPU prices are bullshit rn but let’s not make up stuff

      • zurohki@aussie.zone
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 month ago

        Nah, there was a time when you’d get a new card every two years and it’d be twice as fast for the same price.

        Nowadays the new cards are 10% faster for 15% more money.

        I bought a new card last year after running a Vega 64 for ages and I honestly think it might last me ten years because things are only getting worse.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 month ago

      It’s never been normal to upgrade every year, and it still isn’t. Every three years is probably still more frequent than normal. The issue is there haven’t been reasonable prices for cards for like 8 years, and it’s worse more recently. People who are “due” for an upgrade aren’t because it’s unaffordable.

      • Robust Mirror@aussie.zone
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 month ago

        If consoles can last 6-8 years per gen so can my PC.

        Your PC can run 796 of the top 1000 most popular games listed on PCGameBenchmark - at a recommended system level.

        That’s more than good enough for me.

        I don’t remember exactly when I built this PC but I want to say right before covid, and I haven’t felt any need for an upgrade yet.

    • missingno@fedia.io
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      1 month ago

      I don’t think they’re actually expecting anyone to upgrade annually. But there’s always someone due for an upgrade, however long it’s been for them. You can compare what percentage of users upgraded this year to previous years.

      • dditty@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        I just finally upgraded from a 1080 Ti to a 5070 Ti. At high refresh-rate 1440p the 1080 Ti was definitely showing its age and certain games would crash (even with no GPU overclock). Fortunately I was able to get a PNY 5070 Ti for only ~$60 over MSRP at the local Microcenter.

        5000 series is a pretty shitty value across the board, but I got a new job (and pay increase) and so it was the right time for me to upgrade after 8 years.

    • qweertz (they/she)@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      25 days ago

      Still rocking a GTX 1070 and I plan on using my Graphene OS Pixel 8 Pro till 2030 (only bought it (used ofc) bc my Huawei Mate 20 Pro died on me in October last year 😔)

  • Alphane Moon@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 month ago

    It seems like gamers have finally realized that the newest GPUs by NVIDIA and AMD are getting out of reach, as a new survey shows that many of them are skipping upgrades this year.

    Data on GPU shipments and/or POS sales showing a decline would be much more reliable than a survey.

    Surveys can at times suffer from showing what the respondents want to reply as opposed to what they do.

      • Alphane Moon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        That’s why it’s best to focus on absolute unit shipment numbers/POS.

        If total units increased compared to the previous generation launch, then people are still buying GPUs.

          • Alphane Moon@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 month ago

            Shipment/POS do not telling you anything about unfulfilled demand or “unrealized supply”.

            It’s just how unit were shipped into the channel and sales at retail respectively.

            These are the best data points that we have to understand demand dynamic.

            Gamers are also a notoriously dramatic demography that often don’t go through on what they say.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      7
      ·
      1 month ago

      I mean, as written the headline statement is always true.

      I am horrified by some of the other takeaways, though:

      Nearly 3 in 4 gamers (73%) would choose NVIDIA if all GPU brands performed equally.
      
      57% of gamers have been blocked from buying a GPU due to price hikes or scalping, and 43% have delayed or canceled purchases due to other life expenses like rent and bills.
      
      Over 1 in 4 gamers (25%) say $500 is their maximum budget for a GPU today.
      
      Nearly 2 in 3 gamers (62%) would switch to cloud gaming full-time if latency were eliminated, and 42% would skip future GPU upgrades entirely if AI upscaling or cloud services met their performance needs.
      
      • GrindingGears@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 month ago

        That last one is especially horrifying. You don’t own games when you cloud game, you simply lease them. We all know what that’s done for the preservation of games. Not to mention encouraging the massive amounts of shovel ware that we get flooded with.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          1 month ago

          I don’t know that cloud gaming moves shovelware in either direction, but it really sucks to see the percentage of people that don’t factor ownership into the process at all, at least on paper.

        • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 month ago

          You don’t own games when you cloud game, you simply lease them.

          That’s also how it is with a game you purchased to play on your own PC, though. Unless you have it on physical media, your access could be revoked at any time.

      • xep@fedia.io
        link
        fedilink
        arrow-up
        10
        ·
        1 month ago

        if latency were eliminated

        I’m sure we’d all switch to room temperature fusion for power if we could, too, or use superconductors in our electronics.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          1 month ago

          That’s the problem with surveys, isn’t it? What’s “latency being eliminated”? On principle it’d be your streamed game responds as quickly as a local game, which is entirely achievable if your target is running a 30fps client on a handheld device versus streaming 60 fps gameplay from a much more powerful server. We can do that now.

          But is that “latency free” if you’re comparing it to running something at 240Hz in your gaming PC? With our without frame generation and upscaling? 120 Hz raw? 60Hz on console?

          The question isn’t can you get latency free, the question is at what point in that chain does the average survey-anwering gamer start believing the hype about “latency free streaming”?

          Which is irrelevant to me, because the real problem with cloud gaming has zero to do with latency.

    • snoons@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      It really depends if they hired a professional cognitive psychologist to write the survey for them. I doubt they did…

      • Lord Wiggle@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 month ago

        Nvidia is one of the most evil companies out there, responsible for killing nearly all other GPU producers destroying the market.

        • GrindingGears@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          1 month ago

          So is AMD with their availability of literally three video cards in stock for all of North America at launch. Which in turn just fuels the scalpers. Downvote this all you want guys, AMD is just as complicit in all of this, they’ve fuelled this bullshit just as much.

          • Lord Wiggle@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            1 month ago

            Nvidia is singlehandedly responsible for killing all competition but AMD. They destroyed all other GPU companies with the nastiest tactics to dominate the market, only AMD has been able to survive. You can’t blame AMD for chip shortages, it’s the after shock after the covid pandemic. Never ever has there been a higher demand for chips, especially thanks to the rising EV market.

            You can’t say AMD is as bad as Nvidia, as Nvidia is the sole reason the market got ruined in the first place. They are the worst of the worst.

            And don’t forget diaper Donny, who destroyed international trade with his fucking tariff wars.

    • qweertz (they/she)@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      I’ve been on Linux since 2018 (my PC is from 2016) and my next GPUs will always be AMD, unless Intel somehow manages to produce an on par GPU