• merthyr1831@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 hours ago

      Nvidia is just doing what every monopoly does, and AMD is just playing into it like they did on CPUs with Intel. They’ll keep competing for price performance for a few years then drop something that drops them back on top (or at least near it).

    • VindictiveJudge@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      10
      ·
      2 days ago

      Unfortunately, that’s the anti-scalper countermeasure. Crippling their crypto mining potential didn’t impact scalping very much, so they increased the price with the RTX 40 series. The RTX 40s were much easier to find than the RTX 30s were, so here we are for the RTX 50s. They’re already on the edge of what people will pay, so they’re less attractive to scalpers. We’ll probably see an initial wave of scalped 3090s for $3500-$4000, then it will drop off after a few months and the market will mostly have un-scalped ones with fancy coolers for $2200-$2500 from Zotac, MSI, Gigabyte, etc.

      • b34k@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 hours ago

        The switch from proof of work to proof of stake in ETH right before the 40 series launch was the primary driver of the increased availability.

      • nova_ad_vitum@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 hours ago

        The existence of scalpers means demand exceeds supply. Pricing them this high is a countermeasures against scalpers…in that Nvidia wants to make the money that scalpers would have made .

      • SaltySalamander@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        9 hours ago

        No, it’s a direct result of observing the market during those periods and seeing the lemmings beating down doors to pay 600-1000 dollars over MSRP. They realized the market is stupid and will bear the extra cost.

      • MDCCCLV@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        13 hours ago

        Not really a countermeasure, but the scalping certainly proved that there is a lot of people willing to buy their stuff at high prices.

  • geneva_convenience@lemmy.ml
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    1
    ·
    2 days ago

    By rendering only 25% of the frames we made DLSS4 100% faster than DLSS3. Which only renders 50% of the frames! - NVIDIA unironically

    • ZeroHora@lemmy.ml
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      edit-2
      2 days ago

      You living in the past, rendering 100% of the frames is called Brute Force Rendering, that’s for losers.

      With only 2k trump coins our new graphic card can run Cyberpunk 2077, a game from 4 years ago, at 30 fps with RTX ON but you see with DLSS and all the other crap magic we can run at 280 FPS!!! Everything is blurry and ugly as fuck but look at the numbers!!!

  • KamikazeRusher@lemm.ee
    link
    fedilink
    English
    arrow-up
    59
    ·
    2 days ago

    Maybe I’m stuck in the last decade, but these prices seem insane. I know we’ve yet to see what a 5050 (lol) or 5060 would be capable of or its price point. However launching at $549 as your lowest card feels like a significant amount of the consumer base won’t be able to buy any of these.

      • KamikazeRusher@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 hours ago

        Yeah, I keep forgetting how much time has passed.

        Bought my first GPU, an R9 Fury X, for MSRP when it launched. The R9 300 series and GTX 900 series seemed fairly priced then (aside from the Titan X). Bought another for Crossfire and mining, holding on until I upgraded to a 7800 XT.

        Comparing prices, all but the 5090 are within $150 of each other when accounting for inflation. The 5090 is stupid expensive. A $150 increase in price over a 10-year period probably isn’t that bad.

        I’m still gonna complain about it and embrace my inner “old man yells at prices” though.

    • Stovetop@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      ·
      edit-2
      2 days ago

      Sadly I think this is the new normal. You could buy a decent GPU, or you could buy an entire game console. Unless you have some other reason to need a strong PC, it just doesn’t seem worth the investment.

      At least Intel are trying to keep their prices low. Until they either catch on, in which case they’ll raise prices to match, or they fade out and leave everyone with unsupported hardware.

      • GoodEye8@lemm.ee
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        1
        ·
        2 days ago

        Actually AMD has said they’re ditching their high end options and will also focus on budget and midrange cards. AMD has also promised better raytracing performance (compared to their older cards) so I don’t think it will be the new norm if AMD also prices their cards competitively to Intel. The high end cards will be overpriced as it seems like the target audience doesn’t care that they’re paying shitton of money. But budget and midrange options might slip away from Nvidia and get cheaper, especially if the upscaler crutch breaks and devs have to start doing actual optimizations for their games.

        • moody@lemmings.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          2
          ·
          2 days ago

          Actually AMD has said they’re ditching their high end options

          Which means there’s no more competition in the high-end range. AMD was lagging behind Nvidia in terms of pure performance, but the price/performance ratio was better. Now they’ve given up a segment of the market, and consumers lose out in the process.

          • GoodEye8@lemm.ee
            link
            fedilink
            English
            arrow-up
            16
            ·
            2 days ago

            the high end crowd showed there’s no price competition, there’s only performance competition and they’re willing to pay whatever to get the latest and greatest. Nvidia isn’t putting a 2k pricetag on the top of the line card because it’s worth that much, they’re putting that pricetag because they know the high end crowd will buy it anyway. The high end crowd has caused this situation.

            You call that a loss for the consumers, I’d say it’s a positive. The high end cards make up like 15% (and I’m probably being generous here) of the market. AMD dropping the high and focusing on mid-range and budget cards which is much more beneficial for most users. Budget and mid-range cards make up the majority of the PC users. If the mid-range and budget cards are affordable that’s much more worthwhile to most people than having high end cards “affordable”.

            • moody@lemmings.world
              link
              fedilink
              English
              arrow-up
              8
              arrow-down
              1
              ·
              2 days ago

              But they’ve been selling mid-range and budget GPUs all this time. They’re not adding to the existing competition there, because they already have a share of that market. What they’re doing is pulling out of a segment where there was (a bit of) competition, leaving a monopoly behind. If they do that, we can only hope that Intel puts out high-end GPUs to compete in that market, otherwise it’s Nvidia or nothing.

              Nvidia already had the biggest share of the high-end market, but now they’re the only player.

              • GoodEye8@lemm.ee
                link
                fedilink
                English
                arrow-up
                7
                ·
                2 days ago

                It’s already Nvidia or nothing. There’s no point fighting with Nvidia in the high end corner because unless you can beat Nvidia in performance there’s no winning with the high end cards. People who buy high end cards don’t care about a slightly worse and slightly cheaper card because they’ve already chosen to pay premium price for premium product. They want the best performance, not the best bang for the buck. The people who want the most bang for the buck at the high end are a minority of a minority.

                But on the other hand, by dropping high end cards AMD can focus more on making their budget and mid-range cards better instead of diverting some of their focus on the high end cards that won’t sell anyway. It increases competition in the budget and mid-range section and mid-range absolutely needs stronger competition from AMD because Nvidia is slowly killing mid-range cards as well.

                • Naz@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 day ago

                  TIL, I’m a minority of a minority.

                  Overclocked a $800 AMD 7900XTX to 3.4 GHz core with +15% overvolt (1.35V), total power draw of 470W @86°C hotspot temp under 100% fan duty cycle.

                  Matches the 3DMark score in Time Spy for an RTX 4090D almost to the number.

                  63 FPS @ 1440p Ray Tracing: Ultra (Path Tracing On) in CP2077

    • simple@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      2 days ago

      They’ll sell out anyways due to lack of good competition. Intel is getting there but still have driver issues, AMD didn’t announce their GPU prices yet but their entire strategy is following Nvidia and lowering the price by 10% or something.

      • TonyOstrich@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        2 days ago

        Weird completely unrelated question. Do you have any idea why you write “Anyway” as “Anyways”?

        It’s not just you, it’s a lot of people, but unlike most grammar/word modifications it doesn’t really make sense to me. Most of the time the modification shortens the word in some way rather than lengthening it. I could be wrong, but I don’t remember people writing or saying “anyway” with an added “s” in anyway but ironically 10-15 years ago, and I’m curious where it may be coming from.

        • emeralddawn45@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          https://grammarist.com/usage/anyways/

          Although considered informal, anyways is not wrong. In fact, there is much precedent in English for the adverbial -s suffix, which was common in Old and Middle English and survives today in words such as towards, once, always, and unawares. But while these words survive from a period of English in which the adverbial -s was common, anyways is a modern construction (though it is now several centuries old).

        • Blisterexe@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 day ago

          I also write anyways that way, and so does everyone I know, I think it’s a regional thing

        • simple@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          I guess I’m used to saying it since I spent a long time not knowing it’s the wrong pronunciation for it.

    • tburkhol@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      So much of nvidia’s revenue is now datacenters, I wonder if they even care about consumer sales. Like their consumer level cards are more of an advertising afterthought than actual products.

  • inclementimmigrant@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    1
    ·
    2 days ago

    This is absolutely 3dfx level of screwing over consumers and all about just faking frames to get their “performance”.

    • Breve@pawb.social
      link
      fedilink
      English
      arrow-up
      30
      ·
      2 days ago

      They aren’t making graphics cards anymore, they’re making AI processors that happen to do graphics using AI.

        • Breve@pawb.social
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 day ago

          Oh yeah for sure, I’ve run Llama 3.2 on my RTX 4080 and it struggles but it’s not obnoxiously slow. I think they are betting more software will ship with integrated LLMs that run locally on users PCs instead of relying on cloud compute.

      • daddy32@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Except you cannot use them for AI commercially, or at least in data center setting.

        • Breve@pawb.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          Data centres want the even beefier cards anyhow, but I think nVidia envisions everyone running local LLMs on their PCs because it will be integrated into software instead of relying on cloud compute. My RTX 4080 can struggle through Llama 3.2.

  • The Hobbyist@lemmy.zip
    link
    fedilink
    English
    arrow-up
    62
    ·
    edit-2
    2 days ago

    The performance improvements claims are a bit shady as they compare the old FG technique which only creates one frame for every legit frame, with the next gen FG which can generate up to 3.

    All Nvidia performance plots I’ve seen mention this at the bottom, making comparison very favorable to the 5000 series GPU supposedly.

    Edit:

  • bitjunkie@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    2 days ago

    I’m sure these will be great options in 5 years when the dust finally settles on the scalper market and they’re about to roll out RTX 6xxx.

    • frezik@midwest.social
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      3
      ·
      edit-2
      2 days ago

      Scalpers were basically non existent in the 4xxx series. They’re not some boogieman that always raises prices. They work under certain market conditions, conditions which don’t currently exist in the GPU space, and there’s no particular reason to think this generation will be much different than the last.

      Maybe on the initial release, but not for long after.

      • bitjunkie@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 hours ago

        Scalpers were basically non existent in the 4xxx series.

        Bull fucking shit. I was trying to buy a 4090 for like a year. Couldn’t find anything even approaching retail. Most were $2.3k+.

      • Critical_Thinker@lemm.ee
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        1 day ago

        The 4090 basically never went for MSRP until Q4 2024… and now it’s OOS everywhere.

        nobody scalped the 4080 because it was shit price/perf. 75% of the price of a 4090 too… so why not just pay the extra 25% and get the best?

        the 4070ti (aka base 4080) was too pricey to scalp given that once you start cranking up the price then why not pay the scalper fee for a 4090.

        Things below that are not scalp worthy.

        • SaltySalamander@fedia.io
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          9 hours ago

          The 4090 basically never went for MSRP until Q4 2024

          This had nothing to do with scalpers though. Just pure corporate greed.

    • Subverb@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 day ago

      About two months ago I upgraded from 3090 to 4090. On my 1440p I basically couldn’t tell. I play mostly MMOs and ARPGs.

      • Critical_Thinker@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        18 hours ago

        Those genres aren’t really known for having brutal performance requirements. You have to play the bleeding edge stuff that adds prototype graphics postprocessing in their ultra or optional settings.

        When you compare non RT performance the frame delta is tiny. When you compare RT it’s a lot bigger. I think most of the RT implementations are very flawed today and that it’s largely snake oil so far, but some people are obsessed.

        I will say you can probably undervolt / underclock / power throttle that 4090 and get great frames per watt.

  • Zarxrax@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    2
    ·
    2 days ago

    LOL, their demo shows Cyberpunk running at a mere 27fps on the 5090 with DLSS off. Is that supposed to sell me on this product?

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      Their whole gaming business model now is encouraging devs to stick features that have no hope of rendering quickly in order to sell this new frame generation rubbish.

          • WereCat@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            40 minutes ago

            Whether it’s 50% or 200% it’s pointless if the avg FPS can’t even reach the bare minimum of 30.

          • Kat@orbi.camp
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            3
            ·
            edit-2
            1 day ago

            Not every game has frame gen… not everybody wanna introduce lag to input. So 50% is 100% sketchy marketing. You can keep your 7 frames, Imma wait for 6090

            • Poopfeast420@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              Both figures are without DLSS, FG, whatever. Just native 4k with Path Tracing enabled, that’s why it’s so low.

              The sketchy marketing is comparing a 5070 with a 4090, but that’s not what this is about.

              • Kat@orbi.camp
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                1 day ago

                Like I said, I base performance without frame gen. 5090 is not twice as powerful as a 4090, which they advertise, without frame gen.

  • kingthrillgore@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    2 days ago

    Two problems, they are big ones:

    1. The hardware is expensive for a marginal improvement
    2. The games coming out that best leverage the features like Ray tracing are also expensive and not good
    • frezik@midwest.social
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      2 days ago

      Nvidia claims the 5070 will give 4090 performance. That’s a huge generation uplift if it’s true. Of course, we’ll have to wait for independent benchmarks to confirm that.

      The best ray tracing games I’ve seen are applying it to older games, like Quake II or Minecraft.

      • lazynooblet@lazysoci.al
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        I expect they tell us it can achieve that because under the hood DLSS4 gives it more performance if enabled.

        But is that a fair comparison?

        • Poopfeast420@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          11 hours ago

          They’ve already said it’s all because of DLSS 4. The 5070 needs the new 4x FG to match the 4090, although I don’t know if the 4090 has the “old” 2x FG enabled, probably not.

  • deur@feddit.nl
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    5
    ·
    2 days ago

    Okay losers, time for you to spend obscene amounts to do your part in funding the terrible shit company nvidia.

  • caut_R@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 days ago

    My last new graphics card was a 1080, I‘ve bought second hand since then and will keep doing that cause these prices are…

  • moonlight@fedia.io
    link
    fedilink
    arrow-up
    9
    ·
    2 days ago

    I think it’s going to be a long time before I upgrade my graphics card with these prices.