• ryper@lemmy.ca
    link
    fedilink
    English
    arrow-up
    116
    arrow-down
    3
    ·
    7 months ago

    The full tweet:

    Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory. Most played games WW are mostly esports games. We wouldn’t build it if there wasn’t a market for it. If 8GB isn’t right for you then there’s 16GB. Same GPU, no compromise, just memory options.

    I don’t think he’s that far off; eSports games don’t have the same requirements as AAA single-player games.

    • BombOmOm@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      edit-2
      7 months ago

      Seriously.

      All AMD had to do here is create a 12GB and 16GB version (instead of 8 and 16), then gesture at all the reviews calling the RTX 5060 8GB DOA because of the very limiting VRAM quantity.

      8GB VRAM is not enough for most people. Even 1080p gaming is pushing the limits of an 8GB card. And this is all made worse when you consider people will have these cards for years to come.

      Image (and many more) thanks to Hardware Unboxed testing

      • BlameTheAntifa@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        Exactly. Even if you accept their argument that 8GB is usually enough today for 1080P (and we all know that is only true for high performance e-sports focused titles), it is not true for tomorrow. That makes buying one of those cards today a really poor investment.

      • dormedas@lemmy.dormedas.com
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 months ago

        Even worse when you consider the cost difference between 8GB and 16GB can’t be that high. If they ate the cost difference and marketed 16GB as the new “floor” for a quality card, then they might have eaten NVIDIA’s lunch where they can (low-end)

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      7 months ago

      I mean honestly, yeah. With a simple 4 GB chip they could have won the low end and not screwed over gamers.

      They’ve really seemed to have forgotten their roots with the GPU market, which is a damn shame.

  • SharkAttak@kbin.melroy.org
    link
    fedilink
    arrow-up
    2
    ·
    7 months ago

    So the ones who had VGAs do more and more stuff like they were small separate PCs, and pushed for the “1440p Ultra Gaeming!!!1!” are telling us that nah 8GB is enough?

  • CrowAirbrush@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    7 months ago

    I just ditched my 8gb card because it wasn’t doing the trick well enough at 1080p and especially not at 1440p.

    So if i get this straight AMD agrees that they need to optimize games better.

    I hate upscaling and frame gen with a passion, it never feels right and often looks messy too.

    First descendant became a 480p mess when there were a bunch of enemies even tho i have a 24gb card and pretty decent pc to accompany that.

    I’m now back to heavily modded Skyrim and damn do i love the lack of upscaling and frame gen. The Oblivion stutters were a nightmare and made me ditch the game within 10 hours.

    • BlameTheAntifa@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 months ago

      FSR4 appears to solve a lot of problems with both upscaling and frame gen – not just in FSR, but generally. It appears they’ve fixed disocclusion trails, which is a problem even DLSS suffers from.

  • Einar@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 months ago

    I wish.

    Send one of these guys by my place. I’ll show them what 8GB can not do…

    • DriftingLynx@lemmy.ca
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      7 months ago

      Or to gamers who insist on playing these unoptimized games at max settings. $80 for the game, and then spend $1000 buying a gpu that can run the game.

  • je_skirata@lemmy.today
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    7
    ·
    7 months ago

    I personally think anything over 1080p is a waste of resolution, and I still use a card with 8GB of VRAM.

    That being said, lots of other people want a 16GB card, so let them give you money AMD!

    • IngeniousRocks (They/She) @lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      My gaming rig is also my media center hooked up to a 4k television. I sit around 7 feet away from it. Anything less than 1440p looks grainy and blocky on my display.

      I can’t game at 4k because of hardware limitations (a 3070 just can’t push it at good framerates) but I wouldn’t say it’s a waste to go above 1080p, use case is an important factor.

        • AndyMFK@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          Pixel density is pixel density. Doesn’t matter if it’s a tv or a monitor.

          Sure monitors typically have less input lag and there are reasons one might choose a monitor over a tv, but the reverse is also true. I chose a 55" tv for my sim racing setup that sits maybe a meter from my face and there’s no problem with that setup

            • AndyMFK@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 months ago

              Not sure what you think PPI means or how it’s calculated, but it has nothing to do with being a tv or a monitor. It’s a relationship between the number of pixels and physical size.

              A 34" 1440p monitor will have a lower PPI than a 4k TV at the same size

    • stoy@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      1440p on a 27" monitor is the best resolution for work and for gaming.

    • Exec@pawb.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      7 months ago

      I personally think anything over 1080p is a waste of resolution

      But but Nvidia said at the RTX 3000 announcement that we can now have 8K gaming

  • xploit@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    Oh so it’s not that many players are FORCED to play at 1080p because AMDs and Novideos “affordable” garbage can’t cope with anything more to make a game seem smooth, or better yet the game detected we’re running on a calculator here so it took pity on us and set the graphics bar low.

    • insomniac_lemon@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      Hey, give a little credit to our public schools (poorly-optimized eye-candy) new games! (where 10-20GiB is now considered small)

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    7 months ago

    If he’d chosen his words more carefully and said “many” rather than “most” nobody would have a reason to disagree.

  • Fizz@lemmy.nz
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    4
    ·
    edit-2
    7 months ago

    Ive got 16gb of vram 2k monitor and this tracks pretty accurately. I almost never use over 8gb. The only games that I can break 10gb are games where I can enable a setting (designed for old PCs) where I can load all the textures into vram.

    • FeelzGoodMan420@eviltoast.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      Weird. You must be playing old games. Most modern games are going over 8gb at 1440p no problem. They have been for at least a few years now.

      • DtA@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        7 months ago

        Ksp uses ram, not vram. I play rp1 with 8gb vram no problem. 32 gb of ram isn’t enough though.

          • DtA@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            7 months ago

            I don’t think I’ve ever seen a game use more RAM than ksp with mods though, holy moly.

            • FeelzGoodMan420@eviltoast.org
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              7 months ago

              Cyberpunk with 4K texture packs has entered the chat.

              Edit: also the AI upscaled textures pack for starfield. Also the official 4k texture pack for Warhammer: Space Marines 2. All go over 16gb vram, even at 1440p.