Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • Kazumara@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.

    • NIB@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt’s 16gB offer any advantage. Or do you think having 15fps average more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

      The 4070 super offers significantly superior ray tracing performance, much lower power consumption, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.

      Amd’s offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to “suffer” using AMD gpus, yet they usually join the nvidia shitting circlejerk.

      • Kazumara@feddit.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

        This is exactly what I expect. I have seen what happened to my friends with their GTX 970 when 3.5 GB of VRAM wasn’t enough anymore. Even though the cards were still rasterizing quickly enough they weren’t useful for certains games anymore. Therefore I recently make sure I go for enough VRAM to extend the useful service life of my cards.

        And I’m not just talking about buying AMD, I actually do buy them. I first had the HD 5850 with 1GB, then got my friends HD 5870 also with 1GB (don’t remember if I used it in crossfire or just replaced), then two of my friends each sold me their HD 7850 with 2GB for cheap and I ran crossfire, then I bought a new R9 380 with 4GB when a game that was important to me at the time couldn’t deal with crossfire well, then I bought a used RX 580 with 8GB and finally the RX 6800 with 16 GB two years ago.

        At some point I also bought a used GTX 960 because we were doing some CUDA stuff at University, but that was pretty late, when they weren’t current anymore, and it was only used in my Linux server.

  • ReallyActuallyFrankenstein@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Yep, it’s the RAM, but also just a mismatched value proposition.

    I think it’s clear at this point Nvidia is trying to have it both ways and gamers are sick of it. They used pandemic shortage prices as an excuse to inflate their entire line’s prices, thinking they could just milk the “new normal” without having to change their plans.

    But when you move the x070 series out of the mid-tier price bracket ($250-450, let’s say), you better meet a more premium standard. Instead, they’re throwing mid-tier RAM into a premium-priced project that most customers still feel should be mid-tier priced. It also doesn’t help that it’s at a time where people generally just have less disposable income.

  • FiskFisk33@startrek.website
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    GPUs haven’t been reasonably priced since the 1000 series.

    And now there’s no coin mining promising some money back.

    • Sibbo@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      You mean Nvidia GPUs? I got my 6750XT for 500€, and I think it’s a good price for the performance I get.

    • 9488fcea02a9@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      The new mining is AI… TSMC is at max capacity. They’re not going to waste too many wafers making gaming GPU when AI acceleratora are selling for $30k each

  • wooki@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    If they dont drop the price by at least 50% goodbye nVidia.

    So no more nVidia. Hello Intel.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I don’t think they care. In fact I think they’re going to exit the consumer market eventually, it’s just peanuts to them and the only reason they’re still catering to it is to use it as field testing (and you’re paying them for the privilege which is quite ironic).

      • Kyrgizion@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        This. Corporations are lining up in droves for gpu’s to run AI applications. Nvidia doesn’t care about regular consumers because we aren’t even their primary market anymore, just a bonus to be squeezed.

        • wewbull@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          If Nvidia pivot completely out of the consumer space, which I can totally see coming, they are placing the company totally dependent on the AI hype train. That’s a fairly precarious position in my eyes. I’ve yet to see an actual application which it solves with enough reliability to be more than just a curiosity.

          • willis936@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            They learned their strategy pretty hard into mining when that was on the table. They for sure chase trends and alienate their base. Any way to juice near term profits and they will. It’s working out for them right now, so surely it will forever.

  • caseyweederman@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?

  • CosmoNova@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I mean yeah when I‘m searching for GPUs I specifically filter out anything that‘s less than 16GB of VRAM. I wouldn‘t even consider buying it for that reason alone.

  • Dra@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid. Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit of 20gb VRAM?

    • Hadriscus@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logiciel investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

    • Asafum@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Lmao

      We have your comment: what am I doing with 20gb vram?

      And one comment down: it’s actually criminal there is only 20gb vram

    • AlijahTheMediocre@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      If only game developers optimized their games…

      The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.

    • Eccitaze@yiffit.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Current gen consoles becoming the baseline is probably it.

      As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.

      That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.

    • Space_Racer@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I’m maxed on VRAM in VR for the most part with a 3080. It’s my main bottleneck.

    • AProfessional@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      They clearly believe customers will always buy nvidia over amd so why bother competing just make an annoyingly segmented lineup.

  • trackcharlie@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    less than 20gb of vram in 2024?

    The entire 40 series line of cards should be used as evidence against nvidia in a lawsuit surrounding intentional creation of e waste

    • BorgDrone@lemmy.one
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      The real tragedy is that PCs still have to make do with discrete graphics cards that have separate VRAM.

  • Altima NEO@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    The RAM is so lame. It really needed more.

    Performance exceeding the 3090, but limited by 12 gigs of RAM .

  • Binthinkin@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.

    Aren’t they taking the 4080 completely off the market too?

    • elvith@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn’t the bottle neck, so 12 GB might be fine with this use case BUT I’m also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It’s not, that I would never get a 12 GB card as an upgrade, but you’d be sure, that I’d do some research for all alternatives and then it wouldn’t be my first choice but a compromise, as it wouldn’t future proof me in this regard.