Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?
600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.
12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt’s 16gB offer any advantage. Or do you think having 15fps average more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.
The 4070 super offers significantly superior ray tracing performance, much lower power consumption, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.
Amd’s offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to “suffer” using AMD gpus, yet they usually join the nvidia shitting circlejerk.
$100 less IS the advantage.
Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.
This is exactly what I expect. I have seen what happened to my friends with their GTX 970 when 3.5 GB of VRAM wasn’t enough anymore. Even though the cards were still rasterizing quickly enough they weren’t useful for certains games anymore. Therefore I recently make sure I go for enough VRAM to extend the useful service life of my cards.
And I’m not just talking about buying AMD, I actually do buy them. I first had the HD 5850 with 1GB, then got my friends HD 5870 also with 1GB (don’t remember if I used it in crossfire or just replaced), then two of my friends each sold me their HD 7850 with 2GB for cheap and I ran crossfire, then I bought a new R9 380 with 4GB when a game that was important to me at the time couldn’t deal with crossfire well, then I bought a used RX 580 with 8GB and finally the RX 6800 with 16 GB two years ago.
At some point I also bought a used GTX 960 because we were doing some CUDA stuff at University, but that was pretty late, when they weren’t current anymore, and it was only used in my Linux server.
Yep, it’s the RAM, but also just a mismatched value proposition.
I think it’s clear at this point Nvidia is trying to have it both ways and gamers are sick of it. They used pandemic shortage prices as an excuse to inflate their entire line’s prices, thinking they could just milk the “new normal” without having to change their plans.
But when you move the x070 series out of the mid-tier price bracket ($250-450, let’s say), you better meet a more premium standard. Instead, they’re throwing mid-tier RAM into a premium-priced project that most customers still feel should be mid-tier priced. It also doesn’t help that it’s at a time where people generally just have less disposable income.
GPUs haven’t been reasonably priced since the 1000 series.
And now there’s no coin mining promising some money back.
You mean Nvidia GPUs? I got my 6750XT for 500€, and I think it’s a good price for the performance I get.
The new mining is AI… TSMC is at max capacity. They’re not going to waste too many wafers making gaming GPU when AI acceleratora are selling for $30k each
If they dont drop the price by at least 50% goodbye nVidia.
So no more nVidia. Hello Intel.
I don’t think they care. In fact I think they’re going to exit the consumer market eventually, it’s just peanuts to them and the only reason they’re still catering to it is to use it as field testing (and you’re paying them for the privilege which is quite ironic).
This. Corporations are lining up in droves for gpu’s to run AI applications. Nvidia doesn’t care about regular consumers because we aren’t even their primary market anymore, just a bonus to be squeezed.
If Nvidia pivot completely out of the consumer space, which I can totally see coming, they are placing the company totally dependent on the AI hype train. That’s a fairly precarious position in my eyes. I’ve yet to see an actual application which it solves with enough reliability to be more than just a curiosity.
They learned their strategy pretty hard into mining when that was on the table. They for sure chase trends and alienate their base. Any way to juice near term profits and they will. It’s working out for them right now, so surely it will forever.
Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?
I wish they would have started putting out AMD products. Powercolor just doesn’t feel like a flagship partner like evga was to nvidia.
I would’ve actually switched to AMD if EVGA did
Really?
Yup. It was something like 90% of their revenue, but 25% of their profit.
And now they have 0 revenue and 0 profit.
Yeah sadly they weren’t gonna be able to stay the same with their remaining products being expensive niche case motherboards and good power supplies. Hopefully the employees got good gigs elsewhere at least.
They still exist. However their website also says they’re “America’s #1 NVIDIA partner,” so…
They do seem to be winding down operations as a whole, though. It’s a deliberate choice on the owner’s part.
4070 is $600. That seems like total shit to me. That’s why.
Is this the one that they nerfed so that they could sell them in China around the US AI laws?
Nope, that’s the 4090.
I mean yeah when I‘m searching for GPUs I specifically filter out anything that‘s less than 16GB of VRAM. I wouldn‘t even consider buying it for that reason alone.
And here I’m thinking upgrading from two 512mb cards to a GTX 1660 SUPER with 6GB VRAM is going to be good for another 10 years. The heck does someone need 16 gigs for?
Gaming in 4k or AI (e.g stable diffusion or language models)
@Hypx @technology GPUs are still too expensive for me
I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid. Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit of 20gb VRAM?
Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logiciel investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here
GPU rendering and AI.
Lmao
We have your comment: what am I doing with 20gb vram?
And one comment down: it’s actually criminal there is only 20gb vram
Lol
If only game developers optimized their games…
The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.
An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.
Perfect answer thank you!
Current gen consoles becoming the baseline is probably it.
As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.
That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.
I’m maxed on VRAM in VR for the most part with a 3080. It’s my main bottleneck.
Wait, they didn’t put the 4070 super at 16 GB?
They clearly believe customers will always buy nvidia over amd so why bother competing just make an annoyingly segmented lineup.
less than 20gb of vram in 2024?
The entire 40 series line of cards should be used as evidence against nvidia in a lawsuit surrounding intentional creation of e waste
The real tragedy is that PCs still have to make do with discrete graphics cards that have separate VRAM.
The RAM is so lame. It really needed more.
Performance exceeding the 3090, but limited by 12 gigs of RAM .
You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.
Aren’t they taking the 4080 completely off the market too?
My RTX 4060 has 16GB of RAM. What on earth makes them think people would go for 12GB?
I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn’t the bottle neck, so 12 GB might be fine with this use case BUT I’m also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It’s not, that I would never get a 12 GB card as an upgrade, but you’d be sure, that I’d do some research for all alternatives and then it wouldn’t be my first choice but a compromise, as it wouldn’t future proof me in this regard.
Do you think there is a large overlap of people who buy $600-$900 cards and like 1080p?