Wednesday, April 12th 2023
AMD Plays the VRAM Card Against NVIDIA
In a blog post, AMD has pulled the VRAM card against NVIDIA, telling potential graphics card buyers that they should consider AMD over NVIDIA, because current and future games will require more VRAM, especially at higher resolution. There's no secret that there has been something of a consensus from at least some of the PC gaming crowd that NVIDIA is being too stingy when it comes to VRAM on its graphics cards and AMD is clearly trying to cash in on that sentiment with its latest blog post. AMD is showing the VRAM usage in games such as Resident Evil 4—with and without ray tracing at that—The Last of US Part I and Hogwarts Legacy, all games that use over 11 GB of VRAM or more.
AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.
Source:
AMD
AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.
218 Comments on AMD Plays the VRAM Card Against NVIDIA
While having reliable drivers like they now have will make me give them a try.
The one thing stopping me better support for DLSS, better RT on nvidia, DLSS 2.1 looks better than FSR 2.1, nvidia got FG for CPU bottleneck and in some games.
Reflex can be replicated with caping framerate so that is non issue.
Also AMD has sometimes quirky performance in some games usually RT performance in new games and has multiple outliers at time which are only fixed some time down the pipeline.
While nvidia has always dialed in optimization for all games even on Beta launch days.
The only major caviat of nvidia for mee at the moment is lacking VRAM and minor con is pricing.
This is the only time around I am considering AMD since ATI times, so well over a decade.
AMD better captiulise on nvidia big mistake with lacking VRAM.
I am sure I am not the only one who think this way.
Currently that is only RX 7900, which is out of my price range. As I am looking at spending at most 700€ incl tax.
Waiting for RX 7800 in June.
As long time nvidia doesn't release Refresh by then due to VRAM with bigger chip coming down a tier.
They can due that when they are selling 60 tier card at best as 70 tier.
Seriously, if you're looking for an upgrade, consider all the above. ;)
That's about it... tho.
The only thing I forgot is I am really excited about RTX Remix of old games.
That's one thing that is keeping me away from going with AMD.
I am debating the RX6950XT, but I may just wait and see until June for RX 7800.
I've seen both DLSS 2.0/2.1 and FSR 2.0 first-hand, and can't tell the difference. You can also check out game reviews here on TPU and see if there's any difference yourself. Just be careful, you may spot small details on a still image (I'm not saying that you will, though) which would be unrecognisable in a live game. Yeah, the tech looks promising. I'm just not sure what will become of it. We'll see. The 6950 XT is a fine card, and its drivers have matured to the point when you can expect zero issues. Waiting for future releases is always a gamble, in my opinion.
- Low performance/€ improvement (so far)
- Growing tensions between China and Taiwan
Bonus:7700/7700XT will likely have similar performance to the 6800XT, but with (only) 12G VRAM
AMD better back up the glad tidings with abundant VRAM offerings with 7600/7800 tier cards. Hopefully, NVIDIA will.... nah toss that, "obsolescence" pays well. The green captain is seeing too much green to be concerned with the minority.
I'm still praying for some messianic intervention ... 16GB+ XTX/4080s rolling for $700/$800. Intel the messiah? Perhaps AMD playing santa with weighty discounts? Or maybe the Green monster with its pandemic-prolonged-price-strategy catching the covid bug (they're certainly asking for it)? who knows.
In just 3 months, we've had 4 large games where 8Gb started being a serious problem. I predict that the trend isn't going to stop at all during the next two years.
We'll see just how the 10Gb 3080 lasts, and I think the 12Gb 4070/Ti will be worse. At least with the 3080 you had 2 good years of PS4 era holdovers until the requirements climbed hard. The 4070s feel sufficient "for now" I'm sure, but will it even last 2 years? I highly doubt it.
The GPUs increase their performance drastically in every gen while the CPUs are not. In 3 months, 4 console games have been released (all of them AMDs sponsored as someone in the forum would say but I think it's a coincidence), all of them were broken at launch, all of them got or will get a patch to make them run properly, all of them don't justify the vram or any other requirement for the graphics they deliver.
While at the same time we had games like A Plague Tale Requiem, CP77 (even)Overdrive that require less than half vram than the aforementioned games and look light years better.
When Crysis released, no one could play it and no one had a problem with that.
Because we all knew that it's a game that the dev just put stuff in a graphics engine that cannot handle, just to pose what they could achieve. Even today Crysis or ARMA 3 etc. cannot be played at high fps because of their crap engines.
nVidia doesn't put much vram because their cards have a ton of uses apart from gaming.
AMDs VRAM is useless to nearly any professional software out there, so it's a cost free advantage and line in the presentations and ads.
The term ''slow'' is not accurate though. The truth is that the CPUs are not fully utilised too because of the development issues.
You're right, most games add RT shadows, reflections, AO etc. while it's not necessary.
ex. RDR2, TLOU, God of War, Uncharted, A Plague Tale Innocence/Requiem etc.....you don't need RT if you have time/money and develop the game properly.
But at the same time you can't ask for a truckload of vram without delivering a next gen result.
Also in games that heavily use RT(and the RT actually make a difference) the 4080 is 25-45% faster.
DLSS3 is far superior to FSR.
In addition the 4080 have less power consumption.
In cyberpunk for example with RT ultra at 4K(DLSS+FG) the 4080 gets 110fps and 7900XTX just 20fps at the same graphic settings, that is 5.5 times faster, that is the real world difference between the two. As you can see the 4080 offers so much more for only 10% more money, this Gen AMD manage to be greedier than nvidia.
Personally, I consider FG frame rates irrelevant for comparison. Not really.
Fair enough - I mostly shop at Scan, that's why I was comparing prices from there. A 5% difference might actually be worth it.
Right, it's the same things I've been reading to the point of complete distaste, so I'm just going to put it in a nice image:
Since the denialists are in full force right now, and that logic isn't very useful against them, I'm just going to ignore these pointless ramblings and let time run its course.
Within about a year or so, I expect that all of them will have repeated "bad pc port" or "another AMD sponsored title" until their tongues fall off.
And then they'll do as we've been advocating for all this time, buy a 16Go VRAM card and end this charade.
Oh and BTW, I've been told that I'm extremely optimistic about the 16Go by 3-4 years time. It's more of a "at least 16Go" situation according to some.
(also no need to tell me about the graph's inaccuracy, I didn't make it accurate because it's not about the numbers but the trend, and the trend is very, very clear)
www.pugetsystems.com/recommended/Recommended-Systems-for-Adobe-Premiere-Pro-CC-143/Hardware-Recommendations
RTX 4080's 16 GB VRAM is sufficient enough like the professional RTX A4000's 16 GB VRAM.
"8 GB" VRAM is just a toy video card.
Your "AMDs VRAM is useless to nearly any professional software out there" is flawed since professional video cards like GA104-based RTX A4000 have 16 GB VRAM.
VRAM affects the game's primary artwork quality. For RTX 4060 Ti /4070 / 4070 Ti's price range, I wouldn't be looking at NVIDIA. I can afford RTX 4080 16 VGRAM. Lesser NVIDIA SKUs with "8 GB VRAM" didn't deliver the minimum PS5 experience e.g. TLOU P1. TLOU P1 or Resident Evil 4 Remake are not the only games that killed 8GB VRAM.
RTX 4080's 16 GB VRAM is the minimum for professional cards like RTX A4000 16 GB VRAM. RTX 4080's GA103 chip is reused for mobile RTX 4090 SKUs.
RTX 4080's 16 GB VRAM existence debunks pro-NVIDIA shills' "8 GB VRAM" is enough arguments. VRAM affects the game's primary artwork quality. PS5 / XSX console GPU equivalent on the PC is at least RX 6700 XT 12 GB. 6500 XT is below PS5's specs. NVIDIA didn't combine the best aspects of RTX 3070 Ti's 6144 CUDA cores with RTX 3060's 12 GB VRAM since it would be close to GA104-based RTX A4000 16 GB VRAM.
RTX 3060's 12 GB VRAM has a 2 GB memory chip density.
There are 16 GB VRAM mods for RTX 3070 and it works.
The issue is really simple: it doesn't matter how much VRAM a video card has, anyone can come up with a texture that won't fit. Hardware limits will always be surpassed by software,simply because software changes faster. It is perfectly reasonable for me to require 4, 6 or 8GB VRAM, while you won't accept anything under 16 GB. It doesn't make any of us wrong (unless we buy those cards to play Solitaire maybe).
No need to test your sarcasm on me in the meantime, I'm afraid the time for sarcasm is past and now it's only about waiting for Irony to come and establish its kingdom yet again.
Source: MEGAsizeGPU
This time nobody "had the balls" to do it.
www.techpowerup.com/gpu-specs/?generation=GeForce+500&sort=generation
GTX 570 SKU has a defect recovered GF110 chip design.
techspot just did a new review of vram posted about 50 minutes ago.
AMD is such good value, man I would never buy a 8gb vram card in todays gaming world. wild Nvidia still does this.