Wednesday, April 12th 2023
AMD Plays the VRAM Card Against NVIDIA
In a blog post, AMD has pulled the VRAM card against NVIDIA, telling potential graphics card buyers that they should consider AMD over NVIDIA, because current and future games will require more VRAM, especially at higher resolution. There's no secret that there has been something of a consensus from at least some of the PC gaming crowd that NVIDIA is being too stingy when it comes to VRAM on its graphics cards and AMD is clearly trying to cash in on that sentiment with its latest blog post. AMD is showing the VRAM usage in games such as Resident Evil 4—with and without ray tracing at that—The Last of US Part I and Hogwarts Legacy, all games that use over 11 GB of VRAM or more.
AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.
Source:
AMD
AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.
218 Comments on AMD Plays the VRAM Card Against NVIDIA
PS4 launches around the same time as the 290X and the 780Ti. PS4 has 8GB of ram of which more than 4GB is used for VRAM. 290X has slightly better legs and the 8GB variants have much better legs.
3/4 years into the gen once the cross gen games stop getting made the VRAM requirements spike above 4GB so stuff like the 780Ti gets left behind its Titan variant and the 980 gets left behind the 390X while the 980Ti holds on far better than Fury X.
NV then release a new gen (1000 series back then, 5000 series I expect this time) which addresses the VRAM thing properly and those cards will be fine for an above console experience until we are past the cross gen phase of the PS6 generation when requirements will increase yet again.
EDIT: Then the people who went with the 3090/4090/7900XTX will be sitting smugly as those on 16GB cards start to struggle. Just like 1080Ti and 2080Ti owners sit smugly as those who went with 8GB cards or got a 3070 are struggling now.
I personally would still prefer a 4080 over a 7900xtx due to the fact that I use DLSS balanced + reflex at 4k in most games that ends up being faster and generally better looking than 7900xtX with FSR2 quality.
But I wouldn't really turn down a 7900XTX either -- it's a great product if you don't really care about the current state of "Ray tracing" and don't really use upscaling.
I built a few 5700 + 6700xt 12GB rigs for friends recently and they're as happy as clams.
The problem as I see it is when you have to choose between a weaker GPU with more VRAM and a stronger GPU with less VRAM. Given that VRAM issues are easily fixable by turning textures from "very high" to "high" with differences only Photoshop will spot, I'd always go for the latter.
Guess I should just buy a console huh?
Also your dichotomy is false. Frequently it is 2 GPUs with similar performance like 3070 vs 6800 or 3060Ti vs 6700XT where the latter while performing about the same has more VRAM. Those who chose the 3060Ti or 3070 over the similar priced AMD cards for the ray tracing are suffering in the latest games at settings the 6800 or 6700XT can play perfectly fine.
I always treat numbers from manufacturers the same way I treat synthetic benchmarks: numbers that measure something very specific, with a weak connection to real life performance.
TPU's numbers tell me VRAM is rarely an issue and when it is, it's probably a lame console port. If people think that's indicative of future performance and that a card they buy today will still handle games in 5+ years, by all means, buy 12+ GB VRAM. Now, if you buy the absolute best, like a 4090 or a 7900XTX, you wouldn't expect those to come with 8GB VRAM. But if you buy something more down to earth, 8-10GB is probably just fine.
VRAM USAGE RISES. It's been rising since forever. The reason that GPUs got away with 8Gb for 6 years is because the PS4 limited how much games could actually eat.
Once the PS5 starts, give it about 2 years for proper PS5 games to come out and you can expect that the 8Gb limitation will be a 16Gb everywhere. It's not rocket science.
I am already so tired of Nvidia fanboys throwing this pathetic excuse. "I didn't make a bad decision, I would never, Nvidia didn't skimp out on VRAM, never, and if VRAM keeps rising, it's all bad PC ports, just keep compressing until it fits into 8Gb". And do so until 2050?
Oh wait, next it's going to be the shit-flinging of "these devs suck" or the even more amazing "This is an AMD sponsored title, AMD is forcing devs to take more VRAM to make my amazing Nvidia GPU crash!!!" (not a joke, I REALLY had at least 2 different drones say that over at Reddit).
It gets even worse when it spills over into RT territory btw. RT swallows a solid 1 to 2Gb on the games it's on. So you can be sure that within a year or so, VRAM-choked "muh Raytracing" buyers will be crying at "bad pc ports" for games that will then run about as well or better on RDNA 2 cards that have notoriously poor RT. All because Nvidia daddy was always cheap with VRAM like it's a super expensive luxury, and this time, they've really underestimated the risk and are going to pay big time in public image.
But don't worry, the drones will be all over the internet to echo "bad pc port" "amd conspiracy" "damn lazy devs" until Nvidia finally releases the 5000s with 12Gb min and 16+ for everything else, and then it'll have been a "a necessary toothing period for VRAM to become cheap enough to put in such quantities, thank you Nvidia".
Nvidia drones:
Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult.
The 4070 Ti can scarcely be found at the same price as the XT in my country. And the only models that can be found are Zotac/PNY. So you think 16Gb will not suffice? Why? The PS5 has 16Gb of unified RAM. Counting the PS5 OS, obvious CPU needs, I'd expect that even with a PC with multiple monitors, 16Gb would be enough.
For your price it will be probably the 7600XT which will be more around 12GB, but I have seen no leaks yet. Probably AMD are still working on fixing their performance issues , hoping to fix them before launching the lower SKUs.
We might actually see better progress on VRAM on cards now that the crypto scene has died down significantly. I don't imagine that helped any on pricing of DRAM chips in terms of sourcing them with demand as high as it was at it's peak.
I personally think 12GB should be bare minimum these days (entry), but I come from a generation where VRAM doubled every two years up until 2016... lol