Wednesday, April 12th 2023

AMD Plays the VRAM Card Against NVIDIA

In a blog post, AMD has pulled the VRAM card against NVIDIA, telling potential graphics card buyers that they should consider AMD over NVIDIA, because current and future games will require more VRAM, especially at higher resolution. There's no secret that there has been something of a consensus from at least some of the PC gaming crowd that NVIDIA is being too stingy when it comes to VRAM on its graphics cards and AMD is clearly trying to cash in on that sentiment with its latest blog post. AMD is showing the VRAM usage in games such as Resident Evil 4—with and without ray tracing at that—The Last of US Part I and Hogwarts Legacy, all games that use over 11 GB of VRAM or more.

AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.
Source: AMD
Add your own comment

218 Comments on AMD Plays the VRAM Card Against NVIDIA

#26
N3utro
Yes having more vram is a good thing at equal performance but with ray tracing games amd gpus are far behind nvidia, and it will stay this way because nvidia invests massively into new AAA games producers so they use their technology, making most new games running better on nvidia hardware. That's why nvidia gpu are over 90% of the total gpu market. AMD will keep falling until they also invest like this.
Posted on Reply
#27
btk2k2
This was the same thing that happened during the PS4 gen.

PS4 launches around the same time as the 290X and the 780Ti. PS4 has 8GB of ram of which more than 4GB is used for VRAM. 290X has slightly better legs and the 8GB variants have much better legs.

3/4 years into the gen once the cross gen games stop getting made the VRAM requirements spike above 4GB so stuff like the 780Ti gets left behind its Titan variant and the 980 gets left behind the 390X while the 980Ti holds on far better than Fury X.

NV then release a new gen (1000 series back then, 5000 series I expect this time) which addresses the VRAM thing properly and those cards will be fine for an above console experience until we are past the cross gen phase of the PS6 generation when requirements will increase yet again.

EDIT: Then the people who went with the 3090/4090/7900XTX will be sitting smugly as those on 16GB cards start to struggle. Just like 1080Ti and 2080Ti owners sit smugly as those who went with 8GB cards or got a 3070 are struggling now.
Posted on Reply
#28
phanbuey
N3utroYes having more vram is a good thing at equal performance but with ray tracing games amd gpus are far behind nvidia, and it will stay this way because nvidia invests massively into new AAA games producers so they use their technology, making most new games running better on nvidia hardware. That's why nvidia gpu is over 90% of the total gpu market. AMD will keep falling until they also invest like this.
I think this type of marketing is really a step in the right direction. They really did well with the RX6000 series marketing, but the 7000 series, both on CPU and GPU is kind of a mess.

I personally would still prefer a 4080 over a 7900xtx due to the fact that I use DLSS balanced + reflex at 4k in most games that ends up being faster and generally better looking than 7900xtX with FSR2 quality.

But I wouldn't really turn down a 7900XTX either -- it's a great product if you don't really care about the current state of "Ray tracing" and don't really use upscaling.

I built a few 5700 + 6700xt 12GB rigs for friends recently and they're as happy as clams.
Posted on Reply
#29
bug
Of course more memory is better, that's not really up for debate.

The problem as I see it is when you have to choose between a weaker GPU with more VRAM and a stronger GPU with less VRAM. Given that VRAM issues are easily fixable by turning textures from "very high" to "high" with differences only Photoshop will spot, I'd always go for the latter.
Posted on Reply
#30
Space Lynx
Astronaut
its a good card to play. I love my 16gb of vram, and 32gb ram. i never have to check shit and everything is smooth as butter
Posted on Reply
#31
bearClaw5
Vayra86Devs already are pushing in that area and even Nvidia is helping them push it that way ;) So what segment of the market's offerings is going to step on the brakes here? Its going to be (and already is) solved by dialing down settings or dynamically dropping image quality. Fan-tas-tic!

The only fools that don't seem to want to get that are the ones who bought into low VRAM GPUs at a premium. Sucks to be them. They've been warned a million times. But going by their own cognitive dissonance, they don't care, so all is well in the world isn't it? They're not skipping games from devs that push over 12GB requirements either, so who's really losing here?

I'm just sittin here enjoying my popcorn right now.
A few months ago people were still selling 2nd hand 3080 10GBs at (way) over 700 EUR over here. Try that now. Lmao. Ampere and also most of Ada are going to go down as the least future proof gens in a long long time. Now you know why Ampere's stack is a full blown mess of too low initial and far too high VRAM capacities in the rebound. Nvidia knew it, and they're still pushing for planned obscolescence, and people are still in denial.
Damn.

Guess I should just buy a console huh?
Posted on Reply
#32
Dazz023
And people were hoping that AMDs marketing wont be as abysmal as it was under Hallock - we dont have competitive products, we get owned in sales - let's fabricate numbers and spout some marketing BS that everyone outside of blind fanboys sees through. Same old, same old...
Posted on Reply
#33
TheLostSwede
News Editor
kondaminwith dram prices collapsing like now there is no reason for nvidia to not make the 4070 a 16gb card
DRAM and VRAM are not the same though, so the pricing doesn't correlate.
Posted on Reply
#34
btk2k2
bugOf course more memory is better, that's not really up for debate.

The problem as I see it is when you have to choose between a weaker GPU with more VRAM and a stronger GPU with less VRAM. Given that VRAM issues are easily fixable by turning textures from "very high" to "high" with differences only Photoshop will spot, I'd always go for the latter.
Given the 3070 seems to struggle at 720p in some games I don't think turning textures down 1 notch really fixes the issue.

Also your dichotomy is false. Frequently it is 2 GPUs with similar performance like 3070 vs 6800 or 3060Ti vs 6700XT where the latter while performing about the same has more VRAM. Those who chose the 3060Ti or 3070 over the similar priced AMD cards for the ray tracing are suffering in the latest games at settings the 6800 or 6700XT can play perfectly fine.
Posted on Reply
#35
bug
Dazz023And people were hoping that AMDs marketing wont be as abysmal as it was under Hallock - we dont have competitive products, we get owned in sales - let's fabricate numbers and spout some marketing BS that everyone outside of blind fanboys sees through. Same old, same old...
I don't think the numbers are fabricated. Just cherry picked.
I always treat numbers from manufacturers the same way I treat synthetic benchmarks: numbers that measure something very specific, with a weak connection to real life performance.

TPU's numbers tell me VRAM is rarely an issue and when it is, it's probably a lame console port. If people think that's indicative of future performance and that a card they buy today will still handle games in 5+ years, by all means, buy 12+ GB VRAM. Now, if you buy the absolute best, like a 4090 or a 7900XTX, you wouldn't expect those to come with 8GB VRAM. But if you buy something more down to earth, 8-10GB is probably just fine.
Posted on Reply
#36
BoboOOZ
bugOf course more memory is better, that's not really up for debate.

The problem as I see it is when you have to choose between a weaker GPU with more VRAM and a stronger GPU with less VRAM. Given that VRAM issues are easily fixable by turning textures from "very high" to "high" with differences only Photoshop will spot, I'd always go for the latter.
There is no such problem, usually AMD gives you both a stronger card (higher raster performance) and more VRAM for the same amount of money as Nvidia. Most users still pick Nvidia because of mindshare, hence this type of article.
tvshackerNow I'm really curious about the amount of VRAM the RX 7700 series will bring. Let's see if AMD is all talk and no show...
Safe to say it will be between 12 and 16, more likely 16. Wait, do you mean the real 7700, aka fake 7800, or the fake 7700?
Posted on Reply
#37
bug
BoboOOZThere is no such problem, usually AMD gives you both a stronger card (higher raster performance) and more VRAM for the same amount of money as Nvidia. Most users still pick Nvidia because of mindshare, hence this type of article.
You see? Even you know there is such a problem.
Posted on Reply
#38
Vya Domus
phanbueycoil whine
Nvidia cards don't have coil whine ? :kookoo:
Posted on Reply
#39
Mahboi
TheDeeGeeAll they're showing is how shit PC ports are.
How many times am I going to have to read this moronic excuse yet? Or is it the same 5 people who post this everywhere?
VRAM USAGE RISES. It's been rising since forever. The reason that GPUs got away with 8Gb for 6 years is because the PS4 limited how much games could actually eat.
Once the PS5 starts, give it about 2 years for proper PS5 games to come out and you can expect that the 8Gb limitation will be a 16Gb everywhere. It's not rocket science.

I am already so tired of Nvidia fanboys throwing this pathetic excuse. "I didn't make a bad decision, I would never, Nvidia didn't skimp out on VRAM, never, and if VRAM keeps rising, it's all bad PC ports, just keep compressing until it fits into 8Gb". And do so until 2050?
Oh wait, next it's going to be the shit-flinging of "these devs suck" or the even more amazing "This is an AMD sponsored title, AMD is forcing devs to take more VRAM to make my amazing Nvidia GPU crash!!!" (not a joke, I REALLY had at least 2 different drones say that over at Reddit).

It gets even worse when it spills over into RT territory btw. RT swallows a solid 1 to 2Gb on the games it's on. So you can be sure that within a year or so, VRAM-choked "muh Raytracing" buyers will be crying at "bad pc ports" for games that will then run about as well or better on RDNA 2 cards that have notoriously poor RT. All because Nvidia daddy was always cheap with VRAM like it's a super expensive luxury, and this time, they've really underestimated the risk and are going to pay big time in public image.

But don't worry, the drones will be all over the internet to echo "bad pc port" "amd conspiracy" "damn lazy devs" until Nvidia finally releases the 5000s with 12Gb min and 16+ for everything else, and then it'll have been a "a necessary toothing period for VRAM to become cheap enough to put in such quantities, thank you Nvidia".
Posted on Reply
#40
BoboOOZ
bugYou see? Even you know there is such a problem.
Of course, I added the paranthesis just incase you are one of those that define "stronger card" by "green magic", in which case you are definitely right in your universe.
Posted on Reply
#41
tvshacker
BoboOOZSafe to say it will be between 12 and 16, more likely 16. Wait, do you mean the real 7700, aka fake 7800, or the fake 7700?
Whatever they put out (hopefully) with:
  • similar or better performance than the 6800 (non xt)
  • Improved power consumption
  • Similar price to the 6700xt (<450€)
Posted on Reply
#42
Mahboi
ymdhisTranslation: our cards suck, but we put a lot of VRAM in them because that was the only thing we could do to make them look more valuable.
chart shows clear AMD cost per frame superiority
Nvidia drones:
Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult. Nvidia is never wrong. We are not a cult.
Posted on Reply
#43
docnorth
AMD should have done this with 4070 ti. 4070 non-ti probably isn't that powerful to be limited by 12 GB VRAM. 4060 (-ti) on the other hand with 8 GB and 128 bit bus... :nutkick:
Posted on Reply
#44
Mahboi
DavenIs this the first official posting of the 7900xt at a lower price ($849)?
Probably...and honestly it's lower than that. AMD again with the top tier marketing of lowering prices, but telling nobody, lest they may buy.
The 4070 Ti can scarcely be found at the same price as the XT in my country. And the only models that can be found are Zotac/PNY.
btk2k2EDIT: Then the people who went with the 3090/4090/7900XTX will be sitting smugly as those on 16GB cards start to struggle. Just like 1080Ti and 2080Ti owners sit smugly as those who went with 8GB cards or got a 3070 are struggling now.
So you think 16Gb will not suffice? Why? The PS5 has 16Gb of unified RAM. Counting the PS5 OS, obvious CPU needs, I'd expect that even with a PC with multiple monitors, 16Gb would be enough.
Posted on Reply
#45
BoboOOZ
tvshackerWhatever they put out (hopefully) with:
  • similar or better performance than the 6800 (non xt)
  • Improved power consumption
  • Similar price to the 6700xt (<450€)
Well, I imagine that they will keep their silly naming goind down the line, so the 7800 will probably have 16GB but will not be less than 450 USD.

For your price it will be probably the 7600XT which will be more around 12GB, but I have seen no leaks yet. Probably AMD are still working on fixing their performance issues , hoping to fix them before launching the lower SKUs.
Posted on Reply
#46
InVasMani
WirkoLet's see what DirectStorage (along with DX12 Upload Heaps) can do for games - provided it ever becomes widely used.
One of the things DS can do is decompression from memory. Compressed game assets can be stored in system RAM, then transfered to VRAM and decompressed with fast algorithms on the GPU on the fly. Ideally this could reduce the need for more VRAM but it takes additional effort to optimise for the PC platform.
That will only help when games don't require a certain baseline threshold to be met in terms of VRAM usage and capacity. Yes it could could help juggle allocation a bit and compression itself can help a bit, but that will only help so far. It's not exactly a cure all, but it won't hurt and is a improvement.

We might actually see better progress on VRAM on cards now that the crypto scene has died down significantly. I don't imagine that helped any on pricing of DRAM chips in terms of sourcing them with demand as high as it was at it's peak.
Posted on Reply
#47
TheDeeGee
Vayra86That 4070ti you got there is going to run into trouble sooner rather than later.
I mainly play boomer shooters anyways, and the 4070 Ti has been great for RetroArch emulation so far, it has it's uses :)
Posted on Reply
#48
Zaqq
They've put Cyberpunk 2077 without RT in the marketing material despite being one of the games where the usage of raytracing makes quite a difference, I wonder why...
Posted on Reply
#49
jaszy
AMD has been trying to push VRAM since Radeon 7 as that was the only leverage they had at the time... Card ran hot and underperformed.

I personally think 12GB should be bare minimum these days (entry), but I come from a generation where VRAM doubled every two years up until 2016... lol
Posted on Reply
#50
blacksea76
AMD have crap prices just like Nvidia now, they can push whatever rhetoric they want, next gpu I buy will be value for money even if it is Intel, same for processors. Used to support AMD before, now I supporting my wallet.
Posted on Reply
Add your own comment
May 21st, 2024 08:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts