Wednesday, April 12th 2023

AMD Plays the VRAM Card Against NVIDIA

In a blog post, AMD has pulled the VRAM card against NVIDIA, telling potential graphics card buyers that they should consider AMD over NVIDIA, because current and future games will require more VRAM, especially at higher resolution. There's no secret that there has been something of a consensus from at least some of the PC gaming crowd that NVIDIA is being too stingy when it comes to VRAM on its graphics cards and AMD is clearly trying to cash in on that sentiment with its latest blog post. AMD is showing the VRAM usage in games such as Resident Evil 4—with and without ray tracing at that—The Last of US Part I and Hogwarts Legacy, all games that use over 11 GB of VRAM or more.

AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.
Source: AMD
Add your own comment

218 Comments on AMD Plays the VRAM Card Against NVIDIA

#151
btk2k2
bugTheir numbers for Hogwarts Legacy or RE4 don't match TPU's. Probably some others, too, can't check them all atm.
Obviously, games are big and different reviewers use different scenes. Expecting the numbers to match between different sites who test different parts of a game is a folly.
Posted on Reply
#152
ValenOne
bugTheir numbers for Hogwarts Legacy or RE4 don't match TPU's. Probably some others, too, can't check them all atm.
From www.techpowerup.com/review/resident-evil-4-benchmark-test-performance-analysis/4.html

For RE4
Techpowerup's 1080p Ultra RT
6800 XT = 105.8 fps
3070 = 88

Techspot's 1080p "Max" with RT
6800 = 91 fps
3070 = crashed

Techpowerup's 1440p Ultra RT
6800 XT = 85 fps
3070 = crashed

Techspot's 1440p "Max" with RT
6800 =77 fps
3070 = crashed

Techpowerup used a faster RX 6800 XT model while Techspot used a lesser RX 6800 model. Techspot's RX 6800 vs Techpowerup's RX 6800 XT numbers are close.
Posted on Reply
#153
bug
ValenOneFrom www.techpowerup.com/review/resident-evil-4-benchmark-test-performance-analysis/4.html

For RE4
Techpowerup's 1080p Ultra RT
6800 XT = 105.8 fps
3070 = 88

Techspot's 1080p "Max" with RT
6800 = 91 fps
3070 = crashed

Techpowerup's 1440p Ultra RT
6800 XT = 85 fps
3070 = crashed

Techspot's 1440p "Max" with RT
6800 =77 fps
3070 = crashed

Techpowerup used a faster RX 6800 XT model while Techspot used a lesser RX 6800 model. Techspot's RX 6800 vs Techpowerup's RX 6800 XT numbers are close.
I was talking about tests that passed. They're pretty close together on TPU and wide apart on Techspot. And that's despite Techspot using a slower card, as you noted.
Posted on Reply
#154
FeelinFroggy
Honestly, VRAM usage really depends on the resolution. There are gaming cards with 20 and 24gb of VRAM. That is such a waste and only makes the cards more expensive. Typically, the top end cards are not that much fasted the the next in the tier so they load it up with unneeded stuff to justify the higher price and the 90, TI, XTX designations. All that additional VRAM just sits idle doing nothing for the life of a gaming card. Money well spent.

My recommendation is if you are buying a new card in 2023 and play at resolutions higher then 1080p, get a card with 12gb - 16gb card with VRAM. For 99% of games on the market, 12gb is enough. Game manufactures dont want high system requirements for games because fewer people will be able to buy and play their games.

There will always be some poorly optimized console ports that will run poorly and use unreasonable system resources. And there will always be a game or two that pushes the envelop and we ask "Can it run Crysis?"
Posted on Reply
#155
jarpe
AusWolfThat is an opinion, not fact. Besides, FSR 3.0 is just around the corner, I'd wait for it before calling judgement.


Whether you like FG or not depends on how sensitive you are to input latency. The technology is not without issues at the moment, especially when you generate extra frames from a low frame rate situation.

Personally, I consider FG frame rates irrelevant for comparison.


Not really.



Fair enough - I mostly shop at Scan, that's why I was comparing prices from there. A 5% difference might actually be worth it.
It is not an opinion, in many cases the quality mod of FSR looks works than the performance mod of DLSS.

Also FG goes along with Reflex to compensate for latency, and also you don't use FG to hit 60fps, but you use it to hit 100+ fps and the latency in this situation is very good. I was skeptical at first but after I tried FG on several games and I can say it is a game changer and every demanding game need to have it.

This 16% better RT includes games that barely use RT, however in games that heavily use RT and have meaningful visual impact the RT cores on the7900XTX get overwhelmed, that is why the 4080 is 25-45% faster in heavily RT use.
Posted on Reply
#156
AusWolf
jarpeIt is not an opinion, in many cases the quality mod of FSR looks works than the performance mod of DLSS.
I haven't tried the latest versions of DLSS since my 2070 died about 6 months ago, so I'll take your word for it. The picture you posted may be an isolated case, but DLSS does look better there, I'll give you that.
jarpeAlso FG goes along with Reflex to compensate for latency, and also you don't use FG to hit 60fps, but you use it to hit 100+ fps and the latency in this situation is very good. I was skeptical at first but after I tried FG on several games and I can say it is a game changer and every demanding game need to have it.
That's the thing... I don't need 100+ FPS. I need 60, or at least 40-45 minimum.
jarpeThis 16% better RT includes games that barely use RT, however in games that heavily use RT and have meaningful visual impact the RT cores on the7900XTX get overwhelmed, that is why the 4080 is 25-45% faster in heavily RT use.
I hope both AMD and Nvidia focus the development of their next architectures on RT. Maybe more RT cores with the same number of traditional shader cores. Raster performance is already at a maximum, imo.
Posted on Reply
#157
Prima.Vera
MahboiThe keyword being "yet".
In just 3 months, we've had 4 large games where 8Gb started being a serious problem. I predict that the trend isn't going to stop at all during the next two years.
We'll see just how the 10Gb 3080 lasts, and I think the 12Gb 4070/Ti will be worse. At least with the 3080 you had 2 good years of PS4 era holdovers until the requirements climbed hard. The 4070s feel sufficient "for now" I'm sure, but will it even last 2 years? I highly doubt it.
No man. You're just talking about broken console ports, which work fine even with 8GB VRAM cards. It's not that all of the sudden they just start adding 8K resolution textures.
Plus some of the game engines out there cache the whole VRAM, even if you have 24 or 32GB.
Posted on Reply
#158
chrcoluk
Prima.VeraThis is so much BS that it is ridiculous. I have yet a game to play that would require more than 10GB of VRAM. Even the crappiest of all ports ever released, the "The Last of Us - P1" is smooth as butter on Ultra with G-Sync ON, even if the so called in game VRAM usage is around 12.8GB of VRAM.
This kind of post just reads "I dont get it"

We not talking frames per second, we talking about texture quality, textures going *poof* and games crashing. Also some games stutter due to excessive asset swapping (caused by low VRAM).

Now days many games have dynamic engines which adjust to available VRAM on the fly so the effect of low VRAM is not as abvious as it could be.

Some of us are not ok with PS2/PS3 quality textures in 2023.

Its not important to you personally, thats fine, doesnt mean its BS though.
Posted on Reply
#160
Mahboi
tvshacker2 reasons to not wait:
  1. Low performance/€ improvement (so far)
  2. Growing tensions between China and Taiwan
Bonus:
7700/7700XT will likely have similar performance to the 6800XT, but with (only) 12G VRAM
Unlikely. The Angstronomics leak, which was very complete, mentioned a 256 bit bus for Navi 32. Navi 31 has 384.

Logically if the 384 bus was for 24Go, 256 should be for 16.
No dice on Navi 31, still 128 and unless they double the RAM for certain models, it's still going to be an 8.
Posted on Reply
#161
Winssy
tvshackerHow does this one manage 20GB on a 160bit bus then?
There is a bilateral placement of chips here. Two 2GB chips are placed on each memory controller. One chip is placed on the front side, the other on the back. 160bit / 32bit = 5 controllers , 5 controllers * 2 sides = 10 places for the 2GB gddr6 chips = 20GB of VRAM. But since it is very expensive, we definitely won't see this layout on mid-range cards. The only gaming card with such a chip placement was the expensive 3090, with 24 GDDR6X chips of 1GB each, 12 on each side (at the time of the 3090 release, there were no 2GB GDDR6X chips available).
For example, here is a photo of the Asus Strix 3090.
Posted on Reply
#162
tvshacker
WinssyThere is a bilateral placement of chips here. Two 2GB chips are placed on each memory controller. One chip is placed on the front side, the other on the back. 160bit / 32bit = 5 controllers , 5 controllers * 2 sides = 10 places for the 2GB gddr6 chips = 20GB of VRAM. But since it is very expensive, we definitely won't see this layout on mid-range cards. The only gaming card with such a chip placement was the expensive 3090, with 24 GDDR6X chips of 1GB each, 12 on each side (at the time of the 3090 release, there were no 2GB GDDR6X chips available).
For example, here is a photo of the Asus Strix 3090.
Does it have to be done for all the chips? I.e. in the case of the 4070, would we have a 24gb version or could they just install enough chips in the back to get 16gb?
Posted on Reply
#163
Winssy
tvshackerDoes it have to be done for all the chips? I.e. in the case of the 4070, would we have a 24gb version or could they just install enough chips in the back to get 16gb?
As far as I know, dual chip placement must be done for all controllers at once. In other words, the 4070 can have either 12GB or 24GB of VRAM.
In my opinion, there is only one way to make the 4070 with 16GB using dual chip placement, which is to use 4 memory controllers. But in this case, the memory bus will be 128-bit, which will negatively impact the performance.
Posted on Reply
#164
bug
tvshackerDoes it have to be done for all the chips? I.e. in the case of the 4070, would we have a 24gb version or could they just install enough chips in the back to get 16gb?
Like he said, the arrangement is expensive, so probably not for mid-range cards. Even if the difference wasn't big, who would pay more for a 4070?
Posted on Reply
#165
BoboOOZ
WinssyAs far as I know, dual chip placement must be done for all controllers at once. In other words, the 4070 can have either 12GB or 24GB of VRAM.
In my opinion, there is only one way to make the 4070 with 16GB using dual chip placement, which is to use 4 memory controllers. But in this case, the memory bus will be 128-bit, which will negatively impact the performance.
Actually, it can be done, but it will result in uneven bandwidths for different parts of the memory, which is as bad an idea as it sounds:
www.pcworld.com/article/415858/nvidia-agrees-to-geforce-gtx-970-false-advertising-settlement-offers-30-refunds.html
So we won't be seeing anything like that this time.
Posted on Reply
#166
TheDeeGee
I'd rather have Nvidia than a headache, thank you very much.
Posted on Reply
#167
tvshacker
bugEven if the difference wasn't big, who would pay more for a 4070?
I wouldn't. But maybe they would keep it at 600$ and lower the standard 12gb to 500$
Posted on Reply
#168
chrcoluk
bugLike he said, the arrangement is expensive, so probably not for mid-range cards. Even if the difference wasn't big, who would pay more for a 4070?
Well if they not profiteering it might be an extra $50-100 to bump a 4070 to 16 gigs, I probably would be more likely to buy a $700 16 gig 4070 than a $600 12 gig 4070.

But knowing Nvidia if they released a 16 gig model it would cost an extra $300.
Posted on Reply
#169
Prima.Vera
chrcolukThis kind of post just reads "I dont get it"

We not talking frames per second, we talking about texture quality, textures going *poof* and games crashing. Also some games stutter due to excessive asset swapping (caused by low VRAM).

Now days many games have dynamic engines which adjust to available VRAM on the fly so the effect of low VRAM is not as abvious as it could be.

Some of us are not ok with PS2/PS3 quality textures in 2023.

Its not important to you personally, thats fine, doesnt mean its BS though.
You understood nothing from what I've wrote.
I said I was using ultra settings on that game, INCLUDING textures, and didn't have any crashes or stuttering, or sudden pop up effects or buffering on that game.
Posted on Reply
#170
chrcoluk
Prima.VeraYou understood nothing from what I've wrote.
I said I was using ultra settings on that game, INCLUDING textures, and didn't have any crashes or stuttering, or sudden pop up effects or buffering on that game.
This is so much BS that it is ridiculous. I have yet a game to play that would require more than 10GB of VRAM. Even the crappiest of all ports ever released, the "The Last of Us - P1" is smooth as butter on Ultra with G-Sync ON, even if the so called in game VRAM usage is around 12.8GB of VRAM.
Ok boss.
Posted on Reply
#171
bug
chrcolukOk boss.
He's talking about how games preload stuff if there's VRAM available. It's done to minimize IO, but it doesn't result in a meaningful performance impact.
Posted on Reply
#172
chrcoluk
bugHe's talking about how games preload stuff if there's VRAM available. It's done to minimize IO, but it doesn't result in a meaningful performance impact.
Preloading likely can prevent stutters. Its a good thing, I rather have my assets preloaded than loading on the fly to cause stutters.

I miss the days of load everything into ram, and only then you play, no live loading of stuff in background. I wonder what prevents them doing that now? hmm.
Posted on Reply
#173
bug
chrcolukPreloading likely can prevent stutters. Its a good thing, I rather have my assets preloaded than loading on the fly to cause stutters.
Well, we're kinda drifting from "must have 16GB VRAM". Unless you're trying something like Rage, textures only change during level changes or moving from one area to another. Not exactly the part of the game that would be ruined by a few stutters.
chrcolukI miss the days of load everything into ram, and only then you play, no live loading of stuff in background. I wonder what prevents them doing that now? hmm.
There is no way to do that. Try as you may to fit everything into VRAM, there's a kid somewhere with Photoshop and time on their hands that will take your textures, apply 2x scaling on both axes, call that a HD texture pack and boom! you're out of VRAM again. Also, 16GB VRAM is the same as typical RAM in a PC, that's pretty imbalanced.
I think what the developers/engines do right now is pretty well thought: set a baseline and if they find more VRAM than that, try to load some more. Preloading is guesswork though, because you never know what the next area the player visits will be. Or, you can try to predict the next area based on which "exit" the player approaches, but the player changes their mind, you're just preloading things that won;t be used next and you initiate IO that may lead to other performance drops.
Posted on Reply
#174
chrcoluk
bugWell, we're kinda drifting from "must have 16GB VRAM". Unless you're trying something like Rage, textures only change during level changes or moving from one area to another. Not exactly the part of the game that would be ruined by a few stutters.

There is no way to do that. Try as you may to fit everything into VRAM, there's a kid somewhere with Photoshop and time on their hands that will take your textures, apply 2x scaling on both axes, call that a HD texture pack and boom! you're out of VRAM again. Also, 16GB VRAM is the same as typical RAM in a PC, that's pretty imbalanced.
I think what the developers/engines do right now is pretty well thought: set a baseline and if they find more VRAM than that, try to load some more. Preloading is guesswork though, because you never know what the next area the player visits will be. Or, you can try to predict the next area based on which "exit" the player approaches, but the player changes their mind, you're just preloading things that won;t be used next and you initiate IO that may lead to other performance drops.
Ahh so you now acknowledged VRAM capacity can be an issue then?
Posted on Reply
#175
bug
chrcolukAhh so you now acknowledged VRAM capacity can be an issue then?
Of course it can be, you wouldn't buy a card with 1MB of VRAM for your PC in 2023. That wasn't what we were talking about.
Posted on Reply
Add your own comment
Dec 2nd, 2024 07:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts