Sunday, June 7th 2020

AMD Declares That The Era of 4GB Graphics Cards is Over

AMD has declared that the era of 4 GB graphics cards is over and that users should "Game Beyond 4 GB". AMD has conducted testing of its 4 GB RX 5500XT & 8 GB RX 5500XT to see how much of a difference VRAM can make on gaming performance. AMD tested the cards on a variety of games at 1080p high/ultra settings with a 3600X & 16 GB 3200 MHz ram, on average the 8 GB model performed ~19% better than its 4 GB counterpart. With next-gen consoles featuring 16 GB of combined memory and developers showing no sign of slowing down, it will be interesting to see what happens.
Source: AMD
Add your own comment

31 Comments on AMD Declares That The Era of 4GB Graphics Cards is Over

#1
Searing
Great, then I hope to hear they cancel their stupid 4GB models.... (apparently there will be no 4GB cards for Navi this fall?)
Posted on Reply
#2
Lionheart
That's great AMD, now lets walk the walk. :pimp:
Posted on Reply
#3
R0H1T
Do we get free 4GB memory or what o_O

What's the point of showing benefits of 8GB memory with low(er) end card.
Posted on Reply
#4
arbiter
R0H1TDo we get free 4GB memory or what o_O

What's the point of showing benefits of 8GB memory with low(er) end card.
Cause lower end cards are likely gonna have 4gb then higher end one.
Posted on Reply
#5
ipo3nk
great. but im still happy to have pentium ivybridge / gtx660 2gb to play fifa20 and pes2020. no need to upgrade soon.
Posted on Reply
#6
FordGT90Concept
"I go fast!1!11!1!"
PCIe 3.0 x8 + out of memory = not enough bandwidth to address RAM. If it had x16, it wouldn't be nearly as bad.
Posted on Reply
#7
ixi
Can I get the source for "declares that the era of 4GB graphics cards is over"?

So then, entry level, mid level will always have above 4GB? I doubt it.
Posted on Reply
#8
Hyderz
so 6gb low end cards coming next gpu line up :)
here i am playing doom eternal on a 2gb 1050 on my other machine
Posted on Reply
#9
BSim500
A bit sensationalist. In many regions the 4GB 5500XT is already more expensive than the 4GB GTX 1650 Super. 8GB pushes the price up into +20% faster 6GB GTX 1660 Super range at which point it's hardly a "free" upgrade vs 4GB models. And I assume AMD haven't cancelled their new 2GB "VRAM" limited APU's?...

Likewise, aside from many budget gamers simply not caring about the latest crappily optimized AAA's, most low-end gamers in general also tend to have more common sense than benchmarkers. Eg, rather than artificially cripple frame-rates down to 20fps with Mirrors Edge Catalyst style "Ultra Mega Super Hyper" presets, they'll just bump the presets down a notch or two and enjoy the game. For most games, High is where actual optimization starts with Ultra being more like "let's see how much over-exaggerated post-processing cr*p we can fill this with" and personally I turn half that junk off anyway even without performance / VRAM limitation simply because I want to actually see what I'm playing...
Posted on Reply
#10
delshay
I said 4GB was dead some time ago here on TPU. Some games you can't max out the settings @1080p, even in some older games. It leads to stutter, long pause & all sorts of weird things including crashes. With faster cards coming to the market you need 16GB. The reason for this is, as cards get faster, users will push for 4K & above, because that is what users do. No point in having an ultra fast card & running it @1080p. 4K will start to become standard with higher resolutions being optional. HDMI 2.1 & the latest PCI-e standard will help to accelerate 4K as the new standard.
Posted on Reply
#11
GoldenX
Sure, let's test with a PCIe 8x card that will show a bigger delta once VRAM is full. Totally not biased testing.
Posted on Reply
#12
ARF
delshayI said 4GB was dead some time ago here on TPU. Some games you can't max out the settings @1080p, even in some older games. It leads to stutter, long pause & all sorts of weird things including crashes. With faster cards coming to the market you need 16GB. The reason for this is, as cards get faster, users will push for 4K & above, because that is what users do. No point in having an ultra fast card & running it @1080p. 4K will start to become standard with higher resolutions being optional. HDMI 2.1 & the latest PCI-e standard will help to accelerate 4K as the new standard.
16 GB is too much, not needed yet. It leads to quite expensive cards which are already expensive as is.
As you can see in the following analysis, the maximum is 8 GB VRAM at Ultra 3840x2160..
11 GB or 12 GB would suffice.


www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html


www.techpowerup.com/review/gears-tactics-benchmark-test-performance-analysis/5.html


www.techpowerup.com/review/resident-evil-3-benchmark-test-performance-analysis/4.html


www.techpowerup.com/review/control-benchmark-test-performance-nvidia-rtx/5.html
Posted on Reply
#13
delshay
ARF16 GB is too much, not needed yet. It leads to quite expensive cards which are already expensive as is.
As you can see in the following analysis, the maximum is 8 GB VRAM at Ultra 3840x2160..
11 GB or 12 GB would suffice.


www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html


www.techpowerup.com/review/gears-tactics-benchmark-test-performance-analysis/5.html


www.techpowerup.com/review/resident-evil-3-benchmark-test-performance-analysis/4.html


www.techpowerup.com/review/control-benchmark-test-performance-nvidia-rtx/5.html
In one of those games you can see the 8Gb of memory is almost max out. OK, now try it with ultra wide screen @4K.
Most users do not get near the max system ram, so my personal opinion is vram should be the same. There's also not enough memory to go beyond 4K for Resident Evil 3.

A user some time ago posted a screenshot of Resident Evil 2 needing nearly 14GB vram. There's a thread on this here on TPU.
Posted on Reply
#14
R0H1T
People buying low/mid range GPU will not play at 4k maxed out settings, at least they shouldn't. The ones playing at high resolutions (max settings) &/or high fps buy the upper mid range or high end GPUs anyway. Basically the 8GB doesn't help much with a low/mid tier GPU & the ones above will feature more VRAM anyway, pretty sure we've had this debate multiple times in the past though 3/4/6 GB (VRAM) might've been the talking point then.
Posted on Reply
#15
ObscureAngelPT
Honestly as an owner of a 1650 Super.
I would have to agree with AMD, but only for most future games or for people maxing out their games.
The settings that I do play with the 1650 Super, only a small share of games I had to drop Texture Quality.
The one that I remember it was Doom Eternal, which I did drop the texture quality from Ultra Nightmare to high, so that it would fit inside the vram budget, all the rest of the settings was maxed and the experience was 1080P >60FPS.

With new consoles arriving VRAM usage and ram will go up, logically!
Cheers
Posted on Reply
#16
ARF
ObscureAngelPTHonestly as an owner of a 1650 Super.
I would have to agree with AMD, but only for most future games or for people maxing out their games.
The settings that I do play with the 1650 Super, only a small share of games I had to drop Texture Quality.
The one that I remember it was Doom Eternal, which I did drop the texture quality from Ultra Nightmare to high, so that it would fit inside the vram budget, all the rest of the settings was maxed and the experience was 1080P >60FPS.

With new consoles arriving VRAM usage and ram will go up, logically!
Cheers
It depends on the particular engine operation - I have the impression that Unreal Engine 5 will work and stream textures directly from the super duper fast SSD bypassing the heavy need for VRAM.
Posted on Reply
#17
Caring1
R0H1TDo we get free 4GB memory or what o_O
Sure, you can download it with the next update. :roll:
Posted on Reply
#18
R0H1T
Waiting for more I bricked my GPU (doing this) threads :D
P. S.
I know it's a joke.
Posted on Reply
#19
candle_86
delshayI said 4GB was dead some time ago here on TPU. Some games you can't max out the settings @1080p, even in some older games. It leads to stutter, long pause & all sorts of weird things including crashes. With faster cards coming to the market you need 16GB. The reason for this is, as cards get faster, users will push for 4K & above, because that is what users do. No point in having an ultra fast card & running it @1080p. 4K will start to become standard with higher resolutions being optional. HDMI 2.1 & the latest PCI-e standard will help to accelerate 4K as the new standard.
No not true at all, alot of users play the long game, look at people who bought a 980 Ti but kept to a 1080 screen, they are still happy while the guy who bought 4k has already had to upgrade.
Posted on Reply
#21
efikkan
The main reason why a game would gain performance from more VRAM is that it's swapping. But generally speaking, more VRAM wouldn't increase performance if everything else remains the same. But it's kind of pointless pushing a low-end card to this extreme just to find a bottleneck.

In theory I wouldn't mind if consoles had 4x the amount of VRAM, if it didn't drive up cost significantly, and games utilized in a sensible way. It is possible to use more memory to add more details in backgrounds etc., but this is the chicken and the egg problem once again.
FordGT90ConceptPCIe 3.0 x8 + out of memory = not enough bandwidth to address RAM. If it had x16, it wouldn't be nearly as bad.
If the game is at the point where it's swapping heavily, even 16x wouldn't save it, as latency would also be a huge problem forcing the framerate to a crawl.

x8 is enough though for resource streaming, if it's done properly.
delshayMost users do not get near the max system ram, so my personal opinion is vram should be the same.
Why should you pay for VRAM that you don't need?
Granted you should have a little margin, but beyond that, what's the point?
delshayA user some time ago posted a screenshot of Resident Evil 2 needing nearly 14GB vram. There's a thread on this here on TPU.
There is a difference in allocating and actually needing, some games allocate huge buffers.
Also, was this measurement of the game's own usage, or total usage? Since background tasks can consume a lot, especially Chrome.
ObscureAngelPTI would have to agree with AMD, but only for most future games or for people maxing out their games.
Buying extra VRAM for "future proofing" has rarely if ever paid off in the past. Generally speaking, the need for performance increases just as much (at least how games are commonly balanced), so the card is obsolete long before you get to enjoy that extra VRAM for gaming.
I for instance, have a GTX 680 4 GB in one machine and a GTX 1060 3 GB in another. Guess which one plays games better?
ARFI have the impression that Unreal Engine 5 will work and stream textures directly from the super duper fast SSD bypassing the heavy need for VRAM.
It will be interesting to see what they utilize it for.
But if a game is going to have like ~50 GB of data per level (uncompressed), just a couple of such games will eat up that entire SSD.

Generally it would make more sense to store the assets with lossy compression at about 1:10 ratio, which usually retains good enough details for grainy textures, then decompress in the CPU and send uncompressed data to the GPU. This needs to be prefetched and ready though, but that's not a problem for a well crafted game engine.
Posted on Reply
#22
Vayra86
delshayI said 4GB was dead some time ago here on TPU. Some games you can't max out the settings @1080p, even in some older games. It leads to stutter, long pause & all sorts of weird things including crashes. With faster cards coming to the market you need 16GB. The reason for this is, as cards get faster, users will push for 4K & above, because that is what users do. No point in having an ultra fast card & running it @1080p. 4K will start to become standard with higher resolutions being optional. HDMI 2.1 & the latest PCI-e standard will help to accelerate 4K as the new standard.
You started well... but then you jump straight to 16GB. Shame! The market doesn't work like that. Here's a rewind of how 4GB slowly turned obsolete... it started with the 3GB showings that quickly fell off in comfortable use and performance charts. 3GB meant cutting down on things while the core could probably easily pull it. The same now applies to 4GB. Its not great balance unless you're at the absolute bottom end of performance... but then you won't run high/ultra either.

6GB is now the safe zone, I'd say, for 1080p ultra which is what even mid range can easily push now. Ampere/RDNA2 might push that up to 8GB by the end of their generation. At that point, 8GB will still be the norm for any res up to 1080p and most of 1440p, even, which is still where the vast majority games at - no matter how fat their GPUs are. Many many years go by for these bars to move, and it just follows common denominator in both games and gpus available. The new consoles and gpus will push the envelope... but first you need content to utilize it, and only AFTER that a new baseline is established.

Turing is a good example of that. We're still not looking at lots of RT content, but the next gen will be having the hardware widely available. These things happen slowly and with generational leaps.

Another thing: this isn't system RAM where you need to immediately double up either. 8GB > 16GB makes for an expensive GPU. We have 1080ti's with 11GB. There are MANY steps in between 8 and 16 with coherent VRAM buses. 4K is also not standard, despite the push from manufacturers/market. A few unhappy souls jumped into it and are now constantly tweaking to get desired performance and pixel density :) Its not really gaining traction, just yet.

@John Naylor are you watching this? ;) Live and learn...
Posted on Reply
#23
$ReaPeR$
Since AMD makes the console socs they know what they are talking about, especially since the latest consoles are heavily memory reliable.
Posted on Reply
#24
_Flare
The "can´t run DOOM Eternal in 1080p" thing is new and tested on a i9-9900K, where the missing PCIe4 isn´t a error in the statement.
The performance numbers beside that single statement are maked as:
Testing done by AMD performance labs 11/29/2019 on Ryzen 5 3600X, 16GB DDR4-3200MHz, ASROCK X570 TAICHI
and the whole context of the blogpost seems to only tackle 1080p

So only 8 lanes but PCIe4 instead of PCIe3, and still the crippling to only 8 lanes affects the 4GB card more than the 8GB card,
wich is obvious because the smaller overall buffer leads to the need of more frequent data movement even if all actually needed textures can be saved in the 4GB,
in the moment when new or refreshment of data is needed the PCIe x8 can be an early bottelneck.
On the 8GB card the driver and/or game can make use of the bigger buffer by transmitting some maybe needed data in advance when the PCIe interface has spare time/capacity, when the 4GB card is already saturated with the momentarily needed data.
This also underlines the sometimes obvious advantage of the cleverer nvidia driver when acting in or near VRAM saturation, AMD seems to have a not as clever data movement strategy sometimes.
Posted on Reply
#25
ixi
efikkanBuying extra VRAM for "future proofing" has rarely if ever paid off in the past. Generally speaking, the need for performance increases just as much (at least how games are commonly balanced), so the card is obsolete long before you get to enjoy that extra VRAM for gaming.
I for instance, have a GTX 680 4 GB in one machine and a GTX 1060 3 GB in another. Guess which one plays games better?
Which one plays games better? Easy answer. RT 5700 XT :peace:.

Jokes aside, you're comparing how many gen older gpu against newer one? :pimp: Not kinda fair comparison. GTX 680 still chopping le lumbers, impresive!
Posted on Reply
Add your own comment
Jun 30th, 2024 22:51 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts