Wednesday, April 24th 2024

AMD's RDNA 4 GPUs Could Stick with 18 Gbps GDDR6 Memory

Today, we have the latest round of leaks that suggest that AMD's upcoming RDNA 4 graphics cards, codenamed the "RX 8000-series," might continue to rely on GDDR6 memory modules. According to Kepler on X, the next-generation GPUs from AMD are expected to feature 18 Gbps GDDR6 memory, marking the fourth consecutive RDNA architecture to employ this memory standard. While GDDR6 may not offer the same bandwidth capabilities as the newer GDDR7 standard, this decision does not necessarily imply that RDNA 4 GPUs will be slow performers. AMD's choice to stick with GDDR6 is likely driven by factors such as meeting specific memory bandwidth requirements and cost optimization for PCB designs. However, if the rumor of 18 Gbps GDDR6 memory proves accurate, it would represent a slight step back from the 18-20 Gbps GDDR6 memory used in AMD's current RDNA 3 offerings, such as the RX 7900 XT and RX 7900 XTX GPUs.

AMD's first generation RDNA used GDDR6 with 12-14 Gbps speeds, RDNA 2 came with GDDR6 at 14-18 Gbps, and the current RDNA 3 used 18-20 Gbps GDDR6. Without an increment in memory generation, speeds should stay the same at 18 Gbps. However, it is crucial to remember that leaks should be treated with skepticism, as AMD's final memory choices for RDNA 4 could change before the official launch. The decision to use GDDR6 versus GDDR7 could have significant implications in the upcoming battle between AMD, NVIDIA, and Intel's next-generation GPU architectures. If AMD indeed opts for GDDR6 while NVIDIA pivots to GDDR7 for its "Blackwell" GPUs, it could create a disparity in memory bandwidth performance between the competing products. All three major GPU manufacturers—AMD, NVIDIA, and Intel with its "Battlemage" architecture—are expected to unveil their next-generation offerings in the fall of this year. As we approach these highly anticipated releases, more concrete details on specifications and performance capabilities will emerge, providing a clearer picture of the competitive landscape.
Sources: @Kepler_L2 (on X), via Tom's Hardware
Add your own comment

114 Comments on AMD's RDNA 4 GPUs Could Stick with 18 Gbps GDDR6 Memory

#101
Nordic
evernessinceMost gamers are in fact not on the ray tracing bandwagon. Both HWUB and GamersNexus did a poll and less than 30% of enthusiasts consider ray tracing an essential factor when purchasing a video card.
Would you be willing to provide a link to those polls?
Posted on Reply
#102
evernessince
NordicWould you be willing to provide a link to those polls?
Not the exact polls I was referencing but I was able to pull up some that I've voted on in the past via my history:






www.youtube.com/channel/UCI8iQa1hv7oV_Z8D35vVuSg/community?lb=UgkxUw1eV4quZuFTSY7v1jJeDNdIVtNKOpyN
www.youtube.com/channel/UCI8iQa1hv7oV_Z8D35vVuSg/community?lb=UgkxwOHbJ6uIcR9EyUrPdhxw5F2_A3FMJFAi
www.youtube.com/channel/UChIs72whgZI9w6d6FhwGGHA/community?lb=UgkxABGVuOK5yBlV-rFQROFWNwfSIQ3cIbOD

According to the GN poll 31% of people didn't have an RT capable card and a further 41% didn't even bother enabling it. This is among the enthusiast community, the numbers are going to be much more against RT in the broader PC gaming population.
Posted on Reply
#103
AusWolf
64KIt depends on what you are looking for a GPU to be able perform well on. Most gamers today are on the ray tracing bandwagon and the Nvidia GPUs are superior to the AMD GPUs in that respect.
If by superior, you mean slightly less shit, but still pretty much unusable, then you're right. Nothing except for the 4090 can ray trace properly, so arguing who has the upper hand is pointless media anger.
Posted on Reply
#104
Makaveli
64KMost gamers today are on the ray tracing bandwagon
You cannot make this claim without numbers to back it up.

who's are most gamers? You mean the mobile games? the esports gamers?
Posted on Reply
#105
Panther_Seraphin
Ray Tracing is definately still a "niche" area in gaming. Just have a quick gander at the steam charts out of the top 9 nVidia GPUs on the charts, 2 at most have a decent chance at regularly running ray tracing of any form regularly(3070 and 4070), thats the 9th and 5th place card. 2 of them cannot even run ray tracing of any kind without it turning into powerpoint simulator 2024 (1650 and 1060). and the rest of them would be extremly crippled in performance by enabling it.



So out of ~34% of gamers surveyed 6.5% have a good chance of regularly running Ray Tracing. Still not amazing market penetration for a tech that is now nearly 6 years old.
Now I will admit its only in the 40xx and upcoming 50xx series that we have realllly had the performance in RT to turn it on and not be like what major downgrades do I need to do to the rest of the settings to make it work regularly in AAA games.
I would argue that most of the people that have the capability turn it on do it to see what it looks like but then either turns it right down or off to not impact performance.

Also all the hype around DLSS/FSR/XeSS etc is just a complete crutch for game developers to skip optimising their games to the same level as before/remove the need for optimisation tweaks in patchs so they can focus on DLC/Skins/Microtransactions from Day 0. Like nVidia coming out with marketing materials going "LOOK AT THIS 5X PERFORMANCE UPLIFT GEN TO GEN" and then in 0.5 font "When using DLSS 3.5"
Its like a car manufactuer coming out and saying "We have now released a 155mph capable car while maintaining 100+ MPG on your daily trips" and then you look at the writing to notice its while traversing the side of mount everest for the gravity assist or some other bullshit
Posted on Reply
#106
Bagerklestyne
Panther_SeraphinAlso all the hype around DLSS/FSR/XeSS etc is just a complete crutch for game developers to skip optimising their games to the same level as before/remove the need for optimisation tweaks in patchs so they can focus on DLC/Skins/Microtransactions from Day 0. Like nVidia coming out with marketing materials going "LOOK AT THIS 5X PERFORMANCE UPLIFT GEN TO GEN" and then in 0.5 font "When using DLSS 3.5"
Its like a car manufactuer coming out and saying "We have now released a 155mph capable car while maintaining 100+ MPG on your daily trips" and then you look at the writing to notice its while traversing the side of mount everest for the gravity assist or some other bullshit
Yeah I think they managed the speed and mileage by pushing it out the back of a C-130 @ 10000 feet.

You're not wrong about it being niche, of the people I've helped build systems for in the last 12 months, I've had to explain ray tracing to them in about 2/3 of the cases.

I really want to know what AMD (and this forum) thinks will tempt me to 'upgrade' from 6800XT.
Posted on Reply
#107
AusWolf
BagerklestyneI really want to know what AMD (and this forum) thinks will tempt me to 'upgrade' from 6800XT.
I think there's zero reason to upgrade from a 6800 XT. ;)

The only reason I'm thinking about replacing my 7800 XT is that it eats almost as much power playing a video as my CPU does under full load. It's performance is more than enough and I don't care about RT or DLSS.
Posted on Reply
#108
Bagerklestyne
AusWolfI think there's zero reason to upgrade from a 6800 XT. ;)

The only reason I'm thinking about replacing my 7800 XT is that it eats almost as much power playing a video as my CPU does under full load. It's performance is more than enough and I don't care about RT or DLSS.
Same boat re DLSS/RT.

I've said this before but if it isn't 7900XTX'ish performance with better power consumption and for a healthy price I wont be interested. Scarily could even consider moving to team green, which I'd prefer not to do.
Posted on Reply
#109
Vayra86
evernessinceNot the exact polls I was referencing but I was able to pull up some that I've voted on in the past via my history:






www.youtube.com/channel/UCI8iQa1hv7oV_Z8D35vVuSg/community?lb=UgkxUw1eV4quZuFTSY7v1jJeDNdIVtNKOpyN
www.youtube.com/channel/UCI8iQa1hv7oV_Z8D35vVuSg/community?lb=UgkxwOHbJ6uIcR9EyUrPdhxw5F2_A3FMJFAi
www.youtube.com/channel/UChIs72whgZI9w6d6FhwGGHA/community?lb=UgkxABGVuOK5yBlV-rFQROFWNwfSIQ3cIbOD

According to the GN poll 31% of people didn't have an RT capable card and a further 41% didn't even bother enabling it. This is among the enthusiast community, the numbers are going to be much more against RT in the broader PC gaming population.
Ironically, the moment RT works best is when the performance tax is low enough for people to just 'turn it on' and because it ubiquitously looks better in tandem with that performance cost. Neither is the case today. Yes, sometimes RT makes a marked difference. But it is highly scene dependant, to the point of which its not much different from looking at hand-crafted scenes with well placed rasterized/dynamic lighting. RT doesn't do much that is truly new or couldn't be done already - if used sparingly. Now you can splash RT all over the scene and still waste tons of resources but the specific hardware makes it somewhat manageable.

We've still got a long way to go, and brute forcing the whole scene like they're doing now, isn't the way, the real deal will come from engine developments like Nanite that are software based before hardware specific. In the end, hand crafting will still matter not much unlike the way it worked before. Lots of new games on new engines don't look all that great. There's no love in it, just high fidelity assets. That alone won't make the scene or the game. Evidently, graphics don't make the game.
Posted on Reply
#110
rv8000
Vayra86Ironically, the moment RT works best is when the performance tax is low enough for people to just 'turn it on' and because it ubiquitously looks better in tandem with that performance cost. Neither is the case today. Yes, sometimes RT makes a marked difference. But it is highly scene dependant, to the point of which its not much different from looking at hand-crafted scenes with well placed rasterized/dynamic lighting. RT doesn't do much that is truly new or couldn't be done already - if used sparingly. Now you can splash RT all over the scene and still waste tons of resources but the specific hardware makes it somewhat manageable.

We've still got a long way to go, and brute forcing the whole scene like they're doing now, isn't the way, the real deal will come from engine developments like Nanite that are software based before hardware specific. In the end, hand crafting will still matter not much unlike the way it worked before. Lots of new games on new engines don't look all that great. There's no love in it, just high fidelity assets. That alone won't make the scene or the game. Evidently, graphics don't make the game.
Most RT implementations aren’t path tracing, which is still too demanding for current hardware. This results in most RT games being gimmick settings while murdering your performance and providing little to no visual benefit; there are a select handful of games where the visual impact is actually enough to justify the performance hit - absolutely a niche.

Ray Traced effects and RTX are almost as bad as the PhysX nonsense. Modern hardware and consoles alike simply aren’t powerful enough for real “ray” (path) tracing.
Posted on Reply
#111
THU31
rv8000Most RT implementations aren’t path tracing, which is still too demanding for current hardware. This results in most RT games being gimmick settings while murdering your performance and providing little to no visual benefit; there are a select handful of games where the visual impact is actually enough to justify the performance hit - absolutely a niche.

Ray Traced effects and RTX are almost as bad as the PhysX nonsense. Modern hardware and consoles alike simply aren’t powerful enough for real “ray” (path) tracing.
I'm currently playing Hogwarts Legacy. Runs really well without RT, no stutter or big drops outside of a few cutscenes. 4K60 DLSS Quality with high settings.
I tried RT AO and reflections. Had to drop to DLSS Performance, but even with that the game was hitching a lot. Doesn't seem VRAM related, as the usage was between 10-11 GB. Could be CPU related, but that's still just a bad implementation (all CPUs have terrible 1% lows in this game with RT).

RT reflections look good, I love that they don't disappear. But RTAO doesn't look much better, there are still some disocclusion artifacts. It's absolutely not worth the hit to performance.

RT is pretty pretty much pointless on anything but the top end GPU. I can't imagine when the 5090 comes out and they start "optimizing" games for that. But it is kind of like future-proofing. When you play the game 5 or 10 years later, you can enable all the eye candy.
But I would still rather get more shaders instead of RT cores. I do want DLSS, though, extremely useful on a 4K TV.
Posted on Reply
#112
AusWolf
THU31I'm currently playing Hogwarts Legacy. Runs really well without RT, no stutter or big drops outside of a few cutscenes. 4K60 DLSS Quality with high settings.
I tried RT AO and reflections. Had to drop to DLSS Performance, but even with that the game was hitching a lot. Doesn't seem VRAM related, as the usage was between 10-11 GB. Could be CPU related, but that's still just a bad implementation (all CPUs have terrible 1% lows in this game with RT).

RT reflections look good, I love that they don't disappear. But RTAO doesn't look much better, there are still some disocclusion artifacts. It's absolutely not worth the hit to performance.

RT is pretty pretty much pointless on anything but the top end GPU. I can't imagine when the 5090 comes out and they start "optimizing" games for that. But it is kind of like future-proofing. When you play the game 5 or 10 years later, you can enable all the eye candy.
But I would still rather get more shaders instead of RT cores. I do want DLSS, though, extremely useful on a 4K TV.
I'm playing Hogwarts Legacy, too. I haven't encountered any hitching with RT on, but it tanks performance while providing no visual upgrade that I could notice. If I have to enable FSR to be able to play with a few extra effects that I can't notice anyway, then I say no thanks.
Posted on Reply
#113
THU31
AusWolfI'm playing Hogwarts Legacy, too. I haven't encountered any hitching with RT on, but it tanks performance while providing no visual upgrade that I could notice. If I have to enable FSR to be able to play with a few extra effects that I can't notice anyway, then I say no thanks.
7000X3D seem to be the only CPUs that can run this game with RT, so I'm not surprised you don't get hitching. ;)

Posted on Reply
#114
AusWolf
THU317000X3D seem to be the only CPUs that can run this game with RT, so I'm not surprised you don't get hitching. ;)

I didn't think RT was so CPU intensive! :eek:

Edit: Which video is your screenshot from?
Posted on Reply
Add your own comment
Nov 21st, 2024 13:24 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts