Wednesday, August 28th 2024

AMD RDNA 4 GPU Memory and Infinity Cache Configurations Surface

AMD's next generation RDNA 4 graphics architecture will see the company focus on the performance segment of the market. The company is rumored to not be making a successor to the enthusiast-segment "Navi 21" and "Navi 31" chips based on RDNA 4, and will instead focus on improving performance and efficiency in the most high-volume segments, just like the original RDNA-powered generation, the Radeon RX 5000 series. There are two chips in the new RDNA 4 generation that have hit the rumor mill, the "Navi 48" and the "Navi 44." The "Navi 48" is the faster of the two, powering the top SKUs in this generation, while the "Navi 44" is expected to be the mid-tier chip.

According to Kepler_L2, a reliable source with GPU leaks, and VideoCardz, which connected the tweet to the RDNA 4 generation, the top "Navi 48" silicon is expected to feature a 256-bit wide GDDR6 memory interface—so there's no upgrade to GDDR7. The top SKU based on this chip, the "Navi 48 XTX," will feature a memory speed of 20 Gbps, for 640 GB/s of memory bandwidth. The next-best SKU, codenamed "Navi 48 XT," will feature a slightly lower 18 Gbps memory speed at the same bus-width, for 576 GB/s of memory bandwidth. The "Navi 44" chip has a respectable 192-bit wide memory bus, and its top SKU will feature a 19 Gbps speed, for 456 GB/s of bandwidth on tap.
Another set of rumors from the same sources also point to the Infinity Cache sizes of these chips. "Navi 48" comes with 64 MB of it, which will be available on both the "Navi 48 XTX" and "Navi 48 XT," while the "Navi 44" silicon comes with 48 MB of it. We are hearing from multiple sources that the "Navi 4x" GPU family will stick to traditional monolithic silicon designs, and not venture out into chiplet disaggregation like the company did with the "Navi 31" and the "Navi 32."

Yet another set of rumors, these from Moore's Law is Dead, talk about how AMD's design focus with RDNA 4 will be to ace performance, performance-per-Watt, and performance cost of ray tracing, in the segments of the market that NVIDIA makes the most volumes in, if not the most margins in. MLID points to the likelihood of the ray tracing performance improvements riding on there being not one, but two ray accelerators per compute unit, with a greater degree of fixed-function acceleration for the ray tracing workflow (i.e. less of it will be delegated to the programmable shaders).
Sources: Kepler_L2 (memory speeds), Wccftech, VideoCardz (memory speeds), Kepler_L2 (cache size), VideoCardz (cache size), Moore's Law is Dead (YouTube)
Add your own comment

104 Comments on AMD RDNA 4 GPU Memory and Infinity Cache Configurations Surface

#26
windwhirl
TheinsanegamerNBecause the 7800xt has roughly the same core config as a 6900xt, and near identical raytracing performance in most games.
It does not.

The RX 6900 XT has 5120 shaders

The RX 7800 XT has 3840 shaders.

The RX 7800 XT performs nearly the same as the RX 6900 XT but with just 75% of the shaders and roughly 80% of the power consumption.
Posted on Reply
#27
hsew
DemonicRyzen6667900 xtx in rasterization matches +/- 5% 4090
7900 xtx enabled raytracing +/-5% to a 3090

Adding another raytracing unit does not seem wise as RDNA3 has a hardtimd filling up its improved RT units with BHV trassveral additions. Beside that it also doesn't even both using it 2 issue per-clock addition either ubless specifically coded for it.
7900XTX is much closer to 4080S raster than it is to the 4090.

RT, it’s between the 3090 and 3090Ti, which have an RT gap of about 12% between them.
Posted on Reply
#28
Colddecked
TheinsanegamerNWhich rDNA3 are you referring to? Because the 7800xt has roughly the same core config as a 6900xt, and near identical raytracing performance in most games.
As the kids like to say, cap.
Posted on Reply
#29
Imsochobo
TheinsanegamerNrDNA3 was supposed to improve ray tracing, but it failed to do so, with RT perf per CU being almost identical to rDNA2.

Now we're seeing rDNA4 will have the same cache and memory config as rDNA3, with a similar core count.

Where is this improvement coming from?
if you look at memory system vs nvidia, yeah you see nvidia does a lot more.
256 bit vs 384.
64 mb vs 96mb l2 vs l3
+ amd has a big l2 of 6mb (you know... that's the last big cache of a 3090ti...)

and large L1 L0, so they're using bandwidth terribly bad, it's the architecture being extremely narrow and overfed and badly utilized.
Maybe getting tile based renderer working would help some scenarios, get the double fp working better and in more situations, drivers, matrix to help with certain tasks.

So many things, they're lagging behind so bad now and yet their trying desperately to make a WGP\Dual CU do everything without adding things to them, it's like they're trying to push and hammer down data(memory bandwidth through cache and bus) into a CU and magically thinking it'll work.
Posted on Reply
#30
evernessince
windwhirlIt does not.

The RX 6900 XT has 5120 shaders

The RX 7800 XT has 3840 shaders.

The RX 7800 XT performs nearly the same as the RX 6900 XT but with just 75% of the shaders and roughly 80% of the power consumption.
Indeed, here are links for people that want to confirm themselves:

www.techpowerup.com/gpu-specs/radeon-rx-7800-xt.c3839
www.techpowerup.com/gpu-specs/radeon-rx-6900-xt.c3481

It's not an earth-shattering increase in RT performance per core but it's certainly noteworthy.
Posted on Reply
#31
mkppo
TheinsanegamerNrDNA3 is an architecture, the 6900xt is a chip.

Which rDNA3 are you referring to? Because the 7800xt has roughly the same core config as a 6900xt, and near identical raytracing performance in most games.

Each CU contained raytracing hardware. more CUs means more raytracing performance. Per CU, rDNA3 had almost no performance improvements. Go look at TPU's 7800xt performance review, average of a whopping 3% faster in RT then the 6900xt.

www.techpowerup.com/review/amd-radeon-rx-7800-xt/34.html
Oh sorry I didn't specify which one. In the first line I was referring to 7800xt which has 3840 shaders vs 5120 for the 6900xt but the 7800xt is 4.5% faster in TPU's RT chart at 1920x1080. Second line was 7900xtx against the 3090.

I think you're confused with regard the shader counts of the RDNA3 lineup but there hasn't been a massive increase on that front.
Posted on Reply
#32
Dr. Dro
ARFThey should exit the markets, then:
1. Consumer Ryzen;
2. Consumer Radeon;
3. Semi-custom Playstation and Xbox chips.
No, 1. is profitable, 2... they're scaling back. Still profitable, but easily the smallest of AMD's current businesses... 3. is contractual, not much money (NVIDIA turned this down), but has served as a lifeline

Also regarding Mindfactory... Single store in the EU, AMD friendly market. I maintain it.
R0H1TWhat feature set? Running local LLM with 8GB VRAM or fake frames with "fake" reflections :wtf:
Yet fake frames with fake reflections are some of the star features they've chosen to copy...
64KYou can't compare specs from Nvidia with AMD to understand a performance level comparison. The internet is full of benchmarks that make it clear that the 7900 XTX is the competitor to the 4080. You are going to have to do some research or you can't have a grasp of what is going on in the tech world and you just waste everyone's time trying to explain obvious things to you.
The RTX 4090 came out first and it was clear they could not match it. This "it's positioned against the 4080" together with a price reduction is literally the only thing AMD could do in that situation.

The fact that 7900 XTX is a much larger processor with a clearly higher bill of materials remains unchanged.
DemonicRyzen6667900 xtx in rasterization matches +/- 5% 4090
7900 xtx enabled raytracing +/-5% to a 3090

Adding another raytracing unit does not seem wise as RDNA3 has a hardtimd filling up its improved RT units with BHV trassveral additions. Beside that it also doesn't even both using it 2 issue per-clock addition either ubless specifically coded for it.
The 7900 XTX does not come even close to the 4090 in raster performance. It's 1-4% vs. 4080, 1-2% vs 4080 Super.
Posted on Reply
#33
GoldenX
R0H1TWhat feature set? Running local LLM with 8GB VRAM or fake frames with "fake" reflections :wtf:
Like FSR3 with framegen, AFMF, and selling 8GB products at a higher price than RDNA2 while offering no performance benefit?

RDNA3 has been the most boring release AMD has ever done, not broken like RDNA1, but offering no plus over RDNA2 and only increasing prices enough to not make everyone angry.
Focusing on the mid and low end (the market they literally abandoned right now) when it's obvious they are in no place to match XX90 NVIDIA products in hardware or software at the moment is the right move.
Posted on Reply
#34
Marcus L
GoldenXLike FSR3 with framegen, AFMF, and selling 8GB products at a higher price than RDNA2 while offering no performance benefit?

RDNA3 has been the most boring release AMD has ever done, not broken like RDNA1, but offering no plus over RDNA2 and only increasing prices enough to not make everyone angry.
Focusing on the mid and low end (the market they literally abandoned right now) when it's obvious they are in no place to match XX90 NVIDIA products in hardware or software at the moment is the right move.
But hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out? this is not the majority of buyers, and Nvidia is taking the piss, now more people are waiting multiple gens to upgrade cause $5-$600 for shit mid-gen GPU's is not realistic in the real world, I have a RX 6800 I bought for £350 and nothing new comes close to it in terms of performance/£ even after 2 years of newer GPU's, I am going to hodl this until at least Radeon 10 series or Nvidia 70**, they can charge their overpriced BS money for the same performance class all they want, both of them, I won't be spending a dime on either
Posted on Reply
#35
kapone32
Dr. DroNo, 1. is profitable, 2... they're scaling back. Still profitable, but easily the smallest of AMD's current businesses... 3. is contractual, not much money (NVIDIA turned this down), but has served as a lifeline

Also regarding Mindfactory... Single store in the EU, AMD friendly market. I maintain it.



Yet fake frames with fake reflections are some of the star features they've chosen to copy...



The RTX 4090 came out first and it was clear they could not match it. This "it's positioned against the 4080" together with a price reduction is literally the only thing AMD could do in that situation.

The fact that 7900 XTX is a much larger processor with a clearly higher bill of materials remains unchanged.



The 7900 XTX does not come even close to the 4090 in raster performance. It's 1-4% vs. 4080, 1-2% vs 4080 Super.
Yep and costs double too.
Posted on Reply
#36
sepheronx
Marcus LBut hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out? this is not the majority of buyers, and Nvidia is taking the piss, now more people are waiting multiple gens to upgrade cause $5-$600 for shit mid-gen GPU's is not realistic in the real world, I have a RX 6800 I bought for £350 and nothing new comes close to it in terms of performance/£ even after 2 years of newer GPU's, I am going to hodl this until at least Radeon 10 series or Nvidia 70**, they can charge their overpriced BS money for the same performance class all they want, both of them, I won't be spending a dime on either
We got members on this forum from Vietnam who will purchase 1 of these high end GPU's every release. Unless he ran out of organs to pay for it.
Posted on Reply
#37
Dr. Dro
Marcus LBut hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out? this is not the majority of buyers, and Nvidia is taking the piss, now more people are waiting multiple gens to upgrade cause $5-$600 for shit mid-gen GPU's is not realistic in the real world, I have a RX 6800 I bought for £350 and nothing new comes close to it in terms of performance/£ even after 2 years of newer GPU's, I am going to hodl this until at least Radeon 10 series or Nvidia 70**, they can charge their overpriced BS money for the same performance class all they want, both of them, I won't be spending a dime on either
1. And have you asked yourself why, for one single moment, why this is so? (hint: it isn't Nvidia's greed)
2. Best selling GPU model of the generation. Outsold the entire Radeon stack. Demand is so extreme, there are shortages for some of its board components
3. The RX 6800 is 4 years old, not two. So you got a previous generation card on a discount - that doesn't really count. MSRP by MSRP (they launched at roughly the same cost) even with all the cuts in the 4070's configuration, the 4070 comes far ahead as a product - it is faster, its drivers are of superior quality (a studio branch for productivity is also offered at no extra cost), it is more power efficient - and in 2 years, it'll have the same age and be just as devalued as your RX 6800 is today - in other words, there is nothing special or praiseworthy about your graphics card, it's just an earlier generation card past its prime that obviously still does its job, as it always has
kapone32Yep and costs double too.
Supply and demand
Posted on Reply
#38
eidairaman1
The Exiled Airman
64KYou can't compare specs from Nvidia with AMD to understand a performance level comparison. The internet is full of benchmarks that make it clear that the 7900 XTX is the competitor to the 4080. You are going to have to do some research or you can't have a grasp of what is going on in the tech world and you just waste everyone's time trying to explain obvious things to you.
Just ignore the troll
Posted on Reply
#39
DaemonForce
Vya DomusPolaris based cards were one of their most successful products in this last decade lol.
This is the unfortunate reality. Every time I have checked out the GPU list here for the past 5 years or more: 580 is right there.



Steam Hardware Survey:



It's quite the scroll to find it but damn.

And of course any time I have some kind of graphics problem and need to default to something that just works: BAM!



This thing is everywhere. It's so everywhere that it became the new religion on the Chinese chopping block of chips for a number of years and is now so long in the tooth that we're just finally starting to see it happen to other cards either because the performance now sucks or DX feature lockout is kicking people out of the gaming hobby or the crap encoder is finally starting to force everyone out of service.
Marcus LBut hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out?
This is the hard reality. At the end of the day, people don't give a damn about GPUs. They just want a display adapter that can play the game. That means not playing around with voltage settings for weeks trying to find the sweet spot where it doesn't crash (if there is one), not struggling with balancing fans and noise when tesselation breaks and turns one squirrely mesh into pure nightmare fuel until reboot, not dicking around with a bunch of Bitcoin miner market bullshit and playing momentum monkey with drops just trying to acquire ONE of these cards and that's all assuming they're not fighting to diagnose one dud after another like I had to deal with on 7000 series. What I'm saying is there isn't a good price.

The same fiasco has just kicked off with 5000 series and will happen to 6000 very soon. I'm not saying it hasn't already, I'm talking MASS SCALE. This is the only thing keeping me in a loop of second guessing, which keeps me out of the 2nd hand market for chips. It's bad enough that every delivery of 7000 series seems to be a bomb. I can only hope that 8000 series arrives as expected (functional) and delivers on good enough quantity and price because it took several months to find anything good for previous generations and I still don't have a card.

The performance should be worlds better than what I have now. That's the bar. Don't really care where they fit between 7000 series SKUs, just that enough of them hit the shelves before all the scalpers figure out AMD pulled a massive disinfo campaign to finally get these cards in the hands of real customers.
Posted on Reply
#40
Solaris17
Super Dainty Moderator
Does this mean they arent coming out with an 8th gen counterpart to the 7900XTX? or simply that they dont plan on focusing on increased performance as much?
Posted on Reply
#41
mkppo
GoldenXLike FSR3 with framegen, AFMF, and selling 8GB products at a higher price than RDNA2 while offering no performance benefit?

RDNA3 has been the most boring release AMD has ever done, not broken like RDNA1, but offering no plus over RDNA2 and only increasing prices enough to not make everyone angry.
Focusing on the mid and low end (the market they literally abandoned right now) when it's obvious they are in no place to match XX90 NVIDIA products in hardware or software at the moment is the right move.
RDNA3 is no jump over RDNA2? A quick look at the 7900GRE review shows the 7900xtx being 43% faster than the 6900xt at 2560x1440. Hell, I was super tempted to switch over from a 3090 because it's just that much faster but the lack of side ported waterblocks was the only deterrent.

I also tend to play some Warzone nowadays and for whatever reason RDNA3 is stupid fast in that game, faster even than the 4090 at most resolutions.

I would say being 43% faster is a plus..
Solaris17Does this mean they arent coming out with an 8th gen counterpart to the 7900XTX? or simply that they dont plan on focusing on increased performance as much?
I think they will at least match the 7900xtx but most seem to think otherwise. I guess we'll see soon enough
Posted on Reply
#42
R0H1T
Dr. DroYet fake frames with fake reflections are some of the star features they've chosen to copy.
Copy what exactly? There's certainly feature parity between major vendors at the top of good/important(?) features & I doubt that you'd go anywhere if that wasn't the case. Did Intel copy AMD when they went with x64 route, IMC, chiplets & pretty sure a million other ways? Nvidia with Mantle Vulkan, shared memory, Hairworks et al?
Posted on Reply
#43
ARF
64KYou can't compare specs from Nvidia with AMD to understand a performance level comparison. The internet is full of benchmarks that make it clear that the 7900 XTX is the competitor to the 4080.
Nonsense. The 4080 uses DLSS to artificially lift the framerates, and look that its performance is close. It is oranges vs. bananas comparison.

The real competitor for RTX 4080 is the RX 7900 GRE.

www.techpowerup.com/gpu-specs/radeon-rx-7900-gre.c4166
www.techpowerup.com/gpu-specs/geforce-rtx-4080.c3888
mkppoRDNA3 is no jump over RDNA2
You must compare shader-to-shader. 6144 vs 5120.
Posted on Reply
#44
Tomorrow
TheinsanegamerNIt sure sounds to me like AMD is gonna stick to 7800xt performance and leave high end buyers out to dry.
Most people dont need another 1000+ costing card. Most people want previous gen high end performance for lower price.
ARFWhat was the AMD's outlook last time when they tried this with the RX 580 and RX 5700 XT?
RX 6000 series that followed RX 5000 was very successful. The outlook is that RX 9000 series will do the same after RX 8000.
Vya DomusPolaris based cards were one of their most successful products in this last decade lol.
As were RX 6000 series that followed the same midrange RX 5000 series cards that RX 8000 seems to be.
Dr. DroThe fact that 7900 XTX is a much larger processor with a clearly higher bill of materials remains unchanged.
I assume you mean vs 4080(S)? You have to also remember that Nvidia manufactures their dies on a more expensive node. I would not confidently say that AMD's MCM approach had higher BoM cost.
Dr. DroThe 7900 XTX does not come even close to the 4090 in raster performance. It's 1-4% vs. 4080, 1-2% vs 4080 Super.
Considering it's 18% slower according to TPU's chart while costing only 60% of 4090 then id' say it's not too bad. The loss in RT performance is much bigger tho but in raster, in terms of price/performance it's actually better than 4090:

Posted on Reply
#45
AusWolf
Good. Add improved idle and video playback power consumption into the mix, and they've got a buyer.
Posted on Reply
#46
mkppo
ARFYou must compare shader-to-shader. 6144 vs 5120.
Sure, 20% more shaders for 43% performance gain?
Posted on Reply
#47
londiste
ARFNonsense. The 4080 uses DLSS to artificially lift the framerates, and look that its performance is close. It is oranges vs. bananas comparison.

The real competitor for RTX 4080 is the RX 7900 GRE.

www.techpowerup.com/gpu-specs/radeon-rx-7900-gre.c4166
www.techpowerup.com/gpu-specs/geforce-rtx-4080.c3888
Really? Take a look at any TPU GPU review performance charts, for example:
XFX Radeon RX 7900 XTX Magnetic Air Review - Relative Performance | TechPowerUp

4080 is 23%, 28% and 35% faster than 7900GRE at 1080p, 1440p and 2160p respectively.
On the other hand, 7900XTX is 2%, 3% and 5% faster than 4080 at 1080p, 1440p and 2160p respectively.
Edit: 4090 is 13%, 19% and 23% faster than 7900XTX in the same graphs.

DLSS - or FSR for that matter - works on top of that.
Posted on Reply
#48
AusWolf
ARFNonsense. The 4080 uses DLSS to artificially lift the framerates, and look that its performance is close. It is oranges vs. bananas comparison.

The real competitor for RTX 4080 is the RX 7900 GRE.

www.techpowerup.com/gpu-specs/radeon-rx-7900-gre.c4166
www.techpowerup.com/gpu-specs/geforce-rtx-4080.c3888



You must compare shader-to-shader. 6144 vs 5120.
The only competition that matters is in price. With that in mind, the only true competition to the 4080 is the 7900 XTX. If performance is on par within the same price bracket, then we've got some fair competition. If it isn't, then one is a better buy than the other. It's that simple. Shaders, memory bus and any other arbitrary number don't matter when one is talking about competition.
Posted on Reply
#49
londiste
There is a pretty good RDNA2 > RDNA3 comparison out there - 6800 vs 7800XT, both are 60CU, 256-bit.
What complicates things is the double ALU thing although it seems to have helped even less than same in Nvidia's case.
Less Infinity Cache but given evolution/optimization of the size of that on both AMD and Nvidia newer generations this has negligible impact.
Other than that - clocks are up 6-15% and VRAM bandwidth up 22%.

Based on the last TPU GPU review 7800XT is overall 22-23% faster than 6800 which basically matches the expectations based on specs.
In RT, 27-34% with gap increasing with resolution. There is a nice little jump there - AMD clearly did improve the RT performance.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts