Wednesday, August 28th 2024

AMD RDNA 4 GPU Memory and Infinity Cache Configurations Surface

AMD's next generation RDNA 4 graphics architecture will see the company focus on the performance segment of the market. The company is rumored to not be making a successor to the enthusiast-segment "Navi 21" and "Navi 31" chips based on RDNA 4, and will instead focus on improving performance and efficiency in the most high-volume segments, just like the original RDNA-powered generation, the Radeon RX 5000 series. There are two chips in the new RDNA 4 generation that have hit the rumor mill, the "Navi 48" and the "Navi 44." The "Navi 48" is the faster of the two, powering the top SKUs in this generation, while the "Navi 44" is expected to be the mid-tier chip.

According to Kepler_L2, a reliable source with GPU leaks, and VideoCardz, which connected the tweet to the RDNA 4 generation, the top "Navi 48" silicon is expected to feature a 256-bit wide GDDR6 memory interface—so there's no upgrade to GDDR7. The top SKU based on this chip, the "Navi 48 XTX," will feature a memory speed of 20 Gbps, for 640 GB/s of memory bandwidth. The next-best SKU, codenamed "Navi 48 XT," will feature a slightly lower 18 Gbps memory speed at the same bus-width, for 576 GB/s of memory bandwidth. The "Navi 44" chip has a respectable 192-bit wide memory bus, and its top SKU will feature a 19 Gbps speed, for 456 GB/s of bandwidth on tap.
Another set of rumors from the same sources also point to the Infinity Cache sizes of these chips. "Navi 48" comes with 64 MB of it, which will be available on both the "Navi 48 XTX" and "Navi 48 XT," while the "Navi 44" silicon comes with 48 MB of it. We are hearing from multiple sources that the "Navi 4x" GPU family will stick to traditional monolithic silicon designs, and not venture out into chiplet disaggregation like the company did with the "Navi 31" and the "Navi 32."

Yet another set of rumors, these from Moore's Law is Dead, talk about how AMD's design focus with RDNA 4 will be to ace performance, performance-per-Watt, and performance cost of ray tracing, in the segments of the market that NVIDIA makes the most volumes in, if not the most margins in. MLID points to the likelihood of the ray tracing performance improvements riding on there being not one, but two ray accelerators per compute unit, with a greater degree of fixed-function acceleration for the ray tracing workflow (i.e. less of it will be delegated to the programmable shaders).
Sources: Kepler_L2 (memory speeds), Wccftech, VideoCardz (memory speeds), Kepler_L2 (cache size), VideoCardz (cache size), Moore's Law is Dead (YouTube)
Add your own comment

104 Comments on AMD RDNA 4 GPU Memory and Infinity Cache Configurations Surface

#51
AusWolf
londisteThere is a pretty good RDNA2 > RDNA3 comparison out there - 6800 vs 7800XT, both are 60CU, 256-bit.
What complicates things is the double ALU thing although it seems to have helped even less than same in Nvidia's case.
Less Infinity Cache but given evolution/optimization of the size of that on both AMD and Nvidia newer generations this has negligible impact.
Other than that - clocks are up 6-15% and VRAM bandwidth up 22%.

Based on the last TPU GPU review 7800XT is overall 22-23% faster than 6800 which basically matches the expectations based on specs.
In RT, 27-34% with gap increasing with resolution. There is a nice little jump there - AMD clearly did improve the RT performance.
The usefulness of the double ALU depends on the game you play. It shows a nice uplift in some of them, with the 7800 XT matching or exceeding the 6900 XT, while it does absolutely nothing in others.

RT in RDNA 3 is still quite meh in my opinion. I hope rumours about doubling the RT cores per shader in RDNA 4 are true, and we'll see some actual uplift that's worth talking about.
ARFThe best comparison is RX 6650 XT -> RX 7600




Performance is equal.



www.techpowerup.com/gpu-specs/radeon-rx-6650-xt.c3898
www.techpowerup.com/gpu-specs/radeon-rx-7600.c4153
Yes, that's a pretty poor upgrade from AMD. With that mindset, though, you can pick good and bad examples from practically everywhere.
Posted on Reply
#52
ARF
AusWolfThe usefulness of the double ALU depends on the game you play. It shows a nice uplift in some of them, with the 7800 XT matching or exceeding the 6900 XT, while it does absolutely nothing in others.
This is a quite poor development path taken by AMD. The same applies for the Zen 5 Ryzens.
They have to develop so that the improvements can be seen everywhere, not to be load/app/support dependent.
Posted on Reply
#53
londiste
ARFThe best comparison is RX 6650 XT -> RX 7600




Performance is equal.

You are right, that is an even better comparison. But I really do not understand what AMD did there with these two cards. It really is like nothing changed. Same spec lines down to the anemic PCIe link, even same frequencies in practice, same performance, same RT performance. RX7600 is supposedly on RDNA3 with at least the dual ALU and improved RT things but it simply does not show anywhere. Wow.
Posted on Reply
#54
Ruru
S.T.A.R.S.
ARFWhat was the AMD's outlook last time when they tried this with the RX 580 and RX 5700 XT?

Maybe there is a major problem in AMD's management, and they have to sit around that Board of directors, and decide what actions are necessary to fix the abnormally poor execution of the graphics division.

The 6800 XT which is faster in some games has those 128 MB.
At least Polaris cards were hella popular even though they were mid-end cards.

Though still weird if they're not gonna compete in the high-end, 6800/6900 and 7900 series have been great.
Posted on Reply
#55
AusWolf
londisteYou are right, that is an even better comparison. But I really do not understand what AMD did there with these two cards. It really is like nothing changed. Same spec lines down to the anemic PCIe link, even same frequencies in practice, same performance, same RT performance. RX7600 is supposedly on RDNA3 with at least the dual ALU and improved RT things but it simply does not show anywhere. Wow.
It shows in price at least... Oh wait... :wtf:




Personally, I just call RDNA 3 RDNA 2 refresh (just like Ada Ampere refresh).
Posted on Reply
#56
_roman_
ARFMindfactory is not a "single store". It is the largest retailer in Europe, and sells tons and tons of inventory. Be respectful.
Sells only to germany. there are tricks to circumvent this, but still. You have been warned do not read the strike through. bad - bad - bad word -> Mindfactory = racists.

Alternate has the joke with alternate.at with different prices to alternate.de. Austria and Germany have a common boarder in "Central Europe". Different inventory .de refuses to sell to .at
Proshop also distinguish between countries.
nbb.com

I prefer if you had written mindfactory is a retailer in GERMANY. Not europe!
There are more countries in europe as only Germany.
Posted on Reply
#57
las
I feel like RDNA4 is pointless, AMD called it a bugfix afterall and will target mid-end only, lets hope price is very aggressive but I guess not considering AMD wants to chase AI and enterprise marketshare instead (as they should)

RDNA5 is going to be the next somewhat exiting release but is not even close, late 2025 or even 2026

AMD probably spends 2% of R&D funds on consumer GPUs tops, they are getting worse and worse, sadly. MCM pretty much failed here. RDNA4 looks to be monolithic.

Ray Tracing should not be a focus from AMD. Improving FSR and Frame Gen should be a prime focus, its the reason Nvidia cards are selling like hotcakes. DLDSR, DLSS, DLAA, Frame Gen is the true magic of RTX, not Ray Tracing or Path Tracing, and no AMD is not even close, tried tons of AMD cards, including several SKUs in both 6000 and 7000 series. Had a 6800XT as primary card before I got a 4090.

AMD needs to release some marketshare-grapping cards, meaning top performance per dollar with FSR and Frame Gen close to or matching Nvidia's solutions. AMD lacks behind on all features today. Anyone who denies this fact, have not tried both recently.



www.tomshardware.com/pc-components/gpus/nvidias-grasp-of-desktop-gpu-market-balloons-to-88-amd-has-just-12-intel-negligible-says-jpr

Remove iGPUs and AMD is already below 10% dGPU marketshare, Nvidia dominates without even trying (AI focus)

I guess AMD and Intel can fight for the low-end dGPU market, maybe mid-end but high-end is already lost long ago
Posted on Reply
#58
Tomorrow
lasI feel like RDNA4 is pointless
I would not call a card that brings 7900XT+ performance from 700+ (900 initially) to <500 as pointless. That's exactly what most people have been asking for. Only enthusiasts want a "4090 killer" at any price or power consumption. But very few people by such expensive cards - even among Nvidia's users.
Having a Halo card certainly has it's benefits in terms of marketing and overall image of a series, but financially it's pretty expensive and getting more expensive every year.
laslets hope price is very aggressive
I hope so too but seeing AMD's recent pricing they always seem to be able to shoot themselves in the foot with bad pricing and thus negative reviews only for the price to fall immideatly after launch.
lasAMD probably spends 2% of R&D funds on consumer GPUs tops, they are getting worse and worse, sadly.
How are they getting worse and worse?
lasMCM pretty much failed here.
At least they tried. I would not say they failed. They fell short of their own expectations and of those who though this was going to be a "4090 killer" but they're still competitive in most aspects from 4080S and down.
lasRay Tracing should not be a focus from AMD. Improving FSR and Frame Gen should be a prime focus, its the reason Nvidia cards are selling like hotcakes. DLDSR, DLSS, DLAA, Frame Gen is the true magic of RTX, not Ray Tracing or Path Tracing,
You say as if the two are mutually exclusive. They are not. Why cant they improve both RT in hardware and FSR/FG in software at the same time?
It's not like one is taking resources away from each other. Engineers who design RT units in hardware are not coding FSR/FG the next day and vice-versa.

I would also say FSR FG was pretty good right out of the gate (despite being late) with wider compatibility. Even with Nvidia's own 20 and 30 series cards that were deprived of a feature that clearly could have worked on those cards (despite what Nvidia said). Even reviewers critical of FSR upscaling portion praised FSR FG as nearly indistinguishable from DLSS FG.
lasAMD needs to release some marketshare-grapping cards, meaning top performance per dollar with FSR and Frame Gen close to or matching Nvidia's solutions. AMD lacks behind on all features today. Anyone who denies this fact, have not tried both recently.
Nvidia lacked behind in terms of driver control panel for a long time. AMD's was unified and modern where as Nvidia's was fragmented and disjointed.
Only last year they started developing Nvidia App that while still in Beta has shown great progress towards unification.
You also say performance per dollar but then lambast AMD for not releasing a halo card - but a halo card is almost never top performance per dollar.
lasRemove iGPUs and AMD is already below 10% dGPU marketshare, Nvidia dominates without even trying (AI focus)
Nvidia dominates also largely because of "old fat" ie older cards like 30 series. Not their latest and greatest.
Posted on Reply
#59
las
TomorrowI would not call a card that brings 7900XT+ performance from 700+ (900 initially) to <500 as pointless. That's exactly what most people have been asking for. Only enthusiasts want a "4090 killer" at any price or power consumption. But very few people by such expensive cards - even among Nvidia's users.
Having a Halo card certainly has it's benefits in terms of marketing and overall image of a series, but financially it's pretty expensive and getting more expensive every year.

I hope so too but seeing AMD's recent pricing they always seem to be able to shoot themselves in the foot with bad pricing and thus negative reviews only for the price to fall immideatly after launch.

How are they getting worse and worse?

At least they tried. I would not say they failed. They fell short of their own expectations and of those who though this was going to be a "4090 killer" but they're still competitive in most aspects from 4080S and down.

You say as if the two are mutually exclusive. They are not. Why cant they improve both RT in hardware and FSR/FG in software at the same time?
It's not like one is taking resources away from each other. Engineers who design RT units in hardware are not coding FSR/FG the next day and vice-versa.

I would also say FSR FG was pretty good right out of the gate (despite being late) with wider compatibility. Even with Nvidia's own 20 and 30 series cards that were deprived of a feature that clearly could have worked on those cards (despite what Nvidia said). Even reviewers critical of FSR upscaling portion praised FSR FG as nearly indistinguishable from DLSS FG.

Nvidia lacked behind in terms of driver control panel for a long time. AMD's was unified and modern where as Nvidia's was fragmented and disjointed.
Only last year they started developing Nvidia App that while still in Beta has shown great progress towards unification.
You also say performance per dollar but then lambast AMD for not releasing a halo card - but a halo card is almost never top performance per dollar.

Nvidia dominates also largely because of "old fat" ie older cards like 30 series. Not their latest and greatest.
You won't get 7900XT performance for 499 LMAO, just wait and see.

Clearly you are out of touch with the actual market. I do b2b sales for a living, Nvidia completely crushes AMD in terms of GPU sales, gaming, AI, enterprise, don't matter, Nvidia is the king.

Techpowerup has like 50+ FSR vs DLSS/DLAA tests and Nvidia wins every time. They also have superior Frame Gen without artifacts and ghosting.

Nvidia have superior drivers, by far. Nvidia runs flawlessly no matter which game you open. Early access, Betas, Emulation, Nvidia does it all without a problem. AMD has wonky drivers and I know this for sure since I am coming from a 6800XT and built like 100+ mid to high-end rigs in the last 5 years, minimum. 9 out of 10 people want Nvidia, thats the hard reality for you.

AMD GPUs gotten worse and worse in the last few generations, their focus shifted away from dGPUs, which shows.

Nvidia dominates because 9 out of 10 want Nvidia, its as simple as that. Many tried AMD at some point but came rushing back to Nvidia.

AMD is cheaper for a reason. If they were actually good, they would gain marketshare, not lose it, year after year. They have improved nothing in the last many generations. Rushed features that are cheap knockoffs of Nvidia's tech is what they do.

DLDSR beats VSR
DLSS/DLAA beats FSR
Nvidia Frame Gen beats AMD Frame Gen.
Reflex beats Anti Lag+ (and AL+ got people steam banned haha)
Nvidia have longer support, even GTX 600 series from 15 years ago still get drivers, meanwhile AMD pulled Polaris and Vega support
Nvidia cards can use RT and even Path Tracing
ShadowPlay beats ReLive

Every single feature, Nvidia invented and AMD tried to copy it, but failed.

Also AMD uses more power and has lower resell value, you save nothing by going AMD GPU in the end.

Thats why AMD GPUs are cheaper, and still don't sell.
Posted on Reply
#60
Tomorrow
lasYou won't get 7900XT performance for 499 LMAO, just wait and see.
Exhibit A: Fury to RX 480. From 549 to 229 at 92% of the performance. Plus double the VRAM despite most other specs being downgraded.
Exhibit B: Vega 56 to 5700 XT. From 399 to 399 at 121% of the performance. Most specs were downgrades, but performance actually increased.

Both cases where AMD released a mid-range cards after high-end cards failed. Fury failed against Maxwell based 900 series and Vega failed against Pascal based 10 series.

And the history is about to repeat the third time. But sure. You believe what you want to believe in the hopes that no way history would repeat itself so soon, or ever.
lasClearly you are out of touch with the actual market. I do b2b sales for a living, Nvidia completely crushes AMD in terms of GPU sales, gaming, AI, enterprise, don't matter, Nvidia is the king.
And is this crushing based on exclusively 4090 sales?
I was not arguing that people dont want, or dont buy Nvidia.
I was arguing that most people want cheaper cards with better performance, not faster cards at even higher prices.
lasTechpowerup has like 50+ FSR vs DLSS/DLAA tests and Nvidia wins every time. They also have superior Frame Gen without artifacts and ghosting.
I was not talking about upscaling. I was talking about Frame generation.
From TPU's on conclusion of FSR FG vs DLSS FG:
the image quality of FSR 3.1 Frame Generation is excellent. In Horizon Forbidden West, when using DLSS as the base image for both Frame Generation solutions, we didn't see any major differences in image quality between AMD's and NVIDIA's Frame Generation solutions, which is a very good thing. The only exception is a slightly softer overall image in motion with FSR 3.1 Frame Generation, specifically at 1080p resolution.
lasNvidia have superior drivers, by far. Nvidia runs flawlessly no matter which game you open. Early access, Betas, Emulation, Nvidia does it all without a problem.
Spoken like a fanboy. Not a single card, no matter how "superior" it's drivers are runs "flawlessly" in every game.
Just open Nvidia forums and you'll see plenty of people with driver problems. It's true that Nvidia has less issues than AMD or especially Intel but i never claimed otherwise.

Nvidia lists known issues in their driver releases every time and often they stay there for months on end before finally (i presume) getting fixed.
Nvidia historically also has had worse drivers in Linux. You know the OS most of the world uses? (in enterprise, embedded, smartphones etc).
Only recently have they been starting to improve their Linux drivers by opening up more previously closed source code.
lasAMD has wonky drivers and I know this for sure since I am coming from a 6800XT and built like 100+ mid to high-end rigs in the last 5 years, minimum. 9 out of 10 people want Nvidia, thats the hard reality for you.
I too have AMD boxes in addition to Nvidia and i've yet to see these "wonky drivers" you speak of. Granted i only use WHQL versions.
I have friends who have AMD cards and they dont complain to me about "wonky drivers".
If you search the internet then there are plenty drivers problems with every product, no matter the manufacturer.
lasAMD GPUs gotten worse and worse in the last few generations, their focus shifted away from dGPUs, which shows.
Again i ask how? You speak about drivers. I assume you mean that? Or is it features?
lasAMD is cheaper for a reason. If they were actually good, they would gain marketshare, not lose it, year after year. They have improved nothing in the last many generations. Rushed features that are cheap knockoffs of Nvidia's tech is what they do.
Again spoken like a fanboy failing to see any progress from "cheaper" competitors who no doubt are worse and keep getting worse every year. Keep this positive outlook going buddy...
lasDLDSR beats VSR
DLSS/DLAA beats FSR
Nvidia Frame Gen beats AMD Frame Gen.
Reflex beats Anti Lag+ (and AL+ got people steam banned haha)
Nvidia have longer support, even GTX 600 series from 15 years ago still get drivers, meanwhile AMD pulled Polaris and Vega support
Nvidia cards can use RT and even Path Tracing
ShadowPlay beats ReLive
Have i said they dont?

Most of those features are also exclusive to Nvidia's own cards or even their latest series, screwing over their previous series customers.

Longer support? When we look at latest drivers then quarterly driver releases for Vega is not "pulling support". This is a myth that started to spread and keeps spreading. People that keep repeating this lie never actually bother to visit AMD's site and check for themselves because that would be too hard and disrupt their narrative.

Nvidia with their current drivers actively supports 900 series and newer. Released in 2014.
AMD supports 400 series and newer. Released in 2016.
The difference is between 10 vs 8 years.
So Nvidia has active support for 2 years more, not 4 years like you claim.

Vega series has very recent drivers from March of this year as does 400/500 series. Only the very old R9 200/300/Fury series are using legacy drivers from a few years back.

R9 200/300/Fury:
Adrenalin 22.6.1 WHQL
Release Date: 2022-06-23

Radeon VII/Vega 56 & 64 + RX 400/500:
Adrenalin 24.3.1 WHQL
Release Date: 2024-03-20

So it seems Nvidia supports their oldest series for up to ten years. Meaning 900 series support will likely be dropped next year. AMD seems to support their older series for 7-8 years.

Next time educate yourself, instead of spouting random nonsense you might have read or heard on the internet.

Hilarious that you say Nvidia can use RT? And AMD cant?
PT is a total non-issue (how many games actually use it?) as even 4090 struggles with it and needs every performance enhancing toggle enabled to get playable framerates. People who buy a 1700+ card to play at 60fps with upscaling and FG enabled in a handful of games are idiots.
PT is essentially a tech demo of what will one day be possible. Today it's a tech demo.
lasEvery single feature, Nvidia invented and AMD tried to copy it, but failed.
AMD seems to be focusing more on hardware, not software features.

Who came up with MCM GPU's first? Nvidia has not even tried to copy it yet. Arguably they dont need to but one day they will have to by necessity as making huge monolithic chips on ever more expensive wafers is a big loss if it has any defects. They already do it to some degree with Blackwell where two big dies are joined together by a high speed interconnect. Not too dissimilar to AMD Infinity Fabric. It's only a matter of time before all three manufacturers move to MCM GPU's. At least for high-end cards.

Who introduced ReBAR first and who copied it?
Historically AMD has also been the first the use a new generation of VRAM. They did it with GDDR4, they did it HBM and HBM2 etc.
AMD cards are also more forward looking (in terms of hardware) with more VRAM out of the box, newer display outputs, hardware scheduling, async compute etc.
lasAlso AMD uses more power and has lower resell value, you save nothing by going AMD GPU in the end.
Is this the old "AMD is hot and loud" argument again? I thought this had died in the R9 300 era but apparently not.
I see AMD cards reselling for quite some money. If you would be right i should be able to pick up high-end cards for pennies.
Posted on Reply
#61
Dr. Dro
Solaris17Does this mean they arent coming out with an 8th gen counterpart to the 7900XTX? or simply that they dont plan on focusing on increased performance as much?
Former, they are targeting RTX 4080 performance at the power footprint of a 7800 XT or 4070 Ti.

I think they'll be great cards if they can pull it off and if the price is right, but the high end will go uncontested.
Posted on Reply
#62
sepheronx
I want a card I can replace my rtx 3080 with, that's less heat, less power draw and can do RTX better while being great to run on Linux.

So get cracking at it AMD
Posted on Reply
#63
GoldenX
mkppoRDNA3 is no jump over RDNA2? A quick look at the 7900GRE review shows the 7900xtx being 43% faster than the 6900xt at 2560x1440. Hell, I was super tempted to switch over from a 3090 because it's just that much faster but the lack of side ported waterblocks was the only deterrent.

I also tend to play some Warzone nowadays and for whatever reason RDNA3 is stupid fast in that game, faster even than the 4090 at most resolutions.

I would say being 43% faster is a plus..



I think they will at least match the 7900xtx but most seem to think otherwise. I guess we'll see soon enough
Check lower end product numbers. The 7600 manages to be slower than the 6600 XT.
Marcus LBut hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out? this is not the majority of buyers, and Nvidia is taking the piss, now more people are waiting multiple gens to upgrade cause $5-$600 for shit mid-gen GPU's is not realistic in the real world, I have a RX 6800 I bought for £350 and nothing new comes close to it in terms of performance/£ even after 2 years of newer GPU's, I am going to hodl this until at least Radeon 10 series or Nvidia 70**, they can charge their overpriced BS money for the same performance class all they want, both of them, I won't be spending a dime on either
Yet you didn't upgrade to RDNA3.

NVIDIA is expensive, everyone knows that, the point is that AMD copying prices without offering anything in return made RDNA3 s terrible release.
Posted on Reply
#64
Marcus L
GoldenXCheck lower end product numbers. The 7600 manages to be slower than the 6600 XT.


Yet you didn't upgrade to RDNA3.

NVIDIA is expensive, everyone knows that, the point is that AMD copying prices without offering anything in return made RDNA3 s terrible release.
if it's good enough for the goose it's good enough for the gander, why the fuck should AMD be expected to cut their prices because Nvidia is "better" they will be priced acordingly, so if a 7900 XT beats a 4080 in raster then it will be priced accordingly, where has the $200-$300 market gone? the one that has been the go-to for gamers for years, now all of a sudden we're expected to pay $600 for these class of cards? it's ridiculous, I have never paid more than $400 for a GPU and never will, looks like I'm relegated to waiting 3-4 gens for a mid class GPU that was supposed to be $300 but was inflated to $600 and buying used last gen parts, bubble's gonna burst, I will be waiting with baited breath when it does, fuck their AI and enterprise
Posted on Reply
#65
GoldenX
Marcus Lif it's good enough for the goose it's good enough for the gander, why the fuck should AMD be expected to cut their prices because Nvidia is "better" they will be priced acordingly, so if a 7900 XT beats a 4080 in raster then it will be priced accordingly, where has the $200-$300 market gone? the one that has been the go-to for gamers for years, now all of a sudden we're expected to pay $600 for these class of cards? it's ridiculous, I have never paid more than $400 for a GPU and never will, looks like I'm relegated to waiting 3-4 gens for a mid class GPU that was supposed to be $300 but was inflated to $600 and buying used last gen parts, bubble's gonna burst, I will be waiting with baited breath when it does, fuck their AI and enterprise
Because it's 2024, not 2008, raster is not the sole metric to measure a GPU anymore.

Leaving the RT discussion aside, AMD is at a disadvantage in encoding and decoding, compute software quality, stability, and hardware support for it, a tensor equivalent and the software that takes advantage of it, system stability particularly in Linux with the lack of hardware support for GPU resets, etc.

It's a product that can only game well, it's a lower quality on anything else, and that merits a lower price. AMD themselves know this, so this is how they intend to tackle the problem.
Posted on Reply
#66
Minus Infinity
TheinsanegamerNIt sure sounds to me like AMD is gonna stick to 7800xt performance and leave high end buyers out to dry.

Unfortunate but perhaps expected by this point.
HUh, all leaks indicate 8800XT will be 7900XT/XTX levels of raster and much faster in RT. It will be much stronger than rubbish 7700XT erm 7800XT at lower power.
ARFThe best comparison is RX 6650 XT -> RX 7600




Performance is equal.



www.techpowerup.com/gpu-specs/radeon-rx-6650-xt.c3898
www.techpowerup.com/gpu-specs/radeon-rx-7600.c4153
Good to see AMD really adding new meaning to the word "progress". See if they had of called this 7500XT, lowered price $30 it would have been much better received. 7600XT is even more disappointing.
Posted on Reply
#67
Dr. Dro
Minus InfinityHUh, all leaks indicate 8800XT will be 7900XT/XTX levels of raster and much faster in RT. It will be much stronger than rubbish 7700XT erm 7800XT at lower power.

Good to see AMD really adding new meaning to the word "progress". See if they had of called this 7500XT, lowered price $30 it would have been much better received. 7600XT is even more disappointing.
It's worth noting that the RX 7600/7600 XT (Navi 33) doesn't actually support some of the backend features nor contains some of the chip-level architectural improvements of the Navi 31 and 32 silicon, which makes these utterly redundant in the face of their RDNA 2-based predecessors. Progress indeed.

For all of Nvidia's faults... it could be a lot worse. Their competition is completely misguided as usual, and Arc isn't quite there yet. Things will heat up once BMG arrives, but Alchemist is a done deal at this point.
Posted on Reply
#68
AusWolf
Marcus Lif it's good enough for the goose it's good enough for the gander, why the fuck should AMD be expected to cut their prices because Nvidia is "better" they will be priced acordingly, so if a 7900 XT beats a 4080 in raster then it will be priced accordingly, where has the $200-$300 market gone? the one that has been the go-to for gamers for years, now all of a sudden we're expected to pay $600 for these class of cards? it's ridiculous, I have never paid more than $400 for a GPU and never will, looks like I'm relegated to waiting 3-4 gens for a mid class GPU that was supposed to be $300 but was inflated to $600 and buying used last gen parts, bubble's gonna burst, I will be waiting with baited breath when it does, fuck their AI and enterprise
Same here. The 7800 XT was the most expensive GPU I've ever bought (although I sold it quickly to a friend). Everybody keeps bringing up "the market" and "current trends" as reasons why I should spend more, but honestly, I don't care. My salary doesn't reflect "the market" in any way, so if £300 was good enough for a GPU ten years ago, then £4-500 should be more than enough for one now. AMD not targeting higher segments with RDNA 4 doesn't affect me in the slightest. If they can improve on RT and the efficiency, and bring video playback power consumption down to acceptable levels, they'll have a buyer.
Posted on Reply
#69
mkppo
GoldenXCheck lower end product numbers. The 7600 manages to be slower than the 6600 XT.
Thats one model but you can't take that and say RDNA3 offers nothing over RDNA2. The performance numbers say otherwise.

Boring release yada yada that's fine. But it offers a substantial uplift at the top end
Posted on Reply
#70
las
TomorrowExhibit A: Fury to RX 480. From 549 to 229 at 92% of the performance. Plus double the VRAM despite most other specs being downgraded.
Exhibit B: Vega 56 to 5700 XT. From 399 to 399 at 121% of the performance. Most specs were downgrades, but performance actually increased.

Both cases where AMD released a mid-range cards after high-end cards failed. Fury failed against Maxwell based 900 series and Vega failed against Pascal based 10 series.

And the history is about to repeat the third time. But sure. You believe what you want to believe in the hopes that no way history would repeat itself so soon, or ever.

And is this crushing based on exclusively 4090 sales?
I was not arguing that people dont want, or dont buy Nvidia.
I was arguing that most people want cheaper cards with better performance, not faster cards at even higher prices.

I was not talking about upscaling. I was talking about Frame generation.
From TPU's on conclusion of FSR FG vs DLSS FG:


Spoken like a fanboy. Not a single card, no matter how "superior" it's drivers are runs "flawlessly" in every game.
Just open Nvidia forums and you'll see plenty of people with driver problems. It's true that Nvidia has less issues than AMD or especially Intel but i never claimed otherwise.

Nvidia lists known issues in their driver releases every time and often they stay there for months on end before finally (i presume) getting fixed.
Nvidia historically also has had worse drivers in Linux. You know the OS most of the world uses? (in enterprise, embedded, smartphones etc).
Only recently have they been starting to improve their Linux drivers by opening up more previously closed source code.

I too have AMD boxes in addition to Nvidia and i've yet to see these "wonky drivers" you speak of. Granted i only use WHQL versions.
I have friends who have AMD cards and they dont complain to me about "wonky drivers".
If you search the internet then there are plenty drivers problems with every product, no matter the manufacturer.

Again i ask how? You speak about drivers. I assume you mean that? Or is it features?

Again spoken like a fanboy failing to see any progress from "cheaper" competitors who no doubt are worse and keep getting worse every year. Keep this positive outlook going buddy...

Have i said they dont?

Most of those features are also exclusive to Nvidia's own cards or even their latest series, screwing over their previous series customers.

Longer support? When we look at latest drivers then quarterly driver releases for Vega is not "pulling support". This is a myth that started to spread and keeps spreading. People that keep repeating this lie never actually bother to visit AMD's site and check for themselves because that would be too hard and disrupt their narrative.

Nvidia with their current drivers actively supports 900 series and newer. Released in 2014.
AMD supports 400 series and newer. Released in 2016.
The difference is between 10 vs 8 years.
So Nvidia has active support for 2 years more, not 4 years like you claim.

Vega series has very recent drivers from March of this year as does 400/500 series. Only the very old R9 200/300/Fury series are using legacy drivers from a few years back.

R9 200/300/Fury:
Adrenalin 22.6.1 WHQL
Release Date: 2022-06-23

Radeon VII/Vega 56 & 64 + RX 400/500:
Adrenalin 24.3.1 WHQL
Release Date: 2024-03-20

So it seems Nvidia supports their oldest series for up to ten years. Meaning 900 series support will likely be dropped next year. AMD seems to support their older series for 7-8 years.

Next time educate yourself, instead of spouting random nonsense you might have read or heard on the internet.

Hilarious that you say Nvidia can use RT? And AMD cant?
PT is a total non-issue (how many games actually use it?) as even 4090 struggles with it and needs every performance enhancing toggle enabled to get playable framerates. People who buy a 1700+ card to play at 60fps with upscaling and FG enabled in a handful of games are idiots.
PT is essentially a tech demo of what will one day be possible. Today it's a tech demo.

AMD seems to be focusing more on hardware, not software features.

Who came up with MCM GPU's first? Nvidia has not even tried to copy it yet. Arguably they dont need to but one day they will have to by necessity as making huge monolithic chips on ever more expensive wafers is a big loss if it has any defects. They already do it to some degree with Blackwell where two big dies are joined together by a high speed interconnect. Not too dissimilar to AMD Infinity Fabric. It's only a matter of time before all three manufacturers move to MCM GPU's. At least for high-end cards.

Who introduced ReBAR first and who copied it?
Historically AMD has also been the first the use a new generation of VRAM. They did it with GDDR4, they did it HBM and HBM2 etc.
AMD cards are also more forward looking (in terms of hardware) with more VRAM out of the box, newer display outputs, hardware scheduling, async compute etc.

Is this the old "AMD is hot and loud" argument again? I thought this had died in the R9 300 era but apparently not.
I see AMD cards reselling for quite some money. If you would be right i should be able to pick up high-end cards for pennies.
Who came up with MCM GPUs, and failed miserably? Yeah AMD. Going MCM and STILL loosing in performance per watt and scalability was an utter fail.
Nvidia beats AMD with ease using monolithic, no need to go MCM.

Yeah AMD used HBM first and failed big time as well. 4GB on Fury series, DoA before they even launched and 980 Ti absolutely wrecked Fury X. Especially with OC, 980 Ti gained massive performance here and Fury X barely gained 1% while watt usage exploded. The worst GPU release ever. Lisa Su even called Fury X an overclockers dream, which has to be the biggest joke ever. Still laugh hard when I watch the video.

AMD seems to be focusing on CPUs like they should. They are a CPU company first. They barely makes a dime on consumer GPUs and target AI and Enterprise now yet Nvidia is king of AI. AMD wants a piece of the pie here, they don't care about gaming GPUs. Which shows. Already below 10% dGPU marketshare and their offerings are meh.

RDNA4 will be a joke, just wait and see. AMD spent no money developing it, its merely a RDNA3 bugfix with improved ray tracing, which is pointless since AMD can't do ray tracing and FSR/Frame Gen won't help them here either, because its mediocre as well.

AMD thinks 110C hot spot temp is acceptable so yeah, AMD is hotter, also uses more power. Low demand means low resell value. You save nothing buying an AMD GPU in the end.

You are the fanboy here, obviously. Everything I state is fact. AMDs features are mediocre, AMDs drivers are wonky, game support is meh. AMD spends most of their time improving performance in games that gets benchmarked, so they look decent in reviews, thats why most early access games, betas and just lesser popular games in general, tends to run like crap on AMD GPUs. Zero focus from AMD. Zero focus from dev's because 9 out of 10 uses Nvidia.

I use AMD CPU, why? Because they make good CPUs. I don't use AMD GPU, why? Because their GPUs are crap. Worse than ever pretty much. Miss ATi.
Posted on Reply
#71
GoldenX
mkppoThats one model but you can't take that and say RDNA3 offers nothing over RDNA2. The performance numbers say otherwise.

Boring release yada yada that's fine. But it offers a substantial uplift at the top end
That's the minimum it has to do, and if it fails to do it on the whole stack, it's a ridiculous release. The top end is dominated by better products from the competition, in any way you look at it, and the low end is a sidegrade or outright downgrade. Great job! You just failed the top end users that aren't married to the brand, and the value and low budget chasers that are the bulk of your sales.

Gotta love forgetting about the 7600 XT and 7700 XT while at it.
Posted on Reply
#72
AusWolf
lasWho came up with MCM GPUs, and failed miserably? Yeah AMD. Going MCM and STILL loosing in performance per watt and scalability was an utter fail.
Nvidia beats AMD with ease using monolithic, no need to go MCM.
If you think the MCM design is about performance, or saving power, then you're hugely mistaken. It's all about controlling defects (smaller chips have better yields) and thus, manufacturing costs. Check your CPU's idle power consumption if you don't believe me.
lasRDNA4 will be a joke, just wait and see. AMD spent no money developing it, its merely a RDNA3 bugfix with improved ray tracing, which is pointless since AMD can't do ray tracing and FSR/Frame Gen won't help them here either, because its mediocre as well.
If AMD is bad in RT, then should they spend nothing on improving it, and just give up altogether? I don't see your logic here.
Posted on Reply
#73
ARF
lasWho came up with MCM GPUs, and failed miserably? Yeah AMD. Going MCM and STILL loosing in performance per watt and scalability was an utter fail.
Nvidia beats AMD with ease using monolithic, no need to go MCM.

Yeah AMD used HBM first and failed big time as well. 4GB on Fury series, DoA before they even launched and 980 Ti absolutely wrecked Fury X. Especially with OC, 980 Ti gained massive performance here and Fury X barely gained 1% while watt usage exploded. The worst GPU release ever. Lisa Su even called Fury X an overclockers dream, which has to be the biggest joke ever. Still laugh hard when I watch the video.

AMD seems to be focusing on CPUs like they should. They are a CPU company first. They barely makes a dime on consumer GPUs and target AI and Enterprise now yet Nvidia is king of AI. AMD wants a piece of the pie here, they don't care about gaming GPUs. Which shows. Already below 10% dGPU marketshare and their offerings are meh.

RDNA4 will be a joke, just wait and see. AMD spent no money developing it, its merely a RDNA3 bugfix with improved ray tracing, which is pointless since AMD can't do ray tracing and FSR/Frame Gen won't help them here either, because its mediocre as well.

AMD thinks 110C hot spot temp is acceptable so yeah, AMD is hotter, also uses more power. Low demand means low resell value. You save nothing buying an AMD GPU in the end.

You are the fanboy here, obviously. Everything I state is fact. AMDs features are mediocre, AMDs drivers are wonky, game support is meh. AMD spends most of their time improving performance in games that gets benchmarked, so they look decent in reviews, thats why most early access games, betas and just lesser popular games in general, tends to run like crap on AMD GPUs. Zero focus from AMD. Zero focus from dev's because 9 out of 10 uses Nvidia.

I use AMD CPU, why? Because they make good CPUs. I don't use AMD GPU, why? Because their GPUs are crap. Worse than ever pretty much. Miss ATi.
If AMD was a normal company, Lisa Su's "head" exactly because the gpu compartment is not working, would have fallen long long time ago.
AMD must be GPU-centric and GPU-first company, in order to generate money as it should.
Stupid, stupid..
Posted on Reply
#74
las
AusWolfIf you think the MCM design is about performance, or saving power, then you're hugely mistaken. It's all about controlling defects (smaller chips have better yields) and thus, manufacturing costs. Check your CPU's idle power consumption if you don't believe me.


If AMD is bad in RT, then should they spend nothing on improving it, and just give up altogether? I don't see your logic here.
MCM is about scalability, always have been. AMD said this officially. Usually MCM has better performance per watt, AMD GPUs don't - Their CPUs do.

My CPUs low power consumption is mainly due to low clockspeeds, 3D cache is fragile. Has nothing to do with MCM since its single CCD. I wanted the best gaming chip, and sadly for AMD, the 7800X3D beats both 7900X3D and 7950X3D here. Dual CCD is just not very good for gaming due to latency issues and it does not help that only one CCD has 3D cache either. 7900X3D in particular is bad, since its only 6 cores with 3D cache.
Posted on Reply
#75
Tomorrow
lasWho came up with MCM GPUs, and failed miserably? Yeah AMD. Going MCM and STILL loosing in performance per watt and scalability was an utter fail.
Nvidia beats AMD with ease using monolithic, no need to go MCM.
So coming 18% below 4090 with a card that costs only 60% as much is failing miserably now?
I have a feeling that even if AMD were faster and cheaper you'd make up some crap about their "faults".
lasYeah AMD used HBM first and failed big time as well. 4GB on Fury series, DoA before they even launched and 980 Ti absolutely wrecked Fury X. Especially with OC, 980 Ti gained massive performance here and Fury X barely gained 1% while watt usage exploded. The worst GPU release ever. Lisa Su even called Fury X an overclockers dream, which has to be the biggest joke ever. Still laugh hard when I watch the video.
Yes 4GB was too little. That being said 980 Ti was 6GB. Not exactly earth shattering capacity there either. I guess at that point it was deemed enough.
900 series were good cards. They improved over 700 series on the same node. Unfortunately this was also the last gen they allowed BIOS editing. After this they locked it down.
lasRDNA4 will be a joke, just wait and see. AMD spent no money developing it, its merely a RDNA3 bugfix with improved ray tracing, which is pointless since AMD can't do ray tracing and FSR/Frame Gen won't help them here either, because its mediocre as well.
Oh i will wait and see, believe me. AMD current cards can do RT as well as 3090 Ti. So you're effectively telling me that 3090 Ti can't do RT.
AMD even does RT on consoles. Something i thought was impossible so soon in this generation on that hardware.
Like i proved earlier their FG is pretty good. It's you who keeps on denying reality. Yes the upscaling part is not as good but as we've proved already it does not matter how good it is. As an Nvidia fanboy you cant accept that anyone but Nvidia can be competent or make a competitive product.
lasAMD thinks 110C hot spot temp is acceptable so yeah, AMD is hotter, also uses more power. Low demand means low resell value. You save nothing buying an AMD GPU in the end.
Show me one AMD card that actually reaches it. TPU's latest review of 7900 XTX clearly shows that most cards reach around 80c: www.techpowerup.com/review/xfx-radeon-rx-7900-xtx-magnetic-air/37.html
All GPU's and CPU have max temp limits near 100c or higher. As do capacitors and VRM's - even higher. You using this as some sort of "own" against AMD shows you have zero clue what that number actually represents and that in real world no one actually reaches it.

The age old "AMD hotter/much power" myth refuses to die because dimwits like you dont bother reading a couple of reviews.
4090 hotspot ~75c.
7900 XTX hotspot ~80c.
Both well withing air cooling limits. As for power - 360W. 4090 uses over 400W. Even 4080S uses over 300W.
Again both are acceptable for high end cards. It's Nvidia who has a 600W BIOS for 4090 and was planning (subsequently canceled) a massive cinder block cooler for it's 600W+ monstrosity. But AMD uses 360W - oh noes.
lasYou are the fanboy here, obviously.
Ah yes. The one using actual, factual sources for it's arguments is the fanboy but the one spewing nonsensical, laughable arguments is not. Sure, sure.
lasEverything I state is fact.
I have already exposed multiple of your lies here in this thread. You seem to be well short of "facts" to prove your fanboyish comments here.
Just ten year old BS arguments that have since been mostly resolved.
lasI use AMD CPU, why? Because they make good CPUs. I don't use AMD GPU, why? Because their GPUs are crap. Worse than ever pretty much. Miss ATi.
And you dont see the hypocrisy in this statement? You say AMD is hot, power hungry, that it's drivers are bad etc and then you bring up ATI, who was way worse in those areas. Shows you have zero clue about history.
lasUsually MCM has better performance per watt
Wrong again. Especially idle power is higher on all MCM designs due to the need to spend energy to move data around between dies.
And like was said before - MCM is absolutely about making smaller dies and lower defect rates.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts