Monday, January 29th 2024

Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

We've known since way back in August 2023, that AMD is rumored to be retreating from the enthusiast graphics segment with its next-generation RDNA 4 graphics architecture, which means that we likely won't see successors to the RX 7900 series squaring off against the upper end of NVIDIA's fastest GeForce RTX "Blackwell" series. What we'll get instead is a product stack closely resembling that of the RX 5000 series RDNA, with its top part providing a highly competitive price-performance mix around the $400-mark. A more recent report by Moore's Law is Dead sheds more light on this part.

Apparently, the top Radeon RX SKU based on the next-gen RDNA4 graphics architecture will offer performance comparable to that of the current RX 7900 XTX, but at less than half its price (around the $400 mark). It is also expected to achieve this performance target using a smaller, simpler silicon, with significantly lower board cost, leading up to its price. What's more, there could be energy efficiency gains made from the switch to a newer 4 nm-class foundry node and the RDNA4 architecture itself; which could achieve its performance target using fewer numbers of compute units than the RX 7900 XTX with its 96.
When it came out, the RX 5700 XT offered an interesting performance proposition, beating the RTX 2070, and forcing NVIDIA to refresh its product stack with the RTX 20-series SUPER, and the resulting RTX 2070 SUPER. Things could go down slightly differently with RDNA4. Back in 2019, ray tracing was a novelty, and AMD could surprise NVIDIA in the performance segment even without it. There is no such advantage now, ray tracing is relevant; and so AMD could count on timing its launch before the Q4-2024 debut of the RTX 50-series "Blackwell."
Sources: Moore's Law is Dead (YouTube), Tweaktown
Add your own comment

396 Comments on Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

#351
Dr. Dro
kapone32Can you even use a 3080 for 4K Gaming with new Games?
You cannot use a 6800 XT either, since it performs so poorly. Despite being 10GB, the 3080 still outperforms it. From W1zz's latest game performance review

Posted on Reply
#352
ARF
Dr. DroYou cannot use a 6800 XT either
Not true. Also, the card was released 4 years ago. :D
Posted on Reply
#353
SailorMan-PT
the54thvoid@SailorMan-PT - please post in English, or your next posts will be removed. This is an English speaking forum, thank you.

Bitte auf Englisch posten for writing in German language
AusWolfDoes that mean that my jaw has to drop in front of the 4090's performance with no consideration towards its price or power consumption? Sorry, not gonna happen.

Since you mentioned it, even the 7900 XTX is way above the limit of what I consider a sensible price for a GPU. That's why I'm not affected by AMD's decision of not making a halo RDNA 4 GPU. If they can pull off a decent midrange card, I'm game.
Well, after watched and studied several Benchmarks, there are cards who serves the mid-range market. In despite of technical value, in relationship to the prices, I would consider the AMD Radeon GPUs better. Cuz of her advantage in a larger VMemory and one decent storage connection. 16-GB+256 bit vs.12-GB+192bit.
Those are the Radeon 7800xt and RX-7900-GRE. I think I don't have to mention the respective NVIDIA GPUs. Cuz they're well known.
SailorMan-PTThose are the Radeon 7800xt and RX-7900-GRE. I think I don't have to mention the respective NVIDIA GPUs. Cause they are well known
the54thvoid@SailorMan-PT - please post in English, or your next posts will be removed. This is an English speaking forum, thank you.

Bitte auf Englisch posten
the54thvoid@SailorMan-PT - please post in English, or your next posts will be removed. This is an English speaking forum, thank you.

Bitte auf Englisch posten.
Yeah I understand but my Smartphone translated this English language into German language when I tipt the reply it's appeared in German language so I just think it would be write it on English language in my reply it was one big Miss understanding
Posted on Reply
#354
ARF
AMD may lose a golden opportunity to beat Nvidia this year
A year and a half after the launch of RDNA 3, AMD’s graphics card lineup has grown a little stagnant — as has Nvidia’s. We’re all waiting for a new generation, and according to previous leaks, AMD was getting ready to release RDNA 4 later this year. Except that now, we’re hearing that it might not happen until CES 2025, which is still six months away.
Launching the new GPUs in the first quarter of 2025 is a decision that could easily backfire, and it’s never been more important for AMD to get the timing right. In fact, if AMD really decides to wait until January 2025 to unveil RDNA 4, it’ll miss out on a huge opportunity to beat Nvidia.
...
But AMD’s gaming segment appears to be bleeding money. Its gaming revenue dropped by 48% year-over-year, and even AMD itself doesn’t expect it to get better.
Jean Hu, AMD’s CFO, recently talked about how the subpar sales of its Radeon GPUs affected the gaming segment in a big way. The company predicts that the revenue in that segment will continue to decline. Where’s AMD going to make money then? It’s simple: From its data center and client segment.
www.digitaltrends.com/computing/amd-is-missing-out-on-a-golden-opportunity-with-rdna-4/
Posted on Reply
#356
TeamMe
nVidia Ray Tracing is a con I mean even Cyberpunk on a 4090 looks no where near as good as the CGI on old movie like Lord Of The Rings. AMD shouldn’t bother with Ray Tracing even on NVIDIA the performance degradation is too big. AMD should keep on being a Rasta man and use HDR like The Last of Us Part 1 PC. AMD should focus on getting higher polygon count models and more detailed textures…
Posted on Reply
#357
Dr. Dro
TeamMenVidia Ray Tracing is a con I mean even Cyberpunk on a 4090 looks no where near as good as the CGI on old movie like Lord Of The Rings. AMD shouldn’t bother with Ray Tracing even on NVIDIA the performance degradation is too big. AMD should keep on being a Rasta man and use HDR like The Last of Us Part 1 PC. AMD should focus on getting higher polygon count models and more detailed textures…
Following this absurd logic, every single API feature developed in the past 20 years is a con, since you can get 70% of the job done with DirectX 9.0c anyway
Posted on Reply
#358
Makaveli
Dr. DroYou cannot use a 6800 XT either, since it performs so poorly. Despite being 10GB, the 3080 still outperforms it. From W1zz's latest game performance review

That is at Max settings.

If you drop them you can use a 3080 or a 6800XT at 4k.
Posted on Reply
#359
Dr. Dro
MakaveliThat is at Max settings.

If you drop them you can use a 3080 or a 6800XT at 4k.
That much seems obvious, the argument there (in this necro'd thread) is that the 6800 XT was superior for 4K gaming, it's not.
Posted on Reply
#360
TeamMe
Dr. DroFollowing this absurd logic, every single API feature developed in the past 20 years is a con, since you can get 70% of the job done with DirectX 9.0c anyway
The effect/difference ray tracing adds is subtle, where as the performance degradation is anything but; that statement is beyond reproach…
Posted on Reply
#361
Dr. Dro
TeamMeThe effect/difference ray tracing adds is subtle, where as the performance degradation is anything but; that statement is beyond reproach…
I didn't specifically mention RT to begin with but I disagree, but it's not something that can be truly appreciated unless you're already playing at 4K or higher IMO. The atmospherics in Metro Exodus for example, or more recently the full path tracing in Wukong, these look absolutely phenomenal on the right setup.

End of the day people make a big deal of RT because their hardware can't handle it and they cannot afford a suitable upgrade.
Posted on Reply
#362
Punkenjoy
Dr. DroI disagree, but it's not something that can be truly appreciated unless you're already playing at 4K or higher IMO. The atmospherics in Metro Exodus for example, or more recently the full path tracing in Wukong, these look absolutely phenomenal on the right setup.

End of the day people make a big deal of RT because their hardware can't handle it and they cannot afford a suitable upgrade.
It's a great advice to actually play something instead of just peeping at screenshot.

Because the best looking thing we could do for screenshot is a very high quality baked lighting that wouldn't be dynamic at all. Doesn't matter, screenshot don't move. This is also why so many old game still look good and seem to have good lighting on screenshot. Because it's just static.

When you get to Dynamic lighting, then the non RT stuff have many flaws. It can look somehow good but it's easy to fell apart. RT do a much better job for those kind of scenario. It just make the lighting to be much more realistic, witch at the same time, make it less obvious. That is a strange thing but that is the case. Like said previously that is a style you go for.

Right now, the main problem i think with Path Tracing and RT in general is not really the performance impact, it's the quality of the denoiser. Movie just brute force that issue with many more rays and heavy offline denoising algorithm.

DLSS 3.5 is a bit better on that front but it still have many flaws. Having a quality denoiser will be key to make RT look like what we see in movies.


As for the performance impact, this is just a current impact. Most of the shaders we use would destroys the few first generations of GPU that supported it. In 2-3 generation, even mid range should have plenty of power to run RT at 1080P
Posted on Reply
#363
Dristun
If MLID or RGT say something good then at least 80% of the time it's guaranteed to be late and slow again. Only if the moment arrives when these two declare that AMD has lost it we should all start waiting for AMD to pull out a card of 9700XT/7970 levels of greatness.
Posted on Reply
#364
Makaveli
Dr. Dro6800 XT was superior for 4K gaming, it's not.
neither of them is good for 4k without dropping settings so you are correct.
Posted on Reply
#365
ARF
DristunIf MLID or RGT say something good then at least 80% of the time it's guaranteed to be late and slow again. Only if the moment arrives when these two declare that AMD has lost it we should all start waiting for AMD to pull out a card of 9700XT/7970 levels of greatness.
The current situation is so bad, anyways. Because the current lineup is already 2-year-old and needs optimisations and updates for 2024/2025 usage.
The only thing that matters now is to release something (anything) new, no matter if it's good or bad, then adjust the pricings accordingly, so that at least some stock moves off the warehouses shelves..
Posted on Reply
#366
Chrispy_
ARFThe current situation is so bad, anyways. Because the current lineup is already 2-year-old and needs optimisations and updates for 2024/2025 usage.
The only thing that matters now is to release something (anything) new, no matter if it's good or bad, then adjust the pricings accordingly, so that at least some stock moves off the warehouses shelves..
Why are you so bad at super-basic, increadibly easy-to-check facts, Arf?



Nothing in the current AMD 7000-series lineup is 2 years old yet, most of it isn't even 1 year old yet - and the AMD 7000-series is younger than the RTX 40-series and the Intel Arc series.
Posted on Reply
#367
AusWolf
Chrispy_Why are you so bad at super-basic, increadibly easy-to-check facts, Arf?



Nothing in the current AMD 7000-series lineup is 2 years old yet, most of it isn't even 1 year old yet - and the AMD 7000-series is younger than the RTX 40-series and the Intel Arc series.
Also, what if it's old? What matters is that it still works, right?
Posted on Reply
#368
Broken Processor
Dr. DroThat much seems obvious, the argument there (in this necro'd thread) is that the 6800 XT was superior for 4K gaming, it's not.
I own a 6800xt with the snot over clocked out of it. It benchmarks faster than. 7900GRE and what you are saying is correct.
Posted on Reply
#369
Vayra86
Broken ProcessorI own a 6800xt with the snot over clocked out of it. It benchmarks faster than. 7900GRE and what you are saying is correct.
Throw a UE5 game like Black Myth at a 4090 and its not a real 4K killer either... it manages, at best ;) OTOH throw Minesweeper at a 6800XT and itll do 4K120...

4K cards dont exist and never have nor will. There is only a game and its performance. GPUs move along doing what they can, and especially now with dynamic 'anything' in graphics... if you put up with high latency, lower IQ you can have 4K on many cards...
Posted on Reply
#370
ARF
Chrispy_Nothing in the current AMD 7000-series lineup is 2 years old
20 months old. Give or take 3-4 months. While the design process can be traced back at least 3 or 4 years earlier.
So, a 2018 thing. :D

Posted on Reply
#371
Broken Processor
Vayra86Throw a UE5 game like Black Myth at a 4090 and its not a real 4K killer either... it manages, at best ;) OTOH throw Minesweeper at a 6800XT and itll do 4K120...

4K cards dont exist and never have nor will. There is only a game and its performance. GPUs move along doing what they can, and especially now with dynamic 'anything' in graphics... if you put up with high latency, lower IQ you can have 4K on many cards...
True it's a weird situation currently hardware seems off I'm hoping to upgrade to Blackwell but I'd want 30% more performance than a 4090 to feel comfortable parting with the cash a 5090 will cost. The 6800xt is a hell of a card so much so that I kept it after going through 3 7900xtx cards that all had issues with drivers and latency. Software is to important and my 6800xt and 4 month hell with 7900xtx has shown me AMD still sucks in that department, software adrenaline is terrible in so many ways I'm going back to Nvidia the premium is worth it IMO. But I'll be putting my 6800xt on the wall with my other favourite cards.
Posted on Reply
#372
Chrispy_
ARF20 months old. Give or take 3-4 months. While the design process can be traced back at least 3 or 4 years earlier.
So, a 2018 thing. :D
Yes, I literally posted a table with all of the launch dates for you, ranging from 6 to 20 months. 20 months is not 'already 2 years" that you wrote which quite unambiguously means >24M

As for the design process being 2018, what does that have to do with anything? Every CPU and GPU in the last 30 years has been in development for multiple years before launch. Once again, you're spouting nonsense - please stop, or at least do a basic sanity check on whether what you type is sane, relevant, or worthwhile.
Posted on Reply
#373
ARF
Chrispy_Yes, I literally posted a table with all of the launch dates for you, ranging from 6 to 20 months. 20 months is not 'already 2 years" that you wrote which quite unambiguously means >24M

As for the design process being 2018, what does that have to do with anything? Every CPU and GPU in the last 30 years has been in development for multiple years before launch. Once again, you're spouting nonsense - please stop, or at least do a basic sanity check on whether what you type is sane, relevant, or worthwhile.
The one spouting nonsense is you.
You in a wrong way count the period between a product going in the wild and the present moment, when the right way to count how old a product is to count the time between its set-in-stone, tape-out, or set-in-stone decision / feature set. Because between that moment in time, and the physical release to the wild, there can be multiple other milestones happening, feature set updates, etc.
AusWolfAlso, what if it's old? What matters is that it still works, right?
Wrong. AMD's market share is close to non-existent in the OEMs market, which means that while the products do indeed physically work, they are market failures.

88% of all GPU shipments are nvidia, the rest is AMD and intel.


www.jonpeddie.com/news/shipments-of-graphics-add-in-boards-decline-in-q1-of-24-as-the-market-experiences-a-return-to-seasonality/
Posted on Reply
#374
Chrispy_
ARFthe right way to count how old a product is to count the time between its set-in-stone, tape-out, or set-in-stone decision / feature set.
Your insanity and delusional nonsense continue to entertain.

Tape-out and product launch are at a bare minimum, 6 months apart and have been 2+ years on several notable occasions where redesigns were required after the initial tape-out. Tape-out is one of the product development phases, but there's no way to know if any tape-out will be final until the silicon is received back and tested months later. Saying that a tape-out is the time at which a product is set-in-stone proves that you have absolutely no idea what you're talking about.

By your metrics, Nvidia Blackwell GPUs, slated to launch in early 2025 are already 18 months old!

Here's another shovel if you want to keep digging...
Posted on Reply
#375
AusWolf
ARFWrong. AMD's market share is close to non-existent in the OEMs market, which means that while the products do indeed physically work, they are market failures.

88% of all GPU shipments are nvidia, the rest is AMD and intel.


www.jonpeddie.com/news/shipments-of-graphics-add-in-boards-decline-in-q1-of-24-as-the-market-experiences-a-return-to-seasonality/
What has the age of RDNA 3 got to do with market share?

Also, what's this big deal about market share imparity with Nvidia? Why do you think that a much smaller company should equal out a much bigger one in market share? :kookoo:
Posted on Reply
Add your own comment
Nov 24th, 2024 16:57 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts