• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

My dude, I've said my piece months ago when I had the card. If you have only 60Hz screens and/or single monitor there's nothing to note. I'm not repeating the whole writeup again. Go look back in the owners club or my project log.
If you're looking as hard as you did, for, and at issues and only came up with the three I just read, I wouldn't call that a shocking amount relative to Intel and Nvidia. GPUs but that's me, fan stopping really seamed to bother you plus multi monitor idle above all else, one of those doesn't even register on my radar most of the time, as for stuttering, I don't use your setup you do, some of that stuttering Was down to your personal set-up IE cable's, glad you got it sorted by removing the card entirely but for many other silent users and some vocal, few of your issues applied.

And more importantly, this Is made by the same people, but it isn't your card, so I think your issues might not apply to this rumour personally.
 
on par with the 4070ti, so 10% slower then the 7900xt, how many cards will they plan on releasing to stack them every 10%?!

i guess no 7800xtx or it would just be hilarious :roll:

7800XT probably tie with 6900XT more than likely, AMD has excess volume of Navi31 so they have no other choice than to sell cut-down Navi31 as 7800XT, or even 7700XT :roll:
 
There is no 10%, RDNA scales pretty linearly. 33% less means 33% and slightly better than a 4070 but it could force Nvidia to release the 16GB. As for power savings most people surely set the driver to prefer performance mode where it's probably using the higher 3D clock as opposed to power saving where the low clock is applied instead.
 
I won't even consider any AMD GPU until they fix the low load power draw.
 
Obviously we are exchanging opinions. And as Harry Callahan have said... :p

Don't know what a GPU needs to do to keep feeding a high refresh rate monitor or a dual monitor setup. But close to 100W power consumption in video playback is unacceptable in 2023. While an opinion I also believe it is logical to say that. Your smartphone in your pocket or on your desk can probably playback the same video files at 5-10W power consumption. In fact even AMD's own APUs play video while keeping their power consumption for the whole SOC at even lower than 15W. So, something gone wrong with RDNA 3 here. And even if we consider this still an opinion, then there is this chart
View attachment 302130
where a 6900XT is at half the power consumption.
AMD Radeon RX 7900 XTX Review - Disrupting the RTX 4080 - Power Consumption | TechPowerUp

I own a 7900XT, I never see it using more than 30-35W during video playback with a ton of other things opened in the background and I use two monitors as well.
 
7800XT probably tie with 6900XT more than likely, AMD has excess volume of Navi31 so they have no other choice than to sell cut-down Navi31 as 7800XT, or even 7700XT :roll:
Yep...
 
88W to watch a video is insane. :kookoo:
 
The 3GHz, it's not. This power consumption in multi monitor and video playback is unacceptable in 2023.
View attachment 302128
This is from the original review of TechPowerUp but I think it still remains a problem even today.


Probably.

In my opinion it is. Looking at 6650XT and 7600 specs side by side and then no performance change in games, that's a problem.
Why are you referencing the original review on TPU when you can look at the most recent GPU review to see that those power consumption numbers for multi-monitor and video playback are around half of what they were at release.

There's nothing wrong with the 7600, aside from price(it should be around the $200-225 price range, max). Even it's naming scheme left out the "XT" to differentiate it from the previous generation. Sure it's basically on par with the 6650XT in terms of performance, but it's also around 20% more efficient with that performance.
 
Can't blame anyone who chose a 4090 over any AMD GPU, it's faster, more feature complete, better maintained, it's simply a product that AMD currently can't hope to match. They don't have either the engineering or software development required to release a product of its caliber at this time.



I thought the same until I saw the RX 7600, and nope, RDNA 3 is practically irredeemable at this point. It's just bad beyond belief if you stack it against Ada Lovelace. And from my standpoint as someone with a high-end Ampere card, it's not for me either. Its marginal gains in raster and equal RT performance at the cost of losing of the whole feature set of the RTX ecosystem and Nvidia's superior driver software make it unworthy of me even considering a replacement. This generation has failed, and short of massive price cuts, I don't think any of these are worth buying.

Unfortunately no heads will roll, Radeon needs new blood, innovative people to develop products and the brand. The boomers that developed these GPUs in the 2000s and linger in upper management need to go. They can't keep up with the industry anymore.

You don't think chiplet gpus are innovative?
 
RTX 4090 is a status symbol for someone that can't afford a Tesla or Mercedes. If you want to spend 70 percent more money for 20 percent more performance, go brag to your mommy.

Meanwhile we are all waiting for further improved performance/dollar, and AMD is always the leader there. Hopefully the 7800 XT can impress. Almost fall 2023 here, time for refreshed products, never mind the first release.

I own a 7900XT, I never see it using more than 30-35W during video playback with a ton of other things opened in the background and I use two monitors as well.
Have to spend most of your time responding to the imaginary product in their heads, not the one you actually own.

----------

Sad to realize that most hated GPU release from NVidia, the 4060 Ti, is the only card with a decent perf/dollar. Building on the last gen but lacking VRAM. EVERY other NVidia GPU is much worse. Oh well.
 
Last edited:
You don't think chiplet gpus are innovative?

Another thing people are overlooking is that AD102 has 30% more transistors, also Navi31 dedicates more transistor budget for the cache, GPUs scale very close to linearly, when you realize that there is absolutely nothing impressive about it.

More transistors = more compute units = more performance. Performance per compute unit is comparable between ADA and RDNA3.
 
Last edited:
7800XT probably tie with 6900XT more than likely, AMD has excess volume of Navi31 so they have no other choice than to sell cut-down Navi31 as 7800XT, or even 7700XT :roll:

but will people buy more just because it's called 7800xt? i don't think so. Unless of course they make a big price gap, but in that case no one will buy the 7900 obviously, so i think they have the same problem but now getting even less money.
 
The problem isn't RDNA3, its the competitive edge. The whole RDNA3 'stack' (three SKUs lol) just doesn't lead and also fails to cover a relevant stack position in price/perf. Ada realistically has it beat; its ever so slightly more power efficient, has bigger featureset, and sacrificed raster perf/$ compared to RDNA3 is compensated for by DLSS3, if that's what you're willing to wait for. All aspects that command a premium. It should have competed on price. The only advantage it has is VRAM, but RDNA2 has that too.
I would actually argue that RDNA3 was not intended to have a VRAM advantage. AMD successfully failed into that.

OK, RTX4090 is Nvidia's successful halo product at the top but RTX 4080 should not be competing with 7900XT, much less 7900XTX. It should be competing with the upcoming 7800XT.
76SM vs 84CU, 256-bit vs 320-bit memory bus, 64MB vs 80MB of LLC.
For whatever reason AMD had to move the product stack down a notch and as a result ended up with more VRAM at same product stack/price/performance levels.

Its high. But there are other examples with strange gaps. Look at the 3090ti compared to a 3090. 13W at the same VRAM capacity. Mkay?
Double the GDDR6X chips. 12x2GB vs 24x1GB. That comes with a nasty tradeoff in power consumption.
 
I would actually argue that RDNA3 was not intended to have a VRAM advantage. AMD successfully failed into that.

OK, RTX4090 is Nvidia's successful halo product at the top but RTX 4080 should not be competing with 7900XT, much less 7900XTX. It should be competing with the upcoming 7800XT.
76SM vs 84CU, 256-bit vs 320-bit memory bus, 64MB vs 80MB of LLC.
For whatever reason AMD had to move the product stack down a notch and as a result ended up with more VRAM at same product stack/price/performance levels.

AMD has been using the higher vram as a selling point for year, it was like that in the Polaris days almost a decade ago, it kept it ever since. It's no a new move or a miscalculation of rdna3
 
You don't think chiplet gpus are innovative?

Since they don't measure up in performance and AMD currently offers me less features and worse software for my money than Nvidia, I don't care if it's a monolithic, chiplet, 3D stacked, whatever really.

I paid about the same price an RX 6750 XT costs nowadays for my Radeon VII, which I bought brand new in box (on sale). GPU prices are high because the companies realized that they sell at those inflated prices regardless - and it's interesting to note that the VII was considered a "poor value" choice at $699 when you could have the 5700 XT with RDNA architecture (but less memory and a sliver less performance at the time - though it's faster now) for less money indeed...
 
I own a 7900XT, I never see it using more than 30-35W during video playback with a ton of other things opened in the background and I use two monitors as well.
That's nice.
Why are you referencing the original review on TPU when you can look at the most recent GPU review to see that those power consumption numbers for multi-monitor and video playback are around half of what they were at release.

There's nothing wrong with the 7600, aside from price(it should be around the $200-225 price range, max). Even it's naming scheme left out the "XT" to differentiate it from the previous generation. Sure it's basically on par with the 6650XT in terms of performance, but it's also around 20% more efficient with that performance.
Another person pointed at the newer numbers, but probably you didn't read more posts. That being said the over 40W power consumption in TPU's review is still pretty high. Please read rest of the posts before replying. Not going to repeat what is already written.

As for 7600, you are missing the point about performance. More efficient? Where do you base this?
Have to spend most of your time responding to the imaginary product in their heads, not the one you actually own.
We are using those.... imaginary numbers from TPU reviews. Maybe the reviews are flawed? What do you say?
 
AMD has been using the higher vram as a selling point for year, it was like that in the Polaris days almost a decade ago, it kept it ever since. It's no a new move or a miscalculation of rdna3
Yes, but they genuinely added more VRAM in RDNA2 vs Ampere. That was an interesting comparison because both made different technical tradeoffs in search of faster memory system.
- AMD went heavily into LLC and got the chance to cut memory bus widths as a result without significant loss in performance.
- Nvidia wanted faster VRAM and went for GDDR6X - with the primary problem that it was only available (probably against predictions) in 1GB chips. This led to VRAM sizes on Ampere being lower than on RDNA2 despite wider memory buses.

In that comparison Nvidia either failed or maybe just had less luck this time around.
This generation - RDNA3 vs Ada - Nvidia followed AMD example of adding a large cache and cutting the memory bus width.
 
Would you call a 33% difference "marginal"?
I would. When the newer card is more expensive (as per current prices) getting +33% performance is marginal and no single live one should ever care how little the price difference is. 33% can't justify nothing.

RDOA3 has its name for an eversolid reason. More of the same is almost the complete opposite of what AMD should've been doing if they're aiming at gains and not losses. 7900XTX is a complete mess driverwise and RT-wise. 7900XT is even more of a total nonsense because it shares the 7900XTX problems and is even worse moneywise (even though you have never expected that to be possible). And... 7600 is a marginally overclocked 6650XT marketed as something new. Since the whole line-up is a 10 outta 10 failure how do you expect these 33% to justify anything? Those who already have 3090 will at least buy 4090 or, which is more likely, wait till something can really beat it effortlessly aka x2.5 performance. x2.5, not x1.33.

Those whose best card is at most 3070/6700 XT will still upgrade (if decided to do it now which is senseless but I take it) to 4070 Ti or 4090 because the former has DLSS and better RT and the latter is really providing with massive performance gains over an old GPU.

Nothing, even big discounts, can help RDOA3. x700 and x800 area is doomed because almost everyone who wanted such performance card has already got such a card. And the leatherjerket boy will make even worse products in his RTX 5000 line-up. 8 PCI-e lane 5080 and $800 5060 just because nothing competes.
 
Yep this is definitely an AMD titled GPU post. You would think these cards are an absolute failure. There is one thing the 7900 series cards can do that no 6000 card can. Both cards can run 4K 144Hz panels no problem. There is no Game, not even TWWH3 (Not the benchmark) that cannot be enjoyed at 4K with those cards. Idle monitor usage or video playback issues are so small that people forget that the message before the cards were launched was to get a nice Beefy PSU to avoid any issues with power and I hope none of the people Whining about power draw own an OLED panel. This card has no foreknowledge so saying it will be X, Y or Z is just conjecture. I will say the last time AMD showed confidence the Internet barbqed them and some of the negative hype is directly from the propaganda Campaign that is Nvidia's "advantage". This card should absolutely kick butt at 1440P anything anyway. Let's keep in mind that the 6700XT is not far away from a 6800 at 1440P.
 
Yep this is definitely an AMD titled GPU post. You would think these cards are an absolute failure. There is one thing the 7900 series cards can do that no 6000 card can. Both cards can run 4K 144Hz panels no problem. There is no Game, not even TWWH3 (Not the benchmark) that cannot be enjoyed at 4K with those cards. Idle monitor usage or video playback issues are so small that people forget that the message before the cards were launched was to get a nice Beefy PSU to avoid any issues with power and I hope none of the people Whining about power draw own an OLED panel. This card has no foreknowledge so saying it will be X, Y or Z is just conjecture. I will say the last time AMD showed confidence the Internet barbqed them and some of the negative hype is directly from the propaganda Campaign that is Nvidia's "advantage". This card should absolutely kick butt at 1440P anything anyway. Let's keep in mind that the 6700XT is not far away from a 6800 at 1440P.
You don't get it. nVidia has put a negative effort, yet their Ada products are still better than RDOA3. This is enough to ignore every other detail.
 
If the 7900 XTX hadn't failed, it would have been matching the 4090, just as the 6900 XT once matched the 3090.
No. Simply, no. GPUs are not that complicated to have a performance estimate on.

6900XT vs 3090 were roughly equal (figuring out the SKUs aside where AMD seems to have reacted with 6900XT)
- 80CU vs 82SM, roughly same amount of transistors and shader units. Nvidia had slight disadvantage from being half a node behind.
- AMD bet on LLC to make up for 256-bit memory bus vs 384-bit on 3090. A successful bet, in hindsight.

This is simply not the case for 4090 vs 7900XTX. 128SM vs 96CU on same process node, same memory bus width, similar enough LLC.
There are definitely cases where 79000XTX can get close, mostly when power or memory becomes the limiting factor.
 
I would. When the newer card is more expensive (as per current prices) getting +33% performance is marginal and no single live one should ever care how little the price difference is. 33% can't justify nothing.

RDOA3 has its name for an eversolid reason. More of the same is almost the complete opposite of what AMD should've been doing if they're aiming at gains and not losses. 7900XTX is a complete mess driverwise and RT-wise. 7900XT is even more of a total nonsense because it shares the 7900XTX problems and is even worse moneywise (even though you have never expected that to be possible). And... 7600 is a marginally overclocked 6650XT marketed as something new. Since the whole line-up is a 10 outta 10 failure how do you expect these 33% to justify anything? Those who already have 3090 will at least buy 4090 or, which is more likely, wait till something can really beat it effortlessly aka x2.5 performance. x2.5, not x1.33.

Those whose best card is at most 3070/6700 XT will still upgrade (if decided to do it now which is senseless but I take it) to 4070 Ti or 4090 because the former has DLSS and better RT and the latter is really providing with massive performance gains over an old GPU.

Nothing, even big discounts, can help RDOA3. x700 and x800 area is doomed because almost everyone who wanted such performance card has already got such a card. And the leatherjerket boy will make even worse products in his RTX 5000 line-up. 8 PCI-e lane 5080 and $800 5060 just because nothing competes.
Yep that's why the 6700XT is the best selling card on Newegg Canada. All this card has to beat is the 6800XT, It is not a 7900XT and what driver issue are you talking about? You must mean the 3 months AMD spent making sure that console ports sing with RDNA3. I guess you would have to own one to appreciate. Just read some of the posts in the 7000 Owners Club and you will understand. It is all about pricing.

You don't get it. nVidia has put a negative effort, yet their Ada products are still better than RDOA3. This is enough to ignore every other detail.
Yep the 4070 is the same price in Canada as the 7900XT



So which would a knowledgeable Gamer buy in a World of 4K benchmarks?

Nvidia has gone full in for Greed and are paying the price. In some ways it is the same as Intel. The issue is the hubris of Nvidia Fanboys that qoute high power draw in a world of burning connectors and use desultory words to describe something they have no real experience with.

 
what driver issue are you talking about?
I'm talking about aforementioned absurdly high wattage at video playback and multi-monitor usage. It doesn't matter if it's fixed, the only thing that matters is they launched a product unable to perform efficiently. You can afford taking money from customers for a pre-alpha testing privilege when you're miles ahead of your competition, not when you're stone age behind and you yourself are not a competition at all.

7900 XTX was launched marginally cheaper than 4080 and has nothing to brag about. +8 GB VRAM does nothing since 4080 can turn DLSS3 on and yecgaa away from the 7900 XTX. "Sooper dooper mega chiplet arch" does also do nothing when 4080 can preserve 60ish framerates RT On whilst 7900 XTX goes for shambly 30ish FPS with permanent stuttering inbound. More raw power per buck? WHO CARES?
 
Another thing people are overlooking is that AD102 has 30% more transistors, also Navi31 dedicates more transistor budget for the cache, GPUs scale very close to linearly, when you realize that there is absolutely nothing impressive about it.

More transistors = more compute units = more performance. Performance per compute unit is comparable between ADA and RDNA3.
In both RDNA3 and Ada, the purpose of LLC is not to augment compute units. It is to augment memory controller and increase effective bandwidth. Memory bus width is the same for both 4090 and 7900XTX so yes, AD102 dedicates less of it to LLC. On the other hand, the transistor budget that 7900XTX dedicates to LLC is the one that does not matter - that cache is next to the memory controllers on MCDs.
 
I'm talking about aforementioned absurdly high wattage at video playback and multi-monitor usage. It doesn't matter if it's fixed, the only thing that matters is they launched a product unable to perform efficiently. You can afford taking money from customers for a pre-alpha testing privilege when you're miles ahead of your competition, not when you're stone age behind and you yourself are not a competition at all.

7900 XTX was launched marginally cheaper than 4080 and has nothing to brag about. +8 GB VRAM does nothing since 4080 can turn DLSS3 on and yecgaa away from the 7900 XTX. "Sooper dooper mega chiplet arch" does also do nothing when 4080 can preserve 60ish framerates RT On whilst 7900 XTX goes for shambly 30ish FPS with permanent stuttering inbound. More raw power per buck? WHO CARES?
That the thing with a 7900XT you don't need any of those. Just turn the colour and contrast up, enable Freesync and you are good. Please tell me that is not the case.
 
Back
Top