• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XT

You're comparing the 2nd-tier model with last generation's flagship and disappointed that it's only 22% faster?

And correspondingly so more expensive. This is not how the things work or should be, in the first place.

Because if the trend continues, RX 8900 will cost $1900 and RX 9900 will cost $2900.
:D
 
Pricing is definitely very weird on this one...

why the insane multi monitor and video playback power consumption?
This bug lingers from vega times, during RDNA1 it was very prominent issue, sadly AMD don't know how to fix it. The behavior depends on particular monitor, with some gpu downclock correctly with others it doesn't. One fix could be to use CRU, but it's not guaranteed, sometimes when you find CRU spot where GPU downclock, the monitor has issues and reset. I've written about it 2 years ago on their forum and still no fix to this day - one of many reasons i've sold my 5700XT.
 
  • Like
Reactions: ARF
This bug lingers from vega times, during RDNA1 it was very prominent issue, sadly AMD don't know how to fix it.

Weird indeed. Why don't they simply ask someone in the know, maybe the fellow nvidia engineers? :kookoo:
 
And correspondingly so more expensive. This is not how the things work or should be, in the first place.

Because if the trend continues, RX 8900 will cost $1900 and RX 9900 will cost $2900.
:D
Pricing is definitely very weird on this one...
Honestly, both AMD and Nvidia are milking the high-end right now but it's hurting the PC gaming market.

When you can buy a PS5 or XBSX for less than the price of a 2-year old midrange GPU, more and more of the market is moving to consoles. Not here on TPU, the hardware enthusiast site - but absolutely 100% in the rest of the gaming industry.

At some point, if this trend of raising GPU prices far beyond console prices continues, PC gaming will simply stop existing. Half a decade ago, you could build a low-end PC that matched a console for around 50% more money than the console. Currently a PC that can game as well as a current-gen console costs at least twice as much as that console. By the time it costs 5x more to game on PC than console, 99% of the gamers will have moved over to console.
 
That's exactly how I test, intentionally. The thing is that this same test works perfectly fine on NVIDIA with lower power draw, so it must be possible.
It does not "have" to run the memory at full speed, that's just the way AMD runs their card, which is suboptimal.
Ah, okay, so you intentionally use a setup that uses more power. I suspected as much, thanks for confirming.
 
Ah, okay, so you intentionally use a setup that uses more power. I suspected as much, thanks for confirming.
You make it sound like I'm doing the wrong thing. I'm intentionally not using the "easy" case. My work setup for 15 years has been one big screen + a smaller one on the side, for E-Mail etc .. so this is a realistic setup
 
You make it sound like I'm doing the wrong thing. I'm intentionally not using the "easy" case. My work setup for 15 years has been one big screen + a smaller one on the side, for E-Mail etc .. so this is a realistic setup
Do you like that setup? I still have one of my previous monitors (23” 1080P) I’m thinking of adding back to my desk next to my new monitor (27” 4K) purely for productivity reasons. I just feel like it would bother me having two different monitors.
 
I'd argue this might be a 7800 XT really.
 
You make it sound like I'm doing the wrong thing. I'm intentionally not using the "easy" case.
You are intentionally using a case where one card always performs better, so at the very least you should add a disclaimer that the results were achieved under such settings. Or even better, test it both ways. Because it IS possible to get drastically lower power usage by adjusting them to matching refresh rates, so you can have a scenario where two users with different multi-monitor setups will have drastically different power usages. And then TPU will get a reputation for publishing unreliable numbers, because they have the same card, same drivers, but get far better values than what you published - which means your tests are either incompetent, or purposefully skewed.

I mean how do you think it looks like, that you produce 40-70W power usage, and then someone with the same card gets 5W power usage?

Moreover, sometimes things like driver or windows updates can mess with the refresh rate settings. So if you don't inform the user about this fact, it will be another thing the drivers mysteriously break or fix, when it is in fact just a single setting they need to adjust. Not any different than if a new driver enforced a different antialiasing or vsync setting, that would break performance - only that this one is rather more obscure.

My work setup for 15 years has been one big screen + a smaller one on the side, for E-Mail etc .. so this is a realistic setup
It's not the screen size you have to match, it's the refresh rate. Also having multiple identical monitors is an equally realistic setup...
 
Do you like that setup? I still have one of my previous monitors (23” 1080P) I’m thinking of adding back to my desk next to my new monitor (27” 4K) purely for productivity reasons. I just feel like it would bother me having two different monitors.
It's the best, doesn't take up a ton of space, but multi-monitor makes a HUGE difference for productivity. I can move my review data around, Excel, reviews editor, etc etc

You are intentionally using a case where one card always performs better
Always? AMD just needs to fix their shit? How hard can it be?

Also having multiple identical monitors is an equally realistic setup...
Agreed. So you prefer I test two identical monitors, and let this one slide for AMD?
 
Always? AMD just needs to fix their shit? How hard can it be?
How many processors or device drives have you engineered, that you can ask "how hard can it be"?

Agreed. So you prefer I test two identical monitors, and let this one slide for AMD?
No, I'm saying that you should provide numbers which give accurate results. If two different software settings give you different results, then you should test both to be fair, otherwise you risk your tests either being highly inaccurate or biased towards one manufacturer. Right now you don't list the specific refresh rates on the test setup, even though this is a factor known to affect multi-monitor power usage.

I understand that it is not possible to test every different user case, but setting your monitors to a matching refresh rate isn't very difficult at all - neither is writing a small disclaimer that "multi-monitor test was done with monitory a and b running at resolutions x and y, and refresh rates u and v.
 
Well, despite my earlier reservations I got frustrated that the XTX was already sold out at 6:02 so I ordered the XT and used the $100 left over for a 1TB Samsung 980 Pro to replace my tired 1TB SATA boot drive.
 
Also regarding this:
Always? AMD just needs to fix their shit?

A simple google search shows that high multi-monitor power usage also happens very often on geforce cards. They even have a specific tool called "Multi Display Power Saver" to get around it, except it sometimes causes even more issues.
 
@W1zzard Can you post an editorial article about the graphics cards prices? I think it needs a detailed research of the industry. Is it lunatic profiteering by AMD or the particular segment has very severe components cost increase?

Current pricing in euro in Germany:

Radeon RX 6400 - 132.24
Radeon RX 6500 XT - 168.51
Radeon RX 6600 - 268.95
Radeon RX 6600 XT - 339.00
Radeon RX 6650 XT - 328.99
Radeon RX 6700 XT - 425.61
Radeon RX 6750 XT - 459.00
Radeon RX 6800 - 578.90
Radeon RX 6800 XT - 689.00
Radeon RX 6900 XT - 799.00
Radeon RX 6950 XT - 874.90
Radeon RX 7900 XT - 1049.00
Radeon RX 7900 XTX - 1199.00

It doesn't get better despite some of the cards becoming very very old now.
 
How many processors or device drives have you engineered, that you can ask "how hard can it be"?
Drivers? About 10 in 15 years. Also various electronics projects, using MCUs, Arm, Atmel AVR, various embedded systems, portable devices. I also make software that's installed on millions of PCs.

If two different software settings give you different results, then you should test both to be fair
I understand that it is not possible to test every different user case, but setting your monitors to a matching refresh rate isn't very difficult at all - neither is writing a small disclaimer that "multi-monitor test was done with monitory a and b running at resolutions x and y, and refresh rates u and v.
Can't test all multi-monitor scenarios, there's literally infinite combinations

Like this ?
 
oh ok, sorry, I missed that. I was looking at the test setup page which did not list monitors.
 
@W1zzard Can you add 8K 4320p tests for these cards?
I see that the performance drop from 1080p to 1440p to 2160p in some cases is barely visible, so why not adding new higher resolutions? :)

1670955599250.png

AMD Radeon RX 7900 XT Review - Borderlands 3 | TechPowerUp
 
@W1zzard
You probably won't have time for it. But perhaps you have already experimented with the -10% power limit setting or with undervolting? Seems to me that not a single reviewing has spend time on it ( which I totally understand by the way ). It's always interesting to see how this impacts performance, temperatures and power usage for people that live in hot countries or in countries where energy is very expensive.
 
Send me a monitor.

I'd love to but I think that your organisation can afford to test everything on an 8K TV or monitor which are becoming more mainstream now.

No seriously, testing another resolution just takes too long, and it's just too slow in most games. Borderlands 3 is getting kicked out next round of retesting btw

The same story is in Hitman 3:

1670968099055.png

AMD Radeon RX 7900 XT Review - Hitman 3 | TechPowerUp

Actually, the 1440p is faster than 1080p.

I think the test suit should also include something very popular like CS:GO.
 
I'd love to but I think that your organisation can afford to test everything on an 8K TV or monitor which are becoming more mainstream now.



The same story is in Hitman 3:

View attachment 274348
AMD Radeon RX 7900 XT Review - Hitman 3 | TechPowerUp

Actually, the 1440p is faster than 1080p.

I think the test suit should also include something very popular like CS:GO.

I managed a hardware reviews site for a short period of time, then I sold it. I think you are misjudging the huge amount of time and effort these reviews take vs how little money it brings in.
 
Hitman 3 will be replaced too next time, probably. Depends on what games come out, etc. I am definitely adding The Callisto Protocol, Mount and Blade Bannerlord, Plague Tale Requiem, Spiderman. Probably Evil West and Uncharted 4. So 4-6 titles need to go

I think the test suit should also include something very popular like CS:GO.
CSGO will probably run completely-CPU limited? Do you know of any other sites that test CSGO, so I can take a look at their data?
 
Last edited:
The XT is way to close in price too the XTX to even consider it. XT should be more like $500.
 
This card exists for one reason and one reason only... To make the RX 7900 XTX look better.

It's so stupid too because AMD had a real hit with the RX 6800 XT when compared to the RX 6900 XT. The better value should always be with the lower-tier card but both nVidia and AMD have bucked that trend. I foresee the RX 7900 XT being dropped to about ~$600 before the end of 2023.

As things are now, nobody should buy this card. This is something that went through my mind as soon as Lisa Su showed off the pricing at their "let's talk a lot of BS event". It was like seeing three Jensen Huangs on stage and all I could do was cringe.
 
Back
Top