# AMD Radeon RX 7900 XT



## W1zzard (Dec 12, 2022)

AMD Radeon RX 7900 XT is built on the same Navi 31 GPU as the XTX, but with fewer chiplets, which helps bring down the price. In our review this $900 graphics card is able to match the performance of GeForce RTX 3090 Ti that launched at $2000 just a few months ago.

*Show full review*


----------



## Vecix6 (Dec 12, 2022)

Small typo at page 32 - Relative performance:



> I also combined all per-game results at 4K into a single chart, comparing against the GeForce RTX 3090 Ti. The more natural comparison would be the 4070 Ti, but that launches only next year.



Graphs says RTX 4080, so graph or text gets a typo


----------



## wolf (Dec 12, 2022)

Well, the 7900XTX is 19% faster and 10% more expensive... gotta sell defective dies somehow right, may as well make the top product look like better value.


----------



## bob3002 (Dec 12, 2022)

I don't immediately understand why the Relative Performance at 4k (Page 32) shows the RTX 4080 at 114% of this. The sum of the red bars is slightly larger than the sum of the green bars on the graph at the bottom. Only one game (Civ VI) has the Nvidia card with an advantage of >14% over the 7900 XT, so surely the average can't be that skewed... right?

Unless this is some really, really, extreme example of Simpson's Paradox...

Edit: Chart title fixed, comparison object was the 3090 Ti not the 4080.


----------



## Al Chafai (Dec 12, 2022)

This is why been telling people to lower their expectations. If Nvidia lowers the 4080 price to 999, no one will touch the 7900XTX. 
Seems like AMD still have so much work to do.


----------



## erocker (Dec 12, 2022)

Ehh... I'd think $799 would be the upper limit for this card.


----------



## W1zzard (Dec 12, 2022)

Vecix6 said:


> Small typo at page 32 - Relative performance:
> 
> 
> 
> Graphs says RTX 4080, so graph or text gets a typo


Fixed the chart title



bob3002 said:


> I don't immediately understand why the Relative Performance at 4k (Page 32) shows the RTX 4080 at 114% of this. The sum of the red bars is slightly larger than the sum of the green bars on the graph at the bottom. Only one game (Civ VI) has the Nvidia card with an advantage of >14% over the 7900 XT, so surely the average can't be that skewed... right?
> 
> Unless this is some really, really, extreme example of Simpson's Paradox...


The "relative performance" is calculated slightly differently to the "average FPS" (which the new per-game chart is based on).

Relative performance looks at each test, scales the tested card to 100%, and the others accordingly. Then it averages all tests.
Average FPS is based on averaging the FPS for each card across all tests and then these are compared.

Makes sense? Thoughts on which is better? Bonus points for "why" 

Edit: oh wait, the chart title was wrong, it's supposed to be "vs 3090 Ti", not "vs 4080" (forgot to update the XLS manually). Haven't had time to macro this chart, I had the idea for it this morning


----------



## Acesbong (Dec 12, 2022)

Performance is meh, price isn't great. I'd be surprised of these don't rot on shelves like the 4080. Just not impressive/exciting.
Also having bugs, driver problems in 1K card is unacceptable. AMD think they can carry on their practices yet charge a premium. GPU division is reminding me of the CPU division back in the bulldozer days, they spend all the money on marketing.
I was going to pick one up day one but I think I'll pass. The 7800XT series is going to be what, 10% faster than previous gen? These cards are 7800xt performance with a 7900xt price tag.


----------



## THU31 (Dec 12, 2022)

So this card is even less attractive than the 4080 relative to the 4090. The value would be questionable even at $800.

But anyone who can spend $900 can spend $1000. They are obviously counting on this.

Jensen was right. The days of getting more performance at the same price are over.


----------



## mechtech (Dec 12, 2022)

Looks like tsmc process helped 4080 along with good bios and driver implementations for power and efficiency.   7900 series ok, but was expecting better in terms of their performance/watt claims.   End of the day it comes down to price.  4080 I’m Canada is about $2000 w/o the 13% tax.   Glad I got a good deal on a 2nd hand 6800.   Have to wait now for for the 7600 and 4060 series to come out.


----------



## cosminmcm (Dec 12, 2022)

Acesbong said:


> Performance is meh, price isn't great. I'd be surprised of these don't rot on shelves like the 4080. Just not impressive/exciting.
> Also having bugs, driver problems in 1K card is unacceptable. AMD think they can carry on their practices yet charge a premium. GPU division is reminding me of the CPU division back in the bulldozer days, they spend all the money on marketing.
> I was going to pick one up day one but I think I'll pass. The 7800XT series is going to be what, 10% faster than previous gen? These cards are 7800xt performance with a 7900xt price tag.


Bulldozer/Piledriver had decent prices. These do not.


----------



## Bomby569 (Dec 12, 2022)

Disappointing price. The rest (overall disappointment ) is just a consequence of this.

Nvidia will walk all over AMD as usual.


----------



## LFaWolf (Dec 12, 2022)

wolf said:


> Well, the 7900XTX is 19% faster and 10% more expensive... gotta sell defective dies somehow right, may as well make the top product look like better value.


I am disappointed in this as well. I would seriously consider it at $700 or even $800, but pricing for GPUs nowadays is out of whack. We will have to wait and see what happens next.


----------



## Acesbong (Dec 12, 2022)

cosminmcm said:


> Bulldozer/Piledriver had decent prices. These do not.


I just recall them releasing a "black edition" of some old cpu for 1k back in the day when in reality it got curb stomped by a 2600k or 3700k. Thought if they made it a "limited edition" they could charge money like intel for it. Reason I mentioned the marketing. They are selling 7800xt cards here for 6900xt prices imo. Delusional.


----------



## lightning70 (Dec 12, 2022)

erocker said:


> Ehh... I'd think $799 would be the upper limit for this car.


True, $799 would be much better.


----------



## Blaeza (Dec 12, 2022)

Seeing as I just spent £650 on a 6900XT, glad the 7900XT is priced at 899, less buyers remorse, lol.  Think AMD did a good job, looking forward to the rest of the line up to piss all over Nvidia's 4060 and so on.


----------



## 1d10t (Dec 12, 2022)

On par with RTX 3090Ti despite narrower bus, slower and lesser VRAM. Still, at that price and power consumption, I don't think I'll ever buy this card.


----------



## bug (Dec 12, 2022)

I wouldn't call the "no need for a 16 pin cable" a pro. I mean, it still ties up the same two 8 pin connectors from your PSU.

In other news, at the asking price, I wouldn't care about this if it played the newly released Portal RTX 4k at 1,000fps.


----------



## Pumper (Dec 12, 2022)

This will be $749-$799 in a month.


----------



## AirplaneA1 (Dec 12, 2022)

I presume that the 7900 XT is the complete Navi 31 core shielded from defects and then set a slightly lower price than the flagship product out for sale. The purpose is to highlight the 7900 XTX's price/performance ratio. This behavior is too far, the same as nVidia. I guess this thing will rot on the shelf.


----------



## WhoDecidedThat (Dec 12, 2022)

Nvidia has no excuse to not release a 4080 variant which is the same size as a 7900XT.


----------



## bug (Dec 12, 2022)

Pumper said:


> This will be $749-$799 in a month.


I think it depends. If the XTX is priced fairly and the XT is just an attempt to cash grab, yes these could see a price cut (also depending on where the 4070 and 4070Ti land). If, on the other hand, it's the XT that is priced fairly and the XTX is selling at a discount, to undercut the 4080, AMD may not be happy with two discounted top-end SKUs on their hands at the same time.


----------



## swirl09 (Dec 12, 2022)

This is the wrong price for this card.

I think the 7900XTX, even with a few titles performing less than expected (and perhaps a driver update will correct this), is decent. But I dont see who is buying this card. Its worse than the "value" of a 4080, which was poor.


----------



## GhostRyder (Dec 12, 2022)

Pretty disappointing honestly, the drop in performance is a lot higher than I was expecting and honestly its performance is further below the 4080 than I expected.  Not a bad card, but it does not live up to what was expected of it at the price I am sorry to say.  Would be interested in it for $100 less.


----------



## damric (Dec 12, 2022)

I think this $900 price is only temporary to help clean out remaining high end RX 6000 cards. I don't see many people buying this model until it's down well below $700, and I'm sure AMD is ok with that. Anyways, I'm very happy with my 6900XTs that I bought late in the cycle for cheap so I'll most likely skip this generation. I also had to order an RX 6600 for just over $200 yesterday to upgrade my son's RX 480 4GB. I'm actually very interested in benching (with MPT) that one before I hand it over.


----------



## W1zzard (Dec 12, 2022)

damric said:


> I think this $900 price is only temporary to help clean out remaining high end RX 6000 cards. I don't see many people buying this model until it's down well below $700, and I'm sure AMD is ok with that.


That's a very good point you're making. Let me mention that in the conclusion


----------



## gupsterg (Dec 12, 2022)

As always thank you W1zzard for great reviews. Bounty of data to feast on .


----------



## ZoneDymo (Dec 12, 2022)

why the insane multi monitor and video playback power consumption?


----------



## shovenose (Dec 12, 2022)

Thank you for the review. I was pretty sure I was going to get a 7900XT but based on this review, I’m not so sure, and I might either go for a used 6900XT or a new 7900XTX… Although, since I don’t get paid til Thursday I might not be buying much of anything if I do decide to go new given they’ll probably be all sold out by then.

All that said, the XT should be $849 or $799 not $899 IMO. The XTX at $999 is fine in todays world I just don’t think the price on the XT is right in relation to how much performance is lost compared to the XTX.

Anyone have Microsoft Flight Sim 2020 benchmarks for these? Please?


----------



## BigMack70 (Dec 12, 2022)

wolf said:


> Well, the 7900XTX is 19% faster and 10% more expensive... gotta sell defective dies somehow right, may as well make the top product look like better value.



It's pretty disgusting that both AMD and Nvidia are price gouging everyone horribly so that their most insanely overpriced products are the ones with the most "value". 

I hope these rot on shelves alongside the 4080.


----------



## tabascosauz (Dec 12, 2022)

shovenose said:


> Thank you for the review. I was pretty sure I was going to get a 7900XT but based on this review, I’m not so sure, and I might either go for a used 6900XT or a new 7900XTX… Although, since I don’t get paid til Thursday I might not be buying much of anything if I do decide to go new given they’ll probably be all sold out by then.
> 
> All that said, the XT should be $849 or $799 not $899 IMO. The XTX at $999 is fine in todays world I just don’t think the price on the XT is right in relation to how much performance is lost compared to the XTX.
> 
> Anyone have Microsoft Flight Sim 2020 benchmarks for these? Please?



+1 especially since the 5800X3D walks all over everything else in MSFS, definitely want to see how both 13th gen and 7000 series do there

On one hand, the reference cooler package is just perfect to me, temps are low, power is acceptable, no more 6900XT power spikes, performance is ok to me in relation to a product I'll never consider buying (4080 - hate the FE's size, hate the connector, hate the fact there are no more decent Nvidia AIBs)

On the other hand, this thing is going to be $1300-1400cad. That's nuts.

I guess I'll wait for the price to drop like 6900XT. Not paying $1400 just for MW2.

Also, @W1zzard during what test are the temps taken? Surprising that the XT and XTX came out identical, HWUB had the XTX at 9°C higher in game load


----------



## ARF (Dec 12, 2022)

The performance is shockingly low.

It only manages to tie the 2-year-old RTX 3000 top card, while standing very poorly against the new RTX 4000 generation.

What a disgusting AMD move! 

This thingie should cost not more $500 and even then I would be generous to pay that much for all the flaws that come with it!


----------



## BigMack70 (Dec 12, 2022)

ARF said:


> This thingie should cost not more $500 and even then I would be generous to pay that much for all the flaws that come with it!


Yup. This is obviously a $500 GPU, and the XTX and 4080 are $700 GPUs. 

But greedy corporations are gonna greed I suppose. It's a shame, and this move makes NO sense from AMD, who really need a good generation of GPUs to regain market share. Not gonna happen at these price points. I guess they're aiming to get down into single digit market share.


----------



## wheresmycar (Dec 12, 2022)

Thanks for the review @W1zzard 

A special note to AMD and NVIDIA (OUR SAVIOURS) as always nowadays... great cards!! but for the price NO THANK YOU VERY VERY MUCH.... stick it where the sun don't shine!


----------



## ModEl4 (Dec 12, 2022)

RX 7900XT should have been ≤$849.
If 4070Ti has the same specs as "unlaunched" 4080 12GB, at $799 4070Ti will have better performance/$ in 1440p for those interested in high refresh QHD gaming (and for those interested in 4K there is the 7900XTX model / and if we take account Raytracing even in 4K 4070Ti will have better performance/$ when raytracing is enabled)
Just buy a 7900XTX for 4K or wait for 7800XT (full N32 i mean) for high refresh QHD gaming.


----------



## ARF (Dec 12, 2022)

ModEl4 said:


> RX 7900XT should have been ≤$849.
> If 4070Ti has the same specs as "unlaunched" 4080 12GB, at $799 4070Ti will have better performance/$ in 1440p for those interested in high refresh QHD gaming (and for those interested in 4K there is the 7900XTX model / and if we take account Raytracing even in 4K 4070Ti will have better performance/$ when raytracing is enabled)
> Just buy a 7900XTX for 4K or wait for 7800XT (full N32 i mean) for high refresh QHD gaming.



Nah, this "7900 XT" is the same stupid shenanigan as nvidia's infamous attempt to deceive everyone with "RTX 4080-12".
This thingie should have been called 7800 with all the honesty, and its memory pool lowered to 16 GB because 16 GB is plenty for this junk disabled chip.


----------



## ModEl4 (Dec 12, 2022)

ARF said:


> Nah, this "7900 XT" is the same stupid shenanigan as nvidia's infamous attempt tavo deceive everyone with "RTX 4080-12".
> This thingie should have been called 7800 with all the honesty, and its memory pool lowered to 16 GB because 16 GB is plenty for this junk disabled chip.


Sure, i said the same thing 40 days ago (should have been 7800XT at $649-$699 since it's just a cut down N31) when AMD announced the models , i just wanted to point out that even an uninspiring value like a RTX 4070Ti at $799 will make this look like a bad value...


----------



## holyprof (Dec 12, 2022)

AMD copied Nvidia with these cards


Card name​What it really is​4080​4070 Ti​4070 Ti​4070​7900 XTX​7900 XT​7900 XT​7900​

I'm still having some hope for 4070 (Ti) or 7800 (XT) priced within my budget, but not holding my breath. Giving a full skip to all those $800+ (€1200+ in my country) GPUs.


----------



## zlobby (Dec 12, 2022)

Frametime results, finally! Great review!


----------



## TheinsanegamerN (Dec 12, 2022)

BigMack70 said:


> Yup. This is obviously a $500 GPU, and the XTX and 4080 are $700 GPUs.
> 
> But greedy corporations are gonna greed I suppose. It's a shame, and this move makes NO sense from AMD, who really need a good generation of GPUs to regain market share. Not gonna happen at these price points. I guess they're aiming to get down into single digit market share.


They're already there, Nvidia sits at 88 percent, and 8, and Intel 4 in more recent sales polls. It's not 100 percent perfect but it does indicate AMDs attempt at being a "premium brand" is backfiring


----------



## BigMack70 (Dec 12, 2022)

TheinsanegamerN said:


> They're already there, Nvidia sits at 88 percent, and 8, and Intel 4 in more recent sales polls. It's not 100 percent perfect but it does indicate AMDs attempt at being a "premium brand" is backfiring


I haven't understood anything AMD has done for over 10 years. Their last truly competitive top end product was the HD 7970. But they keep trying to play "me too" with their marketing and pricing, as if they can compete with Nvidia's top products. The market recognizes this for the obvious nonsense that it is, and nobody buys AMD's graphics cards.

They need to completely rebrand and reprice their prodcuts as being the gamer-friendly alternative to Nvidia, with aggressive pricing that exposes Nvidia's greedy anti-consumer practices.

What do they do instead? Market themselves as the superior 8k gaming option and price their cards so high that nobody is going to care to buy them. Their GPU division - at least those in charge of pricing and marketing - is a joke.


----------



## Juventas (Dec 12, 2022)

> Fantastic energy efficiency


It's less efficient than the RTX 4080.  Isn't that it's primary competitor, at least in this review?


----------



## Chrispy_ (Dec 13, 2022)

As predicted by many people last month based on specs alone, the price of this card is wrong for its specs and performance.

The specs are cut down from the XTX by almost exactly 1/6th, and it performs at close enough to 5/6th of the XTX. It should cost no more than 5/6th of an XTX ($829) and arguably even that is too much because at that price there's no performance/$ penalty for skipping it and just getting the XTX instead. Realistically, it's a $749 card if it is going to coexist with a $999 XTX. I guess we'll see over the coming days just how realistic a $999 XTX is though....

The biggest problem with the 7900XT's MSRP isn't the price relative to the XTX, it's the 6900XT you can buy new for $650.


----------



## Minus Infinity (Dec 13, 2022)

Honestly both cards are hugely underwhelmimg, 7900XT barely 22% faster than 6900XT in 1440 and 4K, huge power draw for mulit-monitor setups, overall worse efficiency than the 4080, p!ss poor RT performance. The hype was massive and to not even get anywhere near the 50% AMD promised is a joke. Maybe the drivers are garbage, but I doubt it for most games. If Nvidia drops 4080 price $200 AMD will be forced to major price cuts themselves. Also what disappoints is again seeing how relative performance drops in most games at 4K despite the larger bus widths. At this stage I'll keep waiting but looking more like a 6800XT, 3080 upgrade for my 1080 Ti although I want to move to a 4K monitor for productivity work and these cards won't be ideal for newer games.


----------



## ymdhis (Dec 13, 2022)

Regarding the multi-monitor tests, how did you test that? There's a known "not-a-bug" where if you use screens with different refresh rates, the card can't lock into the vertical sync period to do its job at the minimum time, and instead has to run the memory at max speed to be able to refresh properly at the mismatching speeds. Since it has to run the memory at full speed, the card will end up using much more memory. Both my Vega and Polaris behaved like that, until I realized that I can set them to matching refresh rates and multi-monitor power usage fell down to the same level as single monitor one (note: this is with using 60Hz monitors only, with the secondary one sometime dropping to 59.94 because of HDMI connections dropping the max framerate on my amplifier).


----------



## cityuser (Dec 13, 2022)

And suddenly 12bit color depth ,8K VC-1 decode/encode and display port 2.1 became unimportant when this is the advantage the AMD eqipped with ?


----------



## W1zzard (Dec 13, 2022)

tabascosauz said:


> Also, @W1zzard during what test are the temps taken? Surprising that the XT and XTX came out identical, HWUB had the XTX at 9°C higher in game load











						AMD Radeon RX 7900 XT Review
					

AMD Radeon RX 7900 XT is built on the same Navi 31 GPU as the XTX, but with fewer chiplets, which helps bring down the price. In our review this $900 graphics card is able to match the performance of GeForce RTX 3090 Ti that launched at $2000 just a few months ago.




					www.techpowerup.com
				



Thermal Analysis chart. If you report "max" temp, then you'll get a higher number, but that's unrealistic. I report the long-term stable temperature, as that's what matters for gaming.



ymdhis said:


> Regarding the multi-monitor tests, how did you test that? There's a known "not-a-bug" where if you use screens with different refresh rates, the card can't lock into the vertical sync period to do its job at the minimum time, and instead has to run the memory at max speed to be able to refresh properly at the mismatching speeds. Since it has to run the memory at full speed, the card will end up using much more memory.


That's exactly how I test, intentionally. The thing is that this same test works perfectly fine on NVIDIA with lower power draw, so it must be possible. 
It does not "have" to run the memory at full speed, that's just the way AMD runs their card, which is suboptimal.


----------



## bearClaw5 (Dec 13, 2022)

ARF said:


> It only manages to tie the 2-year-old RTX 3000 top card, while standing very poorly against the new RTX 4000 generation.


The 3090ti came out earlier this year.


----------



## ARF (Dec 13, 2022)

bearClaw5 said:


> The 3090ti came out earlier this year.



The very same GA102 chip which was launched back in 2020, exactly 2 years and 3 months ago and counting 




NVIDIA GA102 GPU Specs | TechPowerUp GPU Database


----------



## Chrispy_ (Dec 13, 2022)

Minus Infinity said:


> Honestly both cards are hugely underwhelmimg, 7900XT barely 22% faster than 6900XT in 1440 and 4K, huge power draw for mulit-monitor setups, overall worse efficiency than the 4080, p!ss poor RT performance.


You're comparing the 2nd-tier model with last generation's flagship and disappointed that it's only 22% faster?

How is that different to Nvidia, or previous AMD generations going back at least half a decade?





As for multi-monitor power draw, that's likely a bug with launch-day drivers and should be patched soon. Neither the drivers nor the reference card's power delivery look great, but at the same time it's not as if either of Nvidia's last two launch generations have been problem-free either. That's why we get driver updates!

Efficiency is a big one that won't be solved with software or drivers; The fact it's not as efficient as Nvidia is potentially down to the chiplet design, which adds energy cost overheads and is one of the main reasons we only get monolithic AMD CPUs for laptops. Chiplet design increases the physical distance between bits of silicon that have to communicate with each other, and the additional interfaces between chiplets all have some internal resistance. It's small, but it adds up. Even if it doesn't beat the 4080, it's still MUCH more efficient than the previous generations.

You have to remember that every deesign decision has implications/drawbacks. Chiplet design reduces costs, at the expense of some efficiency (among other things) and with RDNA3 we _are_ seeing cheaper cards. Unlike the 6900-series, the XTX is ~$400 less than the cheapest 4080 cards and the XT, even at it's "incorrect" price is better performance/$ than the 4080 by a decent margin. We'll have to see what price Nvidia launches the 4080 12GB 4070Ti at to truly compare the 7900XT against the competion though.


----------



## ARF (Dec 13, 2022)

Chrispy_ said:


> You're comparing the 2nd-tier model with last generation's flagship and disappointed that it's only 22% faster?



And correspondingly so more expensive. This is not how the things work or should be, in the first place.

Because if the trend continues, RX 8900 will cost $1900 and RX 9900 will cost $2900.


----------



## RainingTacco (Dec 13, 2022)

Pricing is definitely very weird on this one...



ZoneDymo said:


> why the insane multi monitor and video playback power consumption?


This bug lingers from vega times, during RDNA1 it was very prominent issue, sadly AMD don't know how to fix it. The behavior depends on particular monitor, with some gpu downclock correctly with others it doesn't. One fix could be to use CRU, but it's not guaranteed, sometimes when you find CRU spot where GPU downclock, the monitor has issues and reset. I've written about it 2 years ago on their forum and still no fix to this day - one of many reasons i've sold my 5700XT.


----------



## ARF (Dec 13, 2022)

RainingTacco said:


> This bug lingers from vega times, during RDNA1 it was very prominent issue,* sadly AMD don't know how to fix it*.



Weird indeed. Why don't they simply ask someone in the know, maybe the fellow nvidia engineers?


----------



## Chrispy_ (Dec 13, 2022)

ARF said:


> And correspondingly so more expensive. This is not how the things work or should be, in the first place.
> 
> Because if the trend continues, RX 8900 will cost $1900 and RX 9900 will cost $2900.





RainingTacco said:


> Pricing is definitely very weird on this one...


Honestly, both AMD and Nvidia are milking the high-end right now but it's hurting the PC gaming market.

When you can buy a PS5 or XBSX for less than the price of a 2-year old midrange GPU, more and more of the market is moving to consoles. Not here on TPU, the hardware enthusiast site - but absolutely 100% in the rest of the gaming industry.

At some point, if this trend of raising GPU prices far beyond console prices continues, PC gaming will simply stop existing. Half a decade ago, you could build a low-end PC that matched a console for around 50% more money than the console. Currently a PC that can game as well as a current-gen console costs at least twice as much as that console. By the time it costs 5x more to game on PC than console, 99% of the gamers will have moved over to console.


----------



## ymdhis (Dec 13, 2022)

W1zzard said:


> That's exactly how I test, intentionally. The thing is that this same test works perfectly fine on NVIDIA with lower power draw, so it must be possible.
> It does not "have" to run the memory at full speed, that's just the way AMD runs their card, which is suboptimal.


Ah, okay, so you intentionally use a setup that uses more power. I suspected as much, thanks for confirming.


----------



## W1zzard (Dec 13, 2022)

ymdhis said:


> Ah, okay, so you intentionally use a setup that uses more power. I suspected as much, thanks for confirming.


You make it sound like I'm doing the wrong thing. I'm intentionally not using the "easy" case. My work setup for 15 years has been one big screen + a smaller one on the side, for E-Mail etc .. so this is a realistic setup


----------



## shovenose (Dec 13, 2022)

W1zzard said:


> You make it sound like I'm doing the wrong thing. I'm intentionally not using the "easy" case. My work setup for 15 years has been one big screen + a smaller one on the side, for E-Mail etc .. so this is a realistic setup


Do you like that setup? I still have one of my previous monitors (23” 1080P) I’m thinking of adding back to my desk next to my new monitor (27” 4K) purely for productivity reasons. I just feel like it would bother me having two different monitors.


----------



## Arco (Dec 13, 2022)

I'd argue this might be a 7800 XT really.


----------



## ymdhis (Dec 13, 2022)

W1zzard said:


> You make it sound like I'm doing the wrong thing. I'm intentionally not using the "easy" case.


You are intentionally using a case where one card always performs better, so at the very least you should add a disclaimer that the results were achieved under such settings. Or even better, test it both ways. Because it IS possible to get drastically lower power usage by adjusting them to matching refresh rates, so you can have a scenario where two users with different multi-monitor setups will have drastically different power usages. And then TPU will get a reputation for publishing unreliable numbers, because they have the same card, same drivers, but get far better values than what you published - which means your tests are either incompetent, or purposefully skewed.

I mean how do you think it looks like, that you produce 40-70W power usage, and then someone with the same card gets 5W power usage?

Moreover, sometimes things like driver or windows updates can mess with the refresh rate settings. So if you don't inform the user about this fact, it will be another thing the drivers mysteriously break or fix, when it is in fact just a single setting they need to adjust. Not any different than if a new driver enforced a different antialiasing or vsync setting, that would break performance - only that this one is rather more obscure.



> My work setup for 15 years has been one big screen + a smaller one on the side, for E-Mail etc .. so this is a realistic setup


It's not the screen size you have to match, it's the refresh rate. Also having multiple identical monitors is an equally realistic setup...


----------



## W1zzard (Dec 13, 2022)

shovenose said:


> Do you like that setup? I still have one of my previous monitors (23” 1080P) I’m thinking of adding back to my desk next to my new monitor (27” 4K) purely for productivity reasons. I just feel like it would bother me having two different monitors.


It's the best, doesn't take up a ton of space, but multi-monitor makes a HUGE difference for productivity. I can move my review data around, Excel, reviews editor, etc etc



ymdhis said:


> You are intentionally using a case where one card always performs better


Always? AMD just needs to fix their shit? How hard can it be?



ymdhis said:


> Also having multiple identical monitors is an equally realistic setup...


Agreed. So you prefer I test two identical monitors, and let this one slide for AMD?


----------



## ymdhis (Dec 13, 2022)

W1zzard said:


> Always? AMD just needs to fix their shit? How hard can it be?


How many processors or device drives have you engineered, that you can ask "how hard can it be"?



> Agreed. So you prefer I test two identical monitors, and let this one slide for AMD?


No, I'm saying that you should provide numbers which give accurate results. If two different software settings give you different results, then you should test both to be fair, otherwise you risk your tests either being highly inaccurate or biased towards one manufacturer. Right now you don't list the specific refresh rates on the test setup, *even though this is a factor known to affect multi-monitor power usage.*

I understand that it is not possible to test every different user case, but setting your monitors to a matching refresh rate isn't very difficult at all - neither is writing a small disclaimer that "multi-monitor test was done with monitory a and b running at resolutions x and y, and refresh rates u and v.


----------



## shovenose (Dec 13, 2022)

Well, despite my earlier reservations I got frustrated that the XTX was already sold out at 6:02 so I ordered the XT and used the $100 left over for a 1TB Samsung 980 Pro to replace my tired 1TB SATA boot drive.


----------



## ymdhis (Dec 13, 2022)

Also regarding this:


> Always? AMD just needs to fix their shit?



A simple google search shows that high multi-monitor power usage also happens very often on geforce cards. They even have a specific tool called "Multi Display Power Saver" to get around it, except it sometimes causes even more issues.


----------



## ARF (Dec 13, 2022)

@W1zzard Can you post an editorial article about the graphics cards prices? I think it needs a detailed research of the industry. Is it lunatic profiteering by AMD or the particular segment has very severe components cost increase?

Current pricing in euro in Germany:

Radeon RX 6400 - 132.24
Radeon RX 6500 XT - 168.51
Radeon RX 6600 - 268.95
Radeon RX 6600 XT - 339.00
Radeon RX 6650 XT - 328.99
Radeon RX 6700 XT - 425.61
Radeon RX 6750 XT - 459.00
Radeon RX 6800 - 578.90
Radeon RX 6800 XT - 689.00
Radeon RX 6900 XT - 799.00
Radeon RX 6950 XT - 874.90
Radeon RX 7900 XT - 1049.00
Radeon RX 7900 XTX - 1199.00

It doesn't get better despite some of the cards becoming very very old now.


----------



## W1zzard (Dec 13, 2022)

ymdhis said:


> How many processors or device drives have you engineered, that you can ask "how hard can it be"?


Drivers? About 10 in 15 years. Also various electronics projects, using MCUs, Arm, Atmel AVR, various embedded systems, portable devices. I also make software that's installed on millions of PCs.



ymdhis said:


> If two different software settings give you different results, then you should test both to be fair





ymdhis said:


> I understand that it is not possible to test every different user case, but setting your monitors to a matching refresh rate isn't very difficult at all - neither is writing a small disclaimer that "multi-monitor test was done with monitory a and b running at resolutions x and y, and refresh rates u and v.


Can't test all multi-monitor scenarios, there's literally infinite combinations



http://imgur.com/iDjq8uN

Like this ?


----------



## ymdhis (Dec 13, 2022)

oh ok, sorry, I missed that. I was looking at the test setup page which did not list monitors.


----------



## ARF (Dec 13, 2022)

@W1zzard Can you add 8K 4320p tests for these cards?
I see that the performance drop from 1080p to 1440p to 2160p in some cases is barely visible, so why not adding new higher resolutions? 





AMD Radeon RX 7900 XT Review - Borderlands 3 | TechPowerUp


----------



## JRMBelgium (Dec 13, 2022)

@W1zzard 
You probably won't have time for it. But perhaps you have already experimented with the -10% power limit setting or with undervolting? Seems to me that not a single reviewing has spend time on it ( which I totally understand by the way ). It's always interesting to see how this impacts performance, temperatures and power usage for people that live in hot countries or in countries where energy is very expensive.


----------



## W1zzard (Dec 13, 2022)

ARF said:


> @W1zzard Can you add 8K 4320p tests for these cards?
> I see that the performance drop from 1080p to 1440p to 2160p in some cases is barely visible, so why not adding new higher resolutions?
> 
> View attachment 274341
> AMD Radeon RX 7900 XT Review - Borderlands 3 | TechPowerUp


Send me a monitor. No seriously, testing another resolution just takes too long, and it's just too slow in most games. Borderlands 3 is getting kicked out next round of retesting btw


----------



## ARF (Dec 13, 2022)

W1zzard said:


> Send me a monitor.



I'd love to but I think that your organisation can afford to test everything on an 8K TV or monitor which are becoming more mainstream now.



W1zzard said:


> No seriously, testing another resolution just takes too long, and it's just too slow in most games. Borderlands 3 is getting kicked out next round of retesting btw



The same story is in Hitman 3:




AMD Radeon RX 7900 XT Review - Hitman 3 | TechPowerUp

Actually, the 1440p is faster than 1080p.

I think the test suit should also include something very popular like CS:GO.


----------



## bearClaw5 (Dec 13, 2022)

ARF said:


> The very same GA102 chip which was launched back in 2020, exactly 2 years and 3 months ago and counting
> 
> View attachment 274272
> NVIDIA GA102 GPU Specs | TechPowerUp GPU Database


Yeah, but it wasn't available in a consumer card until this year. I'm not defending this card, but it does outperform, if even slightly, the GA102 cards we could get until this year.


----------



## shovenose (Dec 14, 2022)

ARF said:


> I'd love to but I think that your organisation can afford to test everything on an 8K TV or monitor which are becoming more mainstream now.
> 
> 
> 
> ...



I managed a hardware reviews site for a short period of time, then I sold it. I think you are misjudging the huge amount of time and effort these reviews take vs how little money it brings in.


----------



## W1zzard (Dec 14, 2022)

ARF said:


> Hitman 3


Hitman 3 will be replaced too next time, probably. Depends on what games come out, etc. I am definitely adding The Callisto Protocol, Mount and Blade Bannerlord, Plague Tale Requiem, Spiderman. Probably Evil West and Uncharted 4. So 4-6 titles need to go



ARF said:


> I think the test suit should also include something very popular like CS:GO.


CSGO will probably run completely-CPU limited? Do you know of any other sites that test CSGO, so I can take a look at their data?


----------



## murr (Dec 14, 2022)

The XT is way to close in price too the XTX to even consider it. XT should be more like $500.


----------



## Avro Arrow (Dec 14, 2022)

This card exists for one reason and one reason only...  To make the RX 7900 XTX look better.

It's so stupid too because AMD had a real hit with the RX 6800 XT when compared to the RX 6900 XT.  The better value should _always_ be with the lower-tier card but both nVidia and AMD have bucked that trend.  I foresee the RX 7900 XT being dropped to about ~$600 before the end of 2023.

As things are now, _nobody _should buy this card.  This is something that went through my mind as soon as Lisa Su showed off the pricing at their "let's talk a lot of BS event".  It was like seeing three Jensen Huangs on stage and all I could do was cringe.


----------



## JRMBelgium (Dec 14, 2022)

Avro Arrow said:


> This card exists for one reason and one reason only...  To make the RX 7900 XTX look better.
> 
> It's so stupid too because AMD had a real hit with the RX 6800 XT when compared to the RX 6900 XT.  The better value should _always_ be with the lower-tier card but both nVidia and AMD have bucked that trend.  I foresee the RX 7900 XT being dropped to about ~$600 before the end of 2023.
> 
> As things are now, _nobody _should buy this card.  This is something that went through my mind as soon as Lisa Su showed off the pricing at their "let's talk a lot of BS event".  It was like seeing three Jensen Huangs on stage and all I could do was cringe.



For me it's the only GPU that I can buy for my 650watt system. I refuse to buy any GPU with TDP over 300 watts.


----------



## candymancan21 (Dec 15, 2022)

Ok so ive read all the replies here.. read many reviews.. Ill inlut my 2 cents.

Ive been a gamer for 34 years.  Been buuldings computers and overclocking and watercooling for 23 years now.   Ive seen it all,  AMD duron/athlon cpus.  Duel core 64 bit cpus.  Geforce 2s and ATI cards all the way up to stuff now.

Videi cards back then for top of the line were 400$ or so for a 9700pro.   Then the 8800GTX came out..  and then prices started sky rocketing.  This all started with nvidia.  People may not remember this but nvidia got sued for the price gouging and they got fined for it.  It was a big stinko for years.  Then suddenly everyone forgot.

Then 900 series came out and the 1080 series.  And thats what i have a 1080ti.  For the last 4 years..  ive enjoyed the amazing performance this card has put out.  What i didnt enjoy was spending 870$ for it.   Thats the firat time i ever spent that much on a card.. normally id spend maybe 500-600$..

Now suddenly nvidia wants 1300+ for a video card.  Im shocked they arent being sued again.  Im shocked anyone would even buy a video card for that much.  But hey i spend 1800$ on guns or car stuff or other hobbies.. but then all those items last forever.  A video card is done in 3-4 years tops.

Which is where im at now.  My 1080ti is too slow for 2k gaming.  Im not a fan of being at 40-80 in total war warhammer 3 or other games.  So its time to upgrade.

What will i get ?  Well it wont be an nvidia card.  Do i want to spend 1300$ for 140 fps over 110 fps ?  No i dont.   Do i want to spend 900$ though for a card that isnt #1 like my 1080ti was when i got it ?  No i dont.

The 7900xt is overpriced.  Im dissapointed in this.   Im dissapointed in AMD/Ati.   Ever since their success with ryzen they have become just like intel and nvidia and charge a premium now.

Nvidias excuse for their price being so high talking about complexity of the gpu being made.  Thats total BS anyone who falls for that has no common sense.  A geforce 3 or 4 was complex as well at the time.  But their was no price gouging.

Of course now we have miners and scalpers.   Scalping wasnt much of a thing back then.  Part of this scalping trend though started on videi games with in game currency.  And now its spread from gamers to real life because it works.  Its a shame.

So what will i di ?  Im not sure.  I habe a 3800x and a x570 board.  Ill probably upgrade to the best cpu my x579 can support.  Instead of buying new ram and a motherboard.   And It looks like im going to be forced to buy a 7900xt.   But for 100$ more for a 7900xtx it just seems like a smarter thing to do.  Amd knew this and thats why they did it.  However 100$ some people may not be able to afford.. 900$ is pushing it for me.

But i also do believe these prices are to get the 6000 series inventory out.  I predict in 2-3 months or so the prices will probably go down.  But who knows.

So in the end is spending 900$ worth it.  ?  Yes for me it is.  The 1080ti is aged..  slow to a point.  No ray tracing.  The 7900xt is light years faster.  So for someone with a 900-10 series cards i think these are the people who will be buying this.   On the other end..  why am i spending 1000$ to play a bunch of un finished video games these days ?  Nothing is finished or polished anymore.   In 2 years a 7900xt will probly be outdated and slow.  Is it worth getting for this price ?  Im not sure.  But for someone with a 1080ti and no options its seems like this is the path to go.

Another factor.. i have a 750w psu with 3 8 pin connectors.  Im not upgrading my psu for 16 pin connectors.   But they probably make adapters for that anyway.  They always do.   

So where im standing currently.   I either have to buy a new psu.. and a 1200$ video card.  And maybe upgrade to a new motherboard ram and cpu if i want to stay in the top.   Or just get a new cpu..  and a 900$ video card.

Seems ill be picking option 2.


----------



## JDUNWIN (Dec 15, 2022)

shovenose said:


> Do you like that setup? I still have one of my previous monitors (23” 1080P) I’m thinking of adding back to my desk next to my new monitor (27” 4K) purely for productivity reasons. I just feel like it would bother me having two different monitors.



People have been using mismatched monitors since ancient times, the side monitor is their old monitor after the upgrade lol

I've been using 3 for the longest time now, productivity just is better with natural window/task divisions.  Its the doorway effect for the brain, you forget things when walking through doorways because your brain switches context and dumps short term memory, keeping it all up on screen without having to switch desktops prevents this.

Good monitor arms make all the difference, or desk risers if you have terrible monitor without vesa.


As for power consumption, that is just damning especially with power prices the way they are.  I have a modest 12700 3060 system because I've been betting on a chiplet gpu driving the next generation of gpu prices down, guess I lost that bet.  Running 3 monitors the 3060 drops down as low as 210mhz gpu 202mhz memory, 20 watt power draw in gpu-z, as it should be.

Its hard to make the case for a gpu without a full feature set at luxury prices in a post gpu scarcity world.  If you can afford 1k toy, you can afford 1.2k toy.  Saying ray tracing doesn't make sense is like saying playing video games doesn't make sense, its all for amusement.


----------



## candymancan21 (Dec 15, 2022)

Not sure anyone should complain about power consumption when they spend 1000$ on a gpu.

50 watts for 12 hours at 13 cents per kwh is only 2.50$ a month.   400 watts for 12 hourrs is about 20$ a month.   Chipotle is now 15$ for a chicken burrito with guac..  i mean just saying.

Has anyone actually done the math on a pc power consumption compared to other things in their house.   A coffee maker or dryer uses 2-3x more power..


----------



## SOAREVERSOR (Dec 15, 2022)

candymancan21 said:


> Not sure anyone should complain about power consumption when they spend 1000$ on a gpu.
> 
> 50 watts for 12 hours at 13 cents per kwh is only 2.50$ a month.   400 watts for 12 hourrs is about 20$ a month.   Chipotle is now 15$ for a chicken burrito with guac..  i mean just saying.
> 
> Has anyone actually done the math on a pc power consumption compared to other things in their house.   A coffee maker or dryer uses 2-3x more power..



All true, yet missing a key point.   For people that use a hair dryer it's needed.  It's only on for a moment.   A fridge is needed.   Most of the big power hogs are needed and that output is justified.   A "good"  gaming computer is purely a bragging rights luxury for almost everyone who has one.   There are a rare fraction of gamers who are doing any serious work on these beasts from home.   That power consumption is only needed for gaming same with the parts unless you double as an independent contractor for work or in your off time in which case you can write off the 2000 buck card as well.  If you are not an independent contractor, and at times even if you are, you will have the shit kicker 9000 workstation at work which will suck power and smash things but work foots the bill.

PC gaming has also crashed into the wall where it's largely inferior to consoles now.  All the high points of it for graphics, refresh rates, physics, sound, and all the other arguments where it would stomp consoles in the ground are only actually there if you are willing to go up to the stupidly high end.  For most people a PS5 is the better option by leaps and bounds.  They'll get better frames, better details, and on and on.  At a fraction of the price and less power for better performance.

Which is why if you look at PC gaming hardware stats, it's mostly people who have no business doing it and are making a stupid decision and living cheaply in other areas.  Which LOL, but the cost of being the master race is less performance, higher power draw, for more money, a crappier screen, and eating bad for you food rather than good for you food.  Truly good stuff!


----------



## ZoneDymo (Dec 15, 2022)

these cards are actually really disappointing imo, if you look at the rx6900xt vs rx7900xt you basically get 20 fps improvement and also more power consumption...and basically the same RT performance....how does that work?


----------



## JDUNWIN (Dec 15, 2022)

candymancan21 said:


> Not sure anyone should complain about power consumption when they spend 1000$ on a gpu.
> 
> 50 watts for 12 hours at 13 cents per kwh is only 2.50$ a month.   400 watts for 12 hourrs is about 20$ a month.   Chipotle is now 15$ for a chicken burrito with guac..  i mean just saying.
> 
> Has anyone actually done the math on a pc power consumption compared to other things in their house.   A coffee maker or dryer uses 2-3x more power..



The critique can be easily flipped though, if you can easily afford it, why tolerate less functionality at higher power draw.



SOAREVERSOR said:


> All true, yet missing a key point.   For people that use a hair dryer it's needed.  It's only on for a moment.   A fridge is needed.   Most of the big power hogs are needed and that output is justified.   A "good"  gaming computer is purely a bragging rights luxury for almost everyone who has one.   There are a rare fraction of gamers who are doing any serious work on these beasts from home.   That power consumption is only needed for gaming same with the parts unless you double as an independent contractor for work or in your off time in which case you can write off the 2000 buck card as well.  If you are not an independent contractor, and at times even if you are, you will have the shit kicker 9000 workstation at work which will suck power and smash things but work foots the bill.
> 
> PC gaming has also crashed into the wall where it's largely inferior to consoles now.  All the high points of it for graphics, refresh rates, physics, sound, and all the other arguments where it would stomp consoles in the ground are only actually there if you are willing to go up to the stupidly high end.  For most people a PS5 is the better option by leaps and bounds.  They'll get better frames, better details, and on and on.  At a fraction of the price and less power for better performance.
> 
> Which is why if you look at PC gaming hardware stats, it's mostly people who have no business doing it and are making a stupid decision and living cheaply in other areas.  Which LOL, but the cost of being the master race is less performance, higher power draw, for more money, a crappier screen, and eating bad for you food rather than good for you food.  Truly good stuff!



Yes and no.  For those who can't be bothered with any settings/setup consoles are very user friendly, but for people who would need a desktop anyways, the cost is more than made up for by pc's being useful for so much more, while consoles are just an entertainment black hole, never mind console game prices.

A pc is your car, a console is a jet ski.


----------



## candymancan21 (Dec 15, 2022)

SOAREVERSOR said:


> All true, yet missing a key point.   For people that use a hair dryer it's needed.  It's only on for a moment.   A fridge is needed.   Most of the big power hogs are needed and that output is justified.   A "good"  gaming computer is purely a bragging rights luxury for almost everyone who has one.   There are a rare fraction of gamers who are doing any serious work on these beasts from home.   That power consumption is only needed for gaming same with the parts unless you double as an independent contractor for work or in your off time in which case you can write off the 2000 buck card as well.  If you are not an independent contractor, and at times even if you are, you will have the shit kicker 9000 workstation at work which will suck power and smash things but work foots the bill.
> 
> PC gaming has also crashed into the wall where it's largely inferior to consoles now.  All the high points of it for graphics, refresh rates, physics, sound, and all the other arguments where it would stomp consoles in the ground are only actually there if you are willing to go up to the stupidly high end.  For most people a PS5 is the better option by leaps and bounds.  They'll get better frames, better details, and on and on.  At a fraction of the price and less power for better performance.
> 
> Which is why if you look at PC gaming hardware stats, it's mostly people who have no business doing it and are making a stupid decision and living cheaply in other areas.  Which LOL, but the cost of being the master race is less performance, higher power draw, for more money, a crappier screen, and eating bad for you food rather than good for you food.  Truly good stuff!


youre forgetting one thing.. a keyboard and mouse is superior to any controller lol.

You also cannot multi screen setup on a console browsing the internet while gaming.. actually be on good coms like discord and many many other things.  A controller and no real OS is very limiting for stuff

No duel boxing games either.  Lots and lots of great games are only on pc as well.


----------



## dullahan29 (Dec 15, 2022)

How is this card an "Editor's choice"? This is literally the worst value card of this generation, yes I am counting the 4080.
100W on idle with multi-monitor is a crime, this card should be banned and dumped straight to landfill. 

I guess Wizzard just sings praise for every single card because he is afraid of getting on the bad side of AMD/Nvidia.
I have been a TPU fan for years but after this I would just go to HUB/GN for honest unbiased reviews.


----------



## SOAREVERSOR (Dec 15, 2022)

JDUNWIN said:


> The critique can be easily flipped though, if you can easily afford it, why tolerate less functionality at higher power draw.
> 
> 
> 
> ...



Most people do not need a desktop and it's a stupid buy.  Unless you are doing editing, rendering, or have a reason to run a ton of VMs a desktop is a very very very bad buy.  And if you do all that, odds are you have one at work that stomps what you have at home.  Which is why more and more people are moving to laptops which unless you are doing said work loads is just a smarter and more practicle choice.

Most PC gamers aren't bothering with settings and setup either.  This isn't the late 90s.  Everything auto configures itself, things turbo boost, RAM has profiles it's plug and play in the silliest way possible.

All the arguments for PC gaming have been shot through unless you either need a multi thousand dollar machine at home for work loads which only a tiny portion of the population uses (you doing solidworks, if not LOL), or just simply want to throw thousands up thousands at a machine.  The last niche case is your an LOL/DOTA esports types in which case a potato PC works and the cost of video cards doesn't matter.

You sorta have to look where things are moving to as a market.  Most people are buying up laptops now, even gamers.  Laptops never got hit with the GPU price rack ups even when inflation and shipping issues happened.  It was all a minor thing.  Consoles are getting closer and closer to higher end desktops beating them in some cases.  Desktop is rapidly going into "go ultra high end or go home" and that trend is not changing till the market crashes and PC gaming goes to cloud service based.  Trying to fight the PC being first into the cloud is pointless.


candymancan21 said:


> youre forgetting one thing.. a keyboard and mouse is superior to any controller lol.
> 
> You also cannot multi screen setup on a console browsing the internet while gaming.. actually be on good coms like discord and many many other things.  A controller and no real OS is very limiting for stuff
> 
> No duel boxing games either.  Lots and lots of great games are only on pc as well.



This is all somewhat true and also utterly wrong.

1.  For what games?  This is a flat out lie.  There are games where KBM is better, but there are ways to enable that on a console now.   But tons of games are better with an arcade stick, controller, flight stick, racing wheel, and on and on and on.  You can use those on the PC as well.  But the input device argument has been a dead orse for a while.

2.  This is kinda true and yet utterly false for anybody awake.  You can totally be on discord on a laptop or a macbook on the side while doing team games on a console.  Even a crappy netbook.  All while also hearing the console players as well.  This combination costs less now than the GPU, CPU, and soon SSD combo to make PC gaming impressive.

3.  There are lots and lots of great games that are console exclusive as well.

All these points are sort of null and void unless you are RTS or CS GO at 240hz, or have some 4090 monster with an 800 buck board.  I have a monster at home with one box sitting here with 18 cores and a 3090 and I play Quake 1/World on my PC really.  The SO and her sisters play Beat Saber VR on it.  That's the point of it.  And I have hundreds of games on it.  Maybe if there's something I really want at 240hz or maybe in ultrawide but that's rare as all hell.   I can talk totally fine over Discord even when play on a Switch with people.  For Street Fighter or KOF I'm able to play on either the PS or the PC it's just swapping the arcade stick (of which I have a few).  I'll take the console exclusives over the PC ones any day at the moment.  Demons Souls has been great.   I play Elden Ring on the PS5 over the PC.  There's also more rampant cheating and hate speach on the PC.


----------



## bobmeix (Dec 15, 2022)

@W1zzard
Hi, I've just noticed a small typo on the conclusion page: GPU cores is 5736 (vs 6144 on the XTX) / should have been 5376.


----------



## W1zzard (Dec 15, 2022)

bobmeix said:


> @W1zzard
> Hi, I've just noticed a small typo on the conclusion page: GPU cores is 5736 (vs 6144 on the XTX) / should have been 5376.


Nice find, fixed


----------



## candymancan21 (Dec 15, 2022)

SOAREVERSOR said:


> Most people do not need a desktop and it's a stupid buy.  Unless you are doing editing, rendering, or have a reason to run a ton of VMs a desktop is a very very very bad buy.  And if you do all that, odds are you have one at work that stomps what you have at home.  Which is why more and more people are moving to laptops which unless you are doing said work loads is just a smarter and more practicle choice.
> 
> Most PC gamers aren't bothering with settings and setup either.  This isn't the late 90s.  Everything auto configures itself, things turbo boost, RAM has profiles it's plug and play in the silliest way possible.
> 
> ...




Why would i want to buy a laptop to be on discord ???  Also no i challenge any one on a fps game with a controller and kb mouse.. you dont have the senstivity and accuracy on a controller.. this is why so many fps games has auto aim for consoles lol.  The input device thing is only dead in your head..  You wont beat me in any game that requires accuracy with a thumb stick.  The wrist is more versital and accurate.

Not that many games on consoles are exclusive..  most MMOS of the last and current are still on pc only or better played on pc..  even older MMOS that have huge populations still are pc only.  World of warcraft..  eve online.. planetside 2 i think has a console version but not with the pops on pc lol.   And so many others.  Any total war game all pc only.   The steam library is mostly pc only as well..  any small indie game is always pc only.   Consoles are limited to big publishers only.  Youre missing out on hundreds of great games.... litterally houndreds of games.   

Youre limiting yourself on a console lol.   I gave up with consoles after the first xbox.   So much more advantage on a pc.

I dont game at 240 hz btw lol.  I dont care about a 4090 either.


----------



## bobmeix (Dec 15, 2022)

@W1zzard 
I'm pestering you right now, but: Radeon RX 7900 XTX that also releases today is 19% faster, the RTX 4080 is 14% ahead and the RTX 4090 is 25% faster. Shouldn't the 25% be 45%?


----------



## W1zzard (Dec 15, 2022)

bobmeix said:


> @W1zzard
> I'm pestering you right now, but: Radeon RX 7900 XTX that also releases today is 19% faster, the RTX 4080 is 14% ahead and the RTX 4090 is 25% faster. Shouldn't the 25% be 45%?


Indeed, fixed


----------



## JDUNWIN (Dec 15, 2022)

SOAREVERSOR said:


> Most people do not need a desktop and it's a stupid buy.  Unless you are doing editing, rendering, or have a reason to run a ton of VMs a desktop is a very very very bad buy.  And if you do all that, odds are you have one at work that stomps what you have at home.  Which is why more and more people are moving to laptops which unless you are doing said work loads is just a smarter and more practicle choice.
> 
> Most PC gamers aren't bothering with settings and setup either.  This isn't the late 90s.  Everything auto configures itself, things turbo boost, RAM has profiles it's plug and play in the silliest way possible.
> 
> ...



Most people aren't gamers and can get by with an ipad, so what, we're talking about a sub group regardless, and its not that hard to admit that consoles are heavily limiting gated gardens, basically the ipads of the gaming world.  Gamers who want to install unsanctioned mods or tweaks have to use a pc.

I seriously doubt you type your long responses on a console, the case for pc makes itself, as you were going to buy one anyways.  And for those who don't have one, its often not because they don't want to, its because they spent the budget else where already, laptop/mobile/console/handheld gaming.


----------



## candymancan21 (Dec 15, 2022)

JDUNWIN said:


> Most people aren't gamers and can get by with an ipad, so what, we're talking about a sub group regardless, and its not that hard to admit that consoles are heavily limiting gated gardens, basically the ipads of the gaming world.  Gamers who want to install unsanctioned mods or tweaks have to use a pc.
> 
> I seriously doubt you type your long responses on a console, the case for pc makes itself, as you were going to buy one anyways.  And for those who don't have one, its often not because they don't want to, its because they spent the budget else where already, laptop/mobile/console/handheld gaming.


Hes also trying to say a ps5 is as powerfull as a pc... you dont need a 4090 to get better graphics and performance over a ps5..  look at god of war on the ps5 and pc.. the graphics on a ps5 are limited..  shadows suck.. facial details and so forth arent that great.  For 500$ i guess its a good deal over a 800$ video card not counting everything else.  But still to say the ps5 is better is meh


----------



## Arco (Dec 15, 2022)

If you were able to use your own OS and actually be able to do stuff then consoles would be a very good option. Unfortunately, consoles are sold like a printer. Games and online subscriptions are where they get their money back. Also, they are locked down tight. You can play games, watch content, and do little else on a console. Meanwhile, a PC has so many more options.

I also agree with @candymancan21 regarding controls. Luckily this is mostly whatever. Both consoles and PCs can use each other's inputs.

"2. This is kinda true and yet utterly false for anybody awake. You can totally be on discord on a laptop or a macbook on the side while doing team games on a console. Even a crappy netbook. All while also hearing the console players as well. This combination costs less now than the GPU, CPU, and soon SSD combo to make PC gaming impressive."

This is VERY clunky, to say the least.


----------



## bobmeix (Dec 15, 2022)

@W1zzard
You're surely busy too, but this will be my last comment for today. 
45, not 49.

Also, according to the previous info (and graphs), the following should be 19%, not 16%:
At 4K, buying the XTX will give you a +16% performance uplift—definitely worth considering.


----------



## JRMBelgium (Dec 15, 2022)

candymancan21 said:


> Not sure anyone should complain about power consumption when they spend 1000$ on a gpu.
> 
> 50 watts for 12 hours at 13 cents per kwh is only 2.50$ a month.   400 watts for 12 hourrs is about 20$ a month.   Chipotle is now 15$ for a chicken burrito with guac..  i mean just saying.
> 
> Has anyone actually done the math on a pc power consumption compared to other things in their house.   A coffee maker or dryer uses 2-3x more power..



Try 75 cents per kwh in my country at the moment...


----------



## gupsterg (Dec 15, 2022)

@W1zzard 

Any chance of access to latest beta GPU-Z?

Also what settings for heaven did you use for Heaven, just wanna compare my card to your results cheers.


----------



## shovenose (Dec 15, 2022)

Well that's a mighty strange lookin' CPU haha!


----------



## ARF (Dec 15, 2022)

shovenose said:


> Well that's a mighty strange lookin' CPU haha!



It isn't funny, it is actually very disturbing. What you actually see is degradation in the empire of evil.


----------



## shovenose (Dec 15, 2022)

ARF said:


> It isn't funny, it is actually very disturbing. What you actually see is degradation in the empire of evil.



Nah, this isn't the Red Devil card, it's only halfway in the empire of evil haha. However, maybe this is how the 7900XT is actually a great value, maybe it comes bundled with a Ryzen 7000 series CPU. Although it's a huge waste because we all know that the reason the RX 7900 series is underwhelming is because it doesn't use PCI-E 5.0 and when they launched the Ryzen 7000s they told us HOW AMAZING IT IS THEY SUPPORT PCI-E 5.0, right? Like, it TOTALLY matters when we aren't even using the full bandwidth, but nah you don't want that JUNK Intel platform without PCI-E 5.0. DUH! 

Edit: yay, my XT arrived! It's great


----------



## santio1 (Dec 21, 2022)

¿El punto de referencia en red dead redepmtion 2 es con fsr activado o desactivado? Supongo que está encendido, fps promedio con fsr apagado alrededor de 70 fps. lo probé


----------



## W1zzard (Dec 21, 2022)

santio1 said:


> ¿El punto de referencia en red dead redepmtion 2 es con fsr activado o desactivado? Supongo que está encendido, fps promedio con fsr apagado alrededor de 70 fps. lo probé


No FSR or DLSS in all those tests. We're not using the integrated benchmark, but our own test scene


----------



## Space Lynx (Dec 23, 2022)

My new build uses a MSI 6800 XT Trio...






This is with an undervolt on the gpu through AMD official drivers in the performance section. How does such a giant cooler (this thing is a massive 6800 XT) still manage a 99 celsius hotspot playing God of War no matter what I do (including high fan curve, stock fan curve, I tried it all)

I think something is wrong with it, just doesn't seem right. So, I am refunding this and using the store credit to go towards a 7900 XT for $899, there is an aftermarket one not AMD reference model for $899 I can get.  I will gain 40-45 fps in God of War, and lose probably 25 celsius on the hotspot. (also this is drawing 280-290 watts in gaming) so won't even be using more electricity for both of these gains

For $300 more, that's honestly not a bad deal.

Since it is an aftermarket cooler as well, I don't think I will run into any of the same issues listed in many reviews. AMD has also said a driver fix for fixing media playback power draw is done or almost done... so I mean, 7800 XT gets a lot of hate, but $899 for a superior aftermarket version I think is a fair deal honestly. I am gaining 40-45 fps in several titles at 1440p.


----------



## watzupken (Dec 27, 2022)

bug said:


> I think it depends. If the XTX is priced fairly and the XT is just an attempt to cash grab, yes these could see a price cut (also depending on where the 4070 and 4070Ti land). If, on the other hand, it's the XT that is priced fairly and the XTX is selling at a discount, to undercut the 4080, AMD may not be happy with two discounted top-end SKUs on their hands at the same time.


Objectively, if people think that the RX 7900 XT is a cash grab, then won't the RTX 4070 Ti @ 899 be worst? In Nvidia's own benchmarks, the 4080 12GB (now renamed to 4070 Ti) is about 20% worst in performance, relative to the RTX 4080. So the 4070 Ti should perform closely to the RTX 3090 Ti in most cases. I feel the former will fall short at 4K, as the lower memory bandwidth is going to have an impact to both RT and higher resolution performance. Its got a bigger cache to make up for the smaller memory bus, but like what we have seen with AMD cards, the cache size is mostly in line with the target resolution, i.e. 1440p, and once the cache is insufficient, there will be a noticeable performance penalty.


----------



## breaken (Jan 4, 2023)

> USB-C with DP 1.2 passthrough


little typo there


----------



## Theo74 (Jan 4, 2023)

does anyone know if the RDNA3 now supports VRR on LG oled C9 (<-this specific model, not later ones)


----------

