# ASUS Radeon RX 7900 XTX TUF OC



## W1zzard (Dec 13, 2022)

The ASUS Radeon RX 7900 XTX offers fantastic overclocking potential. Thanks to a triple 8-pin power input, the XTX is no longer power-limited, and the amazing quad-slot cooler ensures the card stays cool and quiet at all times. After manual overclocking, the performance uplift to RTX 4080 was an impressive 23.1% in Cyberpunk 2077.

*Show full review*


----------



## Ferrum Master (Dec 13, 2022)

I'll say it again. I want too see this SKU on water block. In long while we have a card that has a reason to be cool and scales with that. Nvidia is already pushed out to the max and no much gains going custom loop.

But here... I'll quote... automagically


----------



## spnidel (Dec 13, 2022)

ok, this is actually starting to look interesting now
looks like these cards will clock even higher under water - not exactly excited that they pull 400w+, but looking at the OC page the card being so close to the 4090 is... not bad at all now


----------



## Pumper (Dec 13, 2022)

Good guy AMD allowing AIBs room to provide more than just better cooling for their more expensive GPUs. That's a nice performance bump.


----------



## P4-630 (Dec 13, 2022)

Good news if you OC and play CP2077 @ 4K.....That is without ray tracing though....


----------



## wolf (Dec 13, 2022)

iiiinteresting, What are the power limits of the card, I feel like that section is missing or not whre it usually is? It would seem to be that the 7900XTX can stretch it's legs decently further with more power, I would love to see a 450w limit vs the 4090 @ 450w.


----------



## N3M3515 (Dec 13, 2022)

This gpus seem to overclock better than nvidia's. STILL OVERPRICED though.


----------



## spnidel (Dec 13, 2022)

P4-630 said:


> Good news if you OC and play CP2077 @ 4K.....That is without ray tracing though....


yea by comparison playing at 4k with raytracing + rtx 4090 nets you 20 fps - a fantastic deal for nvidia users, I personally love slideshow gaming
oh, 4k too much for a 4090? sorry, I meant I love buying a $1600 gpu to play my games at 720p with raytracing enabled


----------



## xcazy (Dec 13, 2022)

looks good hope price lower than rtx4080 in my place


----------



## P4-630 (Dec 13, 2022)

But still, why is a recent AMD GPU using *88W*atts for just playing a 1080p youtube video...


----------



## TheEndIsNear (Dec 13, 2022)

Great looks like they're all sold out at newegg except for the crappy xt.  The new normal.  Make as little as possible and watch the prices rise.  Duopolies are just as bad as monopolies.  Man I really wanted one.


----------



## P4-630 (Dec 13, 2022)

TheEndIsNear said:


> Great looks like they're all sold out at newegg except for the crappy xt.  The new normal.  Make as little as possible and watch the prices rise.  Duopolies are just as bad as monopolies.  Man I really wanted one.



You'll be  ok, just get one before the end is near.


----------



## W1zzard (Dec 13, 2022)

wolf said:


> What are the power limits of the card, I feel like that section is missing or not whre it usually is?


You're thinking of the NVIDIA reviews .. this information isn't as readily available on AMD as on NVIDIA .. 



P4-630 said:


> But still, why is a recent AMD GPU using *88W*atts for just playing a 1080p youtube video...


4K video, but yes


----------



## Fouquin (Dec 13, 2022)

Ferrum Master said:


> I'll say it again. I want too see this SKU on water block. In long while we have a card that has a reason to be cool and scales with that. Nvidia is already pushed out to the max and no much gains going custom loop.
> 
> But here... I'll quote... automagically



ASRock has a waterblock variant, so look out for reviews on that.



TheEndIsNear said:


> Great looks like they're all sold out at newegg except for the crappy xt. The new normal. Make as little as possible and watch the prices rise. Duopolies are just as bad as monopolies. Man I really wanted one.



I was able to get two different XTXs into my cart and to the payment screen, but after ~4 minutes everything was sold out. So, still better than before. For the record I did not attempt to actually PAY for one because I'm not keen on spending $1150 (after local tax) on a GPU.


----------



## the54thvoid (Dec 13, 2022)

Hmm, this card seems waaaayyyyy better than the XFX one. But again, price. I'm drawn to the more efficient 4080. But not at the price point. And this card looks to be heading toward that point too, though, the TUF cards are usually cheaper.


----------



## watzupken (Dec 13, 2022)

I think AMD delivered a solid card, but the lagging RT performance and fairly limited FSR 2.0 adoption may dampen demand for their cards. The other problem is the very high power draw when using multi monitor and some non-3D workload which I hope can be rectified by updated drivers. The good thing is that the cards are cheaper than Nvidia's Ada lineup so far. I think it may be worth waiting to see if prices will drop over time before buying one. After all, AMD is more willing to drop prices.


----------



## Acesbong (Dec 13, 2022)

Got into the amd.com sale, 7900xtx was in stock for €1,163. Im glad I didn’t bother. If going to spend stupid money looks like partner cards way to go.


----------



## watzupken (Dec 13, 2022)

the54thvoid said:


> Hmm, this card seems waaaayyyyy better than the XFX one. But again, price. I'm drawn to the more efficient 4080. But not at the price point. And this card looks to be heading toward that point too, though, the TUF cards are usually cheaper.


Asus tax hits very hard starting with the TUF series. In my country, the RTX 4080 TUF cost as much as a lower end model RTX 4090 from Zotac and Galax. A Strix RTX 4080 costs the same as a mid range RTX 4090. So I am not expecting the RX 7900 TUF models to cost any less.


----------



## gridracedriver (Dec 13, 2022)

3ghz Edition with N31+ in June?


----------



## Valantar (Dec 13, 2022)

Thanks for another great review! Two things: can you please add power draw measurements to your OC section? And is there any chance of TPU reviews moving to a dynamic score presentation for percentage-based scores any time soon, rather than just static images? In other words giving us the ability to change (clicking? mouse hover?) which GPU in the graph is the 100% baseline in graphs like the relative performance comparison, with other scores adjusting to match? (Something like what Notebookcheck does in their laptop reviews, though hopefully with TPU's much simpler and clearer visual style.) That would be extremely useful IMO - especially as percentages are confusing and tricky to most people - allowing for easy visualizations of things like "how much faster is this GPU compared to my current one" etc.


----------



## sector-z (Dec 13, 2022)

W1zzard said:


> You're thinking of the NVIDIA reviews .. this information isn't as readily available on AMD as on NVIDIA ..
> 
> 
> 4K video, but yes


Yes it would have been cool to know each AIB card +15% = how much at end.

Very cool you added Cyberpunk , you should add 3 or 4 game in OC mode too, it give us better idea if the 200$ we paid in top of the 999$ Reference model give us enough for ours money. 

Or like some others reviewer, put the OC section at the begining and select a color saying OC for add the score in all game of the review of what gived your OC. In the same time if the OC pass all your game test panel, it confirme it was a stable one in same time .


----------



## Crackong (Dec 13, 2022)

Looking forward for pre-waterblock version.


----------



## W1zzard (Dec 13, 2022)

Valantar said:


> Thanks for another great review! Two things: can you please add power draw measurements to your OC section? And is there any chance of TPU reviews moving to a dynamic score presentation for percentage-based scores any time soon, rather than just static images? In other words giving us the ability to change (clicking? mouse hover?) which GPU in the graph is the 100% baseline in graphs like the relative performance comparison, with other scores adjusting to match? (Something like what Notebookcheck does in their laptop reviews, though hopefully with TPU's much simpler and clearer visual style.) That would be extremely useful IMO - especially as percentages are confusing and tricky to most people - allowing for easy visualizations of things like "how much faster is this GPU compared to my current one" etc.





sector-z said:


> Yes it would have been cool to know each AIB card +15% = how much at end.
> 
> Very cool you added Cyberpunk , you should add 3 or 4 game in OC mode too, it give us better idea if the 200$ we paid in top of the 999$ Reference model give us enough for ours money.
> 
> Or like some others reviewer, put the OC section at the begining and select a color saying OC for add the score in all game of the review of what gived your OC. In the same time if the OC pass all your game test panel, it confirme it was a stable one in same time .


The problem is that there is only so much time I have and I'm already testing A LOT of other things.

No plans for dynamic charts. The simplicity of static images helps us so much on social media.. I see our charts everywhere, because they are easy to post, even for computer noobs.


----------



## wNotyarD (Dec 13, 2022)

Looking forward for the Sapphire versions, especially if they make a Toxic one.


----------



## Hattu (Dec 13, 2022)

I think AMD did well this time, but 1550€ for this Asus card is "a bit" too much... It's better than Nvidia (price wise), but still way too expensive. Like all recent GPUs. Every price is 'bout doubled in Euros.

I just hope my 2060/6GB lasts at least 5 more years. And no, I don't game. I would've bought AMD last time, but the card I bought was like only reasonable choice at 400€ between crypto-crazynesses (sometime in 2018/2019, can't remember). Everything was unavailable or too expensive.


----------



## AnotherReader (Dec 13, 2022)

Great overclocking by today's standards. The high clocks seen also lend some weight to the rumour that RDNA3 missed its clock targets.


----------



## The Quim Reaper (Dec 13, 2022)

Lol, the only 2 games I'm interested in for RT performance (Control & CP 2077) happen to be the 2 games these cards totally suck at with RT enabled @4K.


----------



## Osaka2407 (Dec 13, 2022)

> The problem is that there is only so much time I have and I'm already testing A LOT of other things.


I just made an account here because I was *really* surprised with OC results. It pulls really hard with OC. Did you have to raise power limits for such OC? If so, do you recall how much watts the card needed?

Also, many people report high power usage with multi-monitor setups or video playback, including you. I wonder if limiting memory to minimum possible frequency when on desktop does something in this regard. Is it even possible as it was on previous gens?

Memory speed seem to not slow down under light loads. They should go slightly up with multiple monitors or during video playback but no to the full clocks as they seem to do so that may be relatively easy workaround until AMD fixes it.


----------



## ARF (Dec 13, 2022)

P4-630 said:


> But still, why is a recent AMD GPU using *88W*atts for just playing a 1080p youtube video...



Because AMD doesn't know how to lower the clocks in an almost idle case.

This issue has been with Radeon since forever and ever now


----------



## P4-630 (Dec 13, 2022)

ARF said:


> Because AMD doesn't know how to lower the clocks in an almost idle case.
> 
> This issue has been with Radeon since forever and ever now



ATi may have done better.....But, alas..


----------



## ARF (Dec 13, 2022)

P4-630 said:


> ATi may have done better.....But, alas..



I remember it in 2009 with my Radeon HD 4890, so maybe it's ATi legacy... 

Later I sold it to buy an improved Radeon HD 6870 that had the issue in idle fixed.


----------



## AnotherReader (Dec 13, 2022)

ARF said:


> Because AMD doesn't know how to lower the clocks in an almost idle case.
> 
> This issue has been with Radeon since forever and ever now
> 
> View attachment 274321


There have been some exceptions in the past. Vega 64 and Radeon VII were well behaved when it came to these scenarios. The table below is for the Radeon VII from TPU's review:


GPU
Clock​Memory
Clock​GPU Voltage​Desktop25 MHz​350 MHz​0.713 V​Multi-Monitor25 MHz​350 MHz​0.712 V​Blu-ray Playback21 MHz​350 MHz​0.713 V​3D Load1766-1794 MHz​1000 MHz​1.068-1.081 V​


----------



## Osaka2407 (Dec 13, 2022)

AnotherReader said:


> There have been some exceptions in the past. Vega 64 and Radeon VII were well behaved when it came to these scenarios. The table below is for the Radeon VII from TPU's review:
> 
> 
> GPU
> ...


And here is table from 5700XT review:




EDIT:// I actually found different memory clock behavior between TPU's reviews for 6900XT and 6950XT.
6900XT:



6950XT:



And sure enough, it shows in power consumption - with multi monitor setup, 6950XT uses almost 10 watts more. At the same time it uses less power at idle. Given both are essentially the same card, there may be either power managment bugs on driver level or VBIOS level. I doubt there is anything wrong with cards themselfs, at least RDNA2 ones. If this bug, if a bug at all, went unnoticed with RDNA2 as it wasn't causing any massive issues for single die cards, it may continue with RDNA3. But these cards are MCD so power overhead is higher for memory operations.


----------



## Xex360 (Dec 13, 2022)

The infamous 3ghz, would be great if Sapphire or someone else just realise a 3ghz version. The card seems to have potential, unfortunately it doesn't save it due to stupid pricing and underwhelming performance compared to the 4080.


----------



## Denver (Dec 13, 2022)

The Quim Reaper said:


> Lol, the only 2 games I'm interested in for RT performance (Control & CP 2077) happen to be the 2 games these cards totally suck at with RT enabled @4K.


RT in all its splendor isn't playable on any GPU though.


----------



## ARF (Dec 13, 2022)

Denver said:


> RT in all its splendor isn't playable on any GPU though.



Well, technically it will never be because pure ray-traced scenes run with 0.2 FPS and the more natural they make it, the harder it becomes and the lower the FPS are...
Ray-tracing is possible only on supercomputers.


----------



## Neo_Morpheus (Dec 13, 2022)

Yes, I know its just one test, but must say, really interesting how its only 8 FPS away from a GPU that its not even competing with and priced way higher.








Oh yes, also throw the "oH bUt DLSS is sUpErIoR yoo" nah, both DLSS and FSR are a "cheat" code to fake frames. Give me TAA and we are talking.

And yes, the same drones in here that continue with the RT nonsense, I have to ask them, how many of the games that you are playing or going to play have RT AND its an implementation that is visible right away and provide anything to the game play?

Anyways, really happy with the results and my only complains are:

1- These GPU's needs to be renamed, the 7900 XTX should be a XT and the 7900 XT should be a 7800 XT.

2- The corrected named 7900 XT should be priced between US$750 and 850 and the 7800 XT should be priced between US$550 and 650.

Anyways, will wait after the holidays for discounts.


----------



## spnidel (Dec 13, 2022)

Neo_Morpheus said:


> Give me TAA and we are talking.


TAA is dogshit, give me the ability to use FSR 2.0 with a 100% render scale as a far superior anti-alias solution instead of this piece of shit blurry TAA garbage


----------



## TheinsanegamerN (Dec 13, 2022)

P4-630 said:


> ATi may have done better.....But, alas..


ATi may have gotten idle power down but it would have crashed on every 3rd game you ran.


----------



## Neo_Morpheus (Dec 13, 2022)

spnidel said:


> TAA is dogshit, give me the ability to use FSR 2.0 with a 100% render scale as a far superior anti-alias solution instead of this piece of shit blurry TAA garbage


According to this video, you are wrong and mind you, this dude is as pro nvidia as you can get.











Anyways.


----------



## Vayra86 (Dec 13, 2022)

Well that confirms my earlier thoughts seeing the vanilla card.

Nice, this speaks very well for the lower parts too. There is a lot in the AIB tank!

But when that comes at 10% premium.. we still need a massive price drop. XTX does however compare much more favorably against the 4090 now, that is very impressive.


----------



## P4-630 (Dec 13, 2022)

TheinsanegamerN said:


> ATi may have gotten idle power down but it would have crashed on every 3rd game you ran.


Well I have never experienced serious issues when I had ATi GPU's though, maybe you played the wrong games..  

My last ATi GPU was an 4870.


----------



## btk2k2 (Dec 13, 2022)

So for 10% more than the MBA 7900XTX you can get 15% more performance when OC'd out of it. And this is just the TUF model, let alone the Strix, Nitro+, Red Devil and the LC editions mind you. Actually makes paying the AIB premium potentially worthwhile.


----------



## W1zzard (Dec 13, 2022)

Osaka2407 said:


> I just made an account here because I was *really* surprised with OC results. It pulls really hard with OC. Did you have to raise power limits for such OC? If so, do you recall how much watts the card needed?


Yes power limits are at maximum (+15). The card pulls around 450 W in that state



Osaka2407 said:


> I wonder if limiting memory to minimum possible frequency when on desktop does something in this regard. Is it even possible as it was on previous gens?


Reducing memory clocks is not possible, the slider only goes upwards.


----------



## TheinsanegamerN (Dec 13, 2022)

Neo_Morpheus said:


> According to this video, you are wrong and mind you, this dude is as pro nvidia as you can get.
> 
> 
> 
> ...


Wait, are we claiming AMDunboxed is pro nvidia now? LMFAO. 

This is how you know they are good reviewers, they apparently are biased for and against everyone at the same time, depending on the review. 


P4-630 said:


> Well I have never experienced serious issues when I had ATi GPU's though, maybe you played the wrong games..
> 
> My last ATi GPU was an 4870.


You never experienced the wonder what was the 9800 pro's PCI/AGP driver conflict? 

lucky.....

You also got to totally miss out on the buggy evergreen 5000/6000 series that AMD just straight up abandoned once GCN sold decently. My HD6900 homeboys know whats up. There's a reason maxwell/pascal sold so well......


----------



## btk2k2 (Dec 13, 2022)

W1zzard said:


> Yes power limits are at maximum (+15). The card pulls around 450 W in that state
> 
> 
> Reducing memory clocks is not possible, the slider only goes upwards.



Given how good that OC result is what are the chances of getting numbers for the full benchmark suite at 4K?


----------



## Taraquin (Dec 13, 2022)

This is a very good card! Superb thermals and noise (28dB silentbios), and 11.5% faster than founders at 3200MHz once overclocked! I'm impressed! If they sort out the terrible consumption with video playback and multi monitor this starts to look good! Seems lack of vram downclock is to blame, same story with 5xxx and 6xxx but gotvresolved with driver updates


----------



## oxrufiioxo (Dec 13, 2022)

btk2k2 said:


> Given how good that OC result is what are the chances of getting numbers for the full benchmark suite at 4K?



Probably not worth the effort... Even though this sample does 10% someone buying the card at retail might get 5% less or 5% more making the results not very valid. 


Much more impressed with this AIB card than the Reference for sure. Also for anyone that primarily plays Call of Duty these cards Kick @$$.


----------



## rv8000 (Dec 13, 2022)

gridracedriver said:


> 3ghz Edition with N31+ in June?



I fully expect to see a binned N31 to be released as a 7950 XTX at some point


----------



## Osaka2407 (Dec 13, 2022)

TheinsanegamerN said:


> Wait, are we claiming AMDunboxed is pro nvidia now? LMFAO.


Some people will just spin everything to their favour, won't they?
Anyway, FSR2 with native resolution rendering is great. There actually is a game, which allows to do just that. Genshin Impact has no FSR2 presets. Just render resolution scaling slider. So you can enable FSR2 with 150% resolution scaling. So far it's the only game like that I found.



W1zzard said:


> Yes power limits are at maximum (+15). The card pulls around 450 W in that state


Damn, that's a nice OC. Still behind 4090 in perf per watt but it scaled pretty well. 25% power usage for 20% more performance seems good enough. XFX one also scaled well it seems.


W1zzard said:


> Reducing memory clocks is not possible, the slider only goes upwards.


That's a shame. It could be really nice and relatively easy investigation with some added pressure on AMD to fix it


----------



## P4-630 (Dec 13, 2022)

TheinsanegamerN said:


> You never experienced the wonder what was the 9800 pro's PCI/AGP driver conflict?


My first own ATi desktop GPU was a X300 SE PCIe


----------



## rv8000 (Dec 13, 2022)

Neo_Morpheus said:


> According to this video, you are wrong and mind you, this dude is as pro nvidia as you can get.
> 
> 
> 
> ...



TAA, DLSS, and FSR are ALL terrible. Why would any person spending top dollar for halo cards on expensive high refresh monitors want to introduce blurring, ghosting, and visual artifacts to their games?!

I don’t understand this mentality, that any of these upscaling techniques are a good thing. They become valuable on lower end cards far more than high end parts imo.


----------



## spnidel (Dec 13, 2022)

Neo_Morpheus said:


> According to this video, you are wrong


I don't need a video to know that if FSR were to be used as an AA solution rather than an upscaling solution, then it'd soundly beat TAA


----------



## TheinsanegamerN (Dec 13, 2022)

P4-630 said:


> My first own ATi desktop GPU was a X300 SE PCIe


Oh yeah you missed it. 

9000 series had both AGP and PCI drivers, ATi, in their infinite wisdom, included both in their driver packages. This way, when you installed the driver, both the PCI and AGP drivers would get installed, windows would default to the PCI driver, and youd get a nice black screen if you were a child who didnt know how to permanently delete the PCI driver and didnt have stable internet to search for it, and how to re do it every time the drive got updated. 


rv8000 said:


> TAA, DLSS, and FSR are ALL terrible. Why would any person spending top dollar for halo cards on expensive high refresh monitors want to introduce blurring, ghosting, and visual artifacts to their games?!
> 
> I don’t understand this mentality, that any of these upscaling techniques are a good thing. They become valuable on lower end cards far more than high end parts imo.


The thing that gets me is if they are OK with these fake frames, there is a free option that works on all games and brands. 

It's called running a lower resolution LMFAO. All this work so they dont have to say "well I run at 1080p so my pretty dasblinkenlight runs are more then slideshow speed".


----------



## AKBrian (Dec 13, 2022)

W1zzard said:


> Reducing memory clocks is not possible, the slider only goes upwards.



I do wonder if this is a consequence of the Infinity Cache sharing a clock or voltage domain with the VRAM.


----------



## Neo_Morpheus (Dec 13, 2022)

rv8000 said:


> TAA, DLSS, and FSR are ALL terrible. Why would any person spending top dollar for halo cards on expensive high refresh monitors want to introduce blurring, ghosting, and visual artifacts to their games?!
> 
> I don’t understand this mentality, that any of these upscaling techniques are a good thing. They become valuable on lower end cards far more than high end parts imo.


Well, explain it to the likes of Mr. Spnidel.

And maybe I am wrong, but in this case TAA is used for antialising, not for upscaling which is what I want because I really hate jaggies.



TheinsanegamerN said:


> The thing that gets me is if they are OK with these fake frames, there is a free option that works on all games and brands.
> 
> It's called running a lower resolution LMFAO. All this work so they dont have to say "well I run at 1080p so my pretty dasblinkenlight runs are more then slideshow speed".


Agreed wholeheartly. 

Yet you have all the drones going wild after the biggest offender, DLSS 3.


----------



## TheinsanegamerN (Dec 13, 2022)

AKBrian said:


> I do wonder if this is a consequence of the Infinity Cache sharing a clock or voltage domain with the VRAM.


It was like this with rDNA 1 as well. AMD has locked down many of the tweaks one could once do to cards, not as hard as nvidia, but its still nowhere near as open as some believe.


----------



## rv8000 (Dec 13, 2022)

Neo_Morpheus said:


> Well, explain it to the likes of Mr. Spnidel.
> 
> And maybe I am wrong, but in this case TAA is used for antialising, not for upscaling which is what I want because I really hate jaggies.
> 
> ...



TAA is an antialiasing technique however, it still introduces blurring. Whether someone personally prefers this method is up to them, imho it reduces image quality and detail and often turns games into a blurry puddle.

The most recent recollection I have of this is monster hunter world; it’s TAA engine implementation is horrible.


----------



## Neo_Morpheus (Dec 13, 2022)

rv8000 said:


> TAA is an antialiasing technique however, it still introduces blurring. Whether someone personally prefers this method is up to them, imho it reduces image quality and detail and often turns games into a blurry puddle.
> 
> The most recent recollection I have of this is monster hunter world; it’s TAA engine implementation is horrible.


Well, whatever removes the blurriness AND jaggies, I'm game. 

Just dont care for the fake FPS boosters called FSR and DLSS.


----------



## sector-z (Dec 13, 2022)

W1zzard said:


> Yes power limits are at maximum (+15). The card pulls around 450 W in that state
> 
> 
> Reducing memory clocks is not possible, the slider only goes upwards.


Oh thanks alot for the info, I hope MorePowerTool from Igorslab will work with this Gen and let us unlocking 600W or more  + more Volt on Water


----------



## mb194dc (Dec 13, 2022)

Seems like the chip has potential but the first go at it hasn't worked out and it needs 450w+ for maximum performance. Maybe encouraging signs that a revised version in a few months might be much better. 

Still be insanely expensive probably though...


----------



## Steevo (Dec 13, 2022)

So with mods I can see 3.5hz under water with this, meaning for enthusiasts they could have 4090 + levels of performance for $1300 for the card and a full coverage block. 

Makes sense why AMD released their vanilla version, its cheap, the OEM version with a waterblock should overclock and reach within a few percent of the 4090.


----------



## Veseleil (Dec 13, 2022)

rv8000 said:


> TAA is an antialiasing technique however, it still introduces blurring. Whether someone personally prefers this method is up to them, imho it reduces image quality and detail and often turns games into a blurry puddle.
> 
> The most recent recollection I have of this is monster hunter world; it’s TAA engine implementation is horrible.


RDR2 as well.

BTW, this card is one of the best looking of the bunch IMO.


----------



## N3M3515 (Dec 13, 2022)

TheinsanegamerN said:


> Wait, are we claiming AMDunboxed is pro nvidia now? LMFAO.


They recommend the 4080 over the 7900XTX


----------



## Neo_Morpheus (Dec 13, 2022)

TheinsanegamerN said:


> Wait, are we claiming AMDunboxed is pro nvidia now? LMFAO.


Read again, I clearly said "this dude", but hey, move the goal post, its easier to argue that way.


----------



## Veseleil (Dec 13, 2022)

People claiming that Hardware Unboxed is biased, seriously need a chkdsk /r.


----------



## Neo_Morpheus (Dec 13, 2022)

Veseleil said:


> People stating Hardware Unboxed is biased, seriously need a chkdsk /r.


Two quick samples.

Tim in one of their Q&A was asked if the 6900 XT at US$700 was a good buy and he said no, get a 4080 instead.

And on Steve review 0f the 7900 XTX, their own "Elite" members complained that the review was biased and the conclusion was "buy a 4080 because RT and DLSS".

Anyways, feel free to really watch the wording used in their latest reviews and make your own conclusion or not, wouldnt care either way.


----------



## Veseleil (Dec 13, 2022)

Don't watch their Q&A, boring, time consuming and nothing of value. It's on a pub talk level. Their way to "show" they are mere mortals with feelings and stuff. For example, Steve Burke (GN) does that in each of his videos.   Again, waste of time.
Though I could agree their content (HU) is going downhill lately. In the Zen-Zen2 period they were top notch.


----------



## Denver (Dec 13, 2022)

rv8000 said:


> TAA, DLSS, and FSR are ALL terrible. Why would any person spending top dollar for halo cards on expensive high refresh monitors want to introduce blurring, ghosting, and visual artifacts to their games?!
> 
> I don’t understand this mentality, that any of these upscaling techniques are a good thing. They become valuable on lower end cards far more than high end parts imo.


I think these upscaling technologies are more interesting on low-end GPUs where performance is often lacking to run at X/Y/Z resolution. 

Yeah, I also find it ridiculous to rely on it to have games playable with RT on GPUs over $1000...


----------



## RedelZaVedno (Dec 13, 2022)

I was really eager to get this bad boy until I read: _"at *multi-monitor* and *media playback* , just the *graphics card alone consumes 99 W/88W*". _ WTF AMD???
You build a gem and then you render it basically unusable for people with multi-monitor setup and I dare to say that's probably a A LOT of +1000$ GPU buyers, myself included.
Stupid, just stupid, shooting yourself in the knee like that


----------



## kapone32 (Dec 13, 2022)

RedelZaVedno said:


> I was really eager to get this bad boy until I read: _"at *multi-monitor* and *media playback* , just the *graphics card alone consumes 99 W*". _ WTF AMD???
> You build a gem and then you render it basically unusable for people with multi-monitor setup and I dare to say that's probably a A LOT of +1000$ GPU buyers, myself included.
> Stupid, just stupid, shooting yourself in the knee like that


Want to bet that this will be solved by the next 2 monthly driver updates?


----------



## gupsterg (Dec 13, 2022)

@W1zzard

Multi monitor power usage may be a bug, due to which connections used, see this post.



> If this is like my previous experience with this issue from years ago, it depends on how the monitors are connected, with DP+DP being fine, while DP+HDMI may be problematic.


----------



## kapone32 (Dec 13, 2022)

It is obvious that these cards are nuanced vs the 6000 series with the way they have been tuned. I was watching a video this morning on another AIB card. I do believe that these are the closest to ATI OC ability since acquisition.


----------



## RedelZaVedno (Dec 13, 2022)

kapone32 said:


> Want to bet that this will be solved by the next 2 monthly driver updates?


I really hope so, but I'm postponing my buy until fix is confirmed. I can live with 60W my 5700XT draw on my 4-monitor setup, but 99W (probably tested on just 2-monitor setup) is just too much.


----------



## Lionheart (Dec 13, 2022)

Wasn't expecting that overclocking potential Jeez!!! I guess that's one positive thing for AIB partners. That idle/multimonitor power consumption has to be fixed ASAP AMD!


----------



## Veseleil (Dec 13, 2022)

kapone32 said:


> It is obvious that these cards are nuanced vs the 6000 series with the way they have been tuned. I was watching a video this morning on another AIB card. I do believe that these are the closest to ATI OC ability since acquisition.


And don't forget these are the first chiplet-based consumer cards in the world. There's much space left to be improved on the drivers side too.

Anyway, their open beta testing is starting soon.


----------



## kapone32 (Dec 13, 2022)

Veseleil said:


> And don't forget these are the first chiplet-based consumer cards in the world. There's much left to be improved on the drivers side too.


I am getting one from Newegg. Hopefully the Gaming OC from Gigabyte will be avialable next week. I actually want one that has the 3 8 Pin connectors. I can't wait to see what this does with a 5800X3D and Samsung B die Ram at 3600 14-15-15-36 (XPG Extreme). With an Alphacool block with a 420 rad.


----------



## Dristun (Dec 13, 2022)

I feel like AMD just torpedoed their much more important launch-day reviews by clocking the reference card lower and going with 2.5slot/2x8pin. Yeah well it would've consumed even more power - but as nvidia showed nobody cares about it if you're faster.


----------



## Veseleil (Dec 13, 2022)

Dristun said:


> AMD just torpedoed their much more important launch-day reviews


They're experts in that field.


----------



## Talon (Dec 13, 2022)

@W1zzard 

For that Cyberpunk 2077 test do you use Ultra Preset, No RT, No Upscale and the built in benchmark? Also why no power consumption shown for the overclock?


----------



## kapone32 (Dec 13, 2022)

Dristun said:


> I feel like AMD just torpedoed their much more important launch-day reviews by clocking the reference card lower and going with 2.5slot/2x8pin. Yeah well it would've consumed even more power - but as nvidia showed nobody cares about it if you're faster.


For me that is ok. It gives consumers a reason to buy AIB GPUs. That helps to move the market and allow for diversity. Unilike the other company that is directly competing with their AIB partners.


----------



## jootn2kx (Dec 13, 2022)

Show us results in cyberpunk with ray tracing on + overclock against the RTX 4090.
Checking on guru3D's results the 7900XTX has only a RTX 3090 performance in ray tracing.
So adding a 20% overclock won't get you near a 4090 not by far lol.

While 20% overclock is impressive on itself the tests in this article are only showing in pure rasterization.

Edit: Anyway no need to prove anything  already proven here
ASUS TUF Gaming Radeon RX 7900 XTX OC review - (Raytracing) Cyberpunk 2077 (guru3d.com)


----------



## SentinelAeon (Dec 13, 2022)

I never comment on anything, mostly cause i just read for fun, i never own any pc part unless its like 5+ years old, this OC headroom prolly made me comment, and also to give thumbs up to the reviewers. I usualy check all the major review sites. Techpowerup was actualy the last one that i started checking, i think around ryzen 1. And im glad to say i find it the best out of all review sites i check. Prolly the most useful thing is relative performance. Many sites just offer per game performance and to me thats not useful: i just want to know how gpus/cpus compare over as many games as possible. The other day i was outside drinking coffe and thought i was checking a techpowerup review of 7900xt and there was a bunch of pages missing so i got annoyed. Well, turns out i wasn't checking techpowerup but some other site without relative performance containing all games  So yea, while i respect all reviewers because that sounds like insainely tedious work, techpowerup is my favourite one. The only thing that i would add maybe is for each card a page of its relative performance while overclocked, so you could see how gpus compare when stock and how when overclocked. I know, its even more tedious work. But i always buy second hand, as cheap as possible while looking which piece of tech has the most hidden potential. Still, reading tech reviews is 1 of my favourite things i do, its like watching a sports match, see who wins. Also amazes me how passionate the users are in the comments, its like soccer fans. When i told my dad of that, he couldn't belive it xD Anyway, keep up the good work and looking forward to buying this card, second hand, in about 6 years


----------



## gupsterg (Dec 13, 2022)

@W1zzard 

Any chance of AMD MBA RX 7900 XTX UV +15% 2750 MEM to be in OC chart in review? Also OC power figures be sweet  .


----------



## kapone32 (Dec 13, 2022)

jootn2kx said:


> Show us results in cyberpunk with ray tracing on + overclock against the RTX 4090.
> Checking on guru3D's results the 7900XTX has only a RTX 3090 performance in ray tracing.
> So adding a 20% overclock won't get you near a 4090 not by far lol.
> 
> ...


I am so glad I come from a time when rasterization meant Gaming.


----------



## Psychoholic (Dec 13, 2022)

TheinsanegamerN said:


> You never experienced the wonder what was the 9800 pro's PCI/AGP driver conflict?



Ahh, memories.. I had a 9800 (Non Pro) Flashed to a 9800 Pro back in the day.


----------



## SkullFox (Dec 13, 2022)

I have been searching for the game's settings on the review and I can't find them. what are they?

Preset High?
Preset Ultra?

or something else?


----------



## AnotherReader (Dec 14, 2022)

SkullFox said:


> I have been searching for the game's settings on the review and I can't find them. what are they?


The Test setup page of this review mentions the settings.



> All games are set to their highest quality setting unless indicated otherwise.


----------



## WhoDecidedThat (Dec 14, 2022)

kapone32 said:


> For me that is ok. It gives consumers a reason to buy AIB GPUs. That helps to move the market and allow for diversity. Unilike the other company that is directly competing with their AIB partners.


We really need something like Max Q branding in the desktop space.

Those who want 450 watt graphics cards can have them.

Those who want the same GPU to consume 300 watts and maximize power efficiency should be able to get what they want.


----------



## kapone32 (Dec 14, 2022)

WhoDecidedThat said:


> We really need something like Max Q branding in the desktop space.
> 
> Those who want 450 watt graphics cards can have them.
> 
> Those who want the same GPU to consume 300 watts and maximize power efficiency should be able to get what they want.


Well you have the choice of the 2x8 pin vs 3x8 pin to know that one will definitely give you more power headroom already but I do not disagree.


----------



## Minus Infinity (Dec 14, 2022)

So much money for just better cooling. The performance delta is in no way observable and pointless for the extra power draw. If only AMD's reference cooling didn't suck.


----------



## Richards (Dec 14, 2022)

Talon said:


> @W1zzard
> 
> For that Cyberpunk 2077 test do you use Ultra Preset, No RT, No Upscale and the built in benchmark? Also why no power consumption shown for the overclock?


We need to know why


----------



## Nkd (Dec 14, 2022)

I really hope you guys improve your OC portion. I mean I like the explanation but you really don't mention what you did with the card exactly that you are testing. May be posting your settings will give users and idea lmao. Like what undervolt you set and what max you set for range, may be that can be a starting point to find stability. 

Also may be test a few more games and dump the unigine thing. May be 3-5 top titles for OC to get better estimate on performance range with few different games.


----------



## Legacy-ZA (Dec 14, 2022)

This performance gain is really impressive. Now, if only AMD and nVidia can come back to reality with their price structures, we can all be happy again.


----------



## wolf (Dec 14, 2022)

Veseleil said:


> People claiming that Hardware Unboxed is biased, seriously need a chkdsk /r.


When they make content that isn't positive for Nvidia, I hear they're AMD shills, when they make content that isn't positive for AMD, I hear they're nvidia shills. 

I just think they throw shade when they want, and praise when they want, and it has nothing to do with favouring either company. 

Examples of both sides are easy to come by.


----------



## terroralpha (Dec 14, 2022)

i want to get excited by this, but i can't after living with a 350W GPU that turns my home office/computer room into a sauna after a couple hours of gaming, i can't. 

they really need to make a high end model that's tuned for lower power draw. i'd be perfectly happy with a card with like 85% of the performance that pulls under 200W. maybe make it a 2 slot cooler if they really want to get crazy. that way i don't have to vertically mount or watercool this lump of s*** just to fit other expansion cards without blocking the card's airflow.


----------



## nguyen (Dec 14, 2022)

wolf said:


> When they make content that isn't positive for Nvidia, I hear they're AMD shills, when they make content that isn't positive for AMD, I hear they're nvidia shills.
> 
> I just think they throw shade when they want, and praise when they want, and it has nothing to do with favouring either company.
> 
> Examples of both sides are easy to come by.



Tim probably contributed to the recent shift in HUB's opinions regarding RT and DLSS.
When Steve ran the show alone, all he cares is price to performance and rasterization, which is not only closed-minded but also hypocritical (for a tech focused channel).


----------



## Jism (Dec 14, 2022)

That OC potential we have'nt seen for a while, lol. I think AMD simply stick to power values of being a 350W card.

Imagine planting a nice waterblock on it. 3.3Ghz ~ 3.5Ghz core shoudnt be that difficult.


----------



## W1zzard (Dec 14, 2022)

The machine that I test OC on does not have hardware power measurement capability. Moving to a different box just for the sake of getting OC power numbers seems like a lot of work (time is ultra-limited for those reviews), so I've never considered it.

Seems OC power numbers are really needed, also going by all the comments on Reddit, so I'll probably move the whole OC testing to the power testing rig. I wanted to switch to a different test that Unigine anyway, maybe the new Witcher

Another option could be to report the card's software power measurements before and after OC, which could be an option, too.


----------



## Jism (Dec 14, 2022)

W1zzard said:


> The machine that I test OC on does not have hardware power measurement capability. Moving to a different box just for the sake of getting OC power numbers seems like a lot of work (time is ultra-limited for those reviews), so I've never considered it.
> 
> Seems OC power numbers are really needed, also going by all the comments on Reddit, so I'll probably move the whole OC testing to the power testing rig. I wanted to switch to a different test that Unigine anyway, maybe the new Witcher
> 
> Another option could be to report the card's software power measurements before and after OC, which could be an option, too.



Without power measurement equipment, you can still use Hwinfo both on stock and OC. Just measure either the avg or maximum.

However i think the card on it's own stands it's ground. Its fast enough already.


----------



## W1zzard (Dec 14, 2022)

Jism said:


> Without power measurement equipment, you can still use Hwinfo both on stock and OC. Just measure either the avg or maximum.


That's the software measurement I mentioned, it's not exactly accurate, but could be better than nothing. Traditionally I do things right, so I'll figure something out


----------



## nguyen (Dec 14, 2022)

W1zzard said:


> That's the software measurement I mentioned, it's not exactly accurate, but could be better than nothing. Traditionally I do things right, so I'll figure something out



A watt-meter is good enough, simple is best 
At the end of the day, people pay for the electricity of the entire PC, not the GPU alone anyways


----------



## W1zzard (Dec 14, 2022)

nguyen said:


> A watt-meter is good enough, simple is best


It will be SUPER confusing if I mix card-only power with full system power on the OC page only


----------



## Jism (Dec 14, 2022)

W1zzard said:


> It will be SUPER confusing if I mix card-only power with full system power on the OC page only



There's other sites that measure from both PCI-E slot up to PCI-E 12V > https://www.tomshardware.com/reviews/amd-radeon-rx-7900-xtx-and-xt-review-shooting-for-the-top/8

Not sure if you want to put so much detail into a card review either. I bet its time consuming ramping up every card like that.


----------



## W1zzard (Dec 14, 2022)

Jism said:


> There's other sites that measure from both PCI-E slot up to PCI-E 12V > https://www.tomshardware.com/reviews/amd-radeon-rx-7900-xtx-and-xt-review-shooting-for-the-top/8
> 
> Not sure if you want to put so much detail into a card review either. I bet its time consuming ramping up every card like that.


Huh? I do a lot of testing with card-only power, too: https://www.techpowerup.com/review/asus-radeon-rx-7900-xtx-tuf-oc/37.html

Edit: they are using NVIDIA's PCAT, which is too slow imo (only 10 measurements per second)


----------



## lepudruk (Dec 14, 2022)

@W1zzard is there any chance for adding Next Gen Witcher 3 to the test results? I know it just showed up but it seems it has a very strong impact on GPUs. It would be nice to have a comparisson vs Nvidia 40X0 serie.


----------



## W1zzard (Dec 14, 2022)

lepudruk said:


> @W1zzard is there any chance for adding Next Gen Witcher 3 to the test results? I know it just showed up but it seems it has a very strong impact on GPUs. It would be nice to have a comparisson vs Nvidia 40X0 serie.


I will definitely use it in future testing, but for these reviews it's not possible .. going on holiday tomorrow for a week, then it's basically xmas, I have a bunch of other NDA reviews coming up, then going on holiday again around NYE

"adding a game" for me means play through at least most of it, make notes of GPU loading, FPS, where good test scenes are, then pick a scene that's "realistic, demanding, not worst-case"


----------



## spnidel (Dec 14, 2022)

welp, after more thought put into this I've come to my personal conclusion that these cards are overpriced because the 7900 XT is really the 7800 XT in disguise, and the XTX is the real 7900 XT

this 7900 XT is AMD's equivalent of nvidia's 4080 12GB, yet I haven't seen them getting as much shit for it - in fact, most I saw is them being praised for being 'pro-consumer' and giving us, unwashed plebs, a X900 card for $100 less than the previous generation's 6900 XT's MSRP of $999 - real fuckin clever of AMD, I'll give them that

the 7900 XT is only 33% faster than a 6800 XT - what will that make the "real" 7800 XT? 15% faster? 20% faster? at what MSRP? I bet it'll be higher than that of 6800 XT's, which further shows how pathetic this generational leap is - we're back to R9 200 -> R9 Fury days LOL

this generation is a total bummer - high-end performance price goes up while carefully marketed as a "better deal than last gen's 6900 XT" to give the illusion that they're not just selling the 7800 XT as a X900 tier card

6800 XT MSRP - $650
3080 MSRP - $700
7900 XT (real: 7800 XT) - $900 (+$250 over last gen high end)
4080 - $1200 (+$500 over last gen high end)

no amount of "but it's new tech" cope will change it - as a person looking to buy a graphics card I DON'T GIVE A SHIT if it's got 6,000,000nm chiplet & fishlet tech, I want a better deal than last gen, not a worse deal

if this trend continues, and if dumbfuck gamers continue buying either lineup from AMD or nvidia, you'll see the 5080 for $1600, the 5090 for $2500, and AMD will follow suit, what with majority shareholder companies being the same in both AMD and nvidia, as pointed out by someone on here


----------



## beedoo (Dec 14, 2022)

I don't give a crap what AMD or Nvidia call their cards; they could call them the Ford Focus and Toyota Corolla for all I care. Read the reviews, do you comparisons and buy it if you like, or don't... It's your choice.


----------



## wolf (Dec 14, 2022)

W1zzard said:


> Seems OC power numbers are really needed, also going by all the comments on Reddit, so I'll probably move the whole OC testing to the power testing rig. I wanted to switch to a different test that Unigine anyway, maybe the new Witcher
> 
> Another option could be to report the card's software power measurements before and after OC


I think there's a lot of appetite to look at extended/oc performance per watt numbers, specifically in this instance. It appears RDNA3, specifically so far N31, more so than other architectures in recent years is power constrained in reference form and benefits readily from more power. It could be that rather than being given a power limit at/near the peak of clocks X voltage X stability point, the reference cards wound that back to hit the design/marketing goal of 350w board power, leaving board partners to exploit the headroom.

I'd certainly be very keen to see any testing and analysis you'd be willing to do, tall order 11 sleeps from Christmas though


----------



## b1k3rdude (Dec 14, 2022)

spnidel said:


> 6800 XT MSRP - $650 (UK price is £681)
> 3080 MSRP - $700 (UK price is £743)
> 7900 XT (real: 7800 XT) - $900 (+$250 over last gen high end)
> 4080 - $1200 (+$500 over last gen high end)
> ...


I couldn't agree more.

Like a growing number of PC owners, I am utterly puzzled by nVidia/Amd marketing/pricing of "well its faster than the old GPU so you should pay more" b$. To be clear, even the curent prices that are at or just below the MSRP's of last gen flagships indicated above are also not acceptable.

I was fortunate with my last 2x GPU upgrades -

In 2020 I traded 1080Ti (paid £450 in 2018, exch. val. £400) for 3070, upgrade cost £210. 
Then 2yrs year later traded the 3070 (price £610  in 2020, exch. val. in 2022 £400) for a 3080-12gb (price £610), which cost £210. 
I applied the same trade-in method when I upgraded from LGA1150 to AM4. So when adding all the upgrades, I have carefully spent around £1200 over 4yrs on upgrades. So spending around that amount in one go, on one item, hard pass...!!!


----------



## WhoDecidedThat (Dec 14, 2022)

W1zzard said:


> The machine that I test OC on does not have hardware power measurement capability. Moving to a different box just for the sake of getting OC power numbers seems like a lot of work (time is ultra-limited for those reviews), so I've never considered it.
> 
> Seems OC power numbers are really needed, also going by all the comments on Reddit, so I'll probably move the whole OC testing to the power testing rig. I wanted to switch to a different test that Unigine anyway, maybe the new Witcher
> 
> Another option could be to report the card's software power measurements before and after OC, which could be an option, too.


software power readings are good enough sir, just make a note that they are software readings


----------



## RainingTacco (Dec 14, 2022)

Can someone explain to me, why AMD abandoned the OC by curve tuning like it is done in afterburner? It was excellent stuff, and now they returned to these primitive sliders. Why?


----------



## medi01 (Dec 14, 2022)

Holy smoke:






This card deserves OC performance testing in more games.


----------



## kapone32 (Dec 14, 2022)

medi01 said:


> Holy smoke:
> 
> View attachment 274467
> 
> This card deserves OC performance testing in more games.


I am not sure if I should order the card or the Water block first. The As Rock Aqua is too expensive for me so I am getting a block from Alphacool. I wish like EK they provided single back plate slot adapters. For those of us that want to OC make sure we get the 3x8 pin variant. I expect that there will be some serious OC profiles for these cards by Feb-Mar. I might add another 360 by getting the new Alphacool Eisbaer. I really like how big the cold plate is and it has never had to deal with Asetek BS. I dislike the narrative that OC does nothing to improve Ray Tracing. As I was playing Grid Legends this morning I was thinking would I want Ray Tracing and DLSS or Unreal Engine 5 and FSR? Then I remembered I was playing a racing Game in 4K so the car model was already stellar at 120 FPS in 4K with my 6800XT. So more FPS in every Game? Yes? If they tune this card the same way they have tuned 6000 series with monthly drivers we will (buyers) all be so glad we grabbed on of these and I didn't want to say it but if (when) Mining becomes a thing again these cards should be frigging awesome at that too.



RainingTacco said:


> Can someone explain to me, why AMD abandoned the OC by curve tuning like it is done in afterburner? It was excellent stuff, and now they returned to these primitive sliders. Why?


I suspect that AMD is not going to fully open the software package until they can work on the issues they have identified.


----------



## sector-z (Dec 14, 2022)

W1zzard said:


> The machine that I test OC on does not have hardware power measurement capability. Moving to a different box just for the sake of getting OC power numbers seems like a lot of work (time is ultra-limited for those reviews), so I've never considered it.
> 
> Seems OC power numbers are really needed, also going by all the comments on Reddit, so I'll probably move the whole OC testing to the power testing rig. I wanted to switch to a different test that Unigine anyway, maybe the new Witcher
> 
> Another option could be to report the card's software power measurements before and after OC, which could be an option, too.


Yes your idea is what I talked about but it was more to have the Power limit of each card like Nvidia review. Like 6900XT Ref it was 293W with the +15%. What I wanted to know is the max it give after the 15% on each card


----------



## erek (Dec 14, 2022)




----------



## Veseleil (Dec 14, 2022)

erek said:


>


That voice...


----------



## Neizel (Dec 15, 2022)

wolf said:


> I think there's a lot of appetite to look at extended/oc performance per watt numbers, specifically in this instance. It appears RDNA3, specifically so far N31, more so than other architectures in recent years is power constrained in reference form and benefits readily from more power. It could be that rather than being given a power limit at/near the peak of clocks X voltage X stability point, the reference cards wound that back to hit the design/marketing goal of 350w board power, leaving board partners to exploit the headroom.
> 
> I'd certainly be very keen to see any testing and analysis you'd be willing to do, tall order 11 sleeps from Christmas though


This looks like Vega OC boom again. 

Vega 56/64 was a lot of fun and performance gains were huge too.

Power and voltage were limiting these gpus, but that limit was that high that most cards hit power wall much before frequencies stopped going up.

So excited to see these cards on waterblocks, volt mods, power limits removed, etc.


----------



## terroralpha (Dec 15, 2022)

Veseleil said:


> That voice...


dude, i know. i stumbled across her channel like 5 years ago. i go watch it for asmr sometimes.


----------



## erek (Dec 15, 2022)

Veseleil said:


> That voice...


like it?


----------



## Veseleil (Dec 15, 2022)

erek said:


> It's calming alright.  And the accent is great.


----------



## Mussels (Dec 15, 2022)

3090 power consumption with 4080 performance
that's pretty solid

It's surprising to see my 3090 get pushed so far down the lists, so fast


----------



## Pastrav (Dec 15, 2022)

the54thvoid said:


> Hmm, this card seems waaaayyyyy better than the XFX one. But again, price. I'm drawn to the more efficient 4080. But not at the price point. And this card looks to be heading toward that point too, though, the TUF cards are usually cheaper.


I was comparing the 2 as well. My initial impression is that they're both overwhelmingly similar to the reference design.
the XFX PCB looks shorter; keeps the 3 phase VRM design for the memory and the vaporchamber on the cooler
while the ASUS PCB is longer; has 4 phases for memory and doesn't have a vaporchamber.
Couldn't find any info about other components or indicators of quality.

What makes the ASUS one way better than the XFX ?


----------



## the54thvoid (Dec 15, 2022)

Slightly faster on OC but also very much quieter in use. Noise is a deal-breaker for me. Shame it's the price it is.


----------



## qbsterman (Dec 15, 2022)

Great job I just wonder how much more frequency we can squeeze when we apply ek waterblock that just release ,is it radeno 290x times back ? I don't really care about RT but I do love my low frame time and my FPS Lowe's being higher on AMD cards while high refresh gaming .


----------



## MArev (Dec 15, 2022)

erek said:


>


Finally, a tech barbie.



qbsterman said:


> Great job I just wonder how much more frequency we can squeeze when we apply ek waterblock that just release ,is it radeno 290x times back ? I don't really care about RT but I do love my low frame time and my FPS Lowe's being higher on AMD cards while high refresh gaming .


I thought the same, I want to see the beast unleashed with a mora3.


----------



## erek (Dec 16, 2022)

MArev said:


> Finally, a tech barbie.


opinion?


----------



## rattlehead99 (Dec 19, 2022)

How much power does it consume when overclocked?


----------

