• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASRock Radeon RX 6600 XT Phantom Gaming D

Joined
Mar 28, 2020
Messages
1,779 (1.01/day)
Personally, I feel if we focus on the performance of the card vs the power draw, it is not a bad trade off. The card in most cases is keeping up or faster than a 5700 XT which draws 220W or more. However, AMD may have fallen short of the 50% efficiency improvement from RDNA to RDNA2 that they advertised. In my opinion, the only card that I can compare from AMD's RDNA2 lineup is the RX 6800 which see a significant step up in performance for the power it draws over the 5700 XT at around the same TDP.


In any case, I feel the MSRP is very high for a card at this range. The cooler may have been over engineered for this card, resulting in low load temps.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.74/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Personally, I feel if we focus on the performance of the card vs the power draw, it is not a bad trade off. The card in most cases is keeping up or faster than a 5700 XT which draws 220W or more. However, AMD may have fallen short of the 50% efficiency improvement from RDNA to RDNA2 that they advertised. In my opinion, the only card that I can compare from AMD's RDNA2 lineup is the RX 6800 which see a significant step up in performance for the power it draws over the 5700 XT at around the same TDP.


In any case, I feel the MSRP is very high for a card at this range. The cooler may have been over engineered for this card, resulting in low load temps.
Promises like that are always pretty idealized though - no doubt there's a very specific comparison that manages a 50% improvement, but it won't be a maxed-out SKU with clocks pushed high. The efficiency gap between the 6700 XT and 6800 illustrates that quite nicely. In a similar vein, RDNA(1) covered a huge efficiency span, with the 5700 XT being pretty bad, the 5700 being good, and the 5600 XT being amazing for its time before suddenly receiving a last-minute factory OC pushing it into "still pretty good, but no longer class leading" territory. Specific implementations and tuning matters a lot. As does the test scenario, resolution, etc.

An example: The RX 6800 in TPU's reviews is 61% more efficient than the RX 5700 XT at 2160p, but only 33% more efficient at 1080p. At 1440p it hits the 50% improvement mark dead on. On the other hand, other RDNA(1) SKUs fare quite differently in the same comparison (1080p/1440p/2160p):
Improvement over the 5600 XT (new BIOS): 10% / 27% / 37%
Improvement over the 5700: 11% / 27% / 37%
Improvement over the 5700 XT: 33% / 50% / 61%

For a more like-for-like comparison (high end XT vs. XT): 6800 XT perf/W improvement over 5700 XT: 15% / 33% / 27%

So there's defnitely a 50% improvemnt to be found if you look hard enough, but something slightly above 30% seems far more realistic when comparing the range of implementations of each architecture more broadly across a range of test scenarios. What will be really interesting to see is what smaller RDNA2 SKUs bring to the table in efficiency at lower resolutions. Sadly TPU doesn't post per-resolution efficiency numbers any more, which likely makes it look less efficient overall due to its poor performance at higher resolutions compared to lower ones. The 6600 XT Strix review puts it at 18% more efficient than the 5600 XT overall, but it delivers 33% more performance at 1080p, so assuming both GPUs are at their listed gaming power draw (166W vs. 152W), that's something like a 24% efficiency increase at that resolution. Again, not ground-breaking, and certainly not 50%, but kind of promising for lower SKUs given how far the 6600 XT is pushed in terms of clocks.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,122 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Sadly TPU doesn't post per-resolution efficiency numbers any more, which likely makes it look less efficient overall due to its poor performance at higher resolutions compared to lower ones.
Great point. Efficiency is now based on Cyberpunk @ 1440p btw
 
Joined
May 2, 2017
Messages
7,762 (2.74/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Great point. Efficiency is now based on Cyberpunk @ 1440p btw
Thanks! Btw, not to nag, but is there any chance we'll see a review of the 6600 XT Challenger ITX?
CyberPunk2077 is the new "Crysis", and that's not a bad thing.
True. Though IIRC Crysis did technically work at launch (just not particularly well on most systems), which CP2077 kind of didn't. That being said, things do seem to have improved quite a bit since that debacle, and it's pretty great to once again have one of those "you can't really run this at high resolution Ultra, period" games to test.
 
Joined
Jul 5, 2013
Messages
28,859 (6.83/day)
Though IIRC Crysis did technically work at launch (just not particularly well on most systems), which CP2077 kind of didn't.
I remember that launch well. Crysis had just as many glitches as CP2077. The difference between then and now is that people were not the whiny cry-babies they are now. Remember, the Crysis launch was a full human generation ago(tell me you don't suddenly feel old..).
 
Joined
May 2, 2017
Messages
7,762 (2.74/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I remember that launch well. Crysis had just as many glitches as CP2077. The difference between then and now is that people were not the whiny cry-babies they are now. Remember, the Crysis launch was a full human generation ago(tell me you don't suddenly feel old..).
Isn't a generation typically defined at around 25 years? :p But you're probably right - given the difference in my understanding of everything computer related now vs. then, I probably just accepted it as how things were at the time. Of course there's also the question of how much bugs actually manage to bother you if you're playing a game at 20fps in 640x480 :p
 
Joined
May 2, 2017
Messages
7,762 (2.74/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Yes, I said there's the 1650 and the 1650S ...
I guess I stand corrected: there are "only" 12 desktop SKUs in Nvidia's previous GPU generation and not 13. :D Still quite the increase from the 5-SKU 900 series though!
 
Joined
Apr 28, 2021
Messages
24 (0.02/day)
ok, enjoy your free gpu, personaly i will not buy this pos. and nvidia is worst. so what? this cards are already crazy expensive over here, 800- 900 usd so i don't really mind paying extra for real efficiency, i have the money, but this? is a joke and insult of gpu
 
Joined
May 2, 2017
Messages
7,762 (2.74/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
ok, enjoy your free gpu, personaly i will not buy this pos. and nvidia is worst. so what? this cards are already crazy expensive over here, 800- 900 usd so i don't really mind paying extra for real efficiency, i have the money, but this? is a joke and insult of gpu
Again: what are you talking about? GPUs are most definitely not free for anyone - you say as much yourself. Though I guess you're implying that the people disagreeing with you are AMD shills, which ... yay. What an innovative and productive line of argumentation. /s

And again: paying extra for "real efficiency" - what GPUs are that? The 3060, which consumes more power and performs worse? Or the 3060 Ti which consumes even more power but beats this? Nobody here is talking about money, we're talking about your absurd definition of "efficiency", whatever it is. Nobody here is saying what you should or shouldn't buy - but we're trying to present at least a modicum of rationality into what is clearly a pretty chaotic line of reasoning.

GPUs are crazy expensive everywhere. The 6600 XT has an inflated MSRP and even more inflated street prices - as with literally every GPU on the market today. Whether it's in line with others, better or worse is essentially random, and down to the whims of scalpers and people squeezing out margins in your particular corner of the world. The last thing we as customers needs to put on ourself in a market like that is further irrational and selective reasonings beyond "get whichever GPU performs best at the best price you can afford". We certainly don't need to be pointing out meaningless and irrelevant-in-practice spec "deficits" and proclaiming them to be the most important characteristic of a GPU - that's just doing ourselves and everyone else a disservice.
 
Joined
Sep 2, 2020
Messages
1,491 (0.92/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
Def gonna put this gpu in line to upgrade my aging 1060 once i can get it msrp
after i upgrade the cpu ofc
 
Top