• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASRock Radeon RX 6600 XT Phantom Gaming D

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,955 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The ASRock Radeon RX 6600 XT Phantom Gaming comes with the best cooler of all the RX 6600 XT cards we've tested so far. Fan settings are excellent, too: The card is whisper-quiet and runs only 61°C under full load. In our manual OC testing, we saw excellent results, better than all other RX 6600 XT cards.

Show full review
 

Can you check my post?
 
That's a very nice card and that's the kind of engineering that AMD cards need to make them an attractive option to people
 
Good job TechPowerUp
and good job AsRock on this card
 
Sacrifice the few to save the many. Now if they'll just halt production of wafers from the Karen halo tier cards and devout all of them to higher yield cards til supply gets better things can improve for the majority not the entitled minority.
 
Gaming power draw of 175 W is not terribly high! nah just 100 W above my target , nothing. (lol amd)
 
That's a very nice card and that's the kind of engineering that AMD cards need to make them an attractive option to people
They need to remove the bus limitation though
 
Sacrifice the few to save the many. Now if they'll just halt production of wafers from the Karen halo tier cards and devout all of them to higher yield cards til supply gets better things can improve for the majority not the entitled minority.
So, outside of there being 0 financial incentive for AMD to cut production of their high end parts to feed the entitled majority with cheap silicon, you ignore the reality of the market. AMD already tried this, it was called the RX 400 series, small dies with high yields that sold for low, low prices.

The end result? Nvidia's 1070, a $350 card, outsold the entirety of AMD's 400 series. Granted so did the 1060, but the mid high end market that AMD claimed was too small to cater a special GPU to turned out to be larger then their entire market share. The 1080 ALSO outsold the entire RX 400 lineup.

So which market should AMD abandon, the market willing to pay $1000+ for a GPU, or the market willing to pay $200 while constantly whining about prices? AMD's finances dont lie, somebody is buying these things, and so long as production is limited you produce as much of the high margin stuff as possible. That's just common sense.
 
So, outside of there being 0 financial incentive for AMD to cut production of their high end parts to feed the entitled majority with cheap silicon, you ignore the reality of the market. AMD already tried this, it was called the RX 400 series, small dies with high yields that sold for low, low prices.

The end result? Nvidia's 1070, a $350 card, outsold the entirety of AMD's 400 series. Granted so did the 1060, but the mid high end market that AMD claimed was too small to cater a special GPU to turned out to be larger then their entire market share. The 1080 ALSO outsold the entire RX 400 lineup.

So which market should AMD abandon, the market willing to pay $1000+ for a GPU, or the market willing to pay $200 while constantly whining about prices? AMD's finances dont lie, somebody is buying these things, and so long as production is limited you produce as much of the high margin stuff as possible. That's just common sense.
Ya. Sucks for everyone waiting for that 200-250$ card. It was before covid since we seen one of those. But corporations are run by people and the old saying goes….

Get while the getting is good

cause one day mining may crash hard and production may double demand and they will be selling them for a lot cheaper.
 
@W1zzard Thanks for another great review! Any word on whether you'll be getting the ITX-sized ASRock 6600 XT in for review? Would love to see that put through its paces.

Gaming power draw of 175 W is not terribly high! nah just 100 W above my target , nothing. (lol amd)
What an odd thing to say. "(lol amd)"? Does Nvidia have anything competitive to offer in the 75W range? I mean, yes, it's a crying shame that there haven't been any good 75W GPU options for years, but that's an industry problem, not an AMD problem. Now that AMD is at efficiency parity with (or even ahead of) Nvidia, there's significant hope for much better 75W cards if this shortage ever lets up and lower end GPUs launch. Given the efficiency of this (and assuming a ~120W 6600 non-XT), I could see a 75W 6500 XT or 6500 turn out to be an excellent performer.
 
@W1zzard Thanks for another great review! Any word on whether you'll be getting the ITX-sized ASRock 6600 XT in for review? Would love to see that put through its paces.


What an odd thing to say. "(lol amd)"? Does Nvidia have anything competitive to offer in the 75W range? I mean, yes, it's a crying shame that there haven't been any good 75W GPU options for years, but that's an industry problem, not an AMD problem. Now that AMD is at efficiency parity with (or even ahead of) Nvidia, there's significant hope for much better 75W cards if this shortage ever lets up and lower end GPUs launch. Given the efficiency of this (and assuming a ~120W 6600 non-XT), I could see a 75W 6500 XT or 6500 turn out to be an excellent performer.
gimped 8 lanes 128 bit gpu with 175w. you have to be INSANE to call that efficient design. its an overclocked, overpriced notebook chip.
 
gimped 8 lanes 128 bit gpu with 175w. you have to be INSANE to call that efficient design. its an overclocked, overpriced notebook chip.
xs2x1w6evx.jpg


how is that "not efficient"? at the very least you'd have to draw the line between efficient/not efficient in the middle of my test group. actually my test group favors recent releases a lot, if you compare it to older cards, too, it would certainly be "insanely efficient"
 
gimped 8 lanes 128 bit gpu with 175w. you have to be INSANE to call that efficient design. its an overclocked, overpriced notebook chip.
Whatever you're talking about here, it has no relation to efficiency. Heck, the only way bus width would count into any measure of efficiency would be that doing the same work with a narrower bus is more efficient. Not that I would argue that, as the power savings from the narrower bus is likely a couple of watts at best and thus are immaterial, but... there is no way a narrower bus spec makes this a less efficient design. That just doesn't make any type of logical sense.

You could always argue that it's bottlenecked by the narrower bus, but... it isn't. Sure, if you use a PCIe 3.0 platform and run some very specific games (Doom Eternal, are there others?) it's a bit bottlenecked. But PCIe 4.0 PEG slots have been the norm for AMD for two generations and Intel for one. It's not perfect, but it isn't a huge deal.
 
a 175w mid to low end part is not "efficient" by any stretch of imagination, HEY GUYS, LOOK AT MY "NEW AND EFFICIENT" LED LIGHT BULB... ONLY CONSUMES 175W VS that OLD VINTAGE RETRO 60W LIGTH BULB!!! BUT... IT s just a crappy lamp?... and SUCKS way MORE POWER?? AH nooo BUT look! IT IS BRIGTHER!

absurd

Regardless of performance, a single watt past 120w constitutes an energy hog , why? On paper this is a 75w gpu like the 1050ti / 1650 gpu, PERIOD. AMD is pushing to HELL that 7hnm node advantage, hence the stupid HI 175w TDP
 
Last edited:
a 175w mid to low end part is not "efficient" by any stretch of imagination, HEY GUYS, LOOK AT MY "NEW AND EFFICIENT" LED LIGHT BULB... ONLY CONSUMES 175W VS that OLD VINTAGE RETRO 60W LIGTH BULB!!! BUT... IT s just a crappy lamp?... and SUCKS way MORE POWER?? AH nooo BUT look! IT IS BRIGTHER!

absurd

Regardless of performance, a single watt past 120w constitutes an energy hog , why? On paper this is a 75w gpu like the 1050ti / 1650 gpu, PERIOD. AMD is pushing to HELL that 7hnm node advantage, hence the stupid HI 175w TDP
You have some very, very, very odd ideas on GPU segmentation and efficiency. I mean ... yes, this GPU consumes 175W. Though that's a pretty heavily factory OC'd version - the stock ones consume ~150W. So in terms of power draw, it's in the upper midrange area, though there have been high end cards not too far from this. That's not too strange - below 200W GPUs tend to be tightly stacked. Sure, it consumes a lot more power than the nominally higher "ranked" RX 570, but ... at the time the RX 570 was relevant, AMD didn't have a single GPU capable of competing above $300 (and that was before GPU prices went bananas).

You're classifying GPUs without considering performance at all. Which is utterly and completely absurd. I mean, what is efficiency? It is the ratio between input and output - the less input and more output, the more efficient, whatever you're looking at. Less results for more work = less efficient. You're saying "this consumes a lot of power, so it's inefficient" yet ... it outperforms the RTX 3060, which consumes more power. Are you railing at RTX 3060 reviews in the same way, shouting how these are inefficient garbage GPUs? I mean, come on man, get your act together.

You also say "on paper, this is a 75W GPU" - how, exactly? Seriously, which specs align with that? And does actual real-world performance reflect that statement in any way, shape or form? I mean, look at the chart @W1zzard posted above, ffs. This performs a full 25% better per watt of power consumed compared to the GTX 1650 Super. So, it consumes 2x the power, but it also performs 2,5x better. That? That's more efficient. And that's for an overclocked SKU with a significantly (25W, ~16%) increased power draw from stock.

If all you're going from is "this is a x600 tier card" - well, then you need to adjust your frame of reference. GPU numbering counts down from the highest end card. The faster those go, and the broader the range of useful performance, the higher up any given tier below the top will be. In the GTX 9xx series we had 5 relevant SKUs (980 Ti, 980, 970, 960, 950). In the 1100 series there were 8 (1080 Ti, 1080, 1070 Ti, 1070, 1060 6GB, 1060 3GB, 1050 Ti, 1050). In the RTX 2000/GTX 16 series there are 7 + 6 (2080 Ti, S, -, 2070 S, -, 2060 S, -, plus the GTX 1660 Ti, S, -, 1650 Ti, S, -). See how things are expanding? And in addition to that, AMD's previous GPUs were segmented to make their highest end at the time, which was a midrange GPU (not even upper midrange, no), look more "higher end" by calling it "80". Hence the RX 580 being soundly outperformed by the RX 5600 XT, despite that nominally being two tiers further down the stack.

So no, there is nothing about this GPU that is in the same tier as 75W GPUs. That is obviously not saying that it's not overpriced - it very much is. In a sane world, this would be a $300 GPU. But it's not a $150 75W GPU, and it never will be.

Oh, and another thing about efficiency: for rasterized graphics, closer to stock 6600 XTs are the second most efficient GPU in existence.
energy-efficiency.png


I mean, if you don't want a GPU that performs to this level, that is perfectly fine. That doesn't make those that do so any less efficient. Your lightbulb analogy makes no sense whatsoever.

Edit: derp
 
Last edited:
1650Ti only exists for mobile. There is no desktop 1650Ti, only 1650 and 1650S.
 
its clear to me that the design of this gpu is intended only to cut costs and maximize profits for AMD, this is a laptop solution, in a market starved for gpus, "In a sane world" this is should be a sub $200 gpu, with a 75 to 100w tdp Max, but they think they have GOLD in their claws so AMD milks this pos like MAD.





 
its clear to me that the design of this gpu is intended only to cut costs and maximize profits for AMD, this is a laptop solution, in a market starved for gpus, "In a sane world" this is should be a sub $200 gpu, with a 75 to 100w tdp Max, but they think they have GOLD in their claws so AMD milks this pos like MAD.
So tell us then, how would they achieve this 50% drop in power draw? Just reducing clock speeds? Nothing else? I mean, that would kind of be a dream budget GPU, as you could ocerclock it right back to this level of performance. Wide and slow layouts are also extremely efficient, so it would no doubt perform well. But that isn't how GPUs are made, and has never been so. Can you show an example of a single desktop GPU that runs at half the clock speed of others on the same architecture and node? No, because that doesn't exist. I guess it would be equivalent to Nvidia's Max-Q approach, but... those GPUs are really expensive for their level of performance.

And, again, are you equally insistent that Nvidia should have made the 3060 run at 75W and cost much less? Seeing how it consumes more power than the 6600 XT and performs worse across the board, that would only be logical, right?

It's also pretty fascinating to see the utter lack of actual arguments here. Shouting something repeatedly doesn't make it true. And if you're trolling, sadly you're not doing a good job of it.
 
Back
Top