# "Radeon RX 6600 XT is most efficient mining card yet"



## P4-630 (Aug 11, 2021)

__
		https://www.reddit.com/r/Amd/comments/p2ai8r

They can have them, I don't buy one of these anyway...


----------



## nguyen (Aug 11, 2021)

well kinda but not really, 3060 Ti and 3070 (non LHR) can get 60MH/s at 100W (.6 PPW) with a bit of undervolt, but let the miners take them 6600XT


----------



## Chrispy_ (Aug 11, 2021)

Debatable:

Looking at a single card then yes, 32MH/s at 55W is impressive
Looking at a rig of 6 cards it's 192MH/s for 330W + 75W overhead for the motherboard, CPU@~30W, RAM, fans, SSD, etc for a net of 405W for 192MH/s

Using more powerful cards like a vanilla 5700 8GB at 55MH/s for 110W can't quite match the hashrate-per-Watt on a per-card basis but you need fewer rigs to produce the same hashrate and that rig overhead can be almost halved.

A rig of 6x 6600XT is 192MH for 405W, which is 0.47ppw
A rig of 6x 5700 is 330MH for 735W which is 0.45ppw

So it's good, but close enough that halving the number of rigs taking up space, power sockets, network points, per-rig maintenance/management/effort is way more beneficial than squeaking a 4% efficiency gain. It isn't going to convince miners to swap to these unless they're vastly cheaper than the alternatives to purchase and very widely available.


----------



## Chomiq (Aug 11, 2021)

Let them have it.


----------



## mb194dc (Aug 11, 2021)

So there is a point to its existence !


----------



## R-T-B (Aug 11, 2021)

Chrispy_ said:


> Debatable:
> 
> Looking at a single card then yes, 32MH/s at 55W is impressive
> Looking at a rig of 6 cards it's 192MH/s for 330W + 75W overhead for the motherboard, CPU@~30W, RAM, fans, SSD, etc for a net of 405W for 192MH/s
> ...


This.  Board + CPU overhead is a huge factor people are forgetting...


----------



## Cheese_On_tsaot (Aug 11, 2021)

Just grab an RTX 2060


mb194dc said:


> So there is a point to its existence !


No.
Yes.
Maybe.


----------



## dragontamer5788 (Aug 11, 2021)

RDNA is bad at compute!! CDNA is for compute purposes!

/s

I wonder why this meme never dies.


----------



## Vya Domus (Aug 11, 2021)

dragontamer5788 said:


> RDNA is bad at compute!! CDNA is for compute purposes!


I don't think anybody ever said it's bad but compute is clearly not the main focus either.


----------



## Chrispy_ (Aug 11, 2021)

Vya Domus said:


> I don't think anybody ever said it's bad but compute is clearly not the main focus either.


Polaris/Vega/RDNA1/RDNA2 compute performance are kind of irrelevant - it's AMD's VRAM controllers that make them desirable for ETH, and I'm running all of the core clocks as low as possible because it has almost no benefit to ETH mining.




The only reason I don't run the cores lower than that is because there's no power/temp reductions from dropping it further whilst the memory controllers are running an overclock at full load.


----------



## GorbazTheDragon (Aug 11, 2021)

Other thing is these cards have of order twice the ROI time of other cards you can buy...


----------



## Hardcore Games (Aug 13, 2021)

As usual capital cost allowance for initial cost along with the inevitable failures

Chia enthusiasts are spinning their wheels trashing hard disks in the process by the thousands


----------



## kruk (Aug 13, 2021)

P4-630 said:


> __
> https://www.reddit.com/r/Amd/comments/p2ai8r
> 
> They can have them, I don't buy one of these anyway...


55W is misleading, because this is ASIC only power. The total power is 20-30W higher, which means it slides down the list to somewhere around RX 6800.


----------



## Chrispy_ (Aug 13, 2021)

kruk said:


> 55W is misleading, because this is ASIC only power. The total power is 20-30W higher, which means it slides down the list to somewhere around RX 6800.


Assuming you mean GPU-only power, are you sure about that?

Nvidia's driver outputs board-power and GPU-only-power seperately in things like GPU-Z and HWInfo
AMD's driver outputs only board-power for current-gen products. A 180W board total TDP product like the vanilla 5700 will, in something like Furmark, report a 180W usage, not 150W which is what you'd see if it was reporting the GPU power only.

I wasn't sure myself until I hooked up a kill-a-watt meter to my (now three) rigs:

18GPUs @ 107W
15fans @ 3W
3 CPUs @ ~30W
3 motherboards/SSD/WiFi @ ~20W
mutiply all by 1/0.92 for 80+ Gold PSU efficiency

Total 'from-the-wall' use (theoretically calculated from GPU-Z and TDP estimates of other parts) = 2305W
Actual kill-a-watt meter reading: 2275-2375W, fluctuates quite wildly but never saw north of 2400W.

If I had to add 20-30W per GPU I would expect the meter to read more like 2800W....


----------



## kruk (Aug 14, 2021)

Chrispy_ said:


> Assuming you mean GPU-only power, are you sure about that?



This was first pointed out by uzzi38:

__ https://twitter.com/i/web/status/1425604704214700032
Then AdoredTV contacted Keith May of PCWorld and they found out that it actually uses 75W: https://adoredtv.com/no-the-rx-6600-xt-is-not-the-best-mining-gpu/

It kind of makes sense, but I'm not really sure after I read your test, so ...


----------



## nguyen (Aug 14, 2021)

See lots of uninformed people keep saying RDNA2 draw 100W less than Ampere because of this. AMD driver power measurement doesn't monitor VRM power loss, which could be 30-40W at the higher end (6900XT) and maybe around 20W with the 6600XT


----------



## Chrispy_ (Aug 14, 2021)

To me that doesn't make sense. Google "VRM efficiency" and you see pages upon pages of the same graph - showing that VRMs have an afficiency of 94-95%

For a 30-40W loss on a 6900XT, that would mean that the card is using 600-800W of power, which we all know isn't happening, because the PCIe power connectors would melt way before that.


----------



## nguyen (Aug 14, 2021)

Chrispy_ said:


> To me that doesn't make sense. Google "VRM efficiency" and you see pages upon pages of the same graph - showing that VRMs have an afficiency of 94-95%
> 
> For a 30-40W loss on a 6900XT, that would mean that the card is using 600-800W of power, which we all know isn't happening, because the PCIe power connectors would melt way before that.



The higher the current handling the lower the VRM efficiency, for example here is a spec sheet of the IR3553 power stage.






Let say the 6900XT use 240W for the chip, with 8 phase VRM, each power stage handle 25A (assuming Vcore is 1.2V) each and produce 3.5W each, 8x 3.5= 28W of power loss for the GPU alone. Higher end SKU will use better power stages for higher efficiency.

Go to Youtube and search for Buildzoid channel, he does a lot of PCB analysing stuff and explain the VRM power loss.


----------

