• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASRock Radeon RX 5600 XT Phantom Gaming D3

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,637 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
ASRock's Radeon RX 5600 XT Phantom Gaming D3 is the first card that was designed from the ground up as an RX 5600 XT; it isn't just an RX 5700 XT design with the RX 5600 XT GPU slapped on. Thanks to a BIOS update, the card runs at the highest spec allowed by AMD yet remains cool and quiet.

Show full review
 
Just a few little things.

In the power consumption tables there's no reference 5600 or 5600XT.

Something I'd personally love to see is an average framerate table that is broken out for each API.

Great review, though personally I like that AMD makes solid reference designs. Leave the junky reference builds to NV. The 5600 being a cut down 5700 makes monetary sense that the reference board is basically a 5700. It also allows the AIBs to build cards like this that are custom without a large price penalty.

I'm curious

Could you try setting the minimum fan speed to 10 or 20% and see if the card has that same response or if it ramps normally? That start-up response is suspicious. What's the max fan speed?
 
Last edited:
In the power consumption tables there's no reference 5600 or 5600XT.
There's no 5600 non-XT (supposedly an OEM model exists, but can't be found anywhere). For the XT I'm unsure which card I should designate "reference", I'm leaning towards XFX THICC II.
 
Great review, sir! Nice looking card as well.
 
I'm not sure saying that the 5600 is on the same terms as the 1600 series is valid. Turing, even without the Ray Tracing FPS killer is still technologically superior, Mesh Shaders are the next requirement for the future versions of Direct3D and Khronos (Vulkan/OpenGL), and RDNA1 lacks that.
 
People sings different when AMD's hardware has advenced feature.
GCN 2 GPU, better D3D12 performence and better hardware support in 2014, Who buy for future, we buy for the games currently available.
HD 5000 Series : D3D11 is not relevant for the current time.
6 month before : Who need D3D12/Vulkan, D3D11 performence is enough. Now same people are saying we wont buy because the GPUs dont support D3D12 Ultimate.
Raytracing could be become a gimick IF future AMD gpus performe better than Nvidia GPUs in Raytracing.
 
People sings different when AMD's hardware has advenced feature.
GCN 2 GPU, better D3D12 performence and better hardware support in 2014, Who buy for future, we buy for the games currently available.
HD 5000 Series : D3D11 is not relevant for the current time.
6 month before : Who need D3D12/Vulkan, D3D11 performence is enough. Now same people are saying we wont buy because the GPUs dont support D3D12 Ultimate.
Raytracing could be become a gimick IF future AMD gpus performe better than Nvidia GPUs in Raytracing.

There is really no point for DX12 to even exist initially, there is nothing new that DX12 bring aside from async compute and that is a hit or miss. Most games perform worse in DX12 API compared to DX11 API and there is no visual difference. Even RE3 remake which just released recently perform worse in DX12 than in DX11 API.
Really DX12 is more about leveraging CPU performance, which no one cares anymore since AMD has brought their CPU performance up to parity with Intel, or more.
Now with DX12 Ultimate mainly focuses GPU performance and better visual, for me this should be called the real DX12
 
I fail to see how this GPU is as efficient as Turing, considering that this architecture already is running nearly as efficient as can be.
Turing on the other hand, you can overclock, and run a 2060 at 127 Watts, and get near (within 5% of) stock performance, but at a reduction of 40-45 Watts from stock,or a 33% efficiency gain.
I highly doubt AMD is even near this efficiency, granted for mere gaming it wouldn't matter much (other than running hotter and thus run at lower boost fps).
For deep learning, power consumption is vital, and a GPU at this caliber running 24/7/365 under full load, consuming $40 on electricity more in a year, and suddenly a 2060 makes much more sense. 2060 super, or 2060ko if it ran 2 years consecutive.
 
Turing on the other hand, you can overclock, and run a 2060 at 127 Watts, and get near (within 5% of) stock performance, but at a reduction of 40-45 Watts from stock,or a 33% efficiency gain.
What's your result for UV OC on Navi 10?

other than running hotter and thus run at lower boost fps
Temperature affects boost freq only on NVIDIA, not on AMD, because the algorithms are designed differently

For deep learning [...] suddenly a 2060 makes much more sense.
Not sure if deep learning makes sense on any card in this performance range. I'd also claim that if you do deep learning you'll be using NVIDIA anyway, because CUDA and the rest of the ecosystem
 
I still don't get why these are mentioned as pros:
- PCI-Express 4.0
- 7 nanometer production process.

It would make more sense to say that a card is vegan or kosher. :)

Other than that: you should really consider including all tested cards in the noise graphs.
You're comparing (mostly) high-end AIB coolers with reference designs, which means almost all tested cards land on top of the "competition".
A reader, choosing between AIB options, has to write down results from multiple reviews to actually learns something useful.

I know it may not be that important for an average TPU reader, but since you've already bothered to buy the B&K 2236...
I mean: the noise part of the review makes an impression that it's added just as a formality - because everyone else does it.
 
I fail to see how this GPU is as efficient as Turing, considering that this architecture already is running nearly as efficient as can be.
Turing on the other hand, you can overclock, and run a 2060 at 127 Watts, and get near (within 5% of) stock performance, but at a reduction of 40-45 Watts from stock,or a 33% efficiency gain.
I highly doubt AMD is even near this efficiency, granted for mere gaming it wouldn't matter much (other than running hotter and thus run at lower boost fps).
For deep learning, power consumption is vital, and a GPU at this caliber running 24/7/365 under full load, consuming $40 on electricity more in a year, and suddenly a 2060 makes much more sense. 2060 super, or 2060ko if it ran 2 years consecutive.

Cool story dude but this is a consumer GPU. If you buy this for professional or scientific usage, your at the wrong stage with undervolts and overclocks in the first place.
 
I fail to see how this GPU is as efficient as Turing, considering that this architecture already is running nearly as efficient as can be.
It's more or less as efficient as Turing on stock settings, which is what matters for most users.
Turing on the other hand, you can overclock, and run a 2060 at 127 Watts, and get near (within 5% of) stock performance, but at a reduction of 40-45 Watts from stock,or a 33% efficiency gain.
You can throw it away for 100% less power consumption. Gaming isn't productive anyway. Read a book.
For deep learning, power consumption is vital, and a GPU at this caliber running 24/7/365 under full load, consuming $40 on electricity more in a year, and suddenly a 2060 makes much more sense. 2060 super, or 2060ko if it ran 2 years consecutive.
A 2060 running non-stop for deep learning seems like such a bad idea. Why would you?
If you're into competitive DL and you need a local GPU, get a more powerful card.
If you're learning DL, move to cloud. Running non-stop on a mid-range card at home doesn't make much sense.

Also, as @Jism already said, for DL Nvidia is the obvious choice right now. Too much fuss to make it work on AMD. Not worth it.
Unless of course you're on a Mac, where you're limited to AMD, so still no choice...
 
Nice cooler with the low nosie levels.
 
Nice cooler with the low nosie levels.
Well, it's just big. Huge heatsink, 3 fans. Probably roughly the same one they put on 5700XT Phantom Gaming.

Given the price and performance difference, I bet gamers with large ATX systems will go for the 5700XT anyway.
And those who are limited by the size or power consumption will look for a smaller 5600XT.
 
Given the price and performance difference, I bet gamers with large ATX systems will go for the 5700XT anyway.
5700 XT is 20% faster, but 40% more expensive
 
5700 XT is 20% faster, but 40% more expensive
Of course, but will gamers really look at that? I honestly don't know.

5700XT is ~$100 more expensive, which is probably around 10% a typical PC with such a card.
For that 10% you get a GPU that's way more confident in 4K and will last for a generation longer.

If I were in that situation and I expected to keep gaming for another 3+ years, I'd go for the 5700XT. :)
 
So Asrock design their own PCB to save costs... then strap a ridiculously, unnecessarily large and expensive cooler on it. If they were smart they would've used a cooler like Sapphire did with the Pulse, as pretty much every review shows that the 5600 XT is a very cool chip even at load (also it doesn't overclock very much).

Personally I'd prefer to see small 5600XT cards, not these overbuilt monstrosities.
 
Would you consider lack of other DX12 ultimate features (besides RTX) a major con as well? Non Varaiable Rate Shadering, no mesh shadering. That would make the Navi Gen1 fair quite poorly for titles that utilize these performance boosting features from DX12 Ultimate.
 
well, i think its NOT faster than rtx 2060.
reason is that that 5600 xt is sky high top oc'd model, but it compare as usual this kind techpower for nvidia reference model, aka rtx 2600 F.

so,lets take asus strix advance OC model and then we see that that Phamtom its not even close...
there is test techpoerup gpu sector...

also, i think clear that 5600 xt is same as rx 5700 gpu,even power consuption show it,very high..and very very high for 7nm gpu.
mesning rtx 2700 is better, with power consuption.


also price ruin it alot,its best option or was it.


hmmm,naah.


hmm#2,lets see when nvidia release also 7nm ampere, rtx 3060 is more than enough,also rx 5700 xt gpu...let see
 
There's no 5600 non-XT (supposedly an OEM model exists, but can't be found anywhere). For the XT I'm unsure which card I should designate "reference", I'm leaning towards XFX THICC II.

I kind of figured, I know there's a spec for a non-xt but they don't seem to be getting much air time.

That makes sense and the XFX does seem to be sporting a very reference style style PCB, so I would support that choice.
 
It's more or less as efficient as Turing on stock settings, which is what matters for most users.

You can throw it away for 100% less power consumption. Gaming isn't productive anyway. Read a book.

A 2060 running non-stop for deep learning seems like such a bad idea. Why would you?
If you're into competitive DL and you need a local GPU, get a more powerful card.
If you're learning DL, move to cloud. Running non-stop on a mid-range card at home doesn't make much sense.

Also, as @Jism already said, for DL Nvidia is the obvious choice right now. Too much fuss to make it work on AMD. Not worth it.
Unless of course you're on a Mac, where you're limited to AMD, so still no choice...


Eh no? Plenty of researchers use RTX 2060 for CUDA or Tensorflow accelerated work

NOBODY in their right mind will try to use AMD GPU for ANY computation work except mining for crypto-coins. OpenCL support on Navi is a shit show. On top of that there are very limited toolkit developed on OpenCL. Just no

Annotation 2020-04-14 114910.jpg

Annotation 2020-04-14 1149101.jpg



 
People sings different when AMD's hardware has advenced feature.
GCN 2 GPU, better D3D12 performence and better hardware support in 2014, Who buy for future, we buy for the games currently available.
HD 5000 Series : D3D11 is not relevant for the current time.
6 month before : Who need D3D12/Vulkan, D3D11 performence is enough. Now same people are saying we wont buy because the GPUs dont support D3D12 Ultimate.
Raytracing could be become a gimick IF future AMD gpus performe better than Nvidia GPUs in Raytracing.
Ray Tracing is already a gimmick. Was since the first day.
People who brought GCN1 and 2 back in the day, still get the best support in drivers, unlike Kepler. That doesn't happen with RDNA1 because it just seems to be a patched together GCN, when RDNA2 was already working for consoles, with Mesh shaders and Ray Tracing.
 
Ray Tracing is already a gimmick. Was since the first day.
People who brought GCN1 and 2 back in the day, still get the best support in drivers, unlike Kepler. That doesn't happen with RDNA1 because it just seems to be a patched together GCN, when RDNA2 was already working for consoles, with Mesh shaders and Ray Tracing.
Just like D3D12, D3D12 Ulitimate games requiring those feature will come like 5 -10 years letter.
 
It still is quite a bad idea to name your current gen cards as something next gen, when they have nothing new, only driver bugs.
 
A 2060 running non-stop for deep learning seems like such a bad idea. Why would you?
If you're into competitive DL and you need a local GPU, get a more powerful card.
If you're learning DL, move to cloud. Running non-stop on a mid-range card at home doesn't make much sense.

Also, as @Jism already said, for DL Nvidia is the obvious choice right now. Too much fuss to make it work on AMD. Not worth it.
Unless of course you're on a Mac, where you're limited to AMD, so still no choice...
I don't run a 2060 (I mean, I did, but now I only run 2080Tis), but taking initial purchase price into consideration, the Ko/Super or regular 2060 is definitely a better buy for those price points.
Amazon Cloud computing would cost me $8k a year (they charge $1 an hour for a regular GPU server, and about $800 for a quadcore CPU server).
A similar system at my home (minus the initial purchase price) runs for less than $500 a year ($200 for CPU).
The initial purchase price of the CPU server costs about $750-800, GPU costs about 2,5k-3,5k depending on how many GPUs you run and how cheap you can get them.
Both these systems will be heaps faster than Amazon (or Google).
No licenses or running cost surcharges to pay.
I think I'm pretty good with what I have and know.
Made it to the top 20 contributors of FAH, and top 8 on Boinc. That's out of 2-4M clients.
 
Back
Top