• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASRock Radeon RX 5600 XT Phantom Gaming D3

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,870 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
ASRock's Radeon RX 5600 XT Phantom Gaming D3 is the first card that was designed from the ground up as an RX 5600 XT; it isn't just an RX 5700 XT design with the RX 5600 XT GPU slapped on. Thanks to a BIOS update, the card runs at the highest spec allowed by AMD yet remains cool and quiet.

Show full review
 
Joined
Aug 8, 2019
Messages
430 (0.22/day)
System Name R2V2 *In Progress
Processor Ryzen 7 2700
Motherboard Asrock X570 Taichi
Cooling W2A... water to air
Memory G.Skill Trident Z3466 B-die
Video Card(s) Radeon VII repaired and resurrected
Storage Adata and Samsung NVME
Display(s) Samsung LCD
Case Some ThermalTake
Audio Device(s) Asus Strix RAID DLX upgraded op amps
Power Supply Seasonic Prime something or other
Software Windows 10 Pro x64
Just a few little things.

In the power consumption tables there's no reference 5600 or 5600XT.

Something I'd personally love to see is an average framerate table that is broken out for each API.

Great review, though personally I like that AMD makes solid reference designs. Leave the junky reference builds to NV. The 5600 being a cut down 5700 makes monetary sense that the reference board is basically a 5700. It also allows the AIBs to build cards like this that are custom without a large price penalty.

I'm curious

Could you try setting the minimum fan speed to 10 or 20% and see if the card has that same response or if it ramps normally? That start-up response is suspicious. What's the max fan speed?
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,870 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
In the power consumption tables there's no reference 5600 or 5600XT.
There's no 5600 non-XT (supposedly an OEM model exists, but can't be found anywhere). For the XT I'm unsure which card I should designate "reference", I'm leaning towards XFX THICC II.
 
Joined
Oct 1, 2014
Messages
1,983 (0.53/day)
Location
Calabash, NC
System Name The Captain (2.0)
Processor Ryzen 7 7700X
Motherboard Asus ROG Strix X670E-A
Cooling 280mm Arctic Liquid Freezer II, 4x Be Quiet! 140mm Silent Wings 4 (1x exhaust 3x intake)
Memory 32GB (2x16) Kingston Fury Beast CL30 6000MT/s
Video Card(s) MSI GeForce RTX 3070 SUPRIM X
Storage 1x Crucial MX500 500GB SSD; 1x Crucial MX500 500GB M.2 SSD; 1x WD Blue HDD, 1x Crucial P5 Plus
Display(s) Aorus CV27F 27" 1080p 165Hz
Case Phanteks Evolv X (Anthracite Gray)
Power Supply Corsair RMx (2021) 1000W 80-Plus Gold
Mouse Varies based on mood/task; is currently Razer Basilisk V3 Pro or Razer Cobra Pro
Keyboard Varies based on mood; currently Razer Blackwidow V4 75% and Hyper X Alloy 65
Great review, sir! Nice looking card as well.
 
Joined
Oct 2, 2015
Messages
3,152 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
I'm not sure saying that the 5600 is on the same terms as the 1600 series is valid. Turing, even without the Ray Tracing FPS killer is still technologically superior, Mesh Shaders are the next requirement for the future versions of Direct3D and Khronos (Vulkan/OpenGL), and RDNA1 lacks that.
 
Joined
Nov 24, 2017
Messages
853 (0.33/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GiB(2x4GiB) DDR3-1600 [800MHz]
Video Card(s) XFX RX 560D 4GiB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB HDD
Display(s) Samsung S20D300 20" 768p TN
Case Cooler Master MasterBox E501L
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point in HD 4600
People sings different when AMD's hardware has advenced feature.
GCN 2 GPU, better D3D12 performence and better hardware support in 2014, Who buy for future, we buy for the games currently available.
HD 5000 Series : D3D11 is not relevant for the current time.
6 month before : Who need D3D12/Vulkan, D3D11 performence is enough. Now same people are saying we wont buy because the GPUs dont support D3D12 Ultimate.
Raytracing could be become a gimick IF future AMD gpus performe better than Nvidia GPUs in Raytracing.
 
Joined
Nov 11, 2016
Messages
3,419 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
People sings different when AMD's hardware has advenced feature.
GCN 2 GPU, better D3D12 performence and better hardware support in 2014, Who buy for future, we buy for the games currently available.
HD 5000 Series : D3D11 is not relevant for the current time.
6 month before : Who need D3D12/Vulkan, D3D11 performence is enough. Now same people are saying we wont buy because the GPUs dont support D3D12 Ultimate.
Raytracing could be become a gimick IF future AMD gpus performe better than Nvidia GPUs in Raytracing.

There is really no point for DX12 to even exist initially, there is nothing new that DX12 bring aside from async compute and that is a hit or miss. Most games perform worse in DX12 API compared to DX11 API and there is no visual difference. Even RE3 remake which just released recently perform worse in DX12 than in DX11 API.
Really DX12 is more about leveraging CPU performance, which no one cares anymore since AMD has brought their CPU performance up to parity with Intel, or more.
Now with DX12 Ultimate mainly focuses GPU performance and better visual, for me this should be called the real DX12
 

ProDigit

New Member
Joined
Jul 10, 2019
Messages
7 (0.00/day)
I fail to see how this GPU is as efficient as Turing, considering that this architecture already is running nearly as efficient as can be.
Turing on the other hand, you can overclock, and run a 2060 at 127 Watts, and get near (within 5% of) stock performance, but at a reduction of 40-45 Watts from stock,or a 33% efficiency gain.
I highly doubt AMD is even near this efficiency, granted for mere gaming it wouldn't matter much (other than running hotter and thus run at lower boost fps).
For deep learning, power consumption is vital, and a GPU at this caliber running 24/7/365 under full load, consuming $40 on electricity more in a year, and suddenly a 2060 makes much more sense. 2060 super, or 2060ko if it ran 2 years consecutive.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,870 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Turing on the other hand, you can overclock, and run a 2060 at 127 Watts, and get near (within 5% of) stock performance, but at a reduction of 40-45 Watts from stock,or a 33% efficiency gain.
What's your result for UV OC on Navi 10?

other than running hotter and thus run at lower boost fps
Temperature affects boost freq only on NVIDIA, not on AMD, because the algorithms are designed differently

For deep learning [...] suddenly a 2060 makes much more sense.
Not sure if deep learning makes sense on any card in this performance range. I'd also claim that if you do deep learning you'll be using NVIDIA anyway, because CUDA and the rest of the ecosystem
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
I still don't get why these are mentioned as pros:
- PCI-Express 4.0
- 7 nanometer production process.

It would make more sense to say that a card is vegan or kosher. :)

Other than that: you should really consider including all tested cards in the noise graphs.
You're comparing (mostly) high-end AIB coolers with reference designs, which means almost all tested cards land on top of the "competition".
A reader, choosing between AIB options, has to write down results from multiple reviews to actually learns something useful.

I know it may not be that important for an average TPU reader, but since you've already bothered to buy the B&K 2236...
I mean: the noise part of the review makes an impression that it's added just as a formality - because everyone else does it.
 
Joined
Dec 30, 2010
Messages
2,199 (0.43/day)
I fail to see how this GPU is as efficient as Turing, considering that this architecture already is running nearly as efficient as can be.
Turing on the other hand, you can overclock, and run a 2060 at 127 Watts, and get near (within 5% of) stock performance, but at a reduction of 40-45 Watts from stock,or a 33% efficiency gain.
I highly doubt AMD is even near this efficiency, granted for mere gaming it wouldn't matter much (other than running hotter and thus run at lower boost fps).
For deep learning, power consumption is vital, and a GPU at this caliber running 24/7/365 under full load, consuming $40 on electricity more in a year, and suddenly a 2060 makes much more sense. 2060 super, or 2060ko if it ran 2 years consecutive.

Cool story dude but this is a consumer GPU. If you buy this for professional or scientific usage, your at the wrong stage with undervolts and overclocks in the first place.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
I fail to see how this GPU is as efficient as Turing, considering that this architecture already is running nearly as efficient as can be.
It's more or less as efficient as Turing on stock settings, which is what matters for most users.
Turing on the other hand, you can overclock, and run a 2060 at 127 Watts, and get near (within 5% of) stock performance, but at a reduction of 40-45 Watts from stock,or a 33% efficiency gain.
You can throw it away for 100% less power consumption. Gaming isn't productive anyway. Read a book.
For deep learning, power consumption is vital, and a GPU at this caliber running 24/7/365 under full load, consuming $40 on electricity more in a year, and suddenly a 2060 makes much more sense. 2060 super, or 2060ko if it ran 2 years consecutive.
A 2060 running non-stop for deep learning seems like such a bad idea. Why would you?
If you're into competitive DL and you need a local GPU, get a more powerful card.
If you're learning DL, move to cloud. Running non-stop on a mid-range card at home doesn't make much sense.

Also, as @Jism already said, for DL Nvidia is the obvious choice right now. Too much fuss to make it work on AMD. Not worth it.
Unless of course you're on a Mac, where you're limited to AMD, so still no choice...
 
Joined
Apr 8, 2010
Messages
1,011 (0.19/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
Nice cooler with the low nosie levels.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
Nice cooler with the low nosie levels.
Well, it's just big. Huge heatsink, 3 fans. Probably roughly the same one they put on 5700XT Phantom Gaming.

Given the price and performance difference, I bet gamers with large ATX systems will go for the 5700XT anyway.
And those who are limited by the size or power consumption will look for a smaller 5600XT.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,870 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Given the price and performance difference, I bet gamers with large ATX systems will go for the 5700XT anyway.
5700 XT is 20% faster, but 40% more expensive
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
5700 XT is 20% faster, but 40% more expensive
Of course, but will gamers really look at that? I honestly don't know.

5700XT is ~$100 more expensive, which is probably around 10% a typical PC with such a card.
For that 10% you get a GPU that's way more confident in 4K and will last for a generation longer.

If I were in that situation and I expected to keep gaming for another 3+ years, I'd go for the 5700XT. :)
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
So Asrock design their own PCB to save costs... then strap a ridiculously, unnecessarily large and expensive cooler on it. If they were smart they would've used a cooler like Sapphire did with the Pulse, as pretty much every review shows that the 5600 XT is a very cool chip even at load (also it doesn't overclock very much).

Personally I'd prefer to see small 5600XT cards, not these overbuilt monstrosities.
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Would you consider lack of other DX12 ultimate features (besides RTX) a major con as well? Non Varaiable Rate Shadering, no mesh shadering. That would make the Navi Gen1 fair quite poorly for titles that utilize these performance boosting features from DX12 Ultimate.
 
Joined
Mar 21, 2020
Messages
77 (0.04/day)
well, i think its NOT faster than rtx 2060.
reason is that that 5600 xt is sky high top oc'd model, but it compare as usual this kind techpower for nvidia reference model, aka rtx 2600 F.

so,lets take asus strix advance OC model and then we see that that Phamtom its not even close...
there is test techpoerup gpu sector...

also, i think clear that 5600 xt is same as rx 5700 gpu,even power consuption show it,very high..and very very high for 7nm gpu.
mesning rtx 2700 is better, with power consuption.


also price ruin it alot,its best option or was it.


hmmm,naah.


hmm#2,lets see when nvidia release also 7nm ampere, rtx 3060 is more than enough,also rx 5700 xt gpu...let see
 
Joined
Aug 8, 2019
Messages
430 (0.22/day)
System Name R2V2 *In Progress
Processor Ryzen 7 2700
Motherboard Asrock X570 Taichi
Cooling W2A... water to air
Memory G.Skill Trident Z3466 B-die
Video Card(s) Radeon VII repaired and resurrected
Storage Adata and Samsung NVME
Display(s) Samsung LCD
Case Some ThermalTake
Audio Device(s) Asus Strix RAID DLX upgraded op amps
Power Supply Seasonic Prime something or other
Software Windows 10 Pro x64
There's no 5600 non-XT (supposedly an OEM model exists, but can't be found anywhere). For the XT I'm unsure which card I should designate "reference", I'm leaning towards XFX THICC II.

I kind of figured, I know there's a spec for a non-xt but they don't seem to be getting much air time.

That makes sense and the XFX does seem to be sporting a very reference style style PCB, so I would support that choice.
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
It's more or less as efficient as Turing on stock settings, which is what matters for most users.

You can throw it away for 100% less power consumption. Gaming isn't productive anyway. Read a book.

A 2060 running non-stop for deep learning seems like such a bad idea. Why would you?
If you're into competitive DL and you need a local GPU, get a more powerful card.
If you're learning DL, move to cloud. Running non-stop on a mid-range card at home doesn't make much sense.

Also, as @Jism already said, for DL Nvidia is the obvious choice right now. Too much fuss to make it work on AMD. Not worth it.
Unless of course you're on a Mac, where you're limited to AMD, so still no choice...


Eh no? Plenty of researchers use RTX 2060 for CUDA or Tensorflow accelerated work

NOBODY in their right mind will try to use AMD GPU for ANY computation work except mining for crypto-coins. OpenCL support on Navi is a shit show. On top of that there are very limited toolkit developed on OpenCL. Just no

Annotation 2020-04-14 114910.jpg

Annotation 2020-04-14 1149101.jpg



 
Joined
Oct 2, 2015
Messages
3,152 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
People sings different when AMD's hardware has advenced feature.
GCN 2 GPU, better D3D12 performence and better hardware support in 2014, Who buy for future, we buy for the games currently available.
HD 5000 Series : D3D11 is not relevant for the current time.
6 month before : Who need D3D12/Vulkan, D3D11 performence is enough. Now same people are saying we wont buy because the GPUs dont support D3D12 Ultimate.
Raytracing could be become a gimick IF future AMD gpus performe better than Nvidia GPUs in Raytracing.
Ray Tracing is already a gimmick. Was since the first day.
People who brought GCN1 and 2 back in the day, still get the best support in drivers, unlike Kepler. That doesn't happen with RDNA1 because it just seems to be a patched together GCN, when RDNA2 was already working for consoles, with Mesh shaders and Ray Tracing.
 
Joined
Nov 24, 2017
Messages
853 (0.33/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GiB(2x4GiB) DDR3-1600 [800MHz]
Video Card(s) XFX RX 560D 4GiB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB HDD
Display(s) Samsung S20D300 20" 768p TN
Case Cooler Master MasterBox E501L
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point in HD 4600
Ray Tracing is already a gimmick. Was since the first day.
People who brought GCN1 and 2 back in the day, still get the best support in drivers, unlike Kepler. That doesn't happen with RDNA1 because it just seems to be a patched together GCN, when RDNA2 was already working for consoles, with Mesh shaders and Ray Tracing.
Just like D3D12, D3D12 Ulitimate games requiring those feature will come like 5 -10 years letter.
 
Joined
Oct 2, 2015
Messages
3,152 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
It still is quite a bad idea to name your current gen cards as something next gen, when they have nothing new, only driver bugs.
 

ProDigit

New Member
Joined
Jul 10, 2019
Messages
7 (0.00/day)
A 2060 running non-stop for deep learning seems like such a bad idea. Why would you?
If you're into competitive DL and you need a local GPU, get a more powerful card.
If you're learning DL, move to cloud. Running non-stop on a mid-range card at home doesn't make much sense.

Also, as @Jism already said, for DL Nvidia is the obvious choice right now. Too much fuss to make it work on AMD. Not worth it.
Unless of course you're on a Mac, where you're limited to AMD, so still no choice...
I don't run a 2060 (I mean, I did, but now I only run 2080Tis), but taking initial purchase price into consideration, the Ko/Super or regular 2060 is definitely a better buy for those price points.
Amazon Cloud computing would cost me $8k a year (they charge $1 an hour for a regular GPU server, and about $800 for a quadcore CPU server).
A similar system at my home (minus the initial purchase price) runs for less than $500 a year ($200 for CPU).
The initial purchase price of the CPU server costs about $750-800, GPU costs about 2,5k-3,5k depending on how many GPUs you run and how cheap you can get them.
Both these systems will be heaps faster than Amazon (or Google).
No licenses or running cost surcharges to pay.
I think I'm pretty good with what I have and know.
Made it to the top 20 contributors of FAH, and top 8 on Boinc. That's out of 2-4M clients.
 
Top