• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

SAPPHIRE Launches Their RX Vega Nitro+ Series of graphics Cards

Joined
Mar 7, 2010
Messages
994 (0.18/day)
Location
Michigan
System Name Daves
Processor AMD Ryzen 3900x
Motherboard AsRock X570 Taichi
Cooling Enermax LIQMAX III 360
Memory 32 GiG Team Group B Die 3600
Video Card(s) Powercolor 5700 xt Red Devil
Storage Crucial MX 500 SSD and Intel P660 NVME 2TB for games
Display(s) Acer 144htz 27in. 2560x1440
Case Phanteks P600S
Audio Device(s) N/A
Power Supply Corsair RM 750
Mouse EVGA
Keyboard Corsair Strafe
Software Windows 10 Pro
Many previous testers claim exactly what I wrote but I found a new video with the Strix 56 tested. Start watching carefully:


But efficiency goes down for Pascal when doing that while Vega's going up. And Vega 56 can go up in performance by 10-20% depending on the game or application. Search and you will find that around the web. Why do you make me write again and again and again the same things? I am sure you can understand what I write. Truth won't change just because one person prefer a company's product over another's.

I just can't watch that guy, he takes so long to get to the point.
 
Joined
Jun 28, 2016
Messages
3,595 (1.15/day)
As for the efficiency increase by tuning being the same for both Vega and nVidia gpus, in Vega's case you will get MORE performance for LESS power draw when tuned properly. While for nVidia you LOSE performance when you lower the consumption. So, while Vega isn't well optimised from AMD and has more potential than in stock settings, nVidia is very close to optimal efficiency already. Big difference imho due to their R&D money available. I wonder how many times is that needed to be explained...
So you're essentially saying that AMD didn't know how to optimize their products, but home geeks can do that better.
That is... brave.

I have a different theory for you:
AMD, in order to lower production costs, has been releasing products with high quality variance (compared to more expensive competition).
So yes, this means that the factory voltage and clocks are on the safe side. And you can often pull a lot more from your AMD product.
Obviously, some samples won't cope very well with this.
The big difference between overclocking and undervolting is: the former is much easier to control. The card runs too hot, you just ease on the OC (or it does it automatically).
Undervolting results in operational errors and instability, which is... well... much worse. :)
 
Joined
Aug 6, 2017
Messages
7,412 (2.74/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
like I said in my edited post, that's cause the card comes with lower than advertised clocks due to higher voltage.

What? No. You need some practical experience instead of reading benches, forums and reviews, so you can see this in action.
TPU database says 1590MHz is the boost clock for Vega 56, yet the review for Vega 56 strix that I've been referring to says it's 1499 out of the box. It only reaches the advertised speed of 1590MHz after voltage tuning, so I'm absolutely right. What practical experience do I need to read two numbers and tell one is lower ?

So you're essentially saying that AMD didn't know how to optimize their products, but home geeks can do that better.
That is... brave.

I have a different theory for you:
AMD, in order to lower production costs, has been releasing products with high quality variance (compared to more expensive competition).
So yes, this means that the factory voltage and clocks are on the safe side. And you can often pull a lot more from your AMD product.
Obviously, some samples won't cope very well with this.
The big difference between overclocking and undervolting is: the former is much easier to control. The card runs too hot, you just ease on the OC (or it does it automatically).
Undervolting results in operational errors and instability, which is... well... much worse. :)

exactly.
 
Last edited:
Joined
Apr 30, 2011
Messages
2,719 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
So you're essentially saying that AMD didn't know how to optimize their products, but home geeks can do that better.
That is... brave.

I have a different theory for you:
AMD, in order to lower production costs, has been releasing products with high quality variance (compared to more expensive competition).
So yes, this means that the factory voltage and clocks are on the safe side. And you can often pull a lot more from your AMD product.
Obviously, some samples won't cope very well with this.
The big difference between overclocking and undervolting is: the former is much easier to control. The card runs too hot, you just ease on the OC (or it does it automatically).
Undervolting results in operational errors and instability, which is... well... much worse. :)

They were late in releasing Vega and were in rush, the yields weren't good, the HBM2 supply wasn't good, the R&D budget is the tenth of nVidia's. So many reasons to not being able to calibrate properly many samples in order to have a safe voltage level. They chose to go for higher voltage than needed and the buyers have to tune it properly now to get the better out of Vega GPUs. Not too complicated to me but everyone has an opinion when things like this aren't published from official sources and we learn pieces from seeking on the web about. The end product clearly shows that something went wrong with the voltage levels. Every evidence points to that conclusion. Around the web, almost every Vega owner who tried, could OC&UV gaining 10-15% higher stable core clocks once stable drivers were out.
 
Joined
Dec 30, 2010
Messages
2,202 (0.43/day)
For some reason, AMD's products always have a slight higher voltage in both CPU's and GPU's. Why this is done i am not sure, but proberly to guarantee absolute stability in every situation possible.

The 9570 could be shaved off a 60watts off the wall by simplying undervolting the chip. The RX480 had the same issue as well (20 up to 40 watts depending on quality). The VEGA copes with heat created by a pretty high voltage which hampers the maximum boost clock.

So yeah, undervolting might shave off here and there, it's a fun thing yes, as AMD always was with OC'ing their products, but it's actually not our task or job.
 
Joined
Apr 15, 2009
Messages
1,038 (0.18/day)
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Master
Cooling ARCTIC Liquid Freezer III 360 A-RGB
Memory 32 GB Ballistix Elite DDR4-3600 CL16
Video Card(s) XFX 6800 XT Speedster Merc 319 Black
Storage Sabrent Rocket NVMe 4.0 1TB
Display(s) LG 27GL850B x 2 / ASUS MG278Q
Case be quiet! Silent Base 802
Audio Device(s) Sound Blaster AE-7 / Sennheiser HD 660S
Power Supply Seasonic Vertex PX-1200
Software Windows 11 Pro 64
So it's another paper launch for Vega. At the time of writing, these still can't be found for sale anywhere in the US. There are zero Vega models available for purchase at newegg. Four reference models of Vega 64 can be found available on Amazon... the least expensive listing at $1,388. None of Sapphire's partners listed in the US have any availability of their RX Vega Nitro+. /golfclap
 
Joined
Jun 28, 2016
Messages
3,595 (1.15/day)
They were late in releasing Vega and were in rush, the yields weren't good, the HBM2 supply wasn't good, the R&D budget is the tenth of nVidia's.
You're listing this like if the whole world was against them, while in fact it's all their mistakes. They shouldn't have gone for exclusive HBM2, they shouldn't rush anything. And most importantly, they should simply increase their R&D budget if that's what holding them back.
So many reasons to not being able to calibrate properly many samples in order to have a safe voltage level.
WTF? A company releases an unfinished product and instead of being furious, you're inventing excuses for them.
Not too complicated to me but everyone has an opinion when things like this aren't published from official sources and we learn pieces from seeking on the web about.
Voltage is something users should never touch. It's not LEGO. You're not buying parts for some DIY fun project. It's complicated consumer electronics. It should work out of box.
What's next? You expect industrial clients to undervolt/overclock EPYC and Radeon Instinct? :)
Around the web, almost every Vega owner who tried, could OC&UV gaining 10-15% higher stable core clocks once stable drivers were out.
Around the web, every gun owner, who tried to commit suicide, failed.
So it's another paper launch for Vega. At the time of writing, these still can't be found for sale anywhere in the US. There are zero Vega models available for purchase at newegg. Four reference models of Vega 64 can be found available on Amazon... the least expensive listing at $1,388. None of Sapphire's partners listed in the US have any availability of their RX Vega Nitro+. /golfclap
Apple has just released their new iMac Pro with a Vega GPU. Apple was most likely building up stock and using all Vega parts they could lay their hands on. And it will get even worse when the Mac Pro arrives. Everyone thinking about buying a Vega for their PC should just look for another option.
 
Joined
Sep 19, 2005
Messages
157 (0.02/day)
System Name Homebrew
Processor Intel I5 4690k>Ryzen3900x
Motherboard Gigabyte Z97X-Gaming 5>Aorus Elite X570
Cooling CPU: (Prolimatek Megahelim 2+1 exhaust fan)>X72 Kraken, GPU: 570lx 240mm AIO w/kraken G10 + Ramsinks
Memory Kingston Hyper-X 2400 Savage 2 x 8Gb>16GB Corsair Vengence Pro M2Z(for Ryzen)>32GB PatriotViper 4400
Video Card(s) Sapphire RX Vega 56 HBM to 930Mhz from 800Mhz for Gaming & 850 for Gigapixel AI
Storage 850 evo 250Gb, 860 1TB, HGST HDN726040ALE640, WD10EACS 1Tb, WD20EARS 2Tb, WD80EMAZ, WD140EMFZ
Display(s) Benq GW2265>C24FG73 Samsung 144Hz
Case Antec P180>Lian-Li 011 Dynamic XL
Audio Device(s) Asus Xonar DX
Power Supply Corsair HX850
Mouse Gigabyte M6900> Razer Deathadder V2 (optical switches)
Keyboard Logitech G15 + Wolfking circular gaming KB
Software Windows 7 SP1>Windows 10 Pro N fully activated with gatherosstate.exe
Benchmark Scores http://www.doomiii.pwp.blueyonder.co.uk/system-spec.html < Need a new host
I've got a Sapphire RX vega 56 with a G10 bracket and custom filed heatsink for mosfets. I've not tried flashing the bios yet. I am interested in getting a nitro vega 56 bios. I've not found in in TPU database.
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
So it's another paper launch for Vega. At the time of writing, these still can't be found for sale anywhere in the US. There are zero Vega models available for purchase at newegg. Four reference models of Vega 64 can be found available on Amazon... the least expensive listing at $1,388. None of Sapphire's partners listed in the US have any availability of their RX Vega Nitro+. /golfclap

Indeed, Vega stock has almost completely vanished, what the hell is going on? The massively overpriced iMac Pro has just hit the market anyway... for those such inclined.

I suspect trying to compete with a 314 mm² GPU with a much larger 486 mm² HMB2 GPU isn't ideal either way.

But hey.... I'm no business man.
 
Joined
Oct 2, 2004
Messages
13,791 (1.86/day)
So you're essentially saying that AMD didn't know how to optimize their products, but home geeks can do that better.
That is... brave.

I have a different theory for you:
AMD, in order to lower production costs, has been releasing products with high quality variance (compared to more expensive competition).
So yes, this means that the factory voltage and clocks are on the safe side. And you can often pull a lot more from your AMD product.
Obviously, some samples won't cope very well with this.
The big difference between overclocking and undervolting is: the former is much easier to control. The card runs too hot, you just ease on the OC (or it does it automatically).
Undervolting results in operational errors and instability, which is... well... much worse. :)

AMD doesn't have time to fiddle with one card for 2 months like geeks do. That's the major difference. If AMD could pre-tweak each card to geek level in 10 seconds time and send it to stores, they would. It's why they have so optimistically set voltages, because they can't. And there is a level of consistency. Someone doesn't mind losing 3fps if that means 50W less. Someone will go ballistic if his Radeon doesn't achieve max possible framerate regardless of consumption. It's why they prefer to run them slightly hotter with higher voltages than lower that may run slower in some cases. Which is what NVIDIA cards are doing beyond the base clock. GPU Boost 3.0 is a lottery. It may go very high on its own or not compared to same model from some other user. But unlike AMD which advertises maximum guaranteed boost, NVIDIA advertises one thing but users get something else (usually higher than advertised since they want to ensure they don't have legal issues so they state boost at a guaranteed level).
 
Joined
Jun 28, 2016
Messages
3,595 (1.15/day)
AMD doesn't have time to fiddle with one card for 2 months like geeks do. That's the major difference. If AMD could pre-tweak each card to geek level in 10 seconds time and send it to stores, they would. It's why they have so optimistically set voltages, because they can't. And there is a level of consistency. Someone doesn't mind losing 3fps if that means 50W less. Someone will go ballistic if his Radeon doesn't achieve max possible framerate regardless of consumption. It's why they prefer to run them slightly hotter with higher voltages than lower that may run slower in some cases. Which is what NVIDIA cards are doing beyond the base clock. GPU Boost 3.0 is a lottery. It may go very high on its own or not compared to same model from some other user. But unlike AMD which advertises maximum guaranteed boost, NVIDIA advertises one thing but users get something else (usually higher than advertised since they want to ensure they don't have legal issues so they state boost at a guaranteed level).
No one talks about optimizing each card. They don't. No one does. No one can, actually. They're selling a particular product, for particular price. All samples need to have identical specification.
What is being discussed is the variance of products.
If variance is low, the manufacturer can apply factory setting near the average.
If variance is high, they have to configure them a lot below the average, because too many samples would not survive higher performance.

And this is exactly the problem with AMD. As a result some product samples have a lot of OC headroom. But some don't. You never know what you'll get.
We've seen this with Ryzen lately. Some can get to 4.1GHz, some are stuck at 3.9. And that's all fine.
The problem arises when someone is asking for CPU advice and what he gets is: "buy a Ryzen, OC it to 4.1GHz and it'll be awesome".
Now, Vega is something else, because it's already pushed pretty far by the factory. So now it's more about undervolting than OC.
And of course some Vega samples will happily undervolt, some won't.
The issue is, as mentioned above, the nature of these modification. OC stability is easy to test. For undervolting it's not that obvious.
 
Joined
Apr 30, 2011
Messages
2,719 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
You're listing this like if the whole world was against them, while in fact it's all their mistakes. They shouldn't have gone for exclusive HBM2, they shouldn't rush anything. And most importantly, they should simply increase their R&D budget if that's what holding them back.

WTF? A company releases an unfinished product and instead of being furious, you're inventing excuses for them.

Voltage is something users should never touch. It's not LEGO. You're not buying parts for some DIY fun project. It's complicated consumer electronics. It should work out of box.
What's next? You expect industrial clients to undervolt/overclock EPYC and Radeon Instinct? :)

Around the web, every gun owner, who tried to commit suicide, failed.

Apple has just released their new iMac Pro with a Vega GPU. Apple was most likely building up stock and using all Vega parts they could lay their hands on. And it will get even worse when the Mac Pro arrives. Everyone thinking about buying a Vega for their PC should just look for another option.

Don't bother to tell anyone what he should do with his property. Especially when undervoltaging an electronic expands its lifetime without risking anything when all Vega have 2 bios. And where did you see me writing that AMD are correct in what they did with Vega? I just wrote the possible causes for this release being bad for a more potent product than what they got to the market. Ignorance/fanboyism is spread all over your post.
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.86/day)
@notb

OC potential is not being advertised by anyone. Ever. They always guarantee a) base clocks and b) boost clocks when minimum required conditions are met (usually thermal and power delivery). Anything beyond that is pure luck and it can be 5MHz or 200MHz.
 
Joined
Jan 8, 2017
Messages
9,525 (3.26/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Joined
Oct 2, 2004
Messages
13,791 (1.86/day)
It's interesting how something like the 7700K was always quoted that it can reach 5Ghz no problem guaranteed but the thermals were atrocious and in reality not everyone could without serious cooling or deliding.

Not by Intel. They only state specified clocks. The only thing I'd say it should always work is that "Multicore optimization" which ramps up all cores to the boost clock. That should always work, at expense of higher consumption and heat. I mean, cores are designed to reach boost clock always. Just not all together for other reasons (like heat and TDP limits). The rest are general averages. If 90% of samples reach 5GHz no problem, you can safely assume most will reach this and it becomes somewhat a general rule. But that's never guaranteed by Intel. Never.
 
Joined
Jan 8, 2017
Messages
9,525 (3.26/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Not by Intel. They only state specified clocks. The only thing I'd say it should always work is that "Multicore optimization" which ramps up all cores to the boost clock. That should always work, at expense of higher consumption and heat. I mean, cores are designed to reach boost clock always. Just not all together for other reasons (like heat and TDP limits). The rest are general averages. If 90% of samples reach 5GHz no problem, you can safely assume most will reach this and it becomes somewhat a general rule. But that's never guaranteed by Intel. Never.

I wasn't talking about Intel but rather random people on the internet. Of course no manufacturer will ever guarantee this sort of thing.
 
Joined
Dec 28, 2012
Messages
3,987 (0.91/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Indeed, Vega stock has almost completely vanished, what the hell is going on? The massively overpriced iMac Pro has just hit the market anyway... for those such inclined.

I suspect trying to compete with a 314 mm² GPU with a much larger 486 mm² HMB2 GPU isn't ideal either way.

But hey.... I'm no business man.
AMD screwed the pooch hard with vega and completely missed the "make a 3072 or 4096 core polaris" train. It wouldnt surprise me if they were cutting stock hard to meet apple and their juicy margins so they can make a few more % on the expensive vega GPUs.
 
Joined
Jun 28, 2016
Messages
3,595 (1.15/day)
I wasn't talking about Intel but rather random people on the internet. Of course no manufacturer will ever guarantee this sort of thing.
Random people like a lot on this forum? :)

BTW: I needed info about Radeon Instinct today and what a surprise.
MI6 is a Polaris. MI8 is based on Fiji! Only the MI25 variant is based on Vega.
How is that possible? How bad is Vega? It's like AMD admitting they've totally failed on the efficiency front.

Now I'm slightly worried about the Intel+Vega MCM. I really wanted that to be in my next notebook...
 
Joined
Mar 7, 2017
Messages
38 (0.01/day)
Location
Houston, TX
Processor AMD Ryzen 1700X
Motherboard ASRock X370 Taichi
Cooling EKWB Custom Loop
Memory 16GB (2x8GB) G.Skill Trident Z RGB 3000Mhz
Video Card(s) MSI Gaming X GTX 1080 EKWB Nickel WB
Storage 256GB WD Black M.2 O/S & 1TB WD Black Storage
Case Anidees Crystal Cube RGB
Power Supply Corsair RM 850 W/ Cable Mod Carbon Pro Cables
Mouse Razer Deathadder
Keyboard Razer Black Widow
Software Win 10
I want to love this card so much, but the fact that you can get a 1080ti for close to the same price with better performance it just can't happen.
 
Joined
Nov 21, 2010
Messages
2,355 (0.46/day)
Location
Right where I want to be
System Name Miami
Processor Ryzen 3800X
Motherboard Asus Crosshair VII Formula
Cooling Ek Velocity/ 2x 280mm Radiators/ Alphacool fullcover
Memory F4-3600C16Q-32GTZNC
Video Card(s) XFX 6900 XT Speedster 0
Storage 1TB WD M.2 SSD/ 2TB WD SN750/ 4TB WD Black HDD
Display(s) DELL AW3420DW / HP ZR24w
Case Lian Li O11 Dynamic XL
Audio Device(s) EVGA Nu Audio
Power Supply Seasonic Prime Gold 1000W+750W
Mouse Corsair Scimitar/Glorious Model O-
Keyboard Corsair K95 Platinum
Software Windows 10 Pro
AMD has to pull off a miracle with Navi architecture to catch up with Nvidia. Both in performance and power efficiency.

Not really, since power efficiency is derivative of performance they just need to close the efficiency gap, and they'd achieve both goals. V64 vs. a Titan V it's a 38% gap where vs. the 1080 ti it is 53%. If anything it's NVidia who is moving backwards.
 
Joined
Sep 13, 2017
Messages
78 (0.03/day)
System Name project silver poison
Processor Intel Xeon x5650
Motherboard Asus x58 sabertooth
Cooling Enermax Liqfusion 240mm
Memory 24gb ripjaws ddr3
Video Card(s) Sapphire R9 fury Nitro
Storage 256gb sata ssd & 1tb seagate white label
Display(s) Asus VP248QG
Case Lian li lancool one
Audio Device(s) Logitech z207 & audio techica m40x
Power Supply Enermax platimax df 1200w
Mouse Corsair Harpoon rgb
Keyboard Coolermaster Masterkeys pro l rgb
the pcb looks like something asrock would do
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Joined
Sep 19, 2005
Messages
157 (0.02/day)
System Name Homebrew
Processor Intel I5 4690k>Ryzen3900x
Motherboard Gigabyte Z97X-Gaming 5>Aorus Elite X570
Cooling CPU: (Prolimatek Megahelim 2+1 exhaust fan)>X72 Kraken, GPU: 570lx 240mm AIO w/kraken G10 + Ramsinks
Memory Kingston Hyper-X 2400 Savage 2 x 8Gb>16GB Corsair Vengence Pro M2Z(for Ryzen)>32GB PatriotViper 4400
Video Card(s) Sapphire RX Vega 56 HBM to 930Mhz from 800Mhz for Gaming & 850 for Gigapixel AI
Storage 850 evo 250Gb, 860 1TB, HGST HDN726040ALE640, WD10EACS 1Tb, WD20EARS 2Tb, WD80EMAZ, WD140EMFZ
Display(s) Benq GW2265>C24FG73 Samsung 144Hz
Case Antec P180>Lian-Li 011 Dynamic XL
Audio Device(s) Asus Xonar DX
Power Supply Corsair HX850
Mouse Gigabyte M6900> Razer Deathadder V2 (optical switches)
Keyboard Logitech G15 + Wolfking circular gaming KB
Software Windows 7 SP1>Windows 10 Pro N fully activated with gatherosstate.exe
Benchmark Scores http://www.doomiii.pwp.blueyonder.co.uk/system-spec.html < Need a new host
That's a ridiculous price I paid £380 @ OcUK for my Sapphire Vega 56 which I am using with a modified G10 bracket.
 
Joined
Jun 28, 2016
Messages
3,595 (1.15/day)
That's a ridiculous price I paid £380 @ OcUK for my Sapphire Vega 56 which I am using with a modified G10 bracket.
The cheapest Vega 56 @OCUK is 580 GBP at the moment (and it's pre-order). Nitro+ one is 630 GBP (and available).
These are the prices we should compare and, no offense, watercooling doesn't look very sensible (unless you're really devoted to the idea).
 
Top