• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

SAPPHIRE Launches Their RX Vega Nitro+ Series of graphics Cards

Many previous testers claim exactly what I wrote but I found a new video with the Strix 56 tested. Start watching carefully:


But efficiency goes down for Pascal when doing that while Vega's going up. And Vega 56 can go up in performance by 10-20% depending on the game or application. Search and you will find that around the web. Why do you make me write again and again and again the same things? I am sure you can understand what I write. Truth won't change just because one person prefer a company's product over another's.

I just can't watch that guy, he takes so long to get to the point.
 
As for the efficiency increase by tuning being the same for both Vega and nVidia gpus, in Vega's case you will get MORE performance for LESS power draw when tuned properly. While for nVidia you LOSE performance when you lower the consumption. So, while Vega isn't well optimised from AMD and has more potential than in stock settings, nVidia is very close to optimal efficiency already. Big difference imho due to their R&D money available. I wonder how many times is that needed to be explained...
So you're essentially saying that AMD didn't know how to optimize their products, but home geeks can do that better.
That is... brave.

I have a different theory for you:
AMD, in order to lower production costs, has been releasing products with high quality variance (compared to more expensive competition).
So yes, this means that the factory voltage and clocks are on the safe side. And you can often pull a lot more from your AMD product.
Obviously, some samples won't cope very well with this.
The big difference between overclocking and undervolting is: the former is much easier to control. The card runs too hot, you just ease on the OC (or it does it automatically).
Undervolting results in operational errors and instability, which is... well... much worse. :)
 
like I said in my edited post, that's cause the card comes with lower than advertised clocks due to higher voltage.

What? No. You need some practical experience instead of reading benches, forums and reviews, so you can see this in action.
TPU database says 1590MHz is the boost clock for Vega 56, yet the review for Vega 56 strix that I've been referring to says it's 1499 out of the box. It only reaches the advertised speed of 1590MHz after voltage tuning, so I'm absolutely right. What practical experience do I need to read two numbers and tell one is lower ?

So you're essentially saying that AMD didn't know how to optimize their products, but home geeks can do that better.
That is... brave.

I have a different theory for you:
AMD, in order to lower production costs, has been releasing products with high quality variance (compared to more expensive competition).
So yes, this means that the factory voltage and clocks are on the safe side. And you can often pull a lot more from your AMD product.
Obviously, some samples won't cope very well with this.
The big difference between overclocking and undervolting is: the former is much easier to control. The card runs too hot, you just ease on the OC (or it does it automatically).
Undervolting results in operational errors and instability, which is... well... much worse. :)

exactly.
 
Last edited:
So you're essentially saying that AMD didn't know how to optimize their products, but home geeks can do that better.
That is... brave.

I have a different theory for you:
AMD, in order to lower production costs, has been releasing products with high quality variance (compared to more expensive competition).
So yes, this means that the factory voltage and clocks are on the safe side. And you can often pull a lot more from your AMD product.
Obviously, some samples won't cope very well with this.
The big difference between overclocking and undervolting is: the former is much easier to control. The card runs too hot, you just ease on the OC (or it does it automatically).
Undervolting results in operational errors and instability, which is... well... much worse. :)

They were late in releasing Vega and were in rush, the yields weren't good, the HBM2 supply wasn't good, the R&D budget is the tenth of nVidia's. So many reasons to not being able to calibrate properly many samples in order to have a safe voltage level. They chose to go for higher voltage than needed and the buyers have to tune it properly now to get the better out of Vega GPUs. Not too complicated to me but everyone has an opinion when things like this aren't published from official sources and we learn pieces from seeking on the web about. The end product clearly shows that something went wrong with the voltage levels. Every evidence points to that conclusion. Around the web, almost every Vega owner who tried, could OC&UV gaining 10-15% higher stable core clocks once stable drivers were out.
 
For some reason, AMD's products always have a slight higher voltage in both CPU's and GPU's. Why this is done i am not sure, but proberly to guarantee absolute stability in every situation possible.

The 9570 could be shaved off a 60watts off the wall by simplying undervolting the chip. The RX480 had the same issue as well (20 up to 40 watts depending on quality). The VEGA copes with heat created by a pretty high voltage which hampers the maximum boost clock.

So yeah, undervolting might shave off here and there, it's a fun thing yes, as AMD always was with OC'ing their products, but it's actually not our task or job.
 
So it's another paper launch for Vega. At the time of writing, these still can't be found for sale anywhere in the US. There are zero Vega models available for purchase at newegg. Four reference models of Vega 64 can be found available on Amazon... the least expensive listing at $1,388. None of Sapphire's partners listed in the US have any availability of their RX Vega Nitro+. /golfclap
 
They were late in releasing Vega and were in rush, the yields weren't good, the HBM2 supply wasn't good, the R&D budget is the tenth of nVidia's.
You're listing this like if the whole world was against them, while in fact it's all their mistakes. They shouldn't have gone for exclusive HBM2, they shouldn't rush anything. And most importantly, they should simply increase their R&D budget if that's what holding them back.
So many reasons to not being able to calibrate properly many samples in order to have a safe voltage level.
WTF? A company releases an unfinished product and instead of being furious, you're inventing excuses for them.
Not too complicated to me but everyone has an opinion when things like this aren't published from official sources and we learn pieces from seeking on the web about.
Voltage is something users should never touch. It's not LEGO. You're not buying parts for some DIY fun project. It's complicated consumer electronics. It should work out of box.
What's next? You expect industrial clients to undervolt/overclock EPYC and Radeon Instinct? :-)
Around the web, almost every Vega owner who tried, could OC&UV gaining 10-15% higher stable core clocks once stable drivers were out.
Around the web, every gun owner, who tried to commit suicide, failed.
So it's another paper launch for Vega. At the time of writing, these still can't be found for sale anywhere in the US. There are zero Vega models available for purchase at newegg. Four reference models of Vega 64 can be found available on Amazon... the least expensive listing at $1,388. None of Sapphire's partners listed in the US have any availability of their RX Vega Nitro+. /golfclap
Apple has just released their new iMac Pro with a Vega GPU. Apple was most likely building up stock and using all Vega parts they could lay their hands on. And it will get even worse when the Mac Pro arrives. Everyone thinking about buying a Vega for their PC should just look for another option.
 
I've got a Sapphire RX vega 56 with a G10 bracket and custom filed heatsink for mosfets. I've not tried flashing the bios yet. I am interested in getting a nitro vega 56 bios. I've not found in in TPU database.
 
So it's another paper launch for Vega. At the time of writing, these still can't be found for sale anywhere in the US. There are zero Vega models available for purchase at newegg. Four reference models of Vega 64 can be found available on Amazon... the least expensive listing at $1,388. None of Sapphire's partners listed in the US have any availability of their RX Vega Nitro+. /golfclap

Indeed, Vega stock has almost completely vanished, what the hell is going on? The massively overpriced iMac Pro has just hit the market anyway... for those such inclined.

I suspect trying to compete with a 314 mm² GPU with a much larger 486 mm² HMB2 GPU isn't ideal either way.

But hey.... I'm no business man.
 
So you're essentially saying that AMD didn't know how to optimize their products, but home geeks can do that better.
That is... brave.

I have a different theory for you:
AMD, in order to lower production costs, has been releasing products with high quality variance (compared to more expensive competition).
So yes, this means that the factory voltage and clocks are on the safe side. And you can often pull a lot more from your AMD product.
Obviously, some samples won't cope very well with this.
The big difference between overclocking and undervolting is: the former is much easier to control. The card runs too hot, you just ease on the OC (or it does it automatically).
Undervolting results in operational errors and instability, which is... well... much worse. :)

AMD doesn't have time to fiddle with one card for 2 months like geeks do. That's the major difference. If AMD could pre-tweak each card to geek level in 10 seconds time and send it to stores, they would. It's why they have so optimistically set voltages, because they can't. And there is a level of consistency. Someone doesn't mind losing 3fps if that means 50W less. Someone will go ballistic if his Radeon doesn't achieve max possible framerate regardless of consumption. It's why they prefer to run them slightly hotter with higher voltages than lower that may run slower in some cases. Which is what NVIDIA cards are doing beyond the base clock. GPU Boost 3.0 is a lottery. It may go very high on its own or not compared to same model from some other user. But unlike AMD which advertises maximum guaranteed boost, NVIDIA advertises one thing but users get something else (usually higher than advertised since they want to ensure they don't have legal issues so they state boost at a guaranteed level).
 
AMD doesn't have time to fiddle with one card for 2 months like geeks do. That's the major difference. If AMD could pre-tweak each card to geek level in 10 seconds time and send it to stores, they would. It's why they have so optimistically set voltages, because they can't. And there is a level of consistency. Someone doesn't mind losing 3fps if that means 50W less. Someone will go ballistic if his Radeon doesn't achieve max possible framerate regardless of consumption. It's why they prefer to run them slightly hotter with higher voltages than lower that may run slower in some cases. Which is what NVIDIA cards are doing beyond the base clock. GPU Boost 3.0 is a lottery. It may go very high on its own or not compared to same model from some other user. But unlike AMD which advertises maximum guaranteed boost, NVIDIA advertises one thing but users get something else (usually higher than advertised since they want to ensure they don't have legal issues so they state boost at a guaranteed level).
No one talks about optimizing each card. They don't. No one does. No one can, actually. They're selling a particular product, for particular price. All samples need to have identical specification.
What is being discussed is the variance of products.
If variance is low, the manufacturer can apply factory setting near the average.
If variance is high, they have to configure them a lot below the average, because too many samples would not survive higher performance.

And this is exactly the problem with AMD. As a result some product samples have a lot of OC headroom. But some don't. You never know what you'll get.
We've seen this with Ryzen lately. Some can get to 4.1GHz, some are stuck at 3.9. And that's all fine.
The problem arises when someone is asking for CPU advice and what he gets is: "buy a Ryzen, OC it to 4.1GHz and it'll be awesome".
Now, Vega is something else, because it's already pushed pretty far by the factory. So now it's more about undervolting than OC.
And of course some Vega samples will happily undervolt, some won't.
The issue is, as mentioned above, the nature of these modification. OC stability is easy to test. For undervolting it's not that obvious.
 
You're listing this like if the whole world was against them, while in fact it's all their mistakes. They shouldn't have gone for exclusive HBM2, they shouldn't rush anything. And most importantly, they should simply increase their R&D budget if that's what holding them back.

WTF? A company releases an unfinished product and instead of being furious, you're inventing excuses for them.

Voltage is something users should never touch. It's not LEGO. You're not buying parts for some DIY fun project. It's complicated consumer electronics. It should work out of box.
What's next? You expect industrial clients to undervolt/overclock EPYC and Radeon Instinct? :)

Around the web, every gun owner, who tried to commit suicide, failed.

Apple has just released their new iMac Pro with a Vega GPU. Apple was most likely building up stock and using all Vega parts they could lay their hands on. And it will get even worse when the Mac Pro arrives. Everyone thinking about buying a Vega for their PC should just look for another option.

Don't bother to tell anyone what he should do with his property. Especially when undervoltaging an electronic expands its lifetime without risking anything when all Vega have 2 bios. And where did you see me writing that AMD are correct in what they did with Vega? I just wrote the possible causes for this release being bad for a more potent product than what they got to the market. Ignorance/fanboyism is spread all over your post.
 
Last edited:
@notb

OC potential is not being advertised by anyone. Ever. They always guarantee a) base clocks and b) boost clocks when minimum required conditions are met (usually thermal and power delivery). Anything beyond that is pure luck and it can be 5MHz or 200MHz.
 
It's interesting how something like the 7700K was always quoted that it can reach 5Ghz no problem guaranteed but the thermals were atrocious and in reality not everyone could without serious cooling or deliding.

Not by Intel. They only state specified clocks. The only thing I'd say it should always work is that "Multicore optimization" which ramps up all cores to the boost clock. That should always work, at expense of higher consumption and heat. I mean, cores are designed to reach boost clock always. Just not all together for other reasons (like heat and TDP limits). The rest are general averages. If 90% of samples reach 5GHz no problem, you can safely assume most will reach this and it becomes somewhat a general rule. But that's never guaranteed by Intel. Never.
 
Not by Intel. They only state specified clocks. The only thing I'd say it should always work is that "Multicore optimization" which ramps up all cores to the boost clock. That should always work, at expense of higher consumption and heat. I mean, cores are designed to reach boost clock always. Just not all together for other reasons (like heat and TDP limits). The rest are general averages. If 90% of samples reach 5GHz no problem, you can safely assume most will reach this and it becomes somewhat a general rule. But that's never guaranteed by Intel. Never.

I wasn't talking about Intel but rather random people on the internet. Of course no manufacturer will ever guarantee this sort of thing.
 
Indeed, Vega stock has almost completely vanished, what the hell is going on? The massively overpriced iMac Pro has just hit the market anyway... for those such inclined.

I suspect trying to compete with a 314 mm² GPU with a much larger 486 mm² HMB2 GPU isn't ideal either way.

But hey.... I'm no business man.
AMD screwed the pooch hard with vega and completely missed the "make a 3072 or 4096 core polaris" train. It wouldnt surprise me if they were cutting stock hard to meet apple and their juicy margins so they can make a few more % on the expensive vega GPUs.
 
I wasn't talking about Intel but rather random people on the internet. Of course no manufacturer will ever guarantee this sort of thing.
Random people like a lot on this forum? :)

BTW: I needed info about Radeon Instinct today and what a surprise.
MI6 is a Polaris. MI8 is based on Fiji! Only the MI25 variant is based on Vega.
How is that possible? How bad is Vega? It's like AMD admitting they've totally failed on the efficiency front.

Now I'm slightly worried about the Intel+Vega MCM. I really wanted that to be in my next notebook...
 
I want to love this card so much, but the fact that you can get a 1080ti for close to the same price with better performance it just can't happen.
 
AMD has to pull off a miracle with Navi architecture to catch up with Nvidia. Both in performance and power efficiency.

Not really, since power efficiency is derivative of performance they just need to close the efficiency gap, and they'd achieve both goals. V64 vs. a Titan V it's a 38% gap where vs. the 1080 ti it is 53%. If anything it's NVidia who is moving backwards.
 
the pcb looks like something asrock would do
 
That's a ridiculous price I paid £380 @ OcUK for my Sapphire Vega 56 which I am using with a modified G10 bracket.
 
That's a ridiculous price I paid £380 @ OcUK for my Sapphire Vega 56 which I am using with a modified G10 bracket.
The cheapest Vega 56 @OCUK is 580 GBP at the moment (and it's pre-order). Nitro+ one is 630 GBP (and available).
These are the prices we should compare and, no offense, watercooling doesn't look very sensible (unless you're really devoted to the idea).
 
Back
Top