# AMD Radeon RX Vega 64 8 GB



## W1zzard (Aug 14, 2017)

Our AMD Radeon RX Vega 64 review confirms that the company achieved major performance improvements over their last-generation Polaris and Fiji cards: Vega is faster than the GTX 1080. We tested six different performance configurations of the Vega 64, with surprising results.

*Show full review*


----------



## qubit (Aug 14, 2017)

So another meh graphics card from AMD that only matches the performance of the year old GTX 1080 with lots of power draw and noise, especially irritating coil whine. I'll pass. Also, dunno why they bother with that expensive HBM, when GDDR5X does just fine.

Let's hope AMD can finally leapfrog NVIDIA sometime not too long in the future and give us some competition.


----------



## Frogger (Aug 14, 2017)

About as expected. THks for the fine read W1zz


----------



## Kissamies (Aug 14, 2017)

Better than I expected. But when the drivers get better..


----------



## opojare (Aug 14, 2017)

I rate this 3/8.


----------



## jabbadap (Aug 14, 2017)

Damn those game to game positions are all over the place. Some games even RX Vega⁵⁶ beats gtx1080 and another ones RX Vega⁶⁴ loses even gtx1070. All in all good review as always


----------



## lemkeant (Aug 14, 2017)

Thanks W1zzard, great review


----------



## hapkiman (Aug 14, 2017)

Nice review.  Vega is not the impressive product that Ryzen is, that's for sure.  AMD just can't seem to figure out this power consumption issue.  Terrible on these cards.


----------



## B-Real (Aug 14, 2017)

Thanks for the reviews!

The Vega56 beats the 1070 with cooler reference models (with higher noise) despite the +70W power consumption. Paired with a Sync monitor, it is a better choice. The Vega64 also nearly reaches the 1080. Of course the 1080Ti remains unchallenged, but not most consumers don't buy in that price range. Not to mention the drivers coming later for HBCC and so. If you are going for Sync monitors, if Nvidia doesn't drop prices, both the 1070 and 1080 have competitors.


----------



## TheinsanegamerN (Aug 14, 2017)

9700 Pro said:


> Better than I expected. But when the drivers get better..


it wont matter, because by then volta will be out. 

AMD finewine cant fix a bad arch. VEGA, for all it can do, seems to be more of a maxwell competitor then a pascal competitor efficiency wise. Drivers may result in vega consistently beating the 1080 by a bit instead of being all over the place, but it doesnt cover the fact that AMD has still failed to learn from the lessons Fury X provided.



B-Real said:


> Thanks for the reviews!
> 
> The Vega56 beats the 1070 with cooler reference models (with higher noise) despite the +70W power consumption. Paired with a Sync monitor, it is a better choice. The Vega64 also nearly reaches the 1080. Of course the 1080Ti remains unchallenged, but not most consumers don't buy in that price range. Not to mention the drivers coming later for HBCC and so. If you are going for Sync monitors, if Nvidia doesn't drop prices, both the 1070 and 1080 have competitors.


depends on what you mean by better. It beats the 1070 in some games, looses a bit in others, they are very similar performance wise. But vega's MSRP is $50 more, and miners are going to drive that up. 

And given how well AMDs big GPU oced in the past, I wouldnt expect vega 56 to keep up with the 1070 OC to OC.


----------



## birdie (Aug 14, 2017)

The hype train has crashed spectacularly.

I'm not surprised at all but I'm extremely dumbfounded as to why AMD tried to mock Volta while Vega RX 64 cannot even reach the performance level of the 18 months old GTX 1080. And its mining performance is not what AMD fans have been expecting - a pair of RX470 will be a lot faster and cheaper (and have a better ROI).

In short, Vega has turned out to be a huge dud.

We all wanted a healthy competition but NVIDIA now has to compete only against itself. Darn. 

Let's just close this page of Radeon Graphics and expect AMD to step up its game with Navi.


----------



## TheinsanegamerN (Aug 14, 2017)

birdie said:


> The hype train has crashed spectacularly.
> 
> I'm not surprised at all but I'm extremely dumbfounded as to why AMD tried to mock Volta while Vega RX 64 cannot even reach the performance level of the 18 months old GTX 1080.* And its mining performance is not what AMD fans have been expecting* *- a pair of RX470 will be a lot faster and cheaper (and have a better ROI)*.
> 
> ...



isnt that a good thing? It means AMD fans will actually be able to buy vega, as opposed to miners buying up all of them.


----------



## birdie (Aug 14, 2017)

TheinsanegamerN said:


> isnt that a good thing? It means AMD fans will actually be able to buy vega, as opposed to miners buying up all of them.



That could have been a good thing if GTX 1070/1080 hadn't existed. You must be an extremely devoted AMD fan to choose RX 56/64 over GTX 1070/1080.

However Steam Graphics Survey clearly shows that actual AMD fans do ... not exist. Most people play on either NVIDIA's GPUs or use AMD's GPUs to mine.


----------



## Fluffmeister (Aug 14, 2017)

Hard to find a compelling reason to pick this over the GTX 1080, packed math support may pay dividends down the line, but with Volta on the horizon threatening to bring even more performance per watt gains over Pascal it's gonna be an uphill fight regardless.

Vega 56 does seem the better balanced product.


----------



## B-Real (Aug 14, 2017)

TheinsanegamerN said:


> depends on what you mean by better. It beats the 1070 in some games, looses a bit in others, they are very similar performance wise. But vega's MSRP is $50 more, and miners are going to drive that up.
> 
> And given how well AMDs big GPU oced in the past, I wouldnt expect vega 56 to keep up with the 1070 OC to OC.


Well overall, the Vega56 is faster than the 1070 a bit. But if you consider the Vega cards will keep getting better with drivers, and the drivers supporting HBCC are yet to come, you can easily say that Vega56 is faster than the 1070. And if you give credit for the Freesync monitors being a hell cheaper than the G-Sync ones, it is obviously a better choice now. Yepp, the 1070 was obviously better than a Fury or Fury X, but now a Vega56 seems obviously better than the 1070.
And for the price, it is true that Vega MSRP is 50$ more, but right now, the 1070 starts from 440$. IF not bothered by the miners, Vega56 AIBs should start around 430-450$.


----------



## Nihilus (Aug 14, 2017)

Very nice review, but I don't understand the 'Highly Recommended' part.  Zero reason to get this over a GTX 1080: Same Performance, louder, hotter, more power consumption, and likely VERY limited O/C.  The Vega 56 performed much better.


----------



## KainXS (Aug 14, 2017)

I wouldn't call Vega64 a dud, I'd call it sub-par and would definitely not buy it. The Vega56 is not that too bad but between a 1070 and Vega 56 today, I would buy a 980Ti and overclock it and maybe pick a 56 if I couldn't find a Ti. Hype pretty much ruined the launch but AMD's Poor Volta nonsense didn't help either, it should go down as how to not market a card.


----------



## B-Real (Aug 14, 2017)

KainXS said:


> I wouldn't call Vega64 a dud, I'd call it sub-par and would definitely not buy it. The Vega56 is not that too bad but between a 1070 and Vega 56 today, I would buy a 980Ti and overclock it and maybe pick a 56 if I couldn't find a Ti. Hype pretty much ruined the launch but AMD's Poor Volta nonsense didn't help either, it should go down as how to not market a card.



Well we should wait for that HBCC fps increases, but yeah, absolutely, that was a failed marketing.


----------



## birdie (Aug 14, 2017)

B-Real said:


> But if you consider the Vega cards will keep getting better with drivers, and the drivers supporting HBCC are yet to come, you can easily say that Vega56 is faster than the 1070.



"Fine wine" is such a lame myth people need to get sober and realize that the mythical performance increase over time will be minimal and will not allow AMD GPUs to move a tier higher.



> AMD claimed its new AMD Radeon Software Crimson ReLive Edition would net between 4-8% performance increases depending on the game. We can say this is pretty much on the money, at 5-6% in the games we tested AMD’s claims seem valid. That means both the AMD Radeon R9 Fury X and AMD Radeon RX 480 have received a cumulative performance upgrade via drivers, but it is quite small in the grand scheme of performance.
> 
> Think of it this way, at 60 FPS a 5% advantage is only maybe 3 FPS? Free performance is great, we’ll take all we can get, *but anything under 10% is impossible to notice in the real-world while gaming*. This goes to show a couple of important facts. *Drivers are not going to be the miracle answer to a video card’s performance over time.* When a video card is launched, that launch performance is probably within 10% of where it is going to end up in terms of cumulative performance updates, at least as far as AMD GPUs go.
> 
> ...


----------



## medi01 (Aug 14, 2017)

birdie said:


> Vega RX 64 cannot even reach the performance level of the 18 months old GTX 1080


Oh, please, get back to earth.


----------



## medi01 (Aug 14, 2017)

Fluffmeister said:


> Hard to find a compelling reason to pick this over the GTX 1080



No $200 adaptive sync tax, solid lead in most moder games.
Averages are misleading, thanks to games like Civ 4 or Fallout.


----------



## Vya Domus (Aug 14, 2017)

TheinsanegamerN said:


> it wont matter, because by then volta will be out.
> 
> AMD finewine cant fix a bad arch. VEGA, for all it can do, seems to be more of a maxwell competitor then a pascal competitor efficiency wise. Drivers may result in vega consistently beating the 1080 by a bit instead of being all over the place, but it doesnt cover the fact that AMD has still failed to learn from the lessons Fury X provided.





birdie said:


> The hype train has crashed spectacularly.
> 
> I'm not surprised at all but I'm extremely dumbfounded as to why AMD tried to mock Volta while Vega RX 64 cannot even reach the performance level of the 18 months old GTX 1080. And its mining performance is not what AMD fans have been expecting - a pair of RX470 will be a lot faster and cheaper (and have a better ROI).
> 
> ...



A dud ? Bad architecture ? Not at all , everyone is missing the point. Is it unimpressive in gaming ? Yes it is as of now , but as much as you all don't like this gaming is an afterthought these days for silicon manufactures. Vega is not the gaming card people wanted , but it's the compute card AMD needs. Look at it this way : currently the only card that even gets close to Vega 64 in FP16 and compute in general is the Tesla P100 , a 5000$ card. Yeah , it is certainly not a dud in this aspect and it has Volta in the sights. Vega is for the datacenters not for gaming rigs. Disappointing for a market  , a god send for another.


----------



## TheinsanegamerN (Aug 14, 2017)

B-Real said:


> Well overall, the Vega56 is faster than the 1070 a bit. But if you consider the Vega cards will keep getting better with drivers, and the drivers supporting HBCC are yet to come, you can easily say that Vega56 is faster than the 1070. And if you give credit for the Freesync monitors being a hell cheaper than the G-Sync ones, it is obviously a better choice now. Yepp, the 1070 was obviously better than a Fury or Fury X, but now a Vega56 seems obviously better than the 1070.
> And for the price, it is true that Vega MSRP is 50$ more, but right now, the 1070 starts from 440$. IF not bothered by the miners, Vega56 AIBs should start around 430-450$.


HBCC, TBR, DX12, Vulkan, mantle, windows 10, crossfire, there is always something people think will make AMD super fast.

AMD finally optimizing their drivers may, at best, offer 10%. And given it is AMD, that is going to be really hit or miss depending on title. Drivers wont fix an outdated arch.



Vya Domus said:


> A dud ? Bad architecture ? Not at all , everyone is missing the point. Is it unimpressive in gaming ? Yes it is as of now , but as much as you all don't like this gaming is an afterthought these days for silicon manufactures. Vega is not the gaming card people wanted , but it's the compute card AMD needs. Look at it this way : currently the only card that even gets close to Vega 64 in FP16 and compute in general is the Tesla P100 , a 5000$ card. Yeah , it is certainly not a dud in this aspect and it has Volta in the sights. Vega is for the datacenters not for gaming rigs. Disappointing for a market  , a god send for another.


If it is the compute card AMD needs, why are they trying to sell it as a gaming card, and not a fire pro card?


----------



## ERazer (Aug 14, 2017)

as expected, if your AMD fan and been waiting sure go ahead buy one but most of us will be waiting what volta got to offer.


----------



## Vya Domus (Aug 14, 2017)

TheinsanegamerN said:


> If it is the compute card AMD needs, why are they trying to sell it as a gaming card, and not a fire pro card?



It's a purely formal matter , for gaming all they have to do is make it clear that they are not out of the game. That's all Vega is , a reminder they are still active in the high end gaming market. Trust me Vega is for the datacenter.


----------



## B-Real (Aug 14, 2017)

birdie said:


> "Fine wine" is such a lame myth people need to get sober and realize that the mythical performance increase over time will be minimal and will not allow AMD GPUs to move a tier higher.


That 50% average 100% minimum fps increase seems a dream, yeah. But the raw facts say the Vega56 is overall a bit faster card than the 1070. With a sync monitor, it's an absolute winner. And with the drivers in the coming weeks and months, a ~10% performance increase without HBCC doesn't seem impossible if you check the RX480 now and then reviews.


"And given how well AMDs big GPU oced in the past, I wouldnt expect vega 56 to keep up with the 1070 OC to OC."

Not everyone is using OC on cards, btw.


----------



## Fluffmeister (Aug 14, 2017)

medi01 said:


> No $200 adaptive sync tax, solid lead in most moder games.
> Averages are misleading, thanks to games like Civ 4 or Fallout.



It's was clear that was their selling point, being slower and all.

Buy Vega and enjoy it, it was never going to be a Nv killer you Captain Tom and gang had hoped for.


----------



## KainXS (Aug 14, 2017)

B-Real said:


> Well we should wait for that HBCC fps increases, but yeah, absolutely, that was a failed marketing.



The "wait" was one of the problems with this launch. Every single time when faced with the reality of its performance the defense has been well wait for this, wait for that.

No . . . . . just no.

You will be waiting for so long that by the time its optimized the next Geforce series will come out and then you will say wait for Navi.


----------



## bug (Aug 14, 2017)

So no miracle last minute software optimizations or hardware tweaks and the FE was actually a good indicator of the overall performance? Who would have thought?

Also, the 300W this burns through in games is nearly twice the 160W of a GTX 1080. I mean, TDP may not be all that important, but come on...


----------



## chaosmassive (Aug 14, 2017)

qubit said:


> So another meh graphics card from AMD that only matches the performance of the year old GTX 1080 with lots of power draw and noise, especially irritating coil whine. I'll pass. Also, dunno why they bother with that expensive HBM, when GDDR5X does just fine.
> 
> Let's hope AMD can finally leapfrog NVIDIA sometime not too long in the future and give us some competition.





Fluffmeister said:


> Hard to find a compelling reason to pick this over the GTX 1080, packed math support may pay dividends down the line, but with Volta on the horizon threatening to bring even more performance per watt gains over Pascal it's gonna be an uphill fight regardless.
> 
> Vega 56 does seem the better balanced product.





birdie said:


> The hype train has crashed spectacularly.
> 
> I'm not surprised at all but I'm extremely dumbfounded as to why AMD tried to mock Volta while Vega RX 64 cannot even reach the performance level of the 18 months old GTX 1080. And its mining performance is not what AMD fans have been expecting - a pair of RX470 will be a lot faster and cheaper (and have a better ROI).
> 
> ...





hapkiman said:


> Nice review.  Vega is not the impressive product that Ryzen is, that's for sure.  AMD just can't seem to figure out this power consumption issue.  Terrible on these cards.



Told ya



chaosmassive said:


> despite all spotlights given on top of Vega, performance leak that show it still trade blow with GTX 1080 is big turnoff





chaosmassive said:


> Well, more often than not the actual real world performance wont stray far from what we've seen in leaked benchmark (esp Vega FE benchmark flying around).
> Maybe 10~15% deviation from what we already know, feel free to correct me if I am wrong though.



card equiped with HBM2 memory and still on par with GTX 1080 its not funny at all


----------



## TheinsanegamerN (Aug 14, 2017)

KainXS said:


> The "wait" was one of the problems with this launch. Every single time when faced with the reality of its performance the defense has been well wait for this, wait for that.
> 
> No . . . . . just no.
> 
> You will be waiting until for so long that by the time its optimized the next Geforce series will come out and then you will say wait for Navi.


Always Massive Delays strikes again.

Like you said, it doesnt matter if AMD finally delivers after two years, because nvidia left them in the dust.



Vya Domus said:


> It's a purely formal matter , for gaming all they have to do is make it clear that they are not out of the game. That's all Vega is , a reminder they are still active in the high end gaming market. Trust me Vega is for the datacenter.


Where quadro and tesla would slaughter it in efficiency, or outright performance if we are talking vega 56. 

Vega may show they are not "out of the game" but it also clearly shows they are not on the ball either.


----------



## medi01 (Aug 14, 2017)

TheinsanegamerN said:


> HBCC, TBR, DX12, Vulkan, mantle, windows 10, crossfire, there is always something people think will make AMD super fast.



Just recalculate perf summary, removing weirdo games like Civ 4 or Fallout and think again.



Fluffmeister said:


> t was never going to be a Nv killer you Captain Tom and gang had hoped for.


*It would be so cool if you wouldn't put words in my mouth, whoever that guy, with whom you are mistaking me is.



Fluffmeister said:



			being slower and all
		
Click to expand...

*Vega 56  ($399) is faster than 1070, Vega 64 Air ($499) is on par with 1080


----------



## Champ (Aug 14, 2017)

Impressive from an amd standpoint, but power consumption too high. This full throttle is equal to 2 1080s. I rather have 2 1080s


----------



## _Flare (Aug 14, 2017)

Thank you very much for the review, with all Power-Settings tested.
Really interesting how the 2nd Bios on power-saving is only 5-8% slower with a third of the consumption cut off.

And where a GTX1080 is way faster the +100W doesn´t give enough value for the exploding wattage.

Great job W1zzard


----------



## TheinsanegamerN (Aug 14, 2017)

medi01 said:


> Just recalculate perf summary, removing weirdo games like Civ 4 or Fallout and think again.


O rly? I could remove a few games and recalculate showing VEGA is universally slower then the 1070. it would be just as disingenuous as what you just wrote.


----------



## Vya Domus (Aug 14, 2017)

TheinsanegamerN said:


> Where quadro and tesla would slaughter it in efficiency, or outright performance if we are talking vega 56.



If you think datacenters care only about power consumption you are wrong , they care about the overall cost.

Have I not been clear : currently the only competitor for Radeon Instinct MI25 is the P100 ( that is a tad weaker) a card that is *5-10 grand*. Nvidia will come out with Volta soon , but the question is how much cheaper they'll be in order to be worth it. V100 with it's Tensor Cores look nice , but here's the catch : 4x V100s cost 70 000$. Ouch.


----------



## medi01 (Aug 14, 2017)

Champ said:


> Impressive from an amd standpoint, but power consumption too high. This full throttle is equal to 2 1080s. I rather have 2 1080s


You think it's wise to compare stock card power consumption with OCed card's power consumption?

It's bad, but not as bad as people draw it, one could basically cut one third of power, by reducing perf by about 10%












TheinsanegamerN said:


> O rly? I could remove a few games and recalculate showing VEGA is universally slower then the 1070. it would be just as disingenuous as what you just wrote.


Civ and Fallout results are not just "games", these are games with outlandish results.


----------



## Prima.Vera (Aug 14, 2017)

AMD....Ouch! 

1. Power consumption..._*Jeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeezuuuuuuuuuuuuuuuuus!*_
2. Price. Meh. They cost as nVidia's counterparts, while consuming almost double and performance fluctuating...  The VEGA 56 would have been a great deal if priced 50$ less.
3. Performance. Same or lower than 1 years old cards... Meh.


----------



## Frick (Aug 14, 2017)

birdie said:


> In short, Vega has turned out to be a huge dud.



Arguably the Vega 56 is a good card for the price tho.


----------



## Steevo (Aug 14, 2017)

Great review, does tile based rendering work?

Just as I predicted it's going to be mostly miners that buy this card, and I'm sure AND is fine with that as they can provide a mediocre set of drivers for a sold out product and keep moving forward.


----------



## Fluffmeister (Aug 14, 2017)

medi01 said:


> *It would be so cool if you wouldn't put words in my mouth, whoever that guy, with whom you are mistaking me is.*
> 
> Vega 56  ($399) is faster than 1070, Vega 64 Air ($499) is on par with 1080



I said the Vega 56 was a compelling choice, no need to get your knickers in a twist defending your fave brand.

Go Vega and enjoy, now off to Mindfactory!


----------



## B-Real (Aug 14, 2017)

bug said:


> So no miracle last minute software optimizations or hardware tweaks and the FE was actually a good indicator of the overall performance? Who would have thought?
> 
> Also, the 300W this burns through in games is nearly twice the 160W of a GTX 1080. I mean, TDP may not be all that important, but come on...


GTX 1080 draws 166W. Balanced mode (so not OC) with 2nd BIOS draws 262W from Vega64. So it's not 160-300, But nearly 170-260. Which is still a huge difference, but much less than your numbers.



Prima.Vera said:


> AMD....Ouch!
> 
> 1. Power consumption..._*Jeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeezuuuuuuuuuuuuuuuuus!*_
> 2. Price. Meh. They cost as nVidia's counterparts, while consuming almost double and performance fluctuating...  The VEGA 56 would have been a great deal if priced 50$ less.
> 3. Performance. Same or lower than 1 years old cards... Meh.



1. No, they dont consume almost double. Based on more reviews, 1070 and Vega56 gaming power consumption is between 50 and 70 Watts. That is more like 40-50% more power than 100%, as you say. For 1080 and Vega64, that is 166 and 262, which is around 60%. Still, not 100%.

2. Vega56 performs better based on all game average than the 1070. Vega64 even with balanced mode performs better than the 1080.  So overall, both are a bit better than their NV counterparts. And not same or lower...


----------



## _Flare (Aug 14, 2017)

the value of an "average" gets better with "more data/information"

i´m still stunned by the 300W-move by AMD, there is no single game where it can takeover the GTX1080 by any meaningfull step with +100W wasted.
in the games with a generally nvidia advantage, the +100W is wasted, too.
and in games where AMD has general-advantage or plenty FPS, it doesn´t make any sense, too.

for the 5-8% gain with +100W extra, its kind killing an ant with a big hammer.


----------



## DeathtoGnomes (Aug 14, 2017)

Thanks for the review @W1zzard. AMD cleverly positioned the card with the right price point 2nd place performance with the usual lower price. Where are the arguments with "new not-yet-optimized drivers"? I'd like to see this card reviewed under water without thermal pads but instead with real TIM. 

I expect to retail pricing to be $200+ higher on day one. If I can get this card for $499, that would leave room for a 2nd card or a TR/R7 build with it.


----------



## okidna (Aug 14, 2017)

Vya Domus said:


> Vega is not the gaming card people wanted , but it's the compute card AMD needs.


----------



## xkm1948 (Aug 14, 2017)

So overall ~20% performance increases from FuryX to Vega64 meltdown mode. Meh.

Feels like a major failure to me. 290X to FuryX had better improvement performance wise.

Also where is that magical shader discard or whatever stuff? Was that what loads of people claming is going to boost Vega bu HUGE percentage? And the gaming performance is pretty much on par with Vega FE. So much for "not a gaming card"

Calling @RejZoR, your card has arrived. Since mining performance is bad price should be OK.


----------



## Khonjel (Aug 14, 2017)

I wish AMD did more to improve on UE4 titles. Work more closer with EPIC.


----------



## buggalugs (Aug 14, 2017)

bug said:


> So no miracle last minute software optimizations or hardware tweaks and the FE was actually a good indicator of the overall performance? Who would have thought?
> 
> Also, the 300W this burns through in games is nearly twice the 160W of a GTX 1080. I mean, TDP may not be all that important, but come on...



I have to agree. I could live with the performance level, but not when it consumes nearly twice the power of the competitor.

 Its not good enough AMD, after all this time. Theres no point having all these new features if your card draws double the power for the same performance.


----------



## warup89 (Aug 14, 2017)

I was a AMD fanboy, then I grew up. I realized Im a customer and I want whoever gives best performance, thus went to Nvidia this time and very happy with it.

Amazing to see people all over the net defending this card like there's no tomorrow, the power draw alone makes it a subpar card for the general customer, plus the price will go up due to mining. 

_To each their own..._


----------



## EarthDog (Aug 14, 2017)

medi01 said:


> Oh, please, get back to earth.


Fine... it trades punches with or barely beats a card released 444 days ago... all the while using a lot more power and being noisier.



Vya Domus said:


> Trust me Vega is for the datacenter.


Interesting spin.....

Let us know when FP16 is worth its weight for consumers...another item too far ahead of its time.


----------



## Tsukiyomi91 (Aug 14, 2017)

362W on Furmark with Standard Boost BIOS. Really AMD?? Is beating a one-year old GTX1080 non-Ti while increasing Vega's voltage limit for that?? NO. It's R9 3xx Series era of power consumption all over again. $500 asking price for that kind of performance? No thanks. Rather pick the GTX1080 non Ti for it's power efficiency.


----------



## xkm1948 (Aug 14, 2017)

Hey here is an idea. Can you do a clock to clock performance analysis between FuryX and Vega56/64 @W1zzard ? Really want to know if clocked the same where Vega's performance and power consumption level would be.


----------



## Vya Domus (Aug 14, 2017)

okidna said:


> View attachment 91091



Oh please , spare me the dank maymays , though this is TPU not Wccftech.

 Not to mention it ain't got anything to do with what I said.


----------



## medi01 (Aug 14, 2017)

EarthDog said:


> Fine... it trades punches with or barely beats a card released 444 days ago... all the while using a lot more power and being noisier.



Devil is in details:












Vega is a flop, not because of "444 days" etc (seriously, you are blaming underdog on verge of bancrupcy for not having yet another parallel project?) but that being that big (484 square mms vs 314 or 334, I was told) it rather lacks on performance front.  
About 10% slower than similiarly sized nvidia card would be ok.

Power consumption is also ok-ish, in power saving mode (in which it is 10% slower than at max power mode)


----------



## EarthDog (Aug 14, 2017)

Oh yeah....... do tell what you are trying to show there...

See your edit... What can you say to that... really... sure the power comes back to earth, yet still more than a 1080, when power savings is on and it loses 10% performance...

Yeah, those SLEW of DX12/Vulkan titles that are out or coming out saves it............. that is if production stopped from its competition. It takes looking at things through DX12 eyeglasses. So, let me go ahead and pay for this thing now, and HOPE that DX12/Vulkan titles start to saturate the market... meanwhile, 2+ years later, there will be new gen cards out.. LOL. Can't F with that infallible logic.

Hey, listen, I don't cut anyone slack... behind is behind after 444 days. This attempt is a bit disappointing unless the prices are able to stay low.

Oy...


----------



## bug (Aug 14, 2017)

xkm1948 said:


> So overall ~20% performance increases from FuryX to Vega64 meltdown mode. Meh.



Vega64 is clocked 90% higher than Fury X, which makes me wonder what's under the hood.


----------



## ERazer (Aug 14, 2017)

For a second runner up  the pricing aint right needs to around $50 cheaper to make sense.


----------



## bug (Aug 14, 2017)

EarthDog said:


> Oh yeah....... do tell what you are trying to show there...
> 
> See your edit... What can you say to that... really... sure the power comes back to earth, yet still more than a 1080, when power savings is on and it loses 10% performance...
> 
> ...



And Nvidia keeps current prices. Which they have little reason to do after milking the market for over a year.


----------



## B-Real (Aug 14, 2017)

warup89 said:


> I was a AMD fanboy, then I grew up. I realized Im a customer and I want whoever gives best performance, thus went to Nvidia this time and very happy with it.
> 
> Amazing to see people all over the net defending this card like there's no tomorrow, the power draw alone makes it a subpar card for the general customer, plus the price will go up due to mining.
> 
> _To each their own..._


There is price/performance ratio, you know. I think you should still grow up.


----------



## medi01 (Aug 14, 2017)

bug said:


> Vega64 is clocked 90% higher than Fury X, which makes me wonder what's under the hood.


About 60% higher.


----------



## Captain_Tom (Aug 14, 2017)

It really goes to show that it depends on which games you play.   I was seeing massive win after massive win, and then got confused by the "Performance Summary."



If you play Fallout 4, BF1, and DEUS EX - Vega is a completely different card compared to Watch Dogs 2 lol.   By as much as 30%!


----------



## Bjorn_Of_Iceland (Aug 14, 2017)

Let us wait for proper drivers guys. The same way we waited for Vega.


I mean, check it out! 5-7 fps faster than a GTX1080 in Deus Ex!


----------



## Vya Domus (Aug 14, 2017)

EarthDog said:


> and HOPE that DX12/Vulkan titles start to saturate the market...



Honestly with the way things are , I hope DX12 doesn't become the norm with the way it is used right now. At this point it is clear what's the deal with DX12. Many developers have stated that working with DX12 is a burden and sometimes it is difficult to even match what you can achieve with DX11 performance wise. Remedy Entertainment said they "feel at home with DX11" which says a lot , and we know how much of a clusterfuck Quantum Brake was on DX12. Most DX12 games we have either run the same as with DX11 or worse on both AMD and Nvidia.

MS needs to seriously consider stepping in and doing something about it.


----------



## B-Real (Aug 14, 2017)

xkm1948 said:


> Hey here is an idea. Can you do a clock to clock performance analysis between FuryX and Vega56/64 @W1zzard ? Really want to know if clocked the same where Vega's performance and power consumption level would be.











Vega56 18% faster than FuryX with -8% power consumption.


----------



## Manu_PT (Aug 14, 2017)

Is that power consumption really accurate? Wow

You dont want to live in a country where eletricty is expensive. Shocking numbers


----------



## efikkan (Aug 14, 2017)

@W1zzard
Can you please explain your scale?
How can a terrible product like this get a 8.6 out of 10? And how bad does a product have to be to get something like a 5? What is the point of a scale 1-10 when there is barely any difference from a bad card like this vs. a good card like GTX 1080 (which got 9.1).

In my book, a 5 should mean a mediocre score, a completely OK decent product. Vega 64 fails to deliver in terms of performance, efficiency, noise, etc. It deserves a score of 3, 4 at most.

-----

Who in their right mind would buy Vega 56/64 for gaming? GTX 1070/1080 are clearly better options.


----------



## DarkOCean (Aug 14, 2017)

Lol, these are the worst cards in AMD/ATI's history, even the 2900xt was better than this 
It's time to find another hobby I guess.


----------



## Manu_PT (Aug 14, 2017)

Hobby? Just switch to nvidia and keep doing your hobby!


----------



## bug (Aug 14, 2017)

Captain_Tom said:


> It really goes to show that it depends on which games you play.   I was seeing massive win after massive win, and then got confused by the "Performance Summary."
> 
> 
> 
> If you play Fallout 4, BF1, and DEUS EX - Vega is a completely different card compared to Watch Dogs 2 lol.   By as much as 30%!


Either you were reading a different review, or you have a very personal definition of "massive win".
Because on TPU, when it doesn't outright lose, the Vega 64 is within 10% of the GTX 1080 (save for Deus Ex and Doom, titles that have been always running better on AMD hardware).


----------



## bug (Aug 14, 2017)

efikkan said:


> @W1zzard
> Can you please explain your scale?
> How can a terrible product like this get a 8.6 out of 10? And how bad does a product have to be to get something like a 5? What is the point of a scale 1-10 when there is barely any difference from a bad card like this vs. a good card like GTX 1080 (which got 9.1).
> 
> ...


This has been asked in the past.

W1zzard cannot outright admit, has has hinted that if he scores lower, he's not going to get samples for much longer. I've stopped looking at that badge ages ago, even the pros and cons section can be a little tricky to read sometimes.


----------



## efikkan (Aug 14, 2017)

I'm just waiting for people to start blaming "immature drivers" and "missing optimizations".

This is the worst top model released by AMD for years. Don't forget that Pascal is near the end of its life cycle, Vega would have to compete with Volta.


----------



## Vya Domus (Aug 14, 2017)

Everyone is extremely polarizing throwing comprehensive arguments  left and right such as : "this is a failure" , "worst thing ever" , "massive win" , "it's a flop" , "just switch to nvidia".

This has derailed into a fanboy showdown just as expect.


----------



## the54thvoid (Aug 14, 2017)

It's hard to even make a comment without thinking it's going to annoy someone but I genuinely do not understand what RTG do all day to create so much hype about a card with so much technology that isn't better.  Seriously.  How?  It's way faster than Fury X in clockspeed but it's not correspondingly faster in performance.  Do RTG think of a million complex things to throw on a PCB and just hope it's all going to work?  They need to refine, not increase the complexity.

No, sorry wait, refining a process is boring.  People would rather see lots of new things that RTG can hype up that dont translate to real world gains.  Has anyone noticed news about Navi's AI stuff is already leaking out?  

RTG, already moving on and forgetting about Vega.

And before folk say it's the game developers fault for not implementing RTG's stuff, that's a wee bitty hypocritical because the same people argue Nvidia are bad for working with dev's.  So again, for the company that have the Xbox and PS4 in their wings, why does their top range gfx card not do better when the consoles code for their chips?


----------



## EarthDog (Aug 14, 2017)

efikkan said:


> @W1zzard
> Can you please explain your scale?
> How can a terrible product like this get a 8.6 out of 10? And how bad does a product have to be to get something like a 5? What is the point of a scale 1-10 when there is barely any difference from a bad card like this vs. a good card like GTX 1080 (which got 9.1).
> 
> ...


Its subjective, sadly. Its why most sites have, and rightfully so, ditched the point system as there isn't anything quantitative attached to the values.



Vya Domus said:


> Everyone is extremely polarizing throwing comprehensive arguments  left and right such as : "this is a failure" , "worst thing ever" , "massive win" , "it's a flop" , "just switch to nvidia".
> 
> This has derailed into a fanboy showdown just as expect.


You are't new here. Are you surprised?


----------



## Steevo (Aug 14, 2017)

bug said:


> This has been asked in the past.
> 
> W1zzard cannot outright admit, has has hinted that if he scores lower, he's not going to get samples for much longer. I've stopped looking at that badge ages ago, even the pros and cons section can be a little tricky to read sometimes.




That has nothing to do with it. Read the review without your personal Bias. The frame pacing, pricing, and power consumption when chill and other features are turned on make it the *decent* card that it is.

It's not a huge win, it's more like the special Olympics winner that showed up to run at the regular Olympics.... juice box required.

AMD will probably fix half the issues with games, and improve performance by 5% while miners buy the card and they profit, some, cause HBM2 and the die isn't cheap.


----------



## Hokum (Aug 14, 2017)

My Fury X can breath a sigh of relief as I see no reason to upgrade from it, except the 8Gb memory.


----------



## Nima (Aug 14, 2017)

Hype hype and even more hype but nothing interesting when it comes to actual performance, as always with all AMD GPUs.


----------



## qubit (Aug 14, 2017)

chaosmassive said:


> Told ya


Not me, I've been saying this all along.


----------



## Vya Domus (Aug 14, 2017)

the54thvoid said:


> And before folk say it's the game developers fault for not implementing RTG's stuff, that's a wee bitty hypocritical because the same people argue Nvidia are bad for working with dev's.  So again, for the company that have the Xbox and PS4 in their wings, why does their top range gfx card not do better when the consoles code for their chips?



It's not that straight forward. Most people on the AMD side argue that developers are not using DX12/Vulkan not some specific software from AMD. If you use these APIs to make games run faster , they'll run faster universally, not just on AMD. Whereas Nvidia did some clearly discriminatory things in the past when they worked with developers , like it or not. AMD and Nvidia both work with developers all the time , the end result isn't what you would desire every single time unfortunately.
Just because AMD makes hardware for consoles doesn't mean that will translate into brilliant performance on their desktop products , it's a different environment after all.

So once again , it's not as straight forward.



EarthDog said:


> You are't new here. Are you surprised?



Not really , which is why I said "just as expected". Well it's not as bad as on some other forums honestly.


----------



## Air (Aug 14, 2017)

> Unfortunately AMD did not include the idle-fan stop feature with their card, which puzzles me, since it has become a basic feature on nearly all boards on the market today, except for NVIDIA reference designs, which fail here, too.



That's because it probably can't be done on blower coolers. On open air coolers the heatskins are always realively big and exposed to ambient air. On blowers, the heatsink is much smaller and enclasurated in a "pipe", which is necessary to the use of radial fans, but terrible for natural convection. So, it would problably hit temperatures too high even at idle.


----------



## notb (Aug 14, 2017)

Frick said:


> Arguably the Vega 56 is a good card for the price tho.


They're not bad in the same way a 980Ti is not a bad card and 2500K is not a bad CPU. Even today.
So AMD is finally making a high-end GPU (performance-wise).

It's just that this is not a product that was promised. Not a card we would expect in 2017 (looking at what NV can do).

In many ways this looks like a brute-force answer to the more innovative competitors - a bit like what Intel did with Skylake-X to counter Zen (possibly clocking it way higher than they originally planned).
It's just that the performance/power draw relations are even worse than in CPU battle. And of course AMD had literaly years to prepare Vega and it was meant to be a revolution. Now it looks like an interim solution and we're already beeing fed by Navi hype.
As some have already noted: a dual Polaris could be just as effective and most likely cheaper.
Or from another point of view: think how quick would Pascal be if NVIDIA calibrated it for Vega's power consumption...

And a couple important notes:
1) Just how bad will the Vega APU be? Is there any sense in making it? Will a ~30W Vega be noticeably faster than a ~30W Polaris?
2) Will we see a mobile RX Vega?



Vya Domus said:


> Vega is for the datacenters not for gaming rigs.


I don't know where you got this idea from.
This is not a computation card, it will not be used in datacenters. Not happening. Ever.
AMD is targeting RX lineup at gamers. They have other cards designed for professionals.


----------



## crazyeyesreaper (Aug 14, 2017)

So basically i guessed performance would be similar to two RX 470s in Xfire when xfire actually works. End result Vega performs pretty much somewhere between RX 470 and 480 xfire performance on average. Go figure.


----------



## the54thvoid (Aug 14, 2017)

Vya Domus said:


> Just because AMD makes hardware for consoles doesn't mean that will translate into brilliant performance on their desktop products ,



We need to go back in time because I'm pretty sure (not you!) a lot of AMD people were spouting the demise of Nvidia because AMD got the consoles.  It was as good as over, you know, AMD chips in Xbox and PS4, it'll be DX12 everywhere and Nvidia will be finished.

Guess what?  Didn't happen and even then, Nvidia can do DX12 and Vulkan just fine.

Too many people forget the last launch... and the one before that.  What's really sad is that I left Nvidia to go to Radeon (HD 5850's) because the Fermi power issue was hilarious.  I mean, Fermi, this compute monster rolling in with CUDA and being so hot they couldn't make 512 cores work - had to stop at 480 (or something).  It was a grill.  Now it's insane that power is not an issue anymore because it's AMD who can't control it properly.  The historical hypocrisy is sublime. 

I'l say it over and over.  It's a gaming card and RTG need to work on streamlining it, not making it more and more complex.


----------



## Octopuss (Aug 14, 2017)

The power consumption is beyond ridiculous. I'm not putting that crap into my PC.


Also, it's funny how in Guru3d's review the card consistently beats 1080 by noticeable amount and is praised like it was something amazing. I wonder...


----------



## B-Real (Aug 14, 2017)

Octopuss said:


> The power consumption is beyond ridiculous. I'm not putting that crap into my PC.
> 
> 
> Also, it's funny how in Guru3d's review the card consistently beats 1080 by noticeable amount and is praised like it was something amazing. I wonder...


Wondering why are you only thinking about the power consumption part of the GPUs. Why not thinking about Sync monitors, why not thinking about the Radeon Packs? So biased "review" from you.



Nima said:


> Hype hype and even more hype but nothing interesting when it comes to actual performance, *as always with all AMD GPUs*.



Hehe, yeah sure. Pretty little liar.



Manu_PT said:


> Is that power consumption really accurate? Wow
> 
> You dont want to live in a country where eletricty is expensive. Shocking numbers



Yeah sure, this is the time when power consumption is important. What about Intel's X CPUs?


----------



## Octopuss (Aug 14, 2017)

B-Real said:


> Wondering why are you only thinking about the power consumption part of the GPUs. Why not thinking about Sync monitors, why not thinking about the Radeon Packs? So biased "review" from you.


Review? Stop taking drugs. This is discussion forum.


----------



## B-Real (Aug 14, 2017)

Octopuss said:


> Review? Stop taking drugs. This is discussion forum.


"review" You know... lolz


----------



## Vya Domus (Aug 14, 2017)

the54thvoid said:


> Too many people forget the last launch... and the one before that.  What's really sad is that I left Nvidia to go to Radeon (HD 5850's) because the Fermi power issue was hilarious.  I mean, Fermi, this compute monster rolling in with CUDA and being so hot they couldn't make 512 cores work - had to stop at 480 (or something).  It was a grill.  Now it's insane that power is not an issue anymore because it's AMD who can't control it properly.  The historical hypocrisy is sublime.



I don't think anyone disagrees that power consumption is huge and if they do they're not hypocritical they are just ignorant. But is it an actual issue ? Remember that Fermi wasn't just simply a power hog , it would blow up if you would mess with the voltages too much , it would also run at 90-100c on the brink of death with the stock cooler. Vega as far as I can see , doesn't blow up because nowadays power consumption is very strictly monitored/controlled unlike back then. It also runs at acceptable temperatures. High power consumption for me isn't that important , I only consider it an issue when it has catastrophic implications like it did with Fermi.



the54thvoid said:


> We need to go back in time because I'm pretty sure (not you!) a lot of AMD people were spouting the demise of Nvidia because AMD got the consoles.  It was as good as over, you know, AMD chips in Xbox and PS4, it'll be DX12 everywhere and Nvidia will be finished.
> 
> Guess what?  Didn't happen and even then, Nvidia can do DX12 and Vulkan just fine.



Like I said in one of my previous post , DX12 is apparently not that great to work with and Vulkan sees very little adoption. These APIs could have given AMD an edge , but it didn't because the PC gaming development platform is a different beast , they don't really have the same control here.



the54thvoid said:


> I'l say it over and over.I'l say it over and over.  It's a gaming card and RTG need to work on streamlining it, not making it more and more complex.



It was over for AMD from the moment Vega missed the proper time release window to fight off Pascal. It was obvious that if it was going to get dropped in an awkward moment between the release of Pascal and Volta it would fail to achieve much on the gaming side of things. But unfortunately Vega as an architecture is not for gaming. Just because they slapped an RX badge on it doesn't mean under the hood it is an actual gaming card. It's not , it's for compute , a clear competitor to Tesla not Geforce forced into different clothes.


----------



## Durvelle27 (Aug 14, 2017)

Hmmmmm


----------



## EarthDog (Aug 14, 2017)

Vya Domus said:


> Well it's not as bad as on some other forums honestly.


Really? I feel bad for those forums because here it's turning into a god damn joke, some people.


----------



## Vya Domus (Aug 14, 2017)

Oh boy , have I seen some true cesspools. Trust me , things are still adequate on here.


----------



## the54thvoid (Aug 14, 2017)

Vya Domus said:


> High power consumption for me isn't that important



Which is cool (not literally) but it does matter to the OEM's wanting to use small form factor and for mobile etc.  We have laptops now with GTX1080's in them.  Not going to happen with Vega.  AMD are also diminishing from the APU scene (last release wasn't even Zen based) and that was their super niche.  It's hard to see Vega being used effectively on a mobile platform.

Also, the next talk from the web is Navi incorporating the AI hardware onto the chip (which is what Volta has already done).  But the reason many folk (jumping on a single source mind you) say Volta isn't the next Nvidia consumer chip is because it's the AI that makes Volta, well, Volta.  And not for quite a while will games need such heavy AI lifting. It's that 'unnecessary' shove it all on the chip mindset that is hindering RTG.  Hell, the next RTG chip may as well incorporate a climate modelling architecture, just in case.  It's no longer moar cores but moar unincorporated techy stuff.


----------



## RejZoR (Aug 14, 2017)

xkm1948 said:


> So overall ~20% performance increases from FuryX to Vega64 meltdown mode. Meh.
> 
> Feels like a major failure to me. 290X to FuryX had better improvement performance wise.
> 
> ...



After intense consideration after 3 hours of fiddling with stores, prices and shit I've decided to pull the trigger on AORUS GTX 1080Ti. Reason? RX Vega 64 Liquid costs the same as this beefed up air cooled one (and I've specifically taken this one because it actually has tons of thermal pads underneath backplate and VRM is connected to actual heatsink, not some crappy little alu heatsink screwed on top of VRM, proper air cooling).
RX Vega 56 is actually by far the best deal, but it won't cost 400€ in Europe for sure and quite frankly, I'm sick and tired of endless waiting for aftermarket cooled ones to arrive god knows when in autumn 2017. The reference one is crap, just like all blower style cards so I'm certainly not buying that either.

Everyone can call me AMD fanboy for being optimistic about RX Vega, but in the end, I'm not stupid. For someone who wsn't waiting endlessly and just happens to change stuff now, I guess it's still viable, but for me, this is it.


----------



## dwade (Aug 14, 2017)

1950x + Vega = nuclear level


----------



## Vya Domus (Aug 14, 2017)

the54thvoid said:


> Which is cool (not literally) but it does matter to the OEM's wanting to use small form factor and for mobile etc.  We have laptops now with GTX1080's in them.  Not going to happen with Vega.  AMD are also diminishing from the APU scene (last release wasn't even Zen based) and that was their super niche.  It's hard to see Vega being used effectively on a mobile platform.



But look how much more efficient RX 56 is compared to 64. With the right clocks/shader count and some nifty binning and more aggressive power saving I don't see why it wouldn't make it's way into high end laptops. 

As with regards to OEM , meh I don't know. The market for high end desktop GPUs is already a small portion of the whole thing. The cards that go into OEM systems are an even smaller chunk. In the end I don't think AMD should worry to much about it.



the54thvoid said:


> It's that 'unnecessary' shove it all on the chip mindset that is hindering RTG.  Hell, the next RTG chip may as well incorporate a climate modelling architecture, just in case.  It's no longer moar cores but moar unincorporated techy stuff.



It is unnecessary , but they can't deal with it in any other way. They just simply can't afford different designs like Nvidia does. 

Volta really does look like Pascal with extra compute features. If Nvidia wants to come up with something soon I suspect it'll be another Kepler gen2 type product. Slightly larger dies with more shaders, maybe slightly higher clocks at a reduced cost and that's about it. Nothing outstanding.

 Unless AMD is going to push hard on Navi , things will slow down big time on the GPU side of things.


----------



## the54thvoid (Aug 14, 2017)

RejZoR said:


> After intense consideration after 3 hours of fiddling with stores, prices and shit I've decided to pull the trigger on AORUS GTX 1080Ti. Reason? RX Vega 64 Liquid costs the same as this beefed up air cooled one (and I've specifically taken this one because it actually has tons of thermal pads underneath backplate and VRM is connected to actual heatsink, not some crappy little alu heatsink screwed on top of VRM, proper air cooling).
> RX Vega 56 is actually by far the best deal, but it won't cost 400€ in Europe for sure and quite frankly, I'm sick and tired of endless waiting for aftermarket cooled ones to arrive god knows when in autumn 2017. The reference one is crap, just like all blower style cards so I'm certainly not buying that either.
> 
> Everyone can call me AMD fanboy for being optimistic about RX Vega, but in the end, I'm not stupid. For someone who wsn't waiting endlessly and just happens to change stuff now, I guess it's still viable, but for me, this is it.



It'll be just your luck if they release Volta in 1 month 

but.... going on the Vega 64 performance, I think we can all wait till Summer/Autumn 2018 for the next high end Nvidia GPU.


----------



## RejZoR (Aug 14, 2017)

We already know there won't be any Volta any time soon.


----------



## Jism (Aug 14, 2017)

The problem with AMD's GPU design is that it is a 4 lane highway where the max speed is 75mph. On Nvidia's part that same highway is back to 2 lanes with a max of 150mph. This gives Nvidia a advantage in loads such as gaming or whatever. On the other hand you can see the effect of AMD's approach in for example GPU encoding, mining and that sort of stuff that really requires bruteforce compute power. So the review you are having here, proves that AMD might be the better card for general workloads and particular games, but it is Nvidia that takes the overhand with this 2 lanes / 150mph approach. It needs alot of power to even archieve that 150mph of speed compared to Nvidia. Mweh. I'll still take it to. I'm sure driver updates will fix many games and the power issues can be adressed. Remember AMD does'nt spend much time to actually fine-tune their GPU's on power behalf.


----------



## Recus (Aug 14, 2017)

Spoiler: :)


----------



## xkm1948 (Aug 14, 2017)

RejZoR said:


> After intense consideration after 3 hours of fiddling with stores, prices and shit I've decided to pull the trigger on AORUS GTX 1080Ti. Reason? RX Vega 64 Liquid costs the same as this beefed up air cooled one (and I've specifically taken this one because it actually has tons of thermal pads underneath backplate and VRM is connected to actual heatsink, not some crappy little alu heatsink screwed on top of VRM, proper air cooling).
> RX Vega 56 is actually by far the best deal, but it won't cost 400€ in Europe for sure and quite frankly, I'm sick and tired of endless waiting for aftermarket cooled ones to arrive god knows when in autumn 2017. The reference one is crap, just like all blower style cards so I'm certainly not buying that either.
> 
> Everyone can call me AMD fanboy for being optimistic about RX Vega, but in the end, I'm not stupid. For someone who wsn't waiting endlessly and just happens to change stuff now, I guess it's still viable, but for me, this is it.




Nah you are no fanboy, you are hypeboy.


----------



## ERazer (Aug 14, 2017)

the54thvoid said:


> It'll be just your luck if they release Volta in 1 month
> 
> but.... going on the Vega 64 performance, I think we can all wait till Summer/Autumn 2018 for the next high end Nvidia GPU.



pretty sure whom ever bought 1080ti will be waiting for the next voltaTi to upgrade


----------



## xkm1948 (Aug 14, 2017)

the54thvoid said:


> It'll be just your luck if they release Volta in 1 month
> 
> but.... going on the Vega 64 performance, I think we can all wait till Summer/Autumn 2018 for the next high end Nvidia GPU.



Going green next round. Ya know, higher efficiency for a better planet.


----------



## Mr.Newss (Aug 14, 2017)

Most of these Cons can be fixed with non-reference models. for me, that model of the card will be a solid 9.5/10.


----------



## MrMilli (Aug 14, 2017)

I see everybody's bashing Vega's power inefficiency but are you guys seeing the same review as I am?

Vega64 PwrSave <2% slower than GTX1080 while consuming 48W more.
This is huge progress from AMD. Nobody's forcing you to use the obvious ridiculous Turbo mode or even std mode.

Considering these are first release drivers, things can only get better from here on.


----------



## EarthDog (Aug 14, 2017)

MrMilli said:


> I see everybody's bashing Vega's power inefficiency but are you guys seeing the same review as I am?
> 
> Vega64 PwrSave <2% slower than GTX1080 while consuming 48W more.
> This is huge progress from AMD. Nobody's forcing you to use the obvious ridiculous Turbo mode or even std mode.
> ...


Sure... but stock its still much worse performance /watt.


----------



## Fluffmeister (Aug 14, 2017)

AMD are in quite a privileged position really, people are prepared to wait 15 months for the same performance as a GTX 1080, accept the need for different bios profiles to reign in power consumption AND after all the waiting equally accept the drivers probably aren't up to speed yet either... 

They literally can not fail.


----------



## MrMilli (Aug 14, 2017)

EarthDog said:


> Sure... but stock its still much worse performance /watt.



And why is that important? If someone cares about the power consumption, the option to lower it is right there and easy to change. It's like two clicks. 
If someone doesn't care then nothing.


----------



## the54thvoid (Aug 14, 2017)

Recus said:


> Spoiler: :)



I'm not disagreeing with the post as such but (and I may be wrong) I think the demo done at Capsawhatchamacallit limited video ram to 2GB or something to show how HBCC would work in a memory limited environment  So in that case what was demo'ed was valid.  HOWEVER, when it is used on a card with 8GB of HBM2 it is an absolute waste of time.


----------



## EarthDog (Aug 14, 2017)

MrMilli said:


> And why is that important? If someone cares about the power consumption, the option to lower it is right there and easy to change. It's like two clicks.
> If someone doesn't care then nothing.


The important part of my post is that you are FORCED to lower performance (Std mode is default, right?), for more reasonable power consumption. I think that's asinine... two clicks or not, It's two clicks I don't have to worry about using different cards. Its lowering performance to fit in a power envelope... no thanks. 

So, I have to slow my card down, which then puts its 2% slower while still using what, 30% more power than a 1080? Naa....


----------



## the54thvoid (Aug 14, 2017)

EarthDog said:


> The important part of my post is that you are FORCED to lower performance for more reasonable power consumption. I think that's asinine.



Yeah, I can run at 60fps v-sync on BF1 and use like <50% power.......


----------



## MrMilli (Aug 14, 2017)

EarthDog said:


> The important part of my post is that you are FORCED to lower performance for more reasonable power consumption. I think that's asinine... two clicks or not, Its two clicks I don't have to worry about using different cards.



Not disagreeing with you there. AMD went way beyond the sweet spot of the silicon for 2% extra performance. But at least they give you the option to run the card at its right setting without too many loopholes. No BIOS editing, Rivatuner, .... And at its right setting, it's hell of a card.


----------



## EarthDog (Aug 14, 2017)

We definitely part ways on the "huge progress" from AMD sentiment though.


----------



## neatfeatguy (Aug 14, 2017)

qubit said:


> So another meh graphics card from AMD that only matches the performance of the year old GTX 1080 with lots of power draw and noise, especially irritating coil whine. I'll pass. Also, dunno why they bother with that expensive HBM, when GDDR5X does just fine.
> 
> Let's hope AMD can finally leapfrog NVIDIA sometime not too long in the future and give us some competition.



If they had these cards 1-2 years prior, that would have been a hell of a game changer and really pushed competition. The Vega 56 is mostly on par with a 1070, but draws more power.  I could really push the OC On my 980Ti and not be far behind the Vega 56 (even the Fury X isn't far behind) and my card is already 2 years old. It looks like AMD _appears_ to be almost 2 years behind Nvidia.

It's good to see AMD keeping in there, but unless they have something up their sleeves in the next year or so, Nvidia might really put the hurt on them. Navi is rumored to be out 2019, but will that be fast enough to keep up with Nvidia? They don't have anything to answer the 1080Ti right now. Volta most likely won't be around till mid 2018 based on rumors that production of consumer cards won't start until the end of this year (if we're lucky), but if all this holds true, Nvidia will have that 6+ month jump on AMD with the next generation. I wonder when AMD will truly catch up to Nvidia, right now that seems like a long way off.


----------



## EarthDog (Aug 14, 2017)

RejZoR said:


> After intense consideration after 3 hours of fiddling with stores, prices and shit I've decided to pull the trigger on AORUS GTX 1080Ti. Reason? RX Vega 64 Liquid costs the same as this beefed up air cooled one (and I've specifically taken this one because it actually has tons of thermal pads underneath backplate and VRM is connected to actual heatsink, not some crappy little alu heatsink screwed on top of VRM, proper air cooling).
> RX Vega 56 is actually by far the best deal, but it won't cost 400€ in Europe for sure and quite frankly, I'm sick and tired of endless waiting for aftermarket cooled ones to arrive god knows when in autumn 2017. The reference one is crap, just like all blower style cards so I'm certainly not buying that either.
> 
> Everyone can call me AMD fanboy for being optimistic about RX Vega, but in the end, I'm not stupid. For someone who wsn't waiting endlessly and just happens to change stuff now, I guess it's still viable, but for me, this is it.


I'll laugh because you said you were going to buy it no matter what though (right?)....


----------



## MrMilli (Aug 14, 2017)

EarthDog said:


> We definitely part ways on the "huge progress" from AMD sentiment though.



I see you edited your post. No problem. 
I didn't think two clicks was such a hassle for a tweaker but it seems so.
Anyhow, let's not forget that this card is cheaper that a GTX1080 while still being as fast in PwrSave mode.
I'm pretty sure 99% of the world doesn't care about 48W which is less than a normal light bulb.

Just because you see it negatively, doesn't mean we all do. I'm sure there are more that share my opinion.


----------



## RejZoR (Aug 14, 2017)

EarthDog said:


> I'll laugh because you said you were going to buy it no matter what though (right?)....



I said I MIGHT buy it no matter what. Out of that "MIGHT" I've decided not to. Deal with it.


----------



## EarthDog (Aug 14, 2017)

RejZoR said:


> I said I MIGHT buy it no matter what. Out of that "MIGHT" I've decided not to. Deal with it.


----------



## Captain_Tom (Aug 14, 2017)

EarthDog said:


> We definitely part ways on the "huge progress" from AMD sentiment though.



I have to agree with you as well.   Now look, I suspect Vega will be receiving several large performance increases from drivers/game optimizations over the course of the next year.

Having said that, right now Vega is not at all impressive for gaming from a technology demonstration perspective.   It's good for the price given how ridiculous the market is right now, and for professional uses it's somewhat disruptive.  However this is not some big stepping stone for Radeon gaming, not even CLOSE.   Big Polaris would have probably beaten or met this while being cheaper to produce (And launching a year ago lol).


----------



## Manu_PT (Aug 14, 2017)

B-Real said:


> Wondering why are you only thinking about the power consumption part of the GPUs. Why not thinking about Sync monitors, why not thinking about the Radeon Packs? So biased "review" from you.
> 
> 
> 
> ...



Intel X CPU? I don´t care about them and I think who does, actually makes money from using it with all those threads (editing, design, render, etc). Irrelevant to this debate about piece of hardware to play videogames.

Power consumption is always important, maybe for US users it isn´t. In my country the electricity is expensive, Vega 64 consumes almost as much as a GTX 1080 SLI, that´s not acceptable in 2017. I don´t want that crap on my machine just to play videogames. Deal with it.


----------



## Agony (Aug 14, 2017)

So... the big news are .... Summer vacations VS OC = Vacations WIN


----------



## laszlo (Aug 14, 2017)

quite good cards both of them; good pricing also 

who want new tech/upgrade/need to have latest stuff will buy it of course if available on stock...those who don't like it or hate red will stick with green 

one small request... don't start the useless war .... nothing to win no giveaway announced but if i recall correctly @Tatty_One has a lot of free holiday passes....


----------



## EarthDog (Aug 14, 2017)

MrMilli said:


> I see you edited your post. No problem.
> I didn't think two clicks was such a hassle for a tweaker but it seems so.
> Anyhow, let's not forget that this card is cheaper that a GTX1080 while still being as fast in PwrSave mode.
> I'm pretty sure 99% of the world doesn't care about 48W which is less than a normal light bulb.
> ...


I can pick up an aftermarket 1080 for $510 (a giga reference blower for $500). While the air cooled version with reference blower RX V64 is $500+ (one is $500 the rest are gouged at $599).

The point, as I said, isn't that two clicks is a problem, it is that some are forced to do it. Big deal or not (not). The fact that it has to lower performance from default to accomplish that is the issue. Users who do that are not getting what they paid for. I don't find that to be an acceptable option when they can get the same performance with less power, less cost, less noise, WITHOUT touching a thing. 

That is 48W with power savings enabled. What is the difference out of the box, where people normally run the cards (default). Not a huge deal either, but, all ingredients make up the pie...


----------



## Bjorn_Of_Iceland (Aug 14, 2017)

MrMilli said:


> ... And at its right setting, it's *hell *of a card.


I see what you did there.


----------



## efikkan (Aug 14, 2017)

Captain_Tom said:


> Now look, I suspect Vega will be receiving several large performance increases from drivers/game optimizations over the course of the next year.


And there we have a winner! Don't worry, the drivers will come and fix this product…
This product is already three months late; most driver optimizations are already baked in. These promises of future driver improvements never pan out.



Captain_Tom said:


> Having said that, right now Vega is not at all impressive for gaming from a technology demonstration perspective.   It's good for the price given how ridiculous the market is right now, and for professional uses it's somewhat disruptive.  However this is not some big stepping stone for Radeon gaming, not even CLOSE.   Big Polaris would have probably beaten or met this while being cheaper to produce (And launching a year ago lol).


Vega looks pretty bad considering it has a node shrink and so many more transistors and higher clock than Fiji.


----------



## v12dock (Aug 14, 2017)

MAYBE AIB cards will be interesting if they can hold 1.7+ clocks and a year of driver updates. Otherwise its time for AMD to try something new next gen.


----------



## _Flare (Aug 14, 2017)

i made a chart to see in wich games the pwr-saving-mode makes sense and where it doesn´t.







@W1zzard: i hope i had permission to use your data


----------



## Fluffmeister (Aug 14, 2017)

Captain_Tom said:


> I have to agree with you as well.   Now look, I suspect Vega will be receiving several large performance increases from drivers/game optimizations over the course of the next year.
> 
> Having said that, right now Vega is not at all impressive for gaming from a technology demonstration perspective.   It's good for the price given how ridiculous the market is right now, and for professional uses it's somewhat disruptive.  However this is not some big stepping stone for Radeon gaming, not even CLOSE.   Big Polaris would have probably beaten or met this while being cheaper to produce (And launching a year ago lol).



The joys of reality eh?

I wish it was more too: https://www.techpowerup.com/forums/...a-gpu-architecture.229266/page-7#post-3647165


----------



## Steevo (Aug 14, 2017)

efikkan said:


> And there we have a winner! Don't worry, the drivers will come and fix this product…
> This product is already three months late; most driver optimizations are already baked in. These promises of future driver improvements never pan out.
> 
> 
> Vega looks pretty bad considering it has a node shrink and so many more transistors and higher clock than Fiji.




Vega is a compute chip that can also play some games. If anything it's great mining hardware.

AMD is known for their vaporware, sure it has some cool features, but those don't cut it when a 980Ti overclocked beats it.


----------



## notb (Aug 14, 2017)

Vya Domus said:


> But look how much more efficient RX 56 is compared to 64. With the right clocks/shader count and some nifty binning and more aggressive power saving I don't see why it wouldn't make it's way into high end laptops.


By "high end" you mean large, heavy, noisy and usable only plugged in, right?

Check the latest ASUS Zephyrus with a GTX 1080. A powerful gaming notebook of 2017 is quite a bit smaller than a mid-range business notebook from 5 years ago. This is progress.
Of course you can make a huge notebook with a 350W GPU,  but why would you want that?



MrMilli said:


> Considering these are first release drivers, things can only get better from here on.



Or worse. We might soon learn that they have segmentation faults or something like that. And each fix will steal a bit of the performance.

But most likely nothing will change.
As people have already pointed out: AMD delays this release for so long that the drivers should by pretty polished by now. In fact the performance we see in reviews is better than in some of the early leaks. Obviously, a lot of work on drivers has already been done, so the potential should be way smaller than you hope.


----------



## qubit (Aug 14, 2017)

neatfeatguy said:


> If they had these cards 1-2 years prior, that would have been a hell of a game changer and really pushed competition. The Vega 56 is mostly on par with a 1070, but draws more power.  I could really push the OC On my 980Ti and not be far behind the Vega 56 (even the Fury X isn't far behind) and my card is already 2 years old. It looks like AMD _appears_ to be almost 2 years behind Nvidia.
> 
> It's good to see AMD keeping in there, but unless they have something up their sleeves in the next year or so, Nvidia might really put the hurt on them. Navi is rumored to be out 2019, but will that be fast enough to keep up with Nvidia? They don't have anything to answer the 1080Ti right now. Volta most likely won't be around till mid 2018 based on rumors that production of consumer cards won't start until the end of this year (if we're lucky), but if all this holds true, Nvidia will have that 6+ month jump on AMD with the next generation. I wonder when AMD will truly catch up to Nvidia, right now that seems like a long way off.


Yes, it does sound a bit like they are using old technology, which is quite depressing when it's actually their latest and greatest. It looks to me like they're actually overclocking Vega significantly to reduce the framerate deficit against NVIDIA which is at least partly why the card runs so hot and loud. That coil whine is an indicator of just how much juice it's sucking down to do it, hence the 100W higher figures than NVIDIA. It's the usual lack of competition like this that lets NVIDIA take its time in releasing the consumer variant of Volta, milking the maximum amount of profit from Pascal.

I just don't get why they persist with this HBM when it clearly does nothing for them. Why not design around a GDDR5X or the next gen and have a cheaper to make card? At least they can compete better on price then.

I remember how ATI lagged NVIDIA at the time AMD bought them out 11 years ago, but incredibly, it hasn't done anything for them even with all those extra resources available to them. How depressing.


----------



## mastershake575 (Aug 14, 2017)

So its basically the same price/performance as the 1080 while having more noise/power consumption ? 

We waited a year for them to basically offer nothing new/exciting to the market ? (solid work AMD !!!)


----------



## mrthanhnguyen (Aug 14, 2017)

Winter is coming and AMD is selling space heater.


----------



## W1zzard (Aug 14, 2017)

_Flare said:


> i hope i had permission to use your data


always. for such kind of analysis, that goes beyond just copying what i wrote. gj


----------



## efikkan (Aug 14, 2017)

Steevo said:


> Vega is a compute chip that can also play some games. If anything it's great mining hardware.


So, we are still making up excuses?
No, Vega10 is the gaming chip. The compute chip is known as Vega20 and is coming next year.


----------



## BiggieShady (Aug 14, 2017)

Fluffmeister said:


> They literally can not fail.


Raja, the world's greatest slacker


----------



## Durvelle27 (Aug 14, 2017)

W1zzard said:


> always. for such kind of analysis, that goes beyond just copying what i wrote. gj


When will we get OC results


----------



## GhostRyder (Aug 14, 2017)

Well I guess it fell exactly where it was hinted at.  It seems as though the only real winner card is Vega 56 as the higher priced variant is not worth it at least in my book (Especially the WC variant).  Oh well...  Guess I will be sticking with Nvidia for awhile, think ill just bite the bullet and get that 165hz G-Sync monitor I have had my eyes on for months on end.


----------



## Vya Domus (Aug 14, 2017)

BiggieShady said:


> Raja, the world's greatest slacker
> View attachment 91100



AMD should include cuban cigars with their limited edition cards and Nvidia should give one of those neat leather jackets Jensen always wears with every Titan.


----------



## notb (Aug 14, 2017)

Durvelle27 said:


> When will we get OC results


I think we already got them.


----------



## _Flare (Aug 14, 2017)

made me a little proud W1zzard, thanks
now i made the same for the Std.Bios @ pwr-saving ... huge gains and even less understanding for the 292W @ "ballanced"





you can count on 5 fingers where the extra power goes with overclocking,
if you only loose 5% when slashing 27% energy
what happens if using the Std.Bios@pwr-saving with undervolting? maybe the performance diff. goes to 0-2% because the avg. clocks go up again.

i honestly think thats the way Vega is good and efficient
the way we germans with our 25-28 ct/kWh should use it


----------



## EarthDog (Aug 14, 2017)

So..it wont go over 2nd bios turbo?


----------



## bug (Aug 14, 2017)

Steevo said:


> That has nothing to do with it. Read the review without your personal Bias. The frame pacing, pricing, and power consumption when chill and other features are turned on make it the *decent* card that it is.
> 
> It's not a huge win, it's more like the special Olympics winner that showed up to run at the regular Olympics.... juice box required.
> 
> AMD will probably fix half the issues with games, and improve performance by 5% while miners buy the card and they profit, some, cause HBM2 and the die isn't cheap.


I don't know, after two years, in which time AMD effectively dismissed high end last year because apparently that's not where the money is, now AMD returns to the high end to give me... a slightly worse GTX 1080? Call that biased, but I just don't see this as a 86% card.


----------



## BiggieShady (Aug 14, 2017)

Vya Domus said:


> neat leather jackets Jensen always wears with every Titan


Jensen's increasingly fancy leather jackets he sweats in under the lights and whole steering into auto industry are only the starting symptoms of mid life crisis. I expect it to culminate with him driving a motorcycle directly onto the stage for Volta release.


----------



## the54thvoid (Aug 14, 2017)

Durvelle27 said:


> When will we get OC results



W1zzard said this:



> Overclocking simply does not work on AMD's press driver. No matter what setting was chosen, the actual frequencies did not change. Apparently nobody tested overclocking before declaring the driver ready to give to the press.
> 
> Two days ago, AMD provided an updated driver for overclocking testing only, which claims to address this, but it came in too late, when I had already left for my summer vacation.



Which seems so blatantly AMD these days (I was a Ryzen early adopter - I speak with experience).  The company seems disjointed with these things which is really worrying when this is the biggest gfx release from them in 2 years and still can't release working drivers for overclocking?  Guru 3D had the same issues as TPU but where he could overclock he stated that it downclocked very quickly due to a possible power issue.  It's very possible the liquid cards are heavily binned to provide the higher clocks.  It could be Fury X all over again on the OC front which might mean an HBM speed issue?

Either way, unlike Fury X and 980ti, this time round the AMD part is way off the lead.







Vega is 30% away from 1080ti (stock).

Fury X was 10% max away from 980ti (stock).

Going back the 290X was only 5-10% away from the 780ti.

What happened?


----------



## _Flare (Aug 14, 2017)

HD7970(Q1/2012) 6% slower than GTX680(Q1/2012)
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/27.html

HD6970(Q4/2010) 13% slower than GTX580(Q4/2010)
https://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/29.html

the last Win at AMD was the HD5870(Q3/2009), the much older GTX285(Q4/2008) was 17% slower when the HD5870 was released.
That was only possible because nvidia and tsmc had production-problems with the 400-series.
https://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/30.html

and in Q1/2010 it was beaten by 10% thru the GTX480, wich did a 28% generational jump from the GTX285 wich was over 1-year old at that time.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html

(the percents are for the "all resolutions graphs")


----------



## Kissamies (Aug 14, 2017)

Like I posted before, better wait for some driver optimizations etc.

Tho I'm fine with my overclocked GTX970SLI, Vega 56 seems to be still pretty interesting. I'm not in a hurry of upgrading my graphics (and yeah, CPU comes first), but at least I can wait for few driver updates and other things first.

Power consumption hasn't ever been a problem for me. And let's see what those custom models will be like.


----------



## Captain_Tom (Aug 14, 2017)

_Flare said:


> HD7970(Q1/2012) 6% slower than GTX680(Q1/2012)
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/27.html
> 
> HD6970(Q4/2010) 13% slower than GTX580(Q4/2010)
> ...




LOL ok buddy.  You are really paying fast and loose with the truth on those summaries of the past lol.   In fact, can you even read your own links:

https://tpucdn.com/reviews/ATI/Radeon_HD_5870/images/perfrel_1920.gif

The 5870 is 24% faster than the GTX 285, and then still traded blows with the newer GTX 480 that used over double the energy and cost $100 more.   I don't have time to pick apart the rest of your BS, but that one's the funniest point you made imo.


----------



## 0x4452 (Aug 14, 2017)

AMD approved review score to actual score formula

Score = (Official Score - 6) * 2.5


----------



## efikkan (Aug 14, 2017)

the54thvoid said:


> Vega is 30% away from 1080ti (stock).
> 
> Fury X was 10% max away from 980ti (stock).
> 
> ...


AMD stagnated since they gave up actually developing something new, while Nvidia keeps pushing forward.
Imagine a shrunk Fiji bumped about 300 MHz, then it should become obvious how little Vega really improves.


----------



## bug (Aug 14, 2017)

efikkan said:


> AMD stagnated since they gave up actually developing something new, while Nvidia keeps pushing forward.
> Imagine a shrunk Fiji bumped about 300 MHz, then it should become obvious how little Vega really improves.


On top of that, HardOCP seems to have found a weakness in Vega already: MSAA or SSAA performance. I'm taking that with a pinch of salt for now, because they only have 3 games that prove that (and none that proves otherwise), while TPU doesn't which test uses which level of AA. But I'm definitely keeping an eye out for more reviews.


----------



## Captain_Tom (Aug 14, 2017)

efikkan said:


> AMD stagnated since they gave up actually developing something new, while Nvidia keeps pushing forward.
> Imagine a shrunk Fiji bumped about 300 MHz, then it should become obvious how little Vega really improves.



Vega is the first arch designed when AMD had absolutely no money left.  Even the Fiji benefited from respectable R&D numbers, but not Vega.

Remember how the 7970 had features that allowed it to mature incredibly well, even 3 years after it came out?    At the end of the day, Vega was designed to be able to last Radeon at least 5 years.   Therefore they had to design an arch that not only had substantial legs to grow through future tech (HBC, RPM, FP16, DSBR), but it also had to be ok at compute and AI.  That's a very tall order.   Right now the tally is:


Great at compute and certain professional workloads
Serviceable gaming performance with accommodation for the tech future games will use.
Serviceable AI performance for the price.  Certainly FAR better than GCN, but nowhere near Volta.   Probably the only use-case where I think it's fair to say that Vega is a stepping stone.


----------



## Ravenas (Aug 14, 2017)

This card is not designed to compete with the 1080 Ti. It is designed to compete with the 1080. Any research put towards AMD's statement, or W1zzard's statement at the end of his review will reveal this. 

It holds its own against the 1080, and then some. Power draw is slightly higher, but on the tune of around 20 W higher... That's really going to hurt your electrical bill.

Great review W1zzard.


----------



## Captain_Tom (Aug 14, 2017)

Ravenas said:


> This card is not designed to compete with the 1080 Ti. It is designed to compete with the 1080. Any research put towards AMD's statement, or W1zzard's statement at the end of his review will reveal this.
> 
> It holds its own against the 1080, and then some. Power draw is slightly higher, but on the tune of around 20 W higher... That's really going to hurt your electrical bill.
> 
> Great review W1zzard.



I think it was designed to compete with Volta, and then AMD changed their mind late 2016 when it became clear that wasn't gonna happen lol.


----------



## Steevo (Aug 15, 2017)

bug said:


> I don't know, after two years, in which time AMD effectively dismissed high end last year because apparently that's not where the money is, now AMD returns to the high end to give me... a slightly worse GTX 1080? Call that biased, but I just don't see this as a 86% card.




I think they were expecting more from process node development than they got, and Vega was the result coupled with trying to make a one size fits all, much like Tonga performed worse than Tahiti.

AMD has had and still has a serious fabrication issue, now its more of commitments to MS and Sony, how much they needed Zen to work and they seemed to hit just off target with it, mining drying up their Polaris dies. Its a great position they are in but they just seem to keep screwing it up just enough that no one takes them seriously, and they are clawing for market share by offering these at such low MSRP's and allowing retailers to rape the public in favor of cryptominers. 

For example if they came out and said Vega is a compute chip from the get go, designed to use larger memory sizes for mining, and many other tasks.... but instead they keep leading it on as a great gaming card, despite making close to nothing by the time they pay for the silicon, interposer, HBM. There has to be a reason the boards are designed so hardy, and its not to overclock for gaming. TIle based rendering only helped some, and if they have been polishing the drivers with final silicon for the last few months they probably have a relatively short laundry list of bugs that won't massively improve performance. 



efikkan said:


> So, we are still making up excuses?
> No, Vega10 is the gaming chip. The compute chip is known as Vega20 and is coming next year.



Where is an excuse? The card performs exactly where they said it would, and its just as mediocre as I thought it would be. Same shit AMD always pulls, they could have made way more money off this release by pricing them twice as high and calling them mining cards with how they are designed and built, instead they used poor marketing and allow retailers to screw buyers. Things like Rapid Packed Math can be used for shader function, but that requires more money and developer interaction than AMD will put in, but it can be used for compute and mining, the whole architecture is compute based, with deeper pipelines and what appears to be more CPU heavy driver interaction. The only redeeming features appear to be software things they could have given to Polaris cards and made them worth more as gaming cards.


----------



## efikkan (Aug 15, 2017)

Steevo said:


> Where is an excuse? The card performs exactly where they said it would, and its just as mediocre as I thought it would be.



I was referring to:


Steevo said:


> Vega is a compute chip that can also play some games.


We've heard this excuse many times the last two months, ever since people started to realize it would suck at gaming. Many have said things along the lines of "since it sucks in gaming, it was never meant for gaming". But no, Vega10 is primarily targeting gaming.

And it didn't quite land where AMD said it would. Where is the 4× performance per watt?



Steevo said:


> Same shit AMD always pulls, they could have made way more money off this release by pricing them twice as high and calling them mining cards with how they are designed and built, instead they used poor marketing and allow retailers to screw buyers.


Mining has nothing to do with it, that's just another excuse.


----------



## B-Real (Aug 15, 2017)

On the Amazon best-selling CPU list, the Ryzen 1700x jumped to the 2nd position (from maybe a top20 or a bit above). Guess why.


----------



## EarthDog (Aug 15, 2017)

Its cheap.


----------



## Steevo (Aug 15, 2017)

efikkan said:


> I was referring to:
> 
> We've heard this excuse many times the last two months, ever since people started to realize it would suck at gaming. Many have said things along the lines of "since it sucks in gaming, it was never meant for gaming". But no, Vega10 is primarily targeting gaming.
> 
> ...




When is it not an excuse and instead a clear reflection of reality? 

AMD has known for years that compute was big money, watched Nvidia, and put their money in that pot. 

I put my money on AMD stock and pulled out before Vega and made money. Is that an excuse?


----------



## B-Real (Aug 15, 2017)

EarthDog said:


> Its cheap.



As I said, the 1700X was always around the top20 or a bit outside of it. On the day of the Vega launch, it jumped to 2nd place...


----------



## Agony (Aug 15, 2017)

Take the 1080ti  put a sticker AMD La Vega 72 sell it 1 year later a litle cheaper a little warmer  and everyone is happy ....


----------



## AsRock (Aug 15, 2017)

qubit said:


> So another meh graphics card from AMD that only matches the performance of the year old GTX 1080 with lots of power draw and noise, especially irritating coil whine. I'll pass. Also, dunno why they bother with that expensive HBM, when GDDR5X does just fine.
> 
> Let's hope AMD can finally leapfrog NVIDIA sometime not too long in the future and give us some competition.



HAHA, then the power usage be even higher ?.


----------



## wolf (Aug 15, 2017)

From my perspective this is too little too late from old mate AMD. There are compelling aspects to this card like price/performance, mining, but too many drawbacks, like heat, power and being so late to bring this perf level to the playing field.

GTX1080 owners have had this performance available to them for over 14 months now, and running cooler, using less power.

I really wanted AMD to nail this one, but I fear it just cements them as ~1.5 generations behind still (more specifically looking at being able to be the top dog, and having a sound, efficient architecture). The way the landscape looks right now they will not be able to compete well with the outright performance of a new Volta card (considering they can't compete with a 1080Ti), presuming that typically the "xxx80" card is marginally faster than the previous gen's Ti (making say a GTX1180 faster than a 1080Ti with lower power consumption), let alone do it in the ballpark of performance per watt.

They need to Ryzen their GFX segment, back to the drawing board, complete overhaul.

For the meantime we get some competitive prices and shake up in this segment, and that's better than nothing, especially for those invested in freesync.


----------



## johnnyfiive (Aug 15, 2017)

The only reason these cards are compelling is if you happen to have a FreeSync monitor.


----------



## sweet (Aug 15, 2017)

johnnyfiive said:


> The only reason these cards are compelling is if you happen to have a FreeSync monitor.



I actually have been holding on buying one because no AMD card could properly handle the FreeSync range at 4k. This Vega release will enable more sale than you expected, mate.


----------



## ViperXTR (Aug 15, 2017)

johnnyfiive said:


> The only reason these cards are compelling is if you happen to have a FreeSync monitor.


I have a freesync monitor though i have an over a year old 1070, back then thinking of selling it because mining craze and get myself a Vega 64 but the savings i get in purchasing a freesync monitor wil be nulled in getting a buying a new power supply, i don't want my ancient PSU to run this power hungry beast :O


----------



## sebabal (Aug 15, 2017)

I am sorry but the conclusion on this review does not make any sense, how can this card be highly recommended but then it reads...

_"Price-wise, the Radeon RX Vega 64 clocks in at $499, which is not unreasonable. *It is basically priced the same as the GTX 1080, which does offer much better power/heat/noise levels." 
*_
Basically the pricing IS PRETTY MUCH UNREASONABLE, although I understand the reason just look at the GPU die size it must be costing an arm and a leg to produce those chips, I mean it is even bigger than the 1080ti and still can't match it.

Just stay away from this card, either get a 1080 (you can even get pretty good deals on eBay since the card is like 18 months old), or wait for Volta in a few months.


----------



## xkm1948 (Aug 15, 2017)

I wonder how long it will take before RTG start to use AI to design their next chips. You know limit input conditions and let it run millions of possible scenario in terms of performance/power consumption. In the end the design engineers will be just sitting there and choose what looks best to them



johnnyfiive said:


> The only reason these cards are compelling is if you happen to have a FreeSync monitor.



Or need a personal space heater during winter and too lazy to turn on the A/C


----------



## squallheart (Aug 15, 2017)

Ravenas said:


> This card is not designed to compete with the 1080 Ti. It is designed to compete with the 1080. Any research put towards AMD's statement, or W1zzard's statement at the end of his review will reveal this.
> 
> It holds its own against the 1080, and then some. Power draw is slightly higher, but on the tune of around 20 W higher... That's really going to hurt your electrical bill.
> 
> Great review W1zzard.



I have always felt that the human mind is fascinating and one thing in particular is cognitive dissonance.

In order to assert your claim that the power draw is "slightly higher", not only did you have to compare under the most favorable conditions with the smallest differences, but you also based the calculation on power consumption using the power safe mode with extra low power draw. All the benchmarks have demonstrated that at this power save mode, performance is not equal to the 1080.

Just how much mental gymnastics do you have to do in your head l to say that power draw is _"is slightly higher, but on the tune of around 20 W higher"_ with a straight face?

It's pretty obvious here (https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/29.html) that the power consumption is significantly higher than that? Cognitive dissonance is a fascinating thing indeed.

Oh, and FYI, I am neither a fan or Team Red or Green. In fact I was hoping the Vega will be better because this will force Nvidia to also offer more competitive products, which ultimately benefit the consumers.


----------



## xkm1948 (Aug 15, 2017)

Dug through a whole lot of reviews. Here is what I get:

1. Most of the extra transistors were used on improving frequency 

2. RTG sacrificed efficiency (IPC) from Fiji to improve frequency 

3. Everything else remained relatively the same.

4. GloFo manufacturing simply cannot tame the power consumption at such high frequency.


So in the end, RTG was actually hoping to design a super pumped up Fiji while keeping the major components (4 Async Engine, 64CU, ROP) same as Fiji. Unfortunately this resulted in horrible efficiency. So RTG kept pumping up the speed hoping it will at least look good on performance numbers. However the release of 1080Ti and TitanXp crushed that hope as well. Just like Faildozer, GCN simply is too outdated for the current crop of applications. Simply pumping up MHz won't save it. RTG need some fresh new design badly.


----------



## Frick (Aug 15, 2017)

And the Vega 64 is a bad overclocker. Again the Vega 56 is the decent card of the two.


----------



## Captain_Tom (Aug 15, 2017)

sebabal said:


> I am sorry but the conclusion on this review does not make any sense, how can this card be highly recommended but then it reads...
> 
> _"Price-wise, the Radeon RX Vega 64 clocks in at $499, which is not unreasonable. *It is basically priced the same as the GTX 1080, which does offer much better power/heat/noise levels."*_



I actually don't think that's fair.  Depending on the games you play (And somewhat on the resolution), Vega 64 can easily  be an average of 10%+ stronger than the 1080.   Then factor in Freesync, multiple use cases (mining, compute, professional programs), and the enormous technology baked into the card that will undoubtedly yield some decent performance gains over time - and you have a card that I would say _is_ a better choice if priced a tad below the 1080 (Or even at the same price).

Not to mention Freesync makes Nvidia straight up not an option for many people.   For instance I have some friends that look at the $200 troll toll for G-Sync and how horrifically bad the 780 aged, and they just will not buy an Nvidia card for at least another generation.


----------



## Captain_Tom (Aug 15, 2017)

Frick said:


> And the Vega 64 is a bad overclocker. Again the Vega 56 is the decent card of the two.



I have actually read the exact opposite TBH.


----------



## dwade (Aug 15, 2017)

Every generation the bar is decreased by AMD. Maxed out Navi will only compete with the GTX 2060 if this continues.


----------



## _Flare (Aug 15, 2017)

@Captain_Tom , first i wrote about the "all resolutions"-Graph and yes i wrote it the wrong way for the HD5870, it should be, the GTX285 was about 17% slower (100% vs. 83%) thats correct.


----------



## BiggieShady (Aug 15, 2017)

xkm1948 said:


> I wonder how long it will take before RTG start to use AI to design their next chips. You know limit input conditions and let it run millions of possible scenario in terms of performance/power consumption. In the end the design engineers will be just sitting there and choose what looks best to them


That's not really all that conclusive, Bulldozer die was laid out by an automated process and Zen die was laid out by hand ... input conditions are already limited (you ask of AI for max efficiency, min latency and max stable clock), however variations in transistor layout are really huge output conditions so not much help from standard AI like neural networks or genetic algorithms.


----------



## uuuaaaaaa (Aug 15, 2017)

B-Real said:


> On the Amazon best-selling CPU list, the Ryzen 1700x jumped to the 2nd position (from maybe a top20 or a bit above). Guess why.



RX Vega bundles?


----------



## Bjorn_Of_Iceland (Aug 15, 2017)

You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.


----------



## Liviu Cojocaru (Aug 15, 2017)

Bjorn_Of_Iceland said:


> You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.


For me GSync is a very big plus


----------



## BiggieShady (Aug 15, 2017)

Bjorn_Of_Iceland said:


> Gsync / Freesync is overrated.


I guess it's useful if you want to have playable game on a 4K screen


----------



## Tatty_One (Aug 15, 2017)

Just checked availability and pricing in UK (Scan), the cheapest Vega 64 is £110 more expensive than the cheapest 1080 and those have aftermarket cooling, I am seriously hoping these prices drop once things have settled.


----------



## Prima.Vera (Aug 15, 2017)

Bjorn_Of_Iceland said:


> You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.


Without G-Sync on my 3440x1440 monitor anything below 60fps would become a stutter party. With G-Sync enable I can play even with 30fps with maximum fluidity.


----------



## Bjorn_Of_Iceland (Aug 15, 2017)

Indeed, it is smooth and stutter free. However, imo, people can live without it.

Personally, I have gsync turned on all the time even with competitive games like Overwatch and PUBG.






But to opt for a power inneficient and hot card just for the sake of gaming on a freesync (or Gsync if that is the case) is not worth it imo.


----------



## Imsochobo (Aug 15, 2017)

Bjorn_Of_Iceland said:


> Indeed, it is smooth and stutter free. However, imo, people can live without it.
> 
> Personally, I have gsync turned on all the time even with competitive games like Overwatch and PUBG.
> 
> But to opt for a power inneficient and hot card just for the sake of gaming on a freesync is not worth it imo.



you should try enhanced sync with freesync and or just enhanced sync.
I tested "chill" with it, I am very impressed with the trio.

We also tested some die hard gsync fans with enhanced sync, and that was a no sweat from those "lag" complainers.
Specifically with Enhanced sync and Freesync, it seems AMD have it sorted, their hardware needs drivers and optimizations from game devs to tap into the new features and methods one can use for performance or maybe it never happens.
or better; make hardware that fits gaming market better.

I am pro AMD's supporting features and not their hardware, the latter I'd rather not tell my honest opinion on 

Also wattman seems to be a nightmare.


----------



## arbiter (Aug 15, 2017)

As much as people like to think this is a miners delight card, kinda wonder if that is the case give the MH per watt. For example a vega 64 can draw up to 300watts which probably will to get that 33MH, yet for half that a 1070 is listed as 26.5. You could get 2 1070's yes costs more but get 52MH for same power envelope. So stock least for 64 might be a problem but 56 might be one they go for instead if they do.


----------



## efikkan (Aug 15, 2017)

Steevo said:


> When is it not an excuse and instead a clear reflection of reality?


You were the one claiming "Vega is a compute chip that can also play some games", while Vega10 is their gaming chip.



Bjorn_Of_Iceland said:


> You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.


Then you're doing it wrong.
G-Sync is the greatest improvement in gaming for years. Smooth gaming is more important than framerate or resolution.


----------



## jdubo (Aug 15, 2017)

AMD what is the point?  You can't even buy one because they are all sold out.  The stock prices are $100 more than announce.  So Vega 64 Liquid is $699.  I'm not sure why you would spend 700 bucks on Vega 64 Liquid when you can get a 1080Ti for 750 bucks.  When will it be avail to actually purchase and to play games with?


----------



## jabbadap (Aug 15, 2017)

efikkan said:


> @W1zzard
> Can you please explain your scale?
> How can a terrible product like this get a 8.6 out of 10? And how bad does a product have to be to get something like a 5? What is the point of a scale 1-10 when there is barely any difference from a bad card like this vs. a good card like GTX 1080 (which got 9.1).
> 
> ...



Well little late respond, but just went through some old reviews and saw your quite valid question. Old mighty fermistor foureighty got 8.2, which was not really that bad compared to competition(which got 9.5). gtx590 blowed up and got 7.0. And the fixed fermistor fiveeighty got 9.0, while the direct competitor reached low point of 8.0... 

All in all w1zzard is usually quite consistent with points. I.E. he gave gtx1080fe 9.1, which was higher than the points this get(8.6). And then there's is the points difference with reference to custom cards. Highest points for custom gtx1080 is for Zotac gtx1080 amp! extreme with points of 9.9(which I don't really get with that grappy vrm cooling it have). I'm quite confident that there will be custom RX Vega⁶⁴ card, which will get higher points than gtx1080FE and for a good reason too(better cooling, no price premium, better performing so better card all together - power and heat but that can be managed).


----------



## Jeffredo (Aug 15, 2017)

This card is of no use to gamers.  Just matches the competition's card which has been out for a year and does so only after consuming 125W+ more power.  Still, miners will snatch up every one they can make so I guess its still a win for AMD.


----------



## SPLWF (Aug 15, 2017)

I had a chance to purchase a RX Vega 64 at Microcenter but I declined it.  I told them to sell to someone else.  I know heat is always an issue with Ref cards but me selling a RX Vega 64 for $200-$300 more than MSRP, $200-$300 isn't going to make me get ahead in life.  I quite frankly don't need the money.  I will wait for AIB like I mentioned in other posts.


----------



## EarthDog (Aug 15, 2017)

Miners arent going to touch V64... too much power use for the current hash rate.


----------



## efikkan (Aug 15, 2017)

xkm1948 said:


> 1. Most of the extra transistors were used on improving frequency
> 2. RTG sacrificed efficiency (IPC) from Fiji to improve frequency
> <cut>
> 4. GloFo manufacturing simply cannot tame the power consumption at such high frequency.


1 and 2 makes no sense. Making the cores more complex will require higher voltages which in the end will limit the frequency. Regarding 4, the problem is not the node. AMD had the same problems when they were using the same node as Nvidia for 28nm. Vega is power limited, since its circuit design is less efficient, it needs more power to make the transistors respond quickly enough to maintain stability, which in the end results in high energy consumption and throttling.


----------



## hapkiman (Aug 16, 2017)

Why would I buy this Vega card when I can get a used GTX 1080 on eBay which equals it, and saves me about $100 a year in electricity (if not more) for ~$499 (actually saw a diff Gigabyte 1080 for $475).

I'm all for an AMD comeback riding on the coat tails of Ryzen's success, but this is like arriving too late to a party.  With warm beer.  

As for the Miners.  Go for it.  Not interested.

http://www.ebay.com/itm/GigaByte-Wi...357924&hash=item440974e951:g:aXUAAOSwgxxZk2zR


----------



## xkm1948 (Aug 16, 2017)

efikkan said:


> 1 and 2 makes no sense. Making the cores more complex will require higher voltages which in the end will limit the frequency. Regarding 4, the problem is not the node. AMD had the same problems when they were using the same node as Nvidia for 28nm. Vega is power limited, since its circuit design is less efficient, it needs more power to make the transistors respond quickly enough to maintain stability, which in the end results in high energy consumption and throttling.



http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/2
My source:

Regarding not expanding the actual important part, the CU arrays



			
				Anandtech said:
			
		

> Talking to AMD’s engineers about the matter, they haven’t taken any steps with Vega to change this. They have made it clear that 4 compute engines is not a fundamental limitation – they know how to build a design with more engines – however to do so would require additional work. In other words, the usual engineering trade-offs apply, *with AMD’s engineers focusing on addressing things like HBCC and rasterization as opposed to doing the replumbing necessary for additional compute engines in Vega 10*
> 
> Not shown on AMD’s diagram, but confirmed in the specifications, is how the CUs are clustered together within a compute engine. On all iterations of GCN, AMD has bundled CUs together in a shader array, with up to 4 CUs sharing a single L1 instruction cache and a constant cache. For Vega 10, that granularity has gone up a bit, and now only 3 CUs share any one of these cache sets. As a result there are now 6 CU arrays per compute engine, up from 4 on Fiji.



Regarding the extra transistors


			
				Anandtech said:
			
		

> That space is put to good use however, as it contains a staggering 12.5 billion transistors. This is 3.9B more than Fiji, and still 500M more than NVIDIA’s GP102 GPU. So outside of NVIDIA’s dedicated compute GPUs, the GP100 and GV100, Vega 10 is now the largest consumer & professional GPU on the market.
> 
> Given the overall design similarities between Vega 10 and Fiji, this gives us a very rare opportunity to look at the cost of Vega’s architectural features in terms of transistors. Without additional functional units, the vast majority of the difference in transistor counts comes down to enabling new features.
> 
> Talking to AMD’s engineers, what especially surprised me is where the bulk of those transistors went; the single largest consumer of *the additional 3.9B transistors was spent on designing the chip to clock much higher than Fiji.* Vega 10 can reach 1.7GHz, whereas Fiji couldn’t do much more than 1.05GHz. Additional transistors are needed to add pipeline stages at various points or build in latency hiding mechanisms, as electrons can only move so far on a single (ever shortening) clock cycle; this is something we’ve seen in NVIDIA’s Pascal, not to mention countless CPU designs. Still, what it means is that those 3.9B transistors are serving a very important performance purpose: allowing AMD to clock the card high enough to see significant performance gains over Fiji.


----------



## jabbadap (Aug 16, 2017)

hapkiman said:


> Why would I buy this Vega card when I can get a used GTX 1080 on eBay which equals it, and saves me about $100 a year in electricity (if not more) for ~$499 (actually saw a diff Gigabyte 1080 for $475).
> 
> I'm all for an AMD comeback riding on the coat tails of Ryzen's success, but this is like arriving too late to a party.  With warm beer.
> 
> ...



At least you might get it that cheap. Here in Europe it costs 650€ for vanilla blower one, while you can get custom gtx1080ti for 699€ and custom gtx1080 for under 550€...


----------



## Bjorn_Of_Iceland (Aug 16, 2017)

jabbadap said:


> At least you might get it that cheap. Here in Europe it costs 650€ for vanilla blower one, while you can get custom gtx1080ti for 699€ and custom gtx1080 for under 550€...


Vega would probably be in the same overpriced scenario.


----------



## Imsochobo (Aug 16, 2017)

Bjorn_Of_Iceland said:


> Vega would probably be in the same overpriced scenario.



I bought a RX64 for 1070 price here in Norway.
next day, almost 1080TI price for RX64 (I have no idea why really)...


----------



## photonboy (Aug 16, 2017)

Enhanced Sync:
This contains NOTHING that NVidia does not have, though I think the article implies it does at the end. At best it works the same. There are however Freesync monitors with no LFC whereas GSYNC always supports this.

Enhanced Sync is equivalent to NVidia's:
FAST SYNC + Adaptive VSync + GSync (again with the LFC caveat)

(Adaptive VSync is VSYNC ON and OFF automatically. It is not used if GSYNC is working. Same on Freesync)

AMD did a video where they made it sound simple. For example, they talked about the "last frame created" once you go over the top of Freesync range (so FAST SYNC) but what they failed to mention is that it is pointless to do that unless you can generate over 2x the FPS otherwise you never get a 2nd frame that allows you to drop the first frame. For example on a 144Hz monitor with a worst-case 48Hz to 90Hz Freesync range it is:

0 to 48FPS (VSYNC OFF or ON; thus screen tear or stuttering)
48FPS to 90FPS (FASTSYNC; ideal tear-free, minimal lag zone)
90FPS to 144FPS (VSYNC OFF or ON; if VSYNC ON then stutter as you are not synching to 144Hz)
144FPS (VSYNC ON; if chosen by default no screen tear, not much latency)
144FPS to approx 300FPS (VSYNC OFF needed; screen-tear may or may not be obvious but "twitch" shooters may prefer to 144FPS VSYNC ON)
300FPS+ (if you choose "Enhanced" I guess it doesn't cap but works like FAST SYNC so no screen tear as it draws 144FPS but draws only the latest full frame. So similar to 144FPS VSYNC ON but slightly less lag. Very, very minor so only the very BEST twitch shooters could tell)

*See how CONFUSING that setup is (again a worst-case but Freesync range can be hard to find). On a 144Hz GSYNC monitor it is always THIS or close:

0 to 30FPS (each frame doubled to stay in GSYNC mode. So 29FPS becomes "58FPS")
30FPS to 144FPS (GSYNC)

.. above 144FPS options same as Freesync

**Again though, some of the good Freesync monitors are close. Some are 40Hz to 144Hz so you have a nice range and the driver supports LFC so drops below 40FPS are not a big deal.

GSync:
May cost more but it is SUPERIOR overall. There are some Freesync monitors with LFC as discussed that are very good bit it is hit and miss. Even the better ones may not be able to maintain color/blur as well since they use OVERDRIVE which is problematic when changing frame times (unless you for example, make a hardware module to help with that).

FREESYNC 2/HDR:
This makes it closer to GSYNC by requiring LFC support, but with variable frame times and wider range of colors/brightness due to HDR it is much harder to make this work. Price may be a big jump up from normal Freesync, whereas on GSync 2 the addon module should reduce the monitor manufacturers R&D considerably so if they can get the price down on the modules GSYNC and FREESYNC 2 should get closer in price with GSYNC 2 likely to remain superior.

OTHER:
My main issue with the REVIEW which was mostly excellent was I saw no reference to what a top-end GTX1080 could do or even what card was used. Later we need to compare two Asus Strix models (3-fan) for GTX1080 vs VEGA64 then see how they do in terms of performance and noise. Liquid cooling seems mostly pointless if it costs close to a GTX1080Ti that beats it in every way.

GAMERS NEXUS noted that the VEGA56 has a voltage/power cap which is currently impossible to overcome but there does appear enough headroom left to nearly MATCH VEGA64 (or at least the air-cooled VEGA64 that has temperature throttling).


----------



## photonboy (Aug 16, 2017)

RX-VEGA my two cents:
After looking at other reviewers with different results, it appears it is best to conclude that VEGA56 is nearly identical to the GTX1070 and VEGA64 close to the GTX1080 on AVERAGE.

I expect VEGA to age better due to FP16, ACE, and due to the fact that the basic GCN architecture is in the consoles.

Now many people said "so what, that's in the FUTURE and by then... blah blah" well a lot of people buy a graphics card and keep it for 3+ years so guessing how the card should age is very important. I have seen the "FINE WINE" info before for AMD vs NVidia and was not impressed really, but I do think it is completely DIFFERENT now because the software never really had a chance to optimize for GCN before since DX12/Vulkan was required to implement the best features.

But... on the other hand NVidia tends to do a better job at more timely drivers.

Power (HEAT) is another issue. In particular for the VEGA64 since I can NOT use that card in my small room as the room temperature would be far too hot. An extra 100Watts or so makes a HUGE difference. VEGA56 is more reasonable though I'd still get something like an Asus Strix.

VEGA64 solves the heat issue (update: I mean temperature issue not heat) with liquid cooling but then charges so much that you should just get a GTX1080Ti instead.

None of this matters for cheaper VEGA64 and VEGA56 unless the PRICE is right and that may be a big issue until mining is no longer an issue AND stock is sufficient that resellers don't overcharge.

*So in general there are pros and cons, but I think VEGA56 in particular will be the best value mostly due to its FUTURE improvements relative to the GTX1070, and assuming the price is nearly IDENTICAL to a GTX1070 with the same cooler.

FEATURES: most people don't use extra features but it should be looked at if interested. How well does RECORDING compare, or features like ANSEL for 2D and 3D screenshots (in only a few titles so far). There is also VR SUPPORT and frankly I don't know how they compare. AMD's asynchronous architecture in theory should be better but NVidia tends to do better with their software support.

AMD has been improving in software quite a bit to the point they MATCH NVidia most of the time but i wouldn't say they are quite as good yet.

If an Asus Strix VEGA56 card was priced at roughly $450USD today that would be an excellent buy IMO.

(I do not see any advantage to having the HBCC for gaming unless the game needs more than 8GB, and also can't normally swap the data around in a timely fashion. HBM2 though does appear to help at higher resolutions, though possibly not enough to justify the cost since AMD could probably have dropped prices more so the VALUE proposition might have been better with say GDDR5x instead of HBM2 and maybe a $349 RX-VEGA56 MSRP)


----------



## Steevo (Aug 16, 2017)

photonboy said:


> RX-VEGA my two cents:
> After looking at other reviewers with different results, it appears it is best to conclude that VEGA56 is nearly identical to the GTX1070 and VEGA64 close to the GTX1080 on AVERAGE.
> 
> I expect VEGA to age better due to FP16, ACE, and due to the fact that the basic GCN architecture is in the consoles.
> ...





photonboy said:


> Enhanced Sync:
> This contains NOTHING that NVidia does not have, though I think the article implies it does at the end. At best it works the same. There are however Freesync monitors with no LFC whereas GSYNC always supports this.
> 
> Enhanced Sync is equivalent to NVidia's:
> ...



Why should I take any advice from someone 

1) That can't figure out how to NOT double post?
2) That doesn't have the card to see how it works compared to the competition
3) That only has theory and a wall of text explaining their uninformed ideas.

W1zz has the card, tried the features, and found them to be better than the Nvidia implementation. 

As to your "overdrive" idea, overdrive is OVERCLOCKING the pixel clock and causing the refresh rate to go above specified, which has nothing to do with lower than refresh rate Freesync. My TV has 6Gb of memory and an AMD chip in it and I only experience tearing if I don't turn on max frame rates, as it already interleaves frames adaptively. I don't think you understand how "overdrive" works VS just syncing frames to the front or back porch signal http://lmgtfy.com/?q=HDMI+front+porch and the fact that many TV's already perform interleaving or frames (2/3 pull down) on 24Hz sources to prevent backlight and frame flickering issues and all you have to do is turn the feature on and even low frame rates don't cause stuttering or tearing. Freesync was just an extension of that and used technology already in use.


----------



## BiggieShady (Aug 16, 2017)

Vega irony is that if it was good for gamers it would be perfect for miners. Either way no gamers would use it.


----------



## jabbadap (Aug 16, 2017)

Bjorn_Of_Iceland said:


> Vega would probably be in the same overpriced scenario.



Hmm not sure if you understand my point, RX Vega⁶⁴ is that 650€, while I was talking about custom nvidia's. Well could have been more precise.

Just one request for @W1zzard if you don't mind. I think I was requesting getting some more vpu taxing power test than blurays some time ago. Is there possibly to start testing some HEVC 4k movies/trailers to see power usage of those. I don't think amd graphics cards support 4k Netlix yet, so testing with that might be asking too much. But at least all current gen cards should support 4k HEVC videos now.


----------



## _Flare (Aug 16, 2017)

@W1zzard  you have a lot of data, can you confirm that the GTX 980, (wich is about 10% more efficient than the 980Ti), is in 1080p about as efficient as the RX Vega 64 "@Std.Bios PWR-save" ?
So AMD managed finally to build a card better than Nvidias good old Maxwell: "faster then Maxwell 980Ti and efficient as GTX980"  congrats AMD it was about time ... sadly its way too expensive and all serious gamers will have to buy OCed GTX1080s for 100-150  US$ or €uros less.

good luck AMD by switching away from efficient gaming (if its not a console)


----------



## dalekdukesboy (Aug 17, 2017)

Vya Domus said:


> It's a purely formal matter , for gaming all they have to do is make it clear that they are not out of the game. That's all Vega is , a reminder they are still active in the high end gaming market. Trust me Vega is for the datacenter.



Do you think AMD with drivers/updates/tweaks etc can get this well ahead of 1080 and near 1080ti territory? That's only way I see it worth buying, the temps even controlled by great cooler a drawback but the power consumption really sucks, pun intended. May not matter to some, but if nothing else just being inefficient and burning etc electricity and turning a summer room into a swelterbox while gaming/number crunching etc is a turn off for some.


----------



## hapkiman (Aug 17, 2017)

dalekdukesboy said:


> Do you think AMD with drivers/updates/tweaks etc can get this well ahead of 1080 and near 1080ti territory? That's only way I see it worth buying, the temps even controlled by great cooler a drawback but the power consumption really sucks, pun intended. May not matter to some, but if nothing else just being inefficient and burning etc electricity and turning a summer room into a swelterbox while gaming/number crunching etc is a turn off for some.



Speculation at this point.  But will it get well ahead of a 1080?  I don't think so.  But it will certainly equal it, and beat it in some games, and lose in others.  I believe it will eventually be on par with the 1080.  Surpass it completely?  Doubtful.

But a Vega 64 "near 1080Ti territory" (as you put it).  No- highly unlikely.  The 1080Ti is considerably more powerful than the Vega64.


----------



## xkm1948 (Aug 17, 2017)

Funny thing, Primitive Shader Discard works perfectly. As a matter of fact, the higher you overclock the more stuff it discards, like the stuff that are not supposed to be discarded.  Check out @buildzoid video, starting around 26:00

So yeah, overclocking VEGA and you start to loose details in games.


----------



## _Flare (Aug 17, 2017)

so a blackscreen thru OC can be misinterpreted with Vega, it could be only discarding EVERYthing, resulting in a full-black-screen saving a lot of renderwork and power, it´s a WIN, isn´t it ?


----------



## buildzoid (Aug 17, 2017)

xkm1948 said:


> Funny thing, Primitive Shader Discard works perfectly. As a matter of fact, the higher you overclock the more stuff it discards, like the stuff that are not supposed to be discarded.  Check out @buildzoid video, starting around 26:00
> 
> So yeah, overclocking VEGA and you start to loose details in games.



No it glitches out when clocked too high. I've the issue a few times but I can't get it to occur consistently.


----------



## LTUGamer (Aug 17, 2017)

Extremely high power draw

Lots of fan noise

High temperatures

Throttling

Coil noise at high FPS
Still "highly recommended". Lol


----------



## Octopuss (Aug 17, 2017)

I'm thinking I might  buy one of the next five Vega rebrands. You know, Vega 64, 74, 84,... Whatever hardware bugs would likely be fixed by then and power draw might be a little lower.


----------



## Vayra86 (Aug 17, 2017)

Very much as expected, very crappy release. No OC results bc of drivers, even after all tge delay. Matches Nvidia stock clocks so in reality VEGA will consistently land a good 8-10% behind even the 1080. 

Too little, too late, too hot and too power hungry. AMD needs a radical redesign of GCN as it still looks like DX11 inefficiency all over again. This is Fury X v2 just as I predicted. Shame


----------



## xkm1948 (Aug 17, 2017)

AMD may have lied about Vega pricing

https://www.hardocp.com/news/2017/08/16/claims_early_pricing_on_vega_64_56_fall_flat_in_usa

This is like adding insult to injury


----------



## Steevo (Aug 17, 2017)

xkm1948 said:


> AMD may have lied about Vega pricing
> 
> https://www.hardocp.com/news/2017/08/16/claims_early_pricing_on_vega_64_56_fall_flat_in_usa
> 
> This is like adding insult to injury




They should have just doubled its launch price, called it Vega ME, Mining Edition and they still would have flown off shelves and AMD could have at least doubled the money on the few thousand cards they actually had after "building up stock".


----------



## Indurain (Aug 17, 2017)

Without the rebates, which according to some there was no mention of in the review materials, the value proposition changes drastically for all these cards. I am wondering if reviewers will 'do the right thing' and add an addendum to their reviews decreasing the price/performance metrics.

@Steevo from what I read on other sites as an Example Newegg in the US got less than 70 cards that could be sold without the software bundle and at the suggested $499 price. So to say thousands might be a bit of a stretch.

I am so glad that the CPU side of AMD doesn't behave this way.

I was all set to buy a Vega 64 and waterblock, even held up my current build two months waiting on Vega. Yesterday ordered a GTX 1080 Ti since all that was left was the bundles with no hope of ever seeing the $499 or even the $599 price ever again.


----------



## Steevo (Aug 17, 2017)

Indurain said:


> Without the rebates, which according to some there was no mention of in the review materials, the value proposition changes drastically for all these cards. I am wondering if reviewers will 'do the right thing' and add an addendum to their reviews decreasing the price/performance metrics.
> 
> @Steevo from what I read on other sites as an Example Newegg in the US got less than 70 cards that could be sold without the software bundle and at the suggested $499 price. So to say thousands might be a bit of a stretch.
> 
> ...





The profit I made off AMD stock is going to buy me a Nvidia card this time around too. I think they had at least a thousand cards worldwide. I'm debating if I should get a Zen processor this round, or find a good clocking generation old Intel without the memory issues and better IPC performance, since I don't do IT work anymore having 8 cores to handle multiple tasks and with as good Cuda support for video processing...... does it really matter having extra threads for the 4-5% of times I might need more than 4 cores or 8 threads?


----------



## EarthDog (Aug 17, 2017)

Intel Memory issues? Better ipc last gen? Whaaa? 7700k man since you dont need the cores.


----------



## Power Slave (Aug 18, 2017)

xkm1948 said:


> AMD may have lied about Vega pricing
> 
> https://www.hardocp.com/news/2017/08/16/claims_early_pricing_on_vega_64_56_fall_flat_in_usa
> 
> This is like adding insult to injury



Hi short time lurker of forums , long time news reader though with first.

That was 2days ago and based something said that spread rather than factual , AMD gave this response today.

https://videocardz.com/72123/amd-issues-official-statement-regarding-radeon-rx-vega-64-pricing




> Radeon RX Vega 64 demand continues to exceed expectations. AMD is working closely with its partners to address this demand. Our initial launch quantities included standalone Radeon RX Vega 64 at SEP of $499, Radeon RX Vega 64 Black Packs at SEP of $599, and Radeon RX Vega 64 Aqua Packs at SEP of $699. We are working with our partners to restock all SKUs of Radeon RX Vega 64 including the standalone cards and Gamer Packs over the next few weeks, and you should expect quantities of Vega to start arriving in the coming days.




I think Vega 64 has good performance for the price however the heat and the power draw is really high and cant be overlooked. I've got a Nitro Fury right now and contemplating upgrade after I see what a Nitro cooler can do on this chip.


----------



## RejZoR (Aug 19, 2017)

xkm1948 said:


> Funny thing, Primitive Shader Discard works perfectly. As a matter of fact, the higher you overclock the more stuff it discards, like the stuff that are not supposed to be discarded.  Check out @buildzoid video, starting around 26:00
> 
> So yeah, overclocking VEGA and you start to loose details in games.



Don't be stupid, he was talking about LN2. How many of you use LN2 for gaming? No one. Stop spreading FUD.


----------



## Lightofhonor (Aug 19, 2017)

B-Real said:


> On the Amazon best-selling CPU list, the Ryzen 1700x jumped to the 2nd position (from maybe a top20 or a bit above). Guess why.



Because it was the daily deal and was going for under $300


----------



## Vayra86 (Aug 20, 2017)

xkm1948 said:


> AMD may have lied about Vega pricing
> 
> https://www.hardocp.com/news/2017/08/16/claims_early_pricing_on_vega_64_56_fall_flat_in_usa
> 
> This is like adding insult to injury



Who cares, its a bad deal regardless? People are going apeshit over a product they'll never buy anyway, because they all own a 1080ti or a 1080.

Bottom line, AMD doesn't want to sell this card, its clear from the timing and the way its launched to its performance/watt/dollar ratios, to the overall marketing surrounding it and everything else. I wouldn't be surprised if they're selling HBM at a loss again. Exceeding expectations, they say... rofl. Market's been saturated for at least a year now. GTFO


----------



## Footman (Aug 20, 2017)

Who cares!!! So many haters.....

Bottom line AMD has a high end card again that gives end users a choice. Is Vega perfect? No.

People can buy it or not.........


----------



## xkm1948 (Aug 20, 2017)

If Navi is multiple GCN stiched together by IF, it will be another major fail. IF is good tech, however it need some good core design to work on.



Footman said:


> Who cares!!! So many haters.....
> 
> Bottom line AMD has a high end card again that gives end users a choice. Is Vega perfect? No.
> 
> People can buy it or not.........



What overclock can you get on your Vega FE? Does the Crimson driver work on the FE as well?


----------



## Footman (Aug 20, 2017)

xkm1948 said:


> If Navi is multiple GCN stiched together by IF, it will be another major fail. IF is good tech, however it need some good core design to work on.
> 
> 
> 
> What overclock can you get on your Vega FE? Does the Crimson driver work on the FE as well?



I have Vega64 and I am working with Watt Tool to achieve DPM7 state of 1630, which should be reasonably easy, assuming I don't hit thermal or power limits. Having said this so far I have no reliable tools that show what my actual clock is, all of the reporting software is buggy atm....


----------



## xkm1948 (Aug 21, 2017)

What RTG did here is truly despicable. They intentionally manipulated professional review site to favor them through fake pricing. This has been caught on by multiple sites and I do hope this will blow larger. RTG's sketchy move needs to be exposed as this puts the trusted review site in really bad position.

https://www.reddit.com/r/Amd/comments/6ux0qn/jayztwocents_ive_been_thinking_about_this_amd/


----------



## Vayra86 (Aug 21, 2017)

xkm1948 said:


> What RTG did here is truly despicable. They intentionally manipulated professional review site to favor them through fake pricing. This has been caught on by multiple sites and I do hope this will blow larger. RTG's sketchy move needs to be exposed as this puts the trusted review site in really bad position.
> 
> https://www.reddit.com/r/Amd/comments/6ux0qn/jayztwocents_ive_been_thinking_about_this_amd/



Storm in a teacup & AMD 'revolutionary' marketing at work.

Let's move on and forget this ever happened, fast.


----------



## Footman (Aug 21, 2017)

I'll take my class action money for Nvidia lies about memory in their GTX 970's and use that to make up for AMD's lies about Vega pricing..............


----------



## Vayra86 (Aug 21, 2017)

Footman said:


> I'll take my class action money for Nvidia lies about memory in their GTX 970's and use that to make up for AMD's lies about Vega pricing..............



Sounds like a great way to end up losing no matter what


----------



## Footman (Aug 21, 2017)

Vayra86 said:


> Sounds like a great way to end up losing no matter what



Either way the consumer gets screwed, it's not just an AMD issue. Nvidia and Intel are far from innocent!!!


----------



## Artas1984 (Aug 24, 2017)

Well the Evga GeForce GTX1080 SC Gaming that i've bought (all be it with 2 year warranty) is 180 EU cheaper than reference Vega 64 while beating it, using 190 W max power from it's single 8 pin connector and making low noise whatsoever. To think that i've bought the best bang for buck high-end card after it being on the release for 14 months already just shows in what kind of regress the AMD video card market has fallen.  GF GTX1080 did not have competition when released, and does not have now.

Thank you for the review Wiz!


----------



## Footman (Aug 30, 2017)

Just to update this thread, I now have my Ryzen under water and it is extremely well behaved. Assuming that it is 'tweaked' appropriately.
Yes, it uses more power than the GTX 1080 (my last video card), and it is no faster than the GTX 1080..... But, it runs well enough and is smoother than my GTX 1080 in the two games I currently play, Prey and BF1.

For anyone interested I am using Wattman to adjust global settings.
I have increased power target +50, I have set DMP6 and DMP7 at 1632mhz and have undervolted DMP6 from 1150mV to 1120mV and DPM7 from 1200mV to 1120mV. Full load temps are a ridiculous 48C, which is lower than my watercooled GTX 1080! With these settings I am consistently at 1600+mhz. With out of the box balanced mode load frequencies would hover around 1400mhz. Vega is obviously power starved and needs the additional 50+ power target, however this additional power requirement can be offset by a simple undervolting.

My current loop consists of 1x360mm rad and 1 x 240mm rad, Swiftech MCP355, Thermaltake Riing 120mm fans, Eisblock XPX waterblock on my Ryzen 1600X (at 4.0ghz), and the EK Vega waterblock on my Sapphire Vega 64... The motherboard is an Asrock Taichi, I am using the excellent onboard sound and have the Rosewill Cullinan tempered glass case.


----------

