# AMD Radeon RX Vega 56 8 GB



## W1zzard (Aug 14, 2017)

Radeon RX Vega 56 is the second AMD Vega card launched today. It comes at an affordable $399 price point, with a slightly reduced shader count that actually improves things greatly. For example, power efficiency now trades blows with some Pascal cards, which means less heat and noise, too.

*Show full review*


----------



## Frogger (Aug 14, 2017)

If I was to buy would be this 1 ....will wait for OC results ...But I think will stay Green for this round...


----------



## B-Real (Aug 14, 2017)

Thanks for the reviews!
The Vega56 beats the 1070 with cooler reference models (with higher noise) despite the +70W power consumption. Paired with a Sync monitor, it is a better choice. The Vega64 also nearly reaches the 1080. Of course the 1080Ti remains unchallenged, but not most consumers don't buy in that price range.


----------



## TheinsanegamerN (Aug 14, 2017)

So temps look good on the stock driver, but the cooler is also mch louder. Speaks well for third party designs. 

What does not speak well is the price. Why does this have a $50 higher MSRP then a 1070 when they are near identical speed wise?


----------



## medi01 (Aug 14, 2017)

Vega 56 looks good, especially with nvidia adaptive sync tax in mind.
Now we need good AIBs at reasonable prices.

Thank god, gibbo's "leak" about mining perf of Vega's was BS.


----------



## Nihilus (Aug 14, 2017)

Not bad.  Slightly better than a 1070 at 2k and right between a 1070 and 1080 at 4k.  Civ 6 definitely needs driver support as Vega loses to a Fury X inn that title.


----------



## Imsochobo (Aug 14, 2017)

TheinsanegamerN said:


> So temps look good on the stock driver, but the cooler is also mch louder. Speaks well for third party designs.
> 
> What does not speak well is the price. Why does this have a $50 higher MSRP then a 1070 when they are near identical speed wise?



I've seen prices in somewhere in europe so far.
56 is 1080+ prices.
64 is 1080ti prices....


I need Vega! I need something not Nvidia due to linux reasons..
Why is it so hard not to buy Nvidia ?

Do I have to go through yet another 3 years of agonizing workarounds with nvidia cards to do what I want with the performance I need or can I actually buy an amd card ?

hope this changes cause this is hilarious...


----------



## ppn (Aug 14, 2017)

1070 costs 450-500$ now. 6 months ago it used to be 350$ w/ free 60$ game included. 

By now I was expecting to see the 400$ pascal refresh and equivalent to 1080.


----------



## TheinsanegamerN (Aug 14, 2017)

Imsochobo said:


> I've seen prices in somewhere in europe so far.
> 56 is 1080+ prices.
> 64 is 1080ti prices....
> 
> ...


funny you should say that. I've been pounding my head against a wall trying to get AMD's drivers to work, because they still dont support the 4.10 kernel in ubuntu.


----------



## B-Real (Aug 14, 2017)

TheinsanegamerN said:


> So temps look good on the stock driver, but the cooler is also mch louder. Speaks well for third party designs.
> 
> What does not speak well is the price. Why does this have a $50 higher MSRP then a 1070 when they are near identical speed wise?



1. GTX 1070 starts at 440$ atm.
2. Vega56 starts from 400$ hypothetically, though I don't see preorders yet.
3. Think of the price difference between G-Sync and Freesync monitors.


----------



## red_stapler (Aug 14, 2017)

Is this finally the year when I replace my 7950?


----------



## TheinsanegamerN (Aug 14, 2017)

B-Real said:


> 1. GTX 1070 starts at 440$ atm.
> 2. Vega56 starts from 400$ hypothetically, though I don't see preorders yet.
> 3. Think of the price difference between G-Sync and Freesync monitors.


1. My point was that since the 1070 is selling far over MSRP, so will vega, and vega's higher MSRP will result in an even higher final price.
2. so if VEGA is affected the same way pascal is, then vega 56 will be bare minimum $500-$550. Not good when compared to your $450 1070.
3. This point is valid, but I dont buy monitors based on GPU. My monitor still has an easy 8-9 years left in it, so I dont really care about freesync vs gsync. Most people dont replace a monitor until it dies.


----------



## Easo (Aug 14, 2017)

It miiiight look like it will replace my 290X... Maybe.


----------



## B-Real (Aug 14, 2017)

TheinsanegamerN said:


> 1. My point was that since the 1070 is selling far over MSRP, so will vega, and vega's higher MSRP will result in an even higher final price.
> 2. so if VEGA is affected the same way pascal is, then vega 56 will be bare minimum $500-$550. Not good when compared to your $450 1070.
> 3. This point is valid, but I dont buy monitors based on GPU. My monitor still has an easy 8-9 years left in it, so I dont really care about freesync vs gsync. Most people dont replace a monitor until it dies.


Well if it's the case and you can only find Vega56 AIBs around 500, the only reason to go that way is really  the Sync monitors, of course. Let's hope it will not be the case.


----------



## Imsochobo (Aug 14, 2017)

TheinsanegamerN said:


> funny you should say that. I've been pounding my head against a wall trying to get AMD's drivers to work, because they still dont support the 4.10 kernel in ubuntu.



You use closed drivers?
Why not open source ones which work awesomely ?

I have a 280X I put in sometimes  and I run perfectly on 4.11 kernel, 4.15 is when vega with open source out of the box comes.

FYI, open source drivers are better at everything except opencl\vulkan at the moment.


----------



## m0nt3 (Aug 14, 2017)

TheinsanegamerN said:


> funny you should say that. I've been pounding my head against a wall trying to get AMD's drivers to work, because they still dont support the 4.10 kernel in ubuntu.



Why are you trying to use to proprietary driver? The OpenSource driver is faster in everything except Vulkan.


----------



## TheinsanegamerN (Aug 14, 2017)

Imsochobo said:


> You use closed drivers?
> Why not open source ones which work awesomely ?
> 
> I have a 280X I put in sometimes  and I run perfectly on 4.11 kernel, 4.15 is when vega with open source out of the box comes.
> ...


Open source driver throws errors in source games when loading textures
open source driver has a tendency to lock up if I have a video playing in one window and a normal website in another
Open source driver has a tenancy to lock up if I am playing any background video while in a game.
Open source driver has rendering errors in medieval II
Open source driver has rendering errors in minecraft
Open source drivers perform slow as balls in borderlands 2 and civ v.

I could go on. open source is most certainly not "better at everything" and are far from "work awesomely". Even a cursory glance at phoronix would show the open source drivers are still hit or miss, and still have plenty of bugs to be fixed.


----------



## m0nt3 (Aug 14, 2017)

TheinsanegamerN said:


> Open source driver throws errors in source games when loading textures
> open source driver has a tendency to lock up if I have a video playing in one window and a normal website in another
> Open source driver has a tenancy to lock up if I am playing any background video while in a game.
> Open source driver has rendering errors in medieval II
> ...



What video card do you have, I know some 390's have issues. My 480's have not exhibited any of those issues and Borderlands 2 greatly benefits from mesa_glthread=true in the steam launch options for the game, i stay around 70 - 100+ with this option. I know the driver still needs some progress, especially with certain cards, sorry it has not been a good experience for you.


----------



## Imsochobo (Aug 14, 2017)

TheinsanegamerN said:


> Open source driver throws errors in source games when loading textures
> open source driver has a tendency to lock up if I have a video playing in one window and a normal website in another
> Open source driver has a tenancy to lock up if I am playing any background video while in a game.
> Open source driver has rendering errors in medieval II
> ...



try the latest and greatest ?
it may be a bit technical but the latest,  they have made absolutely huge improvements which have helped me a lot till the point where I don't have any major issues anymore.
and which combo do you run of drivers?
radv, amdgpu-pro,radeonsi

The 280x shows no issues for me so far apart from no hdmi audio.
I haven't tried closed source drivers since 2015.

As far as the GTX970 I have issues with all video playback - laggy.
Resizing windows in unity is not an option, it's not possible as long as one Hw accelerated window is open somewhere even as minimized.
issues with tearing with video playback (fixed with a script)
rendering issues with desktop.
Issues with fan ramp up lock at 100%.
Issues with black screen bootup (no video)
Issues with installer (need to put in the 280X to install for instance ubuntu 16.04 as black screen of death randomly occur often without prop. drivers.)
Issues in games when selecting 4K I get super low performance (Seems to be GTX970 specific as 980, 780, 1060 doesn't do this)

Windows doesn't do anything "wrong" with the GTX970, so it's definitely drivers.


----------



## fizhsmile (Aug 14, 2017)

Hmmm this is gonna be my next upgrade. Just upgraded to free sync 4K monitor and my RX 480 sweating like hell XD


----------



## dj-electric (Aug 14, 2017)

Let this set the tone for whatever is happening with AMD right now:


----------



## Imsochobo (Aug 14, 2017)

Dj-ElectriC said:


> Let this set the tone for whatever is happening with AMD right now:




Crap, just crap.
However their "direction" is the only one they "can" take,
R&D must develop for 2019 tech, and I am sure that in 2019 they will have a really good product with Navi.
Nvidia have a lot of R&D on HBM, and all of the things Vega has and it is because it's the future but at the moment it isn't required/doesn't do shit.
Nvidia also have good dev on the known and tried methods with GDDR5X and so on which we see is working out really well 

On top of that having a weak R&D budget doesn't help....
There are good reads on why they didn't make a "big" polaris.
It isn't because the arch. doesn't scale, it's because AMD cannot afford GDDR5X memory controller development alongside HBM dev they clearly need for consoles, laptops and nextgen computing.

I don't think any AMD fanboy can defend this arch and say it's good for 2017 but I can say why it's done, not that it makes the product "better", cause 1070 is a superior product.


----------



## Prima.Vera (Aug 14, 2017)

AMD, make this card at least 50$ cheaper if you want sales, common!


----------



## Tatty_One (Aug 14, 2017)

You possibly won't get one at anywhere near RRP launch prices in any case...... take a look at the mining performance.  Looks quite good to me, if they do reach shelves at anywhere near RRP then they may well be a decent bet as 1070's seem to be heavily overinflated over here at the moment.


----------



## dj-electric (Aug 14, 2017)

Tatty_One said:


> Looks quite good to me


Looks awful next to the RX 580. No miner would want that with this power consumption. Am i missing something?


----------



## dyonoctis (Aug 14, 2017)

Tatty_One said:


> You possibly won't get one at anywhere near RRP launch prices in any case...... take a look at the mining performance.  Looks quite good to me, if they do reach shelves at anywhere near RRP then they may well be a decent bet as 1070's seem to be heavily overinflated over here at the moment.


Taking the power consumption into account, getting 2 gtx 1060 seems better for miners. However i'm seeing a lot of miners who are looking foward to bios mod to get the fabulous hash rate that was leaked by the OcUK commercial. Some are disapointed that the 100Mh/s prophecy isn't currently true, but they have a strong faith in the optimization that will happen in the coming months. Some are saying that they will cancel their pre-order while waiting for improvement, so it's looks like gamers might actually be able to get some at a normal price, until the modders and developpers find a solution.


----------



## P4-630 (Aug 14, 2017)

Still no regrets that I bought a _150_ Watt TDP GTX1070 at day 1....


----------



## B-Real (Aug 14, 2017)

Dj-ElectriC said:


> Looks awful next to the RX 580. No miner would want that with this power consumption. Am i missing something?



It consumes about 30W more power than an RX580. The hash rate (or whatever it is called) is about 300% better than the RX cards as I remember the leaked mining performance.


----------



## dyonoctis (Aug 14, 2017)

B-Real said:


> It consumes about 30W more power than an RX580. The hash rate (or whatever it is called) is about 300% better than the RX cards as I remember the leaked mining performance.


So far nobody could confirm the 75-100 mh/s that was leaked. Every review show the card getting 33-36 mh/s. The miners are currently dissapointed (especialy by vega 64) and are waiting for upcoming bios mod , software optimization that would make the card more interesting. Some ppl are even starting to say that *at the very best the card could* pull 60 mh/s.


----------



## v12dock (Aug 14, 2017)

W1zzard said:
			
		

> Two days ago, AMD provided an updated driver for overclocking testing only, which claims to address this, but it came in too late, when I had already left for my summer vacation.



Where did you go for vacation?


----------



## AlienIsGOD (Aug 14, 2017)

i would like to get this next summer, my RX 480 is barely a year old and does everything i need atm.  Am likely looking to build a new rig and hand down my current one to my kids


----------



## XiGMAKiD (Aug 14, 2017)

So the performance is okay and power consumption is also okay, it's just too late and too expensive


----------



## W1zzard (Aug 14, 2017)

v12dock said:


> Where did you go for vacation?


Sylt, an island here in Germany


----------



## dozenfury (Aug 14, 2017)

Other early overlock reviews with new drivers show quite limited headroom on the Vega.  It's what I kind of feared and expected based on power requirements, and AMDs recent history of putting out cards without much headroom.  Even the water-cooled version is only about 8% faster in reviews.  Mining performance at 40 MH/s overclocked as much as they can is pretty underwhelming too.  Newer drivers may help but I'm extremely skeptical of any magic doubling of performance from a driver.  The hw is what it is. 

What I'd really like to see and I'm sure we'll see very soon, are benches with a 1070 and/or 1080 (which have far more oc headroom) overclocked vs. a Vega 56 or 64 overclocked.  Most enthusiasts aren't running their cards stock.  So that would be much more of an apples to apples comparison.  I have a 1070 that overclocks well and runs neck and neck with 1080 stock benchmarks.  Granted overclocking is always a bit of luck on the bin, but once a few unbiased overclocked vs. overclocked reviews are in we'll start to get a clear picture of what most people can realistically expect.  Even without those it's a fair bet from the charts that a stock Vega 56 or 64 is slightly faster than a stock 1070.  But an overclocked Vega 56 or 64 is likely a bit slower than an overclocked 1070.  An overclocked 1080 would be well ahead of an overclocked Vega 64.  And the 1080ti even at stock is in another class from any Vega.

All in all, not really complaining though.  As a consumer having AMD back at a competitive level with Nvidia in the $400-$500 range of gamer cards is a great thing.  I could see plenty of people going red just by way of personal preference at those levels.  But if you're spending around $700 for sure the 1080ti is the way to go.


----------



## Rahmat Sofyan (Aug 14, 2017)

I wish this is only an another step for much better and better to reach great level performance / watt ratio on Navi...

Vega good but not great, it was too late after Pascal..

Come on AMD, make Radeon great again ... 

#5870


----------



## Lionheart (Aug 14, 2017)

Seems like the only Vega card worth getting TBH, I'm definitely considering it, god I hope coil whine isn't a issue with all of these cards. I have a feeling AMD is holding back the mining performance via a driver so they can get these cards into gamers hands first, they know they need market share & just selling to miners won't help with that, be super lucky to pick it for $399


----------



## trog100 (Aug 14, 2017)

good enough to tempt those with a positive leaning towards the red team but thats about all..  but then again that is all that can be reasonably expected.. 

the 56 version looks to be the one to go for.. the 64 version is just a desperate attempt to get near 1080 performance with no overclocking headroom left..

trog


----------



## gr33nbits (Aug 14, 2017)

This is very good news for all great to see AMD on the GPU high end side too, 2017 seems to be the start of something big for AMD and i hope they keep it up cause im loving my Ryzen and RX400 series that needs and update to keep up with the cpu, but maybe a RX580 8gb is enough for my IPS Freesync 1080p monitor.
Go AMD go.


----------



## Eric3988 (Aug 14, 2017)

I must say compared to the 64, I am impressed. The profit margins are probably razor thin for RTG, but imagine if the MSRP for both cards were $50 less. I think the cards could be no-brainers vs their NVIDIA counterparts. Still, at the asking price for the 56 seems worth it, not so much for 64. I will be picking up one if possible for a reasonable price.


----------



## Chaitanya (Aug 14, 2017)

Looking at those mining tests, looks like those GPUs are going to get snatched up by miners.


----------



## Durvelle27 (Aug 14, 2017)

When will will get OC results


----------



## Tatty_One (Aug 14, 2017)

Dj-ElectriC said:


> Looks awful next to the RX 580. No miner would want that with this power consumption. Am i missing something?


Apart from the fact that in some countries there are no 580's to get?  Then you have 29W more peak for around a 27% improved rate (that's a guesstimate based on the review, too lazy to actually do the math).


----------



## gamerman (Aug 14, 2017)

well, i think its clear that vega 56 is fail average gpu,and not earn 'editor choice' reward, thats for sure.

well,vega is amds newest,brand new 14nm gpu,relase, can say 9/2017, latest so called highend gpu,what amd release since 6/2015
meaning over 2 years amd have time to planning,building and testing amd vega gpu,against gtx 1000 series,so what we get?

1st,these days must checking point and important is effiency,and that issue amd vega,all it version are very lausy,big brother also terrible. biggest should i say banned.
is same that when you buy new TV,wasmachine or even freezor,you check how many stars it get economy,correct?
so,i dont go details,but all there TP revies, but example vega 56 b-ray running, eat alot power, 3 times more than gtx 1070!
and also,even if you not do nothing your vega 56, it eat almost 3 times more power than gtx 1070! huh!

for gaming power eat is so high that while i cant belive that and vega 56 is latest 14nm technology build gpu and amd  are   build it long long time,its again terrible  for comapre gtx 1070.
example average gaming:
vega 56 = 229W!!
gtx 1070 = 145W ,example gtx 1080 166w and even gtx 1080 ti 231w!!
so,vega 56 and gtx 1080 ti have almost same power eat,but gtx 1080 ti are  over 40% faster!!

once again,remembe nvidia gtx 1070 is 2 years old 16nm technics old gpu,so  this is last issue that vega 56 must drop off 'editor choice' reward!!

ok,then gaming...
yes,with that power eat, vega 56 should win all games battle easily against gtx 1070,but no,its battle even many games,loose some but win some.
i release that old  gtx 1070 making good result,its loose 16 games,win 5,several games loose about under 5 fps, few over 10.
bcoz vega 56 have so high power eat, gtx 1070 is winner,clear one. but real winner is vega 56,gaming feel its not.

bcoz,as we all know, playing games smoothly, we need about 60 fps or near 50 fps at least.
both ,vega 56 and gtx 1070 can make it all games easily,both dx11 and dx12 so all games  for resolution 2560x1440 ,running 60 fps or over,so  gaming  speed is not to point.

winner this battle making  differnet issues,anyway,there are numbers.

FPS/dollar =  vega 56 100 ,gtx 1070 105 = winner gtx 1070
FPS/watt = vega 56 100, gtx 1070 148 = clear  winner gtx 1070
performance total = vega 56 100, gtx 1070 94 = winner vega 56
price = even,gtx 1070 cheper,we seen this after few weeks.for later sure gtx 1070 coming cheaper when nvidia volta release.
also i included gpu re-selling value, i guess,for selling for rhis moment (alot btw) most wanted is gtx 1070

last but not least, i read from vega 56 few bad things::
-    Higher power draw than comparable NVIDIA cards
  -  Noisy
   - Coil noise at high FPS
    - Fan does not stop in idle

- 1st im blaming already enough,but thouse other things,fan noisy and coil noise are at least me and my friend big negative.
both together is terrible.
notice: btw  i change my old nvidia gtx 480 for that coil noise long time ago,its shocking noise. guilty is capacitor.

then,oc's for vega 56.... is it possible? can be, that not,might it hardware locked for safety issue.
i think vega 56 and vega 64 has already max oc'd gpus,with it 'uber' option.
and if u can  oc'd it little more,power hungry raise sky high,so maybe amd make it stop...think we know later this like TP  guys say..
summarum,hmm, i think Techpowerup testers give vega 56  'editor choice' reward for 'good will' its sure not deserve it.
gtx 1070 is winner and deserve it 'editor choice' reward, amd vega 56, not. read it review and sure gtx 1080 too.

last i want say that Techpowerup is my most highest top 3  hardwre site,what  i respect alot!
excellent rewies,usually 99,9% objective! 
thanks for test!

 my tips

if you want best gaming gpu now,bgore nvidia volta not release yet, my tips is:
Zotac GeForce GTX 1080 AMP! Extreme+, 8GB GDDR5X, DVI, HDMI, 3x DisplayPort (ZT-P10800I-10P)

its beat all amd vegas gpu's
its whisoer silent
its really fast
has great effiency
has ready oc's good,no need any stupid program going it.
no coil noise,fan idle 0 db for temp under 60c
good warranty and have excellent re-selling value long time.

thanks. i wish all nice times!


----------



## dj-electric (Aug 14, 2017)

Tatty_One said:


> Apart from the fact that in some countries there are no 580's to get?



When demend is high, the RX 580's problems will pale compere to this. I've heard too many miners say "get me the entire stock when this comes in"


----------



## EntropyZ (Aug 14, 2017)

The Fallout 4 benchmark is appreciated. It's like the worst case scenario when the game isn't optimized to take advantage of GCN based cards. I really want a card that can run these poorly running titles and keep Freesync compatibility, but I think I might be staying away from graphics cards for a long time, miners already bought out the whole reference card stock through pre-orders. I don't think other countries will fare any better.

AIB cards only come in September, and even those aren't safe. I think AMD's strategy to attempt miners not getting the cards has failed. Also, the cards just came way too late and the worst time possible during the mining craze. Top kek.


----------



## Jism (Aug 14, 2017)

AMD's biggest problem is R&D. So it has to settle with what it has and basicly design a chip that suites both professional and gaming market. So they go for another extension of the GCN arch and favor brute-force rather then Nvidia's method.

In parallel workloads AMD GPU's deliver. That is a fact. Look at video encoding for example, or the recent Mining benchmark. It kicks out all the competition and leaves it far behind even when OC'ed. It's a beauty of a chip, packed with new technology. However when it comes down to games, parallelism is not best way, so AMD has to settle for 'bruteforce' on gaming workloads, causing this power usage difference that you see between Nvidia and AMD right now.

You can compare it with this: Nvidia uses 2 highways, to transport goods from one to another, with a speed of 150MPH. You have 2 lanes that put small trucks pretty fast from one and the other way. AMD on the other hands offers 4 lanes with a speed of 75mph. When lots of small trucks pass they go at half speed. When there's a shitload of trucks all at the same time, here is where the 4 lanes at 75mph comes in place.







As long as AMD is holding on to the GCN arch they will always have the problem on the high end market. There arch shines however on for example, RX480, or RX580, which offers a best price/performance ratio for the 150W subrange. It's a hell of a good card.  GCN is just not made yet for the high end market / 1080ti terrority. But it does'nt make it a bad card. I am working on buying a HEDT platform and i might as well go for a Vega while i'm on it. I dont play hardcore games. I dont need 170 FPS on a 75Hz WQHD screen anyways. You can tweak the card, undervolt it, use the software to minimize it's power usage, simular as to a 9570, which could be undervolted as well to shave off a rough 60W from the wall.


----------



## r9 (Aug 14, 2017)

I want to see Ruzen/Vega review.


----------



## ppn (Aug 14, 2017)

VEGA64 carries additional 40% transistors (thus power consumption maybe) compared to FURYX but uses them poorly for gaming. Improved clocks is where all the performance gains are coming from.

Block time will rise to 30 seconds soon enough. Up from 20 seconds currently and then suddenly VEGA 64 makes less than 1050Ti less than $. If all VEGA GPUs join it will be 50 seconds in no time, good luck miners.


----------



## Jism (Aug 14, 2017)

ppn said:


> VEGA64 carries additional 40% transistors (thus power consumption maybe) compared to FURYX but uses them poorly for gaming. Improved clocks is where all the performance gains are coming from.



It's not that they dont use them properly, it's a design choice. You cant change that with drivers, only optimize it as much as possible for gaming workloads.


----------



## Tatty_One (Aug 14, 2017)

Dj-ElectriC said:


> When demend is high, the RX 580's problems will pale compere to this. I've heard too many miners say "get me the entire stock when this comes in"


And 29W for 27-28% improved rate?  At the end of the day, plenty will buy them if available, if only because many of the already established mining cards are sold out or in short supply.


----------



## mastershake575 (Aug 14, 2017)

So its basically the same price/performance as the 1070 while having more noise/power consumption ? 

We waited a year for them to basically offer nothing new/exciting to the market ? (solid work AMD !!!)


----------



## 5DVX0130 (Aug 14, 2017)

Tatty_One said:


> And 29W for 27-28% improved rate?  At the end of the day, plenty will buy them if available, if only because many of the already established mining cards are sold out or in short supply.


Plenty are already buying them. Well, at least they are trying to.
Not that easy though. Most sites are out of stock/sold out, and prices are already on their way to the Moon. 

Amazon: Best Sellers in Computer Graphics Cards
#1 XFX Radeon Rx Vega 64


----------



## Shatun_Bear (Aug 14, 2017)

mastershake575 said:


> *So its basically the same price/performance as the 1070* while having more noise/power consumption ?
> 
> We waited a year for them to basically offer nothing new/exciting to the market ? (solid work AMD !!!)



I bet you hoped it would be 'basically the same' performance as a 1070 but it in reality it's clearly faster on day 1 drivers, especially at 1440p and 4k, where it leaves the 1070 in the dust. I think you missed the performance summary.

Anyway, a custom 56 is my next card.


----------



## GoldenX (Aug 14, 2017)

Now we know Zen APUs are going to have good IGP performance.


----------



## Shatun_Bear (Aug 15, 2017)

GoldenX said:


> Now we know Zen APUs are going to have good IGP performance.



Yes it should have been expected that Vega operating at the much lower frequencies to fit inside an APU would be vastly more power-efficient than these desktop cards. APUs with Vega + Ryzen cores are going to be amazing.


----------



## Footman (Aug 15, 2017)

I'm torn TBH. I have a fantastic 27in IPS 2560x1440 Freesync monitor and an RX480 trying to keep up. I have a custom loop, so noise and heat are not much of an issue, especially as EK have already released a waterblock for the stock 56 and 64.... I'd rather buy the vanilla 64, but it's not likely that I will find it at $499, but the high power requirements are pushing me to the vanilla 56 now. With water I should be able to get a stable boosted clock that brings me close to the stock 64. As the 56 wont be available to buy for a couple of weeks I guess I can wait. Perhaps by this time some of the AIB's will be talking about their custom 64 designs... According the Gamer Nexus the VRM design of the Vega cards is top notch for vanilla stock cards, however it looks like the 56 is limited to 300W total power delivery which will limit its overclocking ability compared to the 64....


----------



## Th3pwn3r (Aug 15, 2017)

So far across everything I've seen Vega56=MAYBE, Vega64=DEFINITELY NOT.

This is my current opinion, there will very likely be performance increases but I just feel like we'll be polishing a turd(Vega64). Unless some miracles happen, Vega64 is crap like a lot of the Nvidia fanboys were shouting. Vega56 definitely looks to be the Vega to get if I had to choose between the two, however, Vega56 would be my backup card or for my 5 year old daughter's computer if I don't opt for a 1070 or 1080.


----------



## ViperXTR (Aug 15, 2017)

The better option vs Vega 64, finally something that challenges my over a year old GTX 1070 :V, and i could only think that it will only get better along the way. With the mining craze, i was tempted to sell my 1070 and get a Vega 64 (i have a freesync monitor) or GTX 1080 but i also thought that i barely play games these days and might not be worth it. (also  The savings i had getting a freesync monitor would be nulled with that power draw though, had to replace my ancient PSU if i am to get  a vega gpu)


----------



## Th3pwn3r (Aug 15, 2017)

ViperXTR said:


> The better option vs Vega 64, finally something that challenges my over a year old GTX 1070 :V, and i could only think that it will only get better along the way. With the mining craze, i was tempted to sell my 1070 and get a Vega 64 (i have a freesync monitor) or GTX 1080 but i also thought that i barely play games these days and might not be worth it. (also  The savings i had getting a freesync monitor would be nulled with that power draw though, had to replace my ancient PSU if i am to get  a vega gpu)



At least you can get a more efficient psu as well.


----------



## ViperXTR (Aug 15, 2017)

Th3pwn3r said:


> At least you can get a more efficient psu as well.


Not enough reason now though, my gaming sched has been reduced and my 1070 is barely getting taxed these days.


----------



## mnemo_05 (Aug 15, 2017)

maybe a bit OT, but where can i find a gtx 1070 that sells for $350 as mentioned on the table on the first page?

going back to the topic though, nice to see amd getting back to the game. these cards may not push the gpu technology forward as for their performance, but it may force nvidia to lower their price a little. i myself is in the market for either a 1080 or a 1080ti =)

saddens me to see my 980ti getting old faster that i anticipated


----------



## mastershake575 (Aug 15, 2017)

Shatun_Bear said:


> I bet you hoped it would be 'basically the same' performance as a 1070 but it in reality it's clearly faster on day 1 drivers, especially at 1440p and 4k, where it leaves the 1070 in the dust. I think you missed the performance summary.


 Most reviews have the vega 56 as 4 to 5 % faster at 1440p. Lmao at "leaving it in the dust". I guarantee you when its all said and done, these cards will be "basically the same performance" just like they are now ( especially since the 3rd party 1070s are monster overclockers )

AMD was 14 months late to the party and offered absolutely nothing new or exciting to this segment (hell it won't even force the smallest of a price drop). You can't come this late and be this medicore (at least Ryzen was a success).


----------



## Shatun_Bear (Aug 15, 2017)

mastershake575 said:


> Most reviews have the vega 56 as 4 to 5 % faster at 1440p. Lmao at "leaving it in the dust".* I guarantee you when its all said and done, these cards will be "basically the same performance" *just like they are now ( especially since the 3rd party 1070s are monster overclockers )
> 
> AMD was 14 months late to the party and offered absolutely nothing new or exciting to this segment (hell it won't even force the smallest of a price drop). You can't come this late and be this medicore (at least Ryzen was a success).



I was talking about 4k. So let's break this down for everyone - you think a 6% performance deficit at 1440p and 9% performance deficit at 4K is 'basically the same performance'? 

Even more baffling,what does 'when its all said and done' mean?? Are you expecting a magic driver from Nvidia to increase performance by 10% when historically improvements have come from the other side? It was this site that at launch showed the RX 480 9% slower than a 1060 but now the gap is around 4%.

*'Basically the same'*


----------



## Tatty_One (Aug 15, 2017)

mnemo_05 said:


> maybe a bit OT, but *where can i find a gtx 1070 that sells for $350* as mentioned on the table on the first page?
> 
> going back to the topic though, nice to see amd getting back to the game. these cards may not push the gpu technology forward as for their performance, but it may force nvidia to lower their price a little. i myself is in the market for either a 1080 or a 1080ti =)
> 
> saddens me to see my 980ti getting old faster that i anticipated


 Probably the same place you can get one of these for RRP, otherwise newegg has a mini 1070 for 346 or a full sized for 361


----------



## sutyi (Aug 15, 2017)

mastershake575 said:


> So its basically the same price/performance as the 1070 while having more noise/power consumption ?
> 
> We waited a year for them to basically offer nothing new/exciting to the market ? (solid work AMD !!!)



Mind you lot of features are still dormant in the Vega arch till Q4 late October / early November time frame when interesting stuff gets enabled in the drivers. How much extra performance will that be is still a question tho.

So this is a hard launch with a not feature complete driver package sadly.


----------



## EarthDog (Aug 15, 2017)

sutyi said:


> Mind you lot of features are still dormant in the Vega arch till Q4 late October / early November time frame when interesting stuff gets enabled in the drivers. How much extra performance will that be is still a question tho.
> 
> So this is a hard launch with a not feature complete driver package sadly.


oh???? What features are those? Links please!


----------



## Th3pwn3r (Aug 15, 2017)

sutyi said:


> So this is a hard launch with a not feature complete driver package sadly.



Is it? They've had plenty of time to get drivers completed.


----------



## Frick (Aug 15, 2017)

EarthDog said:


> oh???? What features are those? Links please!



It has slots for memristors. Yes, single slots for single memristors.


----------



## sutyi (Aug 15, 2017)

EarthDog said:


> oh???? What features are those? Links please!



Take a gander at Beyond3D forums mostly, also ComputerBase at ze germans have reported this.

HBCC is in, disabld by default tho. So probably not all kinks ironed out. You can turn it on tho.
DBSR in inactive supposedly, as is Primitive Shaders.

AMD themselves noted that there is a "Radeon Software Crimson ReLive Redux" package in the works, and that is supposed to arrive sometime in Q4. Being optimistic about I hope it ain't early December...


----------



## EarthDog (Aug 15, 2017)

Forums... heh. So just a bunch of people speculating?

You brought it up.. dont make us look for it...


----------



## sutyi (Aug 15, 2017)

EarthDog said:


> Forums... heh. So just a bunch of people speculating?
> 
> You brought it up.. dont make us look for it...



I just noted.  Primitive Shaders are disabled for fact, first driver batch that has a Preview version enabled in it is 17.320. 

As for the rest these are rumors and speculation as you have stated. But you know rumors being rumors, you don't have to believe it from me or from any one else. So take it with a grain of salt or just glide over it.

Also I'm sorry, but I don't have the urge to flip trough some 30+ pages on other forums for others convenience.


----------



## EarthDog (Aug 15, 2017)

You had the urge a took the time to post it somewhere else... it would just be nice to support it. 

Part of the issue with primitive shaders are getting the programmers behind it and supporting it.amd is great at bringing bleeding edge technology to the table. The problem comes when its rarely used and little buy in.. think mantle... Vulkan so far... hbm/hbm2... etc.

HBCC is a 4K thing as well, according to PCPer. Not to mention, being as fast as a 1080, is it really a good 4K card??


----------



## sutyi (Aug 15, 2017)

EarthDog said:


> You had the urge a took the time to post it somewhere else... it would just be nice to support it.
> 
> Part of the issue with primitive shaders are getting tbe programmers behind it and suporrting it.amd is great at bringing vleeding edge technology to the table. The problem comes when its rarely used and little buy in.. think mantle... vulkan so far... hbm/hbm2... etc.



Supposedly (again you know with grain of that NaCL) Primitive Shaders will be a driver level function so no additional work on the game developers as it currently stands.

Last 3-4 pages here: https://forum.beyond3d.com/threads/...-vega-20-rumors-and-discussion.59649/page-179
Conclusion here: https://www.computerbase.de/2017-08/radeon-rx-vega-64-56-test/8/

Also there was some GamerNexus stuff on this whole DBSR / HBCC / Primitive Shaders stuff somewhere from our Hairdo Overlord, Steve. Can't remember it I had read it or if it was mentioned in the video review tho.

Side note: Mantle was the catalyst (no pun intended) that brought close to metal APIs on PC. Without that there would be no DX12 or Vulkan today.


----------



## Footman (Aug 15, 2017)

I invested in Freesync monitor and now need something better than my current RX 580 to power it. I will likely just buy the Vega 56 once the custom designs come to market, or buy stock version and waterblock, assuming I can find one at $399 USD. People are using the increased power requirements as a major showstopper for not buying Vega. Lets say that if you buy Vega and find out that there is a change in full load power requirements from your old card to Vega of 85W, the cost in additional power is minimal, based on my power costs here in Nevada. If I play 20 hours of gaming a week, I end up paying an additional 20 cents per week (12c per kwh in Nevada) based on calculation *wattage x hours used ÷ 1000 x price per kWh= cost of electricity*
If I refer to gaming load results at Anandtech, during BF1 gaming, there is a difference of 51W between the RX 580 and the Vega 56 and 142W between the RX 580 and Vega 64, so if I use the example above for calculating additional power costs then the swap to a Vega 56 will cost an additional 12 cents per week and swap to Vega 64 will cost an additional 34 cents per week.

Perhaps my math is wrong? While the talk of increased TDP and power requirements are initially alarming, the additional costs look to be minimal. These increased power requirements won't stop me from buying Vega.

Having said this the additional cost will hurt miners who are using their computers 24/7 to mine.....


----------



## Th3pwn3r (Aug 15, 2017)

Shatun_Bear said:


> I was talking about 4k. So let's break this down for everyone - you think a 6% performance deficit at 1440p and 9% performance deficit at 4K is 'basically the same performance'?
> 
> Even more baffling,what does 'when its all said and done' mean?? Are you expecting a magic driver from Nvidia to increase performance by 10% when historically improvements have come from the other side? It was this site that at launch showed the RX 480 9% slower than a 1060 but now the gap is around 4%.
> 
> *'Basically the same'*


No reason to discuss Vega 56 and 4K, the frames are too low for a great experience per benchmarks.


----------



## Frick (Aug 15, 2017)

Footman said:


> While the talk of increased TDP and power requirements are initially alarming, _the additional costs look to be minimal_. These increased power requirements won't stop me from buying Vega.



For end users (ie not servers) this has always been true, with a few rare exceptions.


----------



## Th3pwn3r (Aug 15, 2017)

Frick said:


> For end users (ie not servers) this has always been true, with a few rare exceptions.


Well, some of us don't want ovens running 24/7 in our rooms. I remember having my PC under my desk by my legs a long time ago, I learned my lesson. I'm all about efficiency.


----------



## mastershake575 (Aug 15, 2017)

Shatun_Bear said:


> I was talking about 4k.


 I do not know a single person that has a 4K monitor. If they did, I doubt they would be running a Vega 56 (single 1080TI is already pushing it). I literally have no idea why 4K is in this discussion.

4-6% yes, is basically the same performance. All said and done was in reference to you talking about driver improvements. Would not be surprised if the driver improvements that the 56 gets is negated by the huge overclocking potential of the third party 1070s (third party 1070s are pretty consistently overclocking to stock 1080 territory, I hardly doubt third party 56's even with improved drivers are going to significantly leap stock 1080 performance).

Not really seeing the confusion or your obsession with hyping this mediocre card ? Are you the AMD white-knight of this forum ? (few AMD threads i've read have stated yes you are).

Reviews are showing these cards very close in performance and price. This card offered nothing new or exciting......... (get over it, this was a disappointing launch). I will say, it was defiantly better than the 64 but at the end of the day it's too little too late. 

Slightly better performance at $50 higher MSRP is disappointing for a card thats 14 months late to the party (card offers no price wars and nothing intriguing to an already aging market).


----------



## Jeffredo (Aug 15, 2017)

Makes me happy I bought my GTX 1070 almost a year ago for $365.  The miners can have this one.


----------



## Flak (Aug 15, 2017)

Does the Vega 56 not have the same power profile options as the Vega 64?  Vega 64 and power saver profile seems like a win to me.


----------



## Super XP (Aug 16, 2017)

This is a New GPU Generation by AMD. And a great attempt. Some might believe its based on its previous Gen, but not according to what AMD has done. I consider this VEGA as 1st Gen to something brand new from the ground up. That explains the power draw versus the 1080's. 

Great work AMD, battling on both CPU & GPU fronts. Let the games begin.


----------



## Super XP (Aug 16, 2017)

Eric3988 said:


> I must say compared to the 64, I am impressed. The profit margins are probably razor thin for RTG, but imagine if the MSRP for both cards were $50 less. I think the cards could be no-brainers vs their NVIDIA counterparts. Still, at the asking price for the 56 seems worth it, not so much for 64. I will be picking up one if possible for a reasonable price.



AMD has more than enough room to drop the prices for its VEGA line up and still make a commanding Profit.


----------



## Th3pwn3r (Aug 16, 2017)

Flak said:


> Does the Vega 56 not have the same power profile options as the Vega 64?  Vega 64 and power saver profile seems like a win to me.


Power saver profile will turn your Vega 64 into a 1060 with more power consumption.


----------



## Flak (Aug 16, 2017)

Th3pwn3r said:


> Power saver profile will turn your Vega 64 into a 1060 with more power consumption.




Odd, did you read the Techpowerup review? Didn't seem slower than a 1060 to me.  Seemed faster than the Vega 56 with lower power consumption.  Maybe I was reading the wrong review.


----------



## Flogger23m (Aug 16, 2017)

B-Real said:


> 1. GTX 1070 starts at 440$ atm.
> 2. Vega56 starts from 400$ hypothetically, though I don't see preorders yet.
> 3. Think of the price difference between G-Sync and Freesync monitors.



I suppose if you're starting from a blank slate that makes sense. But I purchased my GTX 1070 almost one year ago. So it doesn't seem like a good deal. At best we're getting a 20% jump, at worse, a 20% decrease depending on the game. Most games they seem to be within a frame rate or two. And keep in mind my GTX 1070 runs about 10% faster than stock, and most of these benchmarks use stock GTX 1070s I'd imagine. So those gains will look even worse considering that. Will Vega OC well? Maybe, but seeing the high power draw as it is, I think it may not have as much headroom as Nvidia's lineup. Assuming it does OC just as well, we're still at best 5% better on average (if that), with a maximum of 20% better in a small number of titles. 

To be, that isn't worth upgrading to. If Nvidia released a card with similar performance for $400 I'd skip it as well.


----------



## 1Gpi2ZV6Jy (Aug 16, 2017)

I would never pay this money for any GPU and especially in Australia this card is ~Au$800

this price tag is not going anywhere


----------



## Dimi (Aug 16, 2017)

I got my GTX 1070 in September LAST year for 410$ and bought a 1440p Gsync 165hz monitor last month for 399$. I'm extremely happy i didn't jump on the Vega hype. I'm running the card at 2050mhz and it plays every game i have flawless.

What i don't get is this whole driver debacle, they had the card running on show a full year ago playing Doom. How long do these people need to create a good release driver? People shat on Intel for rushing their X299 platform but ALL of the other AMD releases are rushed just as much. Ryzen, Threadripper and now Vega. Don't people ever learn?


----------



## Shatun_Bear (Aug 17, 2017)

Dimi said:


> I got my GTX 1070 in September LAST year for 410$ and bought a 1440p Gsync 165hz monitor last month for 399$. I'm extremely happy i didn't jump on the Vega hype. I'm running the card at 2050mhz and it plays every game i have flawless.
> 
> What i don't get is this whole driver debacle, they had the card running on show a full year ago playing Doom. How long do these people need to create a good release driver? People shat on Intel for rushing their X299 platform but ALL of the other AMD releases are rushed just as much. Ryzen, Threadripper and now Vega. Don't people ever learn?



Please enlighten us to how the Threadripper launch was 'rushed'? And Ryzen's issues were hardly severe.


----------



## RejZoR (Aug 19, 2017)

Vega 64 is "Made in Korea"? Samsung? Vega 56 "Made in Taiwan"? TSMC? Hm, that's interesting...


----------



## SAMiN (Aug 20, 2017)

RejZoR said:


> Vega 64 is "Made in Korea"? Samsung? Vega 56 "Made in Taiwan"? TSMC? Hm, that's interesting...


I think its other way around, Vega 56 is made in Korea and 64 made in Taiwan.


----------



## D3mux (Aug 23, 2017)

At hardwareluxx.de they downvolted vega 56 and obtained a stable overclock reaching GTX1080 performance at lower consumption (lower than 1080...). Is it possible? 
This would put Vega 56 over 1070 and 1080 in terms of both performance and perf/power.


----------



## wizardfingers (Aug 26, 2017)

Hopefully I can get one per-ordered then I'll sell it off to the miners and get a 1080


----------



## chief-gunney (Aug 29, 2017)

1Gpi2ZV6Jy said:


> I would never pay this money for any GPU and especially in Australia this card is ~Au$800
> 
> this price tag is not going anywhere



ordered my vega56 today for $579 AU, etailer told me stock arrives on 1st Sept. Also ordered an EK waterblock for it.


----------



## Footman (Nov 21, 2017)

I found a vanilla Vega 64 a few weeks ago at $499 USD and slapped on an EK waterblock. Apart from a little coil whine, the card has been amazing. Runs cool and quiet at around  52c full load and I have it undervolted at 1140mv down from 1200mv full load and it still boosts to just over 1600mhz in game. Provides similar performance to my old GTX 1080. I bought this to run with my Freesync monitor.


----------



## Imsochobo (Nov 21, 2017)

Footman said:


> I found a vanilla Vega 64 a few weeks ago at $499 USD and slapped on an EK waterblock. Apart from a little coil whine, the card has been amazing. Runs cool and quiet at around  52c full load and I have it undervolted at 1140mv down from 1200mv full load and it still boosts to just over 1600mhz in game. Provides similar performance to my old GTX 1080. I bought this to run with my Freesync monitor.



I shit on a 1080 watercooled with my ek block vega 64(air)
I do not come close to 1080TI, so we got that going for us.

under water it's better than all except 1080ti, not that horrible, not that great


----------



## Footman (Nov 21, 2017)

Well, based on my comparison between graphic scores on unigine benchmarks and futuremarks, my Vega 64 does not shit on my old GTX 1080. While these are synthetic tests, I have not seen any huge differences in frame rates in games with these two cards.


----------



## Shatun_Bear (Nov 22, 2017)

Footman said:


> Well, based on my *comparison between graphic scores on unigine benchmarks and futuremarks*, my Vega 64 does not shit on my old GTX 1080. While these are synthetic tests, I have not seen any huge differences in frame rates in games with these two cards.



Lol you are not serious. Using unigine and futuremark to compare cards? Come on.

Also, game benches between a 1080 and Vega 64/56 with the biggest new PC releases show the 64 has a pretty large performance lead over the 1080 in 1440p and 4k.

Look at these benches of Battlefront 2 and Wolfenstein 2:

http://www.guru3d.com/articles_page..._pc_graphics_analysis_benchmark_review,5.html

http://www.guru3d.com/articles_page..._pc_graphics_analysis_benchmark_review,5.html


----------



## Footman (Nov 22, 2017)

I used the GTX 1080 for over 6 months and have had the Vega 64 since release and can say that at 1440 resolution I see no realistic difference in my gaming with either card. A few extra fps in Wolfenstein or Battlefront do not translate to a better gaming experience. I have a Freesync monitor capped at 144fps. I can certainly say that I am enjoying the Vega 64 and Freesync combination though.


----------



## Shatun_Bear (Nov 24, 2017)

Footman said:


> I used the GTX 1080 for over 6 months and have had the Vega 64 since release and can say that at 1440 resolution I see no realistic difference in my gaming with either card. A few extra fps in Wolfenstein or Battlefront do not translate to a better gaming experience. I have a Freesync monitor capped at 144fps. I can certainly say that I am enjoying the Vega 64 and Freesync combination though.



Sure but the difference between cards is often just a few FPS. Look at the 1060 vs 480/580. Same for Fury X vs 980 Ti in some cases.


----------



## Footman (Nov 25, 2017)

No arguments there Shatun. I agree that there are no more than a few fps difference between GTX 1080 and Vega 64...


----------



## RejZoR (Nov 26, 2017)

Shatun_Bear said:


> Sure but the difference between cards is often just a few FPS. Look at the 1060 vs 480/580. Same for Fury X vs 980 Ti in some cases.



Psssssh. Don't ruin the fun of people bragging it's OVER 15% FASTER!!!!!11111 even though that means only, I don't know, 5fps difference which is already beyond 60fps threshold to begin with...

Which is why I really don't like percentages, because they literally skew the perception of differences. 15% sounds like a lot, but if someone says the difference is 5fps, it's like whatever, margin of error almost if we are realistic.


----------



## trog100 (Nov 26, 2017)

RejZoR said:


> Psssssh. Don't ruin the fun of people bragging it's OVER 15% FASTER!!!!!11111 even though that means only, I don't know, 5fps difference which is already beyond 60fps threshold to begin with...
> 
> Which is why I really don't like percentages, because they literally skew the perception of differences. 15% sounds like a lot, but if someone says the difference is 5fps, it's like whatever, margin of error almost if we are realistic.



15% dosnt sound a lot to me.. but then again i know what it means.. next to f-ck all.. he he

trog


----------



## Mighty-Lu-Bu (Sep 17, 2018)

Since we have gotten a bunch of new Vega 64 and 56 cards, can you review them please?


----------



## cronus75 (Dec 10, 2018)

I have an Asrock Vega 56 undervolted and not overclocked. Easily gaining better results than 1070ti and knocking the door of a 1080. At about 250$, it is a very good choice for 1080P gaming even for 1440P.


----------



## EarthDog (Dec 10, 2018)

RejZoR said:


> Psssssh. Don't ruin the fun of people bragging it's OVER 15% FASTER!!!!!11111 even though that means only, I don't know, 5fps difference which is already beyond 60fps threshold to begin with...
> 
> Which is why I really don't like percentages, because they literally skew the perception of differences. 15% sounds like a lot, but if someone says the difference is 5fps, it's like whatever, margin of error almost if we are realistic.


and by using a number only it doesnt tell the story. 5 fps at 50 is different than 5 fps at 100. 10% vs 5%. Percent tells the story better...in particular when you know the actual fps value.


----------



## jabbadap (Dec 10, 2018)

cronus75 said:


> I have an Asrock Vega 56 undervolted and not overclocked. Easily gaining better results than 1070ti and knocking the door of a 1080. At about 250$, it is a very good choice for 1080P gaming even for 1440P.



If you don't have gtx1070ti too, you can't really know that.


----------



## Super XP (Dec 10, 2018)

Shatun_Bear said:


> Please enlighten us to how the Threadripper launch was 'rushed'? And Ryzen's issues were hardly severe.


Agreed. 
Neither was rushed. Both were quite successful. Ryzen's very minor issues were quickly rectified. No CPU or GPU manufacturer releases products without issue. Just look at Intel CPU launches,  all followed by patches and firmware updates. Quite common actually.


----------

