• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX Vega 64 8 GB

My Fury X can breath a sigh of relief as I see no reason to upgrade from it, except the 8Gb memory.
 
Hype hype and even more hype but nothing interesting when it comes to actual performance, as always with all AMD GPUs.
 
And before folk say it's the game developers fault for not implementing RTG's stuff, that's a wee bitty hypocritical because the same people argue Nvidia are bad for working with dev's. So again, for the company that have the Xbox and PS4 in their wings, why does their top range gfx card not do better when the consoles code for their chips?

It's not that straight forward. Most people on the AMD side argue that developers are not using DX12/Vulkan not some specific software from AMD. If you use these APIs to make games run faster , they'll run faster universally, not just on AMD. Whereas Nvidia did some clearly discriminatory things in the past when they worked with developers , like it or not. AMD and Nvidia both work with developers all the time , the end result isn't what you would desire every single time unfortunately.
Just because AMD makes hardware for consoles doesn't mean that will translate into brilliant performance on their desktop products , it's a different environment after all.

So once again , it's not as straight forward.

You are't new here. Are you surprised?

Not really , which is why I said "just as expected". Well it's not as bad as on some other forums honestly.
 
Unfortunately AMD did not include the idle-fan stop feature with their card, which puzzles me, since it has become a basic feature on nearly all boards on the market today, except for NVIDIA reference designs, which fail here, too.

That's because it probably can't be done on blower coolers. On open air coolers the heatskins are always realively big and exposed to ambient air. On blowers, the heatsink is much smaller and enclasurated in a "pipe", which is necessary to the use of radial fans, but terrible for natural convection. So, it would problably hit temperatures too high even at idle.
 
Last edited:
Arguably the Vega 56 is a good card for the price tho.
They're not bad in the same way a 980Ti is not a bad card and 2500K is not a bad CPU. Even today.
So AMD is finally making a high-end GPU (performance-wise).

It's just that this is not a product that was promised. Not a card we would expect in 2017 (looking at what NV can do).

In many ways this looks like a brute-force answer to the more innovative competitors - a bit like what Intel did with Skylake-X to counter Zen (possibly clocking it way higher than they originally planned).
It's just that the performance/power draw relations are even worse than in CPU battle. And of course AMD had literaly years to prepare Vega and it was meant to be a revolution. Now it looks like an interim solution and we're already beeing fed by Navi hype.
As some have already noted: a dual Polaris could be just as effective and most likely cheaper.
Or from another point of view: think how quick would Pascal be if NVIDIA calibrated it for Vega's power consumption...

And a couple important notes:
1) Just how bad will the Vega APU be? Is there any sense in making it? Will a ~30W Vega be noticeably faster than a ~30W Polaris?
2) Will we see a mobile RX Vega?

Vega is for the datacenters not for gaming rigs.
I don't know where you got this idea from.
This is not a computation card, it will not be used in datacenters. Not happening. Ever.
AMD is targeting RX lineup at gamers. They have other cards designed for professionals.
 
Last edited:
So basically i guessed performance would be similar to two RX 470s in Xfire when xfire actually works. End result Vega performs pretty much somewhere between RX 470 and 480 xfire performance on average. Go figure.
 
Just because AMD makes hardware for consoles doesn't mean that will translate into brilliant performance on their desktop products ,

We need to go back in time because I'm pretty sure (not you!) a lot of AMD people were spouting the demise of Nvidia because AMD got the consoles. It was as good as over, you know, AMD chips in Xbox and PS4, it'll be DX12 everywhere and Nvidia will be finished.

Guess what? Didn't happen and even then, Nvidia can do DX12 and Vulkan just fine.

Too many people forget the last launch... and the one before that. What's really sad is that I left Nvidia to go to Radeon (HD 5850's) because the Fermi power issue was hilarious. I mean, Fermi, this compute monster rolling in with CUDA and being so hot they couldn't make 512 cores work - had to stop at 480 (or something). It was a grill. Now it's insane that power is not an issue anymore because it's AMD who can't control it properly. The historical hypocrisy is sublime.

I'l say it over and over. It's a gaming card and RTG need to work on streamlining it, not making it more and more complex.
 
The power consumption is beyond ridiculous. I'm not putting that crap into my PC.


Also, it's funny how in Guru3d's review the card consistently beats 1080 by noticeable amount and is praised like it was something amazing. I wonder...
 
The power consumption is beyond ridiculous. I'm not putting that crap into my PC.


Also, it's funny how in Guru3d's review the card consistently beats 1080 by noticeable amount and is praised like it was something amazing. I wonder...
Wondering why are you only thinking about the power consumption part of the GPUs. Why not thinking about Sync monitors, why not thinking about the Radeon Packs? So biased "review" from you.

Hype hype and even more hype but nothing interesting when it comes to actual performance, as always with all AMD GPUs.

Hehe, yeah sure. Pretty little liar.

Is that power consumption really accurate? Wow

You dont want to live in a country where eletricty is expensive. Shocking numbers

Yeah sure, this is the time when power consumption is important. What about Intel's X CPUs? :)
 
Wondering why are you only thinking about the power consumption part of the GPUs. Why not thinking about Sync monitors, why not thinking about the Radeon Packs? So biased "review" from you.
Review? Stop taking drugs. This is discussion forum.
 
Too many people forget the last launch... and the one before that. What's really sad is that I left Nvidia to go to Radeon (HD 5850's) because the Fermi power issue was hilarious. I mean, Fermi, this compute monster rolling in with CUDA and being so hot they couldn't make 512 cores work - had to stop at 480 (or something). It was a grill. Now it's insane that power is not an issue anymore because it's AMD who can't control it properly. The historical hypocrisy is sublime.

I don't think anyone disagrees that power consumption is huge and if they do they're not hypocritical they are just ignorant. But is it an actual issue ? Remember that Fermi wasn't just simply a power hog , it would blow up if you would mess with the voltages too much , it would also run at 90-100c on the brink of death with the stock cooler. Vega as far as I can see , doesn't blow up because nowadays power consumption is very strictly monitored/controlled unlike back then. It also runs at acceptable temperatures. High power consumption for me isn't that important , I only consider it an issue when it has catastrophic implications like it did with Fermi.

We need to go back in time because I'm pretty sure (not you!) a lot of AMD people were spouting the demise of Nvidia because AMD got the consoles. It was as good as over, you know, AMD chips in Xbox and PS4, it'll be DX12 everywhere and Nvidia will be finished.

Guess what? Didn't happen and even then, Nvidia can do DX12 and Vulkan just fine.

Like I said in one of my previous post , DX12 is apparently not that great to work with and Vulkan sees very little adoption. These APIs could have given AMD an edge , but it didn't because the PC gaming development platform is a different beast , they don't really have the same control here.

I'l say it over and over.I'l say it over and over. It's a gaming card and RTG need to work on streamlining it, not making it more and more complex.

It was over for AMD from the moment Vega missed the proper time release window to fight off Pascal. It was obvious that if it was going to get dropped in an awkward moment between the release of Pascal and Volta it would fail to achieve much on the gaming side of things. But unfortunately Vega as an architecture is not for gaming. Just because they slapped an RX badge on it doesn't mean under the hood it is an actual gaming card. It's not , it's for compute , a clear competitor to Tesla not Geforce forced into different clothes.
 
Last edited:
Oh boy , have I seen some true cesspools. Trust me , things are still adequate on here.
 
High power consumption for me isn't that important

Which is cool (not literally) but it does matter to the OEM's wanting to use small form factor and for mobile etc. We have laptops now with GTX1080's in them. Not going to happen with Vega. AMD are also diminishing from the APU scene (last release wasn't even Zen based) and that was their super niche. It's hard to see Vega being used effectively on a mobile platform.

Also, the next talk from the web is Navi incorporating the AI hardware onto the chip (which is what Volta has already done). But the reason many folk (jumping on a single source mind you) say Volta isn't the next Nvidia consumer chip is because it's the AI that makes Volta, well, Volta. And not for quite a while will games need such heavy AI lifting. It's that 'unnecessary' shove it all on the chip mindset that is hindering RTG. Hell, the next RTG chip may as well incorporate a climate modelling architecture, just in case. It's no longer moar cores but moar unincorporated techy stuff.
 
So overall ~20% performance increases from FuryX to Vega64 meltdown mode. Meh.

Feels like a major failure to me. 290X to FuryX had better improvement performance wise.

Also where is that magical shader discard or whatever stuff? Was that what loads of people claming is going to boost Vega bu HUGE percentage? And the gaming performance is pretty much on par with Vega FE. So much for "not a gaming card"

Calling @RejZoR, your card has arrived. Since mining performance is bad price should be OK.

After intense consideration after 3 hours of fiddling with stores, prices and shit I've decided to pull the trigger on AORUS GTX 1080Ti. Reason? RX Vega 64 Liquid costs the same as this beefed up air cooled one (and I've specifically taken this one because it actually has tons of thermal pads underneath backplate and VRM is connected to actual heatsink, not some crappy little alu heatsink screwed on top of VRM, proper air cooling).
RX Vega 56 is actually by far the best deal, but it won't cost 400€ in Europe for sure and quite frankly, I'm sick and tired of endless waiting for aftermarket cooled ones to arrive god knows when in autumn 2017. The reference one is crap, just like all blower style cards so I'm certainly not buying that either.

Everyone can call me AMD fanboy for being optimistic about RX Vega, but in the end, I'm not stupid. For someone who wsn't waiting endlessly and just happens to change stuff now, I guess it's still viable, but for me, this is it.
 
1950x + Vega = nuclear level
 
Which is cool (not literally) but it does matter to the OEM's wanting to use small form factor and for mobile etc. We have laptops now with GTX1080's in them. Not going to happen with Vega. AMD are also diminishing from the APU scene (last release wasn't even Zen based) and that was their super niche. It's hard to see Vega being used effectively on a mobile platform.

But look how much more efficient RX 56 is compared to 64. With the right clocks/shader count and some nifty binning and more aggressive power saving I don't see why it wouldn't make it's way into high end laptops.

As with regards to OEM , meh I don't know. The market for high end desktop GPUs is already a small portion of the whole thing. The cards that go into OEM systems are an even smaller chunk. In the end I don't think AMD should worry to much about it.

It's that 'unnecessary' shove it all on the chip mindset that is hindering RTG. Hell, the next RTG chip may as well incorporate a climate modelling architecture, just in case. It's no longer moar cores but moar unincorporated techy stuff.

It is unnecessary , but they can't deal with it in any other way. They just simply can't afford different designs like Nvidia does.

Volta really does look like Pascal with extra compute features. If Nvidia wants to come up with something soon I suspect it'll be another Kepler gen2 type product. Slightly larger dies with more shaders, maybe slightly higher clocks at a reduced cost and that's about it. Nothing outstanding.

Unless AMD is going to push hard on Navi , things will slow down big time on the GPU side of things.
 
After intense consideration after 3 hours of fiddling with stores, prices and shit I've decided to pull the trigger on AORUS GTX 1080Ti. Reason? RX Vega 64 Liquid costs the same as this beefed up air cooled one (and I've specifically taken this one because it actually has tons of thermal pads underneath backplate and VRM is connected to actual heatsink, not some crappy little alu heatsink screwed on top of VRM, proper air cooling).
RX Vega 56 is actually by far the best deal, but it won't cost 400€ in Europe for sure and quite frankly, I'm sick and tired of endless waiting for aftermarket cooled ones to arrive god knows when in autumn 2017. The reference one is crap, just like all blower style cards so I'm certainly not buying that either.

Everyone can call me AMD fanboy for being optimistic about RX Vega, but in the end, I'm not stupid. For someone who wsn't waiting endlessly and just happens to change stuff now, I guess it's still viable, but for me, this is it.

It'll be just your luck if they release Volta in 1 month :nutkick:

but.... going on the Vega 64 performance, I think we can all wait till Summer/Autumn 2018 for the next high end Nvidia GPU.
 
We already know there won't be any Volta any time soon.
 
The problem with AMD's GPU design is that it is a 4 lane highway where the max speed is 75mph. On Nvidia's part that same highway is back to 2 lanes with a max of 150mph. This gives Nvidia a advantage in loads such as gaming or whatever. On the other hand you can see the effect of AMD's approach in for example GPU encoding, mining and that sort of stuff that really requires bruteforce compute power. So the review you are having here, proves that AMD might be the better card for general workloads and particular games, but it is Nvidia that takes the overhand with this 2 lanes / 150mph approach. It needs alot of power to even archieve that 150mph of speed compared to Nvidia. Mweh. I'll still take it to. I'm sure driver updates will fix many games and the power issues can be adressed. Remember AMD does'nt spend much time to actually fine-tune their GPU's on power behalf.
 
b694ac8550f0.jpg

AMD-Vega-Tomb-Raider-HBCC-Demo.jpg


hqGRkAx.png


mHM3DMk.gif
 
After intense consideration after 3 hours of fiddling with stores, prices and shit I've decided to pull the trigger on AORUS GTX 1080Ti. Reason? RX Vega 64 Liquid costs the same as this beefed up air cooled one (and I've specifically taken this one because it actually has tons of thermal pads underneath backplate and VRM is connected to actual heatsink, not some crappy little alu heatsink screwed on top of VRM, proper air cooling).
RX Vega 56 is actually by far the best deal, but it won't cost 400€ in Europe for sure and quite frankly, I'm sick and tired of endless waiting for aftermarket cooled ones to arrive god knows when in autumn 2017. The reference one is crap, just like all blower style cards so I'm certainly not buying that either.

Everyone can call me AMD fanboy for being optimistic about RX Vega, but in the end, I'm not stupid. For someone who wsn't waiting endlessly and just happens to change stuff now, I guess it's still viable, but for me, this is it.


Nah you are no fanboy, you are hypeboy.
 
Back
Top