• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX Vega 64 8 GB

i made a chart to see in wich games the pwr-saving-mode makes sense and where it doesn´t.

rxvega64-pwr-savings-vs-standard666.png


@W1zzard: i hope i had permission to use your data
 
Last edited:
I have to agree with you as well. Now look, I suspect Vega will be receiving several large performance increases from drivers/game optimizations over the course of the next year.

Having said that, right now Vega is not at all impressive for gaming from a technology demonstration perspective. It's good for the price given how ridiculous the market is right now, and for professional uses it's somewhat disruptive. However this is not some big stepping stone for Radeon gaming, not even CLOSE. Big Polaris would have probably beaten or met this while being cheaper to produce (And launching a year ago lol).

The joys of reality eh?

I wish it was more too: https://www.techpowerup.com/forums/...a-gpu-architecture.229266/page-7#post-3647165
 
And there we have a winner! Don't worry, the drivers will come and fix this product…
This product is already three months late; most driver optimizations are already baked in. These promises of future driver improvements never pan out.


Vega looks pretty bad considering it has a node shrink and so many more transistors and higher clock than Fiji.


Vega is a compute chip that can also play some games. If anything it's great mining hardware.

AMD is known for their vaporware, sure it has some cool features, but those don't cut it when a 980Ti overclocked beats it.
 
But look how much more efficient RX 56 is compared to 64. With the right clocks/shader count and some nifty binning and more aggressive power saving I don't see why it wouldn't make it's way into high end laptops.
By "high end" you mean large, heavy, noisy and usable only plugged in, right?

Check the latest ASUS Zephyrus with a GTX 1080. A powerful gaming notebook of 2017 is quite a bit smaller than a mid-range business notebook from 5 years ago. This is progress.
Of course you can make a huge notebook with a 350W GPU, but why would you want that?

Considering these are first release drivers, things can only get better from here on.

Or worse. We might soon learn that they have segmentation faults or something like that. And each fix will steal a bit of the performance.

But most likely nothing will change.
As people have already pointed out: AMD delays this release for so long that the drivers should by pretty polished by now. In fact the performance we see in reviews is better than in some of the early leaks. Obviously, a lot of work on drivers has already been done, so the potential should be way smaller than you hope.
 
If they had these cards 1-2 years prior, that would have been a hell of a game changer and really pushed competition. The Vega 56 is mostly on par with a 1070, but draws more power. I could really push the OC On my 980Ti and not be far behind the Vega 56 (even the Fury X isn't far behind) and my card is already 2 years old. It looks like AMD appears to be almost 2 years behind Nvidia.

It's good to see AMD keeping in there, but unless they have something up their sleeves in the next year or so, Nvidia might really put the hurt on them. Navi is rumored to be out 2019, but will that be fast enough to keep up with Nvidia? They don't have anything to answer the 1080Ti right now. Volta most likely won't be around till mid 2018 based on rumors that production of consumer cards won't start until the end of this year (if we're lucky), but if all this holds true, Nvidia will have that 6+ month jump on AMD with the next generation. I wonder when AMD will truly catch up to Nvidia, right now that seems like a long way off.
Yes, it does sound a bit like they are using old technology, which is quite depressing when it's actually their latest and greatest. It looks to me like they're actually overclocking Vega significantly to reduce the framerate deficit against NVIDIA which is at least partly why the card runs so hot and loud. That coil whine is an indicator of just how much juice it's sucking down to do it, hence the 100W higher figures than NVIDIA. It's the usual lack of competition like this that lets NVIDIA take its time in releasing the consumer variant of Volta, milking the maximum amount of profit from Pascal.

I just don't get why they persist with this HBM when it clearly does nothing for them. Why not design around a GDDR5X or the next gen and have a cheaper to make card? At least they can compete better on price then.

I remember how ATI lagged NVIDIA at the time AMD bought them out 11 years ago, but incredibly, it hasn't done anything for them even with all those extra resources available to them. How depressing.
 
So its basically the same price/performance as the 1080 while having more noise/power consumption ?

We waited a year for them to basically offer nothing new/exciting to the market ? (solid work AMD !!!)
 
Winter is coming and AMD is selling space heater.
 
i hope i had permission to use your data
always. for such kind of analysis, that goes beyond just copying what i wrote. gj
 
Vega is a compute chip that can also play some games. If anything it's great mining hardware.
So, we are still making up excuses?
No, Vega10 is the gaming chip. The compute chip is known as Vega20 and is coming next year.
 
Well I guess it fell exactly where it was hinted at. It seems as though the only real winner card is Vega 56 as the higher priced variant is not worth it at least in my book (Especially the WC variant). Oh well... Guess I will be sticking with Nvidia for awhile, think ill just bite the bullet and get that 165hz G-Sync monitor I have had my eyes on for months on end.
 
made me a little proud W1zzard, thanks:)
now i made the same for the Std.Bios @ pwr-saving ... huge gains and even less understanding for the 292W @ "ballanced"
rxvega64-pwr-savings-vs-standard100.png


you can count on 5 fingers where the extra power goes with overclocking,
if you only loose 5% when slashing 27% energy
what happens if using the Std.Bios@pwr-saving with undervolting? maybe the performance diff. goes to 0-2% because the avg. clocks go up again.

i honestly think thats the way Vega is good and efficient
the way we germans with our 25-28 ct/kWh should use it ;)
 
Last edited:
That has nothing to do with it. Read the review without your personal Bias. The frame pacing, pricing, and power consumption when chill and other features are turned on make it the *decent* card that it is.

It's not a huge win, it's more like the special Olympics winner that showed up to run at the regular Olympics.... juice box required.

AMD will probably fix half the issues with games, and improve performance by 5% while miners buy the card and they profit, some, cause HBM2 and the die isn't cheap.
I don't know, after two years, in which time AMD effectively dismissed high end last year because apparently that's not where the money is, now AMD returns to the high end to give me... a slightly worse GTX 1080? Call that biased, but I just don't see this as a 86% card.
 
neat leather jackets Jensen always wears with every Titan
Jensen's increasingly fancy leather jackets he sweats in under the lights and whole steering into auto industry are only the starting symptoms of mid life crisis. I expect it to culminate with him driving a motorcycle directly onto the stage for Volta release.
 
When will we get OC results

W1zzard said this:

Overclocking simply does not work on AMD's press driver. No matter what setting was chosen, the actual frequencies did not change. Apparently nobody tested overclocking before declaring the driver ready to give to the press.

Two days ago, AMD provided an updated driver for overclocking testing only, which claims to address this, but it came in too late, when I had already left for my summer vacation.

Which seems so blatantly AMD these days (I was a Ryzen early adopter - I speak with experience). The company seems disjointed with these things which is really worrying when this is the biggest gfx release from them in 2 years and still can't release working drivers for overclocking? Guru 3D had the same issues as TPU but where he could overclock he stated that it downclocked very quickly due to a possible power issue. It's very possible the liquid cards are heavily binned to provide the higher clocks. It could be Fury X all over again on the OC front which might mean an HBM speed issue?

Either way, unlike Fury X and 980ti, this time round the AMD part is way off the lead.

untitled441.png



Vega is 30% away from 1080ti (stock).

Fury X was 10% max away from 980ti (stock).

Going back the 290X was only 5-10% away from the 780ti.

What happened?
 
Last edited:
HD7970(Q1/2012) 6% slower than GTX680(Q1/2012)
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/27.html

HD6970(Q4/2010) 13% slower than GTX580(Q4/2010)
https://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/29.html

the last Win at AMD was the HD5870(Q3/2009), the much older GTX285(Q4/2008) was 17% slower when the HD5870 was released.
That was only possible because nvidia and tsmc had production-problems with the 400-series.
https://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/30.html

and in Q1/2010 it was beaten by 10% thru the GTX480, wich did a 28% generational jump from the GTX285 wich was over 1-year old at that time.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html

(the percents are for the "all resolutions graphs")
 
Last edited:
Like I posted before, better wait for some driver optimizations etc.

Tho I'm fine with my overclocked GTX970SLI, Vega 56 seems to be still pretty interesting. I'm not in a hurry of upgrading my graphics (and yeah, CPU comes first), but at least I can wait for few driver updates and other things first.

Power consumption hasn't ever been a problem for me. And let's see what those custom models will be like. :)
 
HD7970(Q1/2012) 6% slower than GTX680(Q1/2012)
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/27.html

HD6970(Q4/2010) 13% slower than GTX580(Q4/2010)
https://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/29.html

the last Win at AMD was the HD5870(Q3/2009) wich was 17% faster than much older GTX285(Q4/2008), only possible because nvidia and tsmc had production-problems with the 400-series.
https://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/30.html

and in Q1/2010 it was beaten by 10% thru the GTX480, wich did a 28% generational jump from the GTX285 wich was over 1-year old at that time.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html


LOL ok buddy. You are really paying fast and loose with the truth on those summaries of the past lol. In fact, can you even read your own links:

https://tpucdn.com/reviews/ATI/Radeon_HD_5870/images/perfrel_1920.gif

The 5870 is 24% faster than the GTX 285, and then still traded blows with the newer GTX 480 that used over double the energy and cost $100 more. I don't have time to pick apart the rest of your BS, but that one's the funniest point you made imo.
 
AMD approved review score to actual score formula

Score = (Official Score - 6) * 2.5

:peace:
 
Vega is 30% away from 1080ti (stock).

Fury X was 10% max away from 980ti (stock).

Going back the 290X was only 5-10% away from the 780ti.

What happened?
AMD stagnated since they gave up actually developing something new, while Nvidia keeps pushing forward.
Imagine a shrunk Fiji bumped about 300 MHz, then it should become obvious how little Vega really improves.
 
AMD stagnated since they gave up actually developing something new, while Nvidia keeps pushing forward.
Imagine a shrunk Fiji bumped about 300 MHz, then it should become obvious how little Vega really improves.
On top of that, HardOCP seems to have found a weakness in Vega already: MSAA or SSAA performance. I'm taking that with a pinch of salt for now, because they only have 3 games that prove that (and none that proves otherwise), while TPU doesn't which test uses which level of AA. But I'm definitely keeping an eye out for more reviews.
 
Back
Top