Wednesday, June 28th 2017

AMD Radeon Pro Vega Frontier Edition Unboxed, Benchmarked

A lucky customer has already gotten his hands on one of these coveted, sky-powered AMD graphics cards, and is currently in the process of setting up his system. Given the absence of review samples from AMD to any outlet - a short Vega Frontier Edition supply ensured so - there isn't any other real way to get impressions on this graphics card. As such, we'll be borrowing Disqus' user #define posts as a way to cover live pics and performance measurements of this card. Expect this post to be updated as new developments arise.

After some glamour shots of the card were taken (which really are justified by its unique color scheme), #define mentioned the card's build quality. After having installed the driver package (which, as we've covered today, includes both a developer and gaming path inside the drivers, granting increased performance in both workloads depending on the enabled driver profile, he is now about to conduct some testing on SPECViewperf and 3DMark, with both gaming and non gaming profiles.
Specs of the system include an Intel Core i7 4790K (apparently at stock 4GHz), an ASUS Maximus VII Impact motherboard, and 16 GB (2x8) of Corsair Vengeance Pro Black DDR3 modules, running at 2133 MHZ, and a 550 W PSU.

Update 1: #define has made an update with a screenshot of the card's score in 3DMark's FireStrike graphics test. The user reported that the Pro drivers' score "didn't make sense", which we assume means are uncooperative with actual gaming workloads. On the Game Mode driver side, though, #define reports GPU frequencies that are "all over the place". This is probably a result of AMD's announced typical/base clock of 1382 MHz and an up to 1600 MHz peak/boost clock. It is as of yet unknown whether these frequencies scale as much with GPU temperature and power constraints as NVIDIA's pascal architecture does, but the fact that #define is using a small case along with the Frontier Edition's blower-style cooler could mean the graphics card is heavily throttling. That would also go some way towards explaining the actual 3DMark score of AMD's latest (non-gaming geared, I must stress) graphics card: a 17,313 point score isn't especially convincing. Other test runs resulted in comparable scores, with 21,202; 21,421; and 22,986 scores. However, do keep in mind these are the launch drivers we're talking about, on a graphics card that isn't officially meant for gaming (at least, not in the sense we are all used to.) It is also unclear whether there are some configuration hoops that #define failed to go through.

Update 2: # After fiddling around with Wattman settings, #Define managed to do some more benchmarks. Operating frequency should be more stable now, but alas, there still isn't much information regarding frequency stability or throttling amount, if any. He reports he had to set Wattman's Power Limit to 30% however; #define also fiddled with the last three power states in a bid to decrease frequency variability on the card, setting all to the 1602 MHz frequency that AMD rated as the peak/boost frequency. Temperature limits were set to their maximum value.
Latest results post this non-gaming Vega card around the same ballpark as a GTX 1080:
For those in the comments going about the Vega Frontier Edition professional performance, I believe the following results will come in as a shock. #define tested the card in Specviewperf with the PRO drivers enabled, and the results... well, speak for themselves.

#define posted some Specviewperf 12.1 results from NVIDIA's Quadro P5000 and P6000 on Xeon machines, below (in source as well):
And then proceeded to test the Vega Frontier Edition, which gave us the following results:

So, this is a Frontier Edition Vega, which isn't neither a professional nor a consumer video card, straddling the line in a prosumer posture of sorts. And as you know, being a jack of all trades usually means that you can't be a master at any of them. So let's look at the value proposition: here we have a prosumer video card which costs $999, battling a $2000 P5000 graphics card. Some of its losses are deep, but it still ekes out some wins. But let's look at the value proposition: averaging results between the Vega Frontier Edition (1014,56 total points) and the Quadro P5000 (1192.23 points), we see the Vega card delivering around 80% of the P5000's performance... for 50% of its price. So if you go with NVIDIA's Quadro P5000, you're trading around a 100% increase in purchase cost, for a 20% performance increase. You tell me if it's worth it. Comparisons to the P6000 are even more ridiculous (though that's usual considering the increase in pricing.) The P6000 averages 1338.49 points versus Vega's 1014,56. So a 25% performance increase from the Quadro P6000 comes with a price tag increased to... wait for it... $4800, which means that a 25% increase in performance will cost you a 380% increase in dollars.

Update 3:

Next up, #define did some quick testing on the Vega Frontier Edition's actual gaming chops, with the gaming fork of the drivers enabled, on The Witcher 3. Refer to the system specs posted above. he ran the game in 1080p, Über mode with Hairworks off. At those settings, the Vega Frontier Edition was posting around 115 frames per second when in open field, and around 100 FPS in city environments. Setting the last three power states to 1602 MHz seems to have stabilized clock speeds.

Update 4:

#define has now run 3D Mark's Time Spy benchmark, which uses a DX12 render path. Even though frequency stability has improved on the Vega Frontier Edition due to the change of the three last power states, the frequency still varies somewhat, though we can't the how much due to the way data is presented in Wattman. That said, the Frontier Edition Vega manages to achieve 7,126 points in the graphics section of the Time Spy benchmark. This is somewhat on the ballpark of stock GTX 1080's, though it still scores a tad lower than most.
Sources: #define @ Disqus, Spec.org, Spec.org
Add your own comment

200 Comments on AMD Radeon Pro Vega Frontier Edition Unboxed, Benchmarked

#126
Steevo
GC_PaNzerFINAfter 10 years of similar fan and cooling designs you expect THIS time it changes everything how radial coolers have worked? Having reviewed and/or owned probably in excess of 100 graphics card, very many with radial fan designs, I do find your ignorance to reality a bit insulting.
I too have owned, worked on, installed and heard many radial blower design cards, Nvidia has managed to keep many quiet and between the fan blade design, shroud, fins, tolerance and much else two seemingly identical coolers can be significantly different in noise, and what pitch or timbre of noise is produced. Thanks for calling me ignorant.
Posted on Reply
#127
GC_PaNzerFIN
SteevoI too have owned, worked on, installed and heard many radial blower design cards, Nvidia has managed to keep many quiet and between the fan blade design, shroud, fins, tolerance and much else two seemingly identical coolers can be significantly different in noise, and what pitch or timbre of noise is produced. Thanks for calling me ignorant.
How many of your NVIDIA cards have had fan at over 4000rpm?

Want to check latest similar fan design from AMD, here is TPU RX 480 review.
Now add 100W more heat to be transferred off even beefier fin array.

Posted on Reply
#128
EarthDog
Hugh MungusQuite a few more large games have dx12 options. There are thousands of games released every year, so good luck getting through all of them, but there definitely are more games with dx12 (options) than bf1 amd forza. DX12 is mainstream, vulkan is niche and yet there are roughly as many games of each according to wikipedia. The vulkan list may be right, but the dx12 list sure as hell isn't. Just about every new game tested seems to have optional dx12 nowadays and barely any of them made it the list.

Many of the games without API stated can run in dx12 mode it seems.
yawn.. links...
Posted on Reply
#129
rtwjunkie
PC Gaming Enthusiast
GC_PaNzerFINHow many of your NVIDIA cards have had fan at over 4000rpm?

Want to check latest similar fan design from AMD, here is TPU RX 480 review.
Now add 100W more heat to be transferred off even beefier fin array.

I've always taken W1zzard's noise chart with a grain of salt, and you should too. According to that chart my 780 I used to have should have been very audible, but I never even heard it, even at full load.
Posted on Reply
#130
GC_PaNzerFIN
rtwjunkieI've always taken W1zzard's noise chart with a grain of salt, and you should too. According to that chart my 780 I used to have should have been very audible, but I never even heard it, even at full load.
No need to trust wizzard, youtube has other RX 480 fan speed noise tests.


5000 rpm, 2600 rpm etc.

RX480 doesn't need to push fanspeed so far, but vega fe seems to do it.
Posted on Reply
#131
Steevo
GC_PaNzerFINHow many of your NVIDIA cards have had fan at over 4000rpm?

Want to check latest similar fan design from AMD, here is TPU RX 480 review.
Now add 100W more heat to be transferred off even beefier fin array.



Longer fin array from pictures, but the fan itself is what is creating the noise for the most part. The fan itself looks different than the RX480 stock fan in small ways. I'm by no means saying it is silent, but the guy with one said clearly that its not noisy, and he has a watercooled CPU so I am guessing if it were noisy he would know the difference.
rtwjunkieI've always taken W1zzard's noise chart with a grain of salt, and you should too. According to that chart my 780 I used to have should have been very audible, but I never even heard it, even at full load.
5870's were quiet for the day, but I hated the way mine sounded before the water block went on.
Posted on Reply
#132
I No
AenraYou're not forced to do anything, ever; that's one :)
On top of that, you have zero, absolute no reason to even think that the gaming-oriented version will cost more than Nvidia's; don't know where you saw that 'extra', but it goes with what i was saying before. We need to approach this more.. reasonably. Feels like you've added some hyperbole to the equation, then used it to conclude you're right to be disappointed.
Now in terms of the hype.. why did you allow any hype to influence you in the first place? Are we not in the 21st century? Who is it that has yet to learn how marketing is done or, even better, how better it is to let the customers do the marketing for the company? You've only yourself to blame if you fell for that, and that's the truth.

As to the latter part i quote above; NO. That's you thinking like that; skewed perceptions, a market gone haywire because that's how they prefer you to think. Do all of you here drive Porsches? Live in $1000000 mansions? No. If you get a second toddler and are in need of a new house, what you gonna do? Buy the mansion or go under the river?
Whom is it that said we must want the 'bestest'?
And by the way, because 99% i know the reply to this last bit.. if let's say it's 5% slower than the 1080Ti.. is that 'worstest'? 'Fail'? How exactly could you measure anything when all you go by is 'bestest'? Again, rather.. skewed criteria there.

*same goes for price-performance ratio.. who told you that's what everyone wants? Some have money to burn and do so just because, some purchase due to an emotional drive, others due to past impressions/familiarity, etc etc.. this "everyone" is just dead wrong. Keep it limited to what -you- want. And why it is you allowed yourself to form an opinion before it was even out (hype).

Take it or leave it, am not defending them. Am only showing you how this is very, very problematic; and most folks don't even see it no more.
Ok so basically what you're saying is that i'm not forced to get a product that's best suited for my needs because i'm not forced to do so? Wait what? What does this have to do with a segment in the market that's running completely unopposed at the time of writing? (This would be where those "high hopes" I mentioned would fit in)
Hype is "this will blow X product out of the water" Realism is "They had a whole year to get this at least on OK levels"- And this isn't OK from where i'm sitting. Most of the users/people/whatever do actually care about price/performance more so when we're talking about PC parts. But please permit me to doubt that the same chip in a "gaming" version will do a far better job.
Theory crafting, if it turns up to be 5% bellow or above the Ti then good that means the market will have competition and it will force a reaction thus pushing the industry further (note Intel's reaction to Ryzen).
People buy whatever is suited for their needs. If they feel the need for a $2000 GPU/CPU/whatever they will throw money at it.
I don't really care if it's red blue green purple or has polka-dots on it as long as it is incentive for innovation.
My bad on the usage of "Everyone" should've sticked with "the majority".
Posted on Reply
#133
DeathtoGnomes
**put the waders on**

The amount of ...less than spectacular displays of knowledge... posts show how many are in need of doing research before guessingposting.

In the meantime, please dont forget this card is not a gaming card and the drivers could be considered as beta level. :cool:
Posted on Reply
#134
efikkan
Hugh MungusWasn't even running 1600mhz like it should. If it was running at something like 1400mhz that would mean a 1600mhz score of about 19.5k at least. Besides, we don't know what support's like, we don't know how well rx vega will do and most importantly, firestrike isn't an actual game. Lots of if's, dunno's and a guy doing the benchmarks that doesn't understand wattman. Great, now we still don't know how good rx vega is.
You don't know how boost works.
Makers usually specify the base clock and typical/average boost clock. But this time AMD have chosen to specify typical clock and peak boost clock, probably to get a higher peak throughput number to "beat" the competition. However, if we employ the same method to Nvidia, the rated "performance" of Titan Xp would become 14.2 Tflop/s (assuming 1850 MHz peak clock).
Posted on Reply
#136
Steevo
efikkanYou don't know how boost works.
Makers usually specify the base clock and typical/average boost clock. But this time AMD have chosen to specify typical clock and peak boost clock, probably to get a higher peak throughput number to "beat" the competition. However, if we employ the same method to Nvidia, the rated "performance" of Titan Xp would become 14.2 Tflop/s (assuming 1850 MHz peak clock).
The guy with the card posted that he had to change the WattMan settings to get it to stay at 1600Mhz and only found that today when playing The Witcher
Posted on Reply
#137
Fluffmeister
When are the consumer gaming orientated versions due?
Posted on Reply
#138
Basard
So.... It's a FuryX with higher clocks and "better" memory....


Edit:
I NoMy point of view :

1. No driver update in this world would boost it by 30-35% (where the hype-train showed it would land).
2. Prosumer card or not it's marketed with a "GAMING MODE". Last time I checked the Titan didn't perform worse than the "gaming" version although the Titan is clearly aimed at gaming (no pro driver support) but it also has some deep-learning usage where it is 20-25% faster than a 1080 Ti.
3. The price is outrageous for what it can do (don't get me wrong the Titan is a rip-off as well) in both gaming and/or pro use
4. AMD did state they won't be supplying the card for reviewers. Yet they were the ones that made a comparison vs a Titan in the first place (mind you in pro usage with the Titan on regular drivers - Thanks nVidia for that).
5. No sane person would pick this vs a Quardo or hell even a WX variant for pro usage.
6. AMD dropped the ball when they showed a benchmark vs a Titan (doesn't even matter what was the object of the bench itself). nVidia being them never showed a slide where the competition is mentioned they only compare the new gen vs the current gen (tones down the hype).
7. Apart from the 700 series every Titan delivered what was promised(and that's cuz they didn't use the full chip in the Titan).

Long story short rushed, unpolished release. Jack all support. Denying the reviewers a sample. AMD is trying to do what nVidia does with the Titan but it's clearly not in the position to do so. In other words AMD just pulled another classic "AMD cock-up". Again this is my POV, had high hopes for this ... but then again I should've known better...
Seems pretty much spot on. The Vega will also deliver what what was promised. The first "prosumer" card, lol. Ever since ATI became AMD that's all they've seemed to be able to sell--prosumer cards. The architecture always seems so generic and too future-proof. Too open and reliant on people devoting their free time to help "the cause"...
Posted on Reply
#139
ShurikN
I'm actually wondering why this card exists.
It's not a gaming card, and at $1000 it can't compete with similar offerings.
It's not a workstation card, as it gets beaten by much smaller (die) ws gpus...

The only reason I can think of as to why is AMD wasting Vega chips on this abomination (and not on RX) is probably due to the early chips being utter crap. Hence the delay between FE and RX Vega.

A jack of all trades, but pretty mediocre at all of them.
Posted on Reply
#140
Basard
ShurikNI'm actually wondering why this card exists.
It's not a gaming card, and at $1000 it can't compete with similar offerings.
It's not a workstation card, as it gets beaten by much smaller (die) ws gpus...

The only reason I can think of as to why is AMD wasting Vega chips on this abomination (and not on RX) is probably due to the early chips being utter crap. Hence the delay between FE and RX Vega.

A jack of all trades, but pretty mediocre at all of them.
It's the best they could do with a shrunken Fiji....
Now, if they could hit 2Ghz.... Nvidia seems to have whooped AMD's ass on the way to 2Ghz. Oh well, you can't win ALL of the Ghz battles, AMD.
The slogan should have been "Vega--The Bulldozer of GPUs," the RX Vega's will be "RX vega--the Piledriver of GPU's," and hopefully by the time Navi 2.0 comes out we will have something better than Ryzen to compare it too.
I hate to hate AMD, but I hope the next revision of Ryzen will offer something other than more cores and a mediocre IPC.
Done ranting for the night. Later guys! :pimp:
Posted on Reply
#141
Th3pwn3r
FluffmeisterWhen are the consumer gaming orientated versions due?
Are you new here or just to the Vega 'situation'?

And my lord, there are so many terrible, stupid, comments filled with BS that I wish I had more faces and palms.
Posted on Reply
#142
toilet pepper
EarthDogBut wait!!!!!!!!! Drivers aren't mature and this dude didn't potentially didn't set up configurations for it......... Ugh, the speculation to play both sides to the middle........... bleh.


Guys, GPUs don't 'brown out' and get "slower" because of an inadequate PSU... ;)
Uhmmm. Jayztwocents made a video about that today. Even the cables you use from the PSU mattered in benchmarking. Insignificant but there was a change.
Posted on Reply
#143
Fluffmeister
Th3pwn3rAre you new here or just to the Vega 'situation'?

And my lord, there are so many terrible, stupid, comments filled with BS that I wish I had more faces and palms.
I'm new to the V3ga 'situation'!

But i hear ya, it's facepalm galore.
Posted on Reply
#144
Th3pwn3r
FluffmeisterI'm new to the V3ga 'situation'!

But i hear ya, it's facepalm galore.
Vega has basically been AMD teasing everyone with the cards for what seems like an eternity. A lot of people don't even think they'll come out with the consumer versions at this point, others said F it and bought 1080s/1080tis. I bought a 1080 but am still somewhat thinking about buying Vega but it's getting to the point where Nvidia will have something that stomps all over it if AMD ever decides to release it.
Posted on Reply
#145
xkm1948
High Band Width Cache may be amazing on paper. The truth is Vega is 2048bit versus FuryX 4096bit. And Vega has less total Vram bandwidth as well. I wonder whether this has anything to do with the not so promising performance of Vega. After all, no applications or games are optomized for HBC yet.
Posted on Reply
#146
qubit
Overclocked quantum bit
So AMD's super duper overpriced flagship card is only as fast as NVIDIA's second tier card and consumes lots of power. Sounds about right. :rolleyes: Kinda the same thing as Ryzen and Intel, but there's not always quite so much difference there at least.

Maybe in another couple of generations AMD will catch up, but I'm not holding my breath.

Perhaps the upcoming gaming oriented RX Vega will be better, but I doubt it.
Posted on Reply
#147
Divide Overflow
AMD releases a $1000 workstation card that's comparable to a $5000 card from the competition. Nice work, AMD!
Looking forward to RX Vega in a little over a month and some real reviews. Reference AMD cards tend to be hot and loud, so tack on another month for custom cooling solutions to appear.
Posted on Reply
#148
TheGuruStud
BasardSo.... It's a FuryX with higher clocks and "better" memory....
No. If you believe this ridiculousness, then Fiji is much faster (about same scoring as Fury X at 1.25 GHz).

I guess you gotta rile up fanboys for attention, though.

Might as well ignore all posts for a couple months.
Posted on Reply
#149
xkm1948
TheGuruStudNo. If you believe this ridiculousness, then Fiji is much faster (about same scoring as Fury X at 1.25 GHz).

I guess you gotta rile up fanboys for attention, though.

Might as well ignore all posts for a couple months.
I agree. If the keep 4096bit width just switch to HBM2, use 14nm to boost up clock speed then it might be better.

And yeah I know I know next to squat about GPU electronic engineering yadayada.
Posted on Reply
#150
eidairaman1
The Exiled Airman
Sorry, 300 is my cap on any GPU.
Posted on Reply
Add your own comment
Dec 18th, 2024 18:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts