Wednesday, June 28th 2017

AMD Radeon Pro Vega Frontier Edition Unboxed, Benchmarked

A lucky customer has already gotten his hands on one of these coveted, sky-powered AMD graphics cards, and is currently in the process of setting up his system. Given the absence of review samples from AMD to any outlet - a short Vega Frontier Edition supply ensured so - there isn't any other real way to get impressions on this graphics card. As such, we'll be borrowing Disqus' user #define posts as a way to cover live pics and performance measurements of this card. Expect this post to be updated as new developments arise.

After some glamour shots of the card were taken (which really are justified by its unique color scheme), #define mentioned the card's build quality. After having installed the driver package (which, as we've covered today, includes both a developer and gaming path inside the drivers, granting increased performance in both workloads depending on the enabled driver profile, he is now about to conduct some testing on SPECViewperf and 3DMark, with both gaming and non gaming profiles.
Specs of the system include an Intel Core i7 4790K (apparently at stock 4GHz), an ASUS Maximus VII Impact motherboard, and 16 GB (2x8) of Corsair Vengeance Pro Black DDR3 modules, running at 2133 MHZ, and a 550 W PSU.

Update 1: #define has made an update with a screenshot of the card's score in 3DMark's FireStrike graphics test. The user reported that the Pro drivers' score "didn't make sense", which we assume means are uncooperative with actual gaming workloads. On the Game Mode driver side, though, #define reports GPU frequencies that are "all over the place". This is probably a result of AMD's announced typical/base clock of 1382 MHz and an up to 1600 MHz peak/boost clock. It is as of yet unknown whether these frequencies scale as much with GPU temperature and power constraints as NVIDIA's pascal architecture does, but the fact that #define is using a small case along with the Frontier Edition's blower-style cooler could mean the graphics card is heavily throttling. That would also go some way towards explaining the actual 3DMark score of AMD's latest (non-gaming geared, I must stress) graphics card: a 17,313 point score isn't especially convincing. Other test runs resulted in comparable scores, with 21,202; 21,421; and 22,986 scores. However, do keep in mind these are the launch drivers we're talking about, on a graphics card that isn't officially meant for gaming (at least, not in the sense we are all used to.) It is also unclear whether there are some configuration hoops that #define failed to go through.

Update 2: # After fiddling around with Wattman settings, #Define managed to do some more benchmarks. Operating frequency should be more stable now, but alas, there still isn't much information regarding frequency stability or throttling amount, if any. He reports he had to set Wattman's Power Limit to 30% however; #define also fiddled with the last three power states in a bid to decrease frequency variability on the card, setting all to the 1602 MHz frequency that AMD rated as the peak/boost frequency. Temperature limits were set to their maximum value.
Latest results post this non-gaming Vega card around the same ballpark as a GTX 1080:
For those in the comments going about the Vega Frontier Edition professional performance, I believe the following results will come in as a shock. #define tested the card in Specviewperf with the PRO drivers enabled, and the results... well, speak for themselves.

#define posted some Specviewperf 12.1 results from NVIDIA's Quadro P5000 and P6000 on Xeon machines, below (in source as well):
And then proceeded to test the Vega Frontier Edition, which gave us the following results:

So, this is a Frontier Edition Vega, which isn't neither a professional nor a consumer video card, straddling the line in a prosumer posture of sorts. And as you know, being a jack of all trades usually means that you can't be a master at any of them. So let's look at the value proposition: here we have a prosumer video card which costs $999, battling a $2000 P5000 graphics card. Some of its losses are deep, but it still ekes out some wins. But let's look at the value proposition: averaging results between the Vega Frontier Edition (1014,56 total points) and the Quadro P5000 (1192.23 points), we see the Vega card delivering around 80% of the P5000's performance... for 50% of its price. So if you go with NVIDIA's Quadro P5000, you're trading around a 100% increase in purchase cost, for a 20% performance increase. You tell me if it's worth it. Comparisons to the P6000 are even more ridiculous (though that's usual considering the increase in pricing.) The P6000 averages 1338.49 points versus Vega's 1014,56. So a 25% performance increase from the Quadro P6000 comes with a price tag increased to... wait for it... $4800, which means that a 25% increase in performance will cost you a 380% increase in dollars.

Update 3:

Next up, #define did some quick testing on the Vega Frontier Edition's actual gaming chops, with the gaming fork of the drivers enabled, on The Witcher 3. Refer to the system specs posted above. he ran the game in 1080p, Über mode with Hairworks off. At those settings, the Vega Frontier Edition was posting around 115 frames per second when in open field, and around 100 FPS in city environments. Setting the last three power states to 1602 MHz seems to have stabilized clock speeds.

Update 4:

#define has now run 3D Mark's Time Spy benchmark, which uses a DX12 render path. Even though frequency stability has improved on the Vega Frontier Edition due to the change of the three last power states, the frequency still varies somewhat, though we can't the how much due to the way data is presented in Wattman. That said, the Frontier Edition Vega manages to achieve 7,126 points in the graphics section of the Time Spy benchmark. This is somewhat on the ballpark of stock GTX 1080's, though it still scores a tad lower than most.
Sources: #define @ Disqus, Spec.org, Spec.org
Add your own comment

200 Comments on AMD Radeon Pro Vega Frontier Edition Unboxed, Benchmarked

#151
sweet
xkm1948I agree. If the keep 4096bit width just switch to HBM2, use 14nm to boost up clock speed then it might be better.

And yeah I know I know next to squat about GPU electronic engineering yadayada.
HBM2 is not cheap, and 4096 bit means 4 stack and doubling the cost for VRAM unfortunately.
Posted on Reply
#152
xkm1948
sweetHBM2 is not cheap, and 4096 bit means 4 stack and doubling the cost for VRAM unfortunately.
Instead of 2 stacks of 8GB, how about 4 stacks of 4GB? Or better, 4 stacks of 2GB? More bandwidth is always good.
Posted on Reply
#153
laszlo
too expensive for what can do...

another review:

quite a fiasco for amd ...
Posted on Reply
#154
sweet
laszlotoo expensive for what can do...

another review:

quite a fiasco for amd ...
For what it can do, it is quite a bargain. We are talking about a $1000 card that is equal to a $5000 one in pro tasks.

It's just wrong if this card is judged solely on its gaming performance.
Posted on Reply
#155
Xzibit
laszlotoo expensive for what can do...

another review:

quite a fiasco for amd ...
Unfortunately i saw most of the video. At 1hr 20min he realizes hes not in gaming mode (I think he gets told by the viewers). Then he has issues with Afterburner and doesn't want to use WattMan cause he doesn't really know how.
Posted on Reply
#156
TheGuruStud
XzibitUnfortunately i saw most of the video. At 1hr 20min he realizes hes not in gaming mode (I think he gets told by the viewers). Then he has issues with Afterburner and doesn't want to use WattMan cause he doesn't really know how.
And gaming mode doesn't do anything for games lol. It's just chill, wattman and w/e the record shit is called...?
Posted on Reply
#157
laszlo
sweetFor what it can do, it is quite a bargain. We are talking about a $1000 card that is equal to a $5000 one in pro tasks.

It's just wrong if this card is judged solely on its gaming performance.
at some pro task it may deliver; consumer versions will not be better in terms of gaming and they can't ask more than 500 $ for top product... wonder if they manage to cover costs...
Posted on Reply
#158
sweet
laszloat some pro task it may deliver; consumer versions will not be better in terms of gaming and they can't ask more than 500 $ for top product... wonder if they manage to cover costs...
Why are you so sure about that bold statement? Note that AMD themselves said that RX Vega will be much faster than Vega FE in games. They also said that who wants to game should wait for RX Vega as well.

Also, we still don't know if RX Vega comes with HBM or not. If not, the production cost would be significantly reduced and a much cheaper price is quite feasible while maintaining some margin.
Posted on Reply
#159
Xzibit
TheGuruStudAnd gaming mode doesn't do anything for games lol. It's just chill, wattman and w/e the record shit is called...?
He didnt run test after he installed the updated drivers. Once he was told he installed them and just loaded W3 and ran around for 10seconds in the grass and thus made his comparison from that, he loads and runs around. Thats a summary of the last 30mins of the video. 1hr 35+ he says he was getting 32-35 on previous drivers then with the new ones he says he gets from 36-43 game mode & pro mode.

The video should be called I don't know how to work my machine because he keeps saying "It's gonna crash, i think its gonna crash!!!" :laugh:

This was a pro benchmarkers move. When hes using the drivers that came with the card there is a part where he is running test, Hes looks at the info display and says "I might have something running in the background? Yeah, I have something running in the background" o_O
Posted on Reply
#160
Unregistered
GC_PaNzerFINAfter 10 years of similar fan and cooling designs you expect THIS time it changes everything how radial coolers have worked? Having reviewed and/or owned probably in excess of 100 graphics card, very many with radial fan designs, I do find your ignorance to reality a bit insulting.
EarthDogyawn.. links...
You go look through thousands of games. I have a life.
ShurikNI'm actually wondering why this card exists.
It's not a gaming card, and at $1000 it can't compete with similar offerings.
It's not a workstation card, as it gets beaten by much smaller (die) ws gpus...
Disproven.
The only reason I can think of as to why is AMD wasting Vega chips on this abomination (and not on RX) is probably due to the early chips being utter crap. Hence the delay between FE and RX Vega.
Could be, although vega FE isn't crap.
A jack of all trades, but pretty mediocre at all of them.
Again, disproven.
#161
Boosnie
laszlotoo expensive for what can do...

another review:

quite a fiasco for amd ...
My eyes and eras are bleeding.
My brain is bleeding too for the utter lack of competence of this guy.
Posted on Reply
#163
rtwjunkie
PC Gaming Enthusiast
Hugh MungusYou go look through thousands of games. I have a life.
YOU made the claim. Those of us who apparently, by your insinuation, don't have a life :rolleyes: already know the number of DX12 games is minuscule. DX12 has so far turned out to be the most irrelevant DX yet.

When you make a claim like that in a forum, you need to be able to back it up, which you seem to be unable/unwilling to do.
Posted on Reply
#164
Xzibit
RecusThrottling
That another funny part. Hes not using 2 fans to cool it. He has a mickey mouse setup Nactua (92mm I think) fan wrapped in Amazon Prime cardboard pushing air into, when he has a Corsair 120mm fan right next to it. A similar Nactua w/ Amazon Prime sucking air out of the exhaust.

@1:42:00+

Its funny how he says its Thermal throttling 3seconds after loading W3 and the screen with Afterburner the temp says 51c.
Posted on Reply
#165
qubit
Overclocked quantum bit
I NoMy point of view :

1. No driver update in this world would boost it by 30-35% (where the hype-train showed it would land).
2. Prosumer card or not it's marketed with a "GAMING MODE". Last time I checked the Titan didn't perform worse than the "gaming" version although the Titan is clearly aimed at gaming (no pro driver support) but it also has some deep-learning usage where it is 20-25% faster than a 1080 Ti.
3. The price is outrageous for what it can do (don't get me wrong the Titan is a rip-off as well) in both gaming and/or pro use
4. AMD did state they won't be supplying the card for reviewers. Yet they were the ones that made a comparison vs a Titan in the first place (mind you in pro usage with the Titan on regular drivers - Thanks nVidia for that).
5. No sane person would pick this vs a Quardo or hell even a WX variant for pro usage.
6. AMD dropped the ball when they showed a benchmark vs a Titan (doesn't even matter what was the object of the bench itself). nVidia being them never showed a slide where the competition is mentioned they only compare the new gen vs the current gen (tones down the hype).
7. Apart from the 700 series every Titan delivered what was promised(and that's cuz they didn't use the full chip in the Titan).

Long story short rushed, unpolished release. Jack all support. Denying the reviewers a sample. AMD is trying to do what nVidia does with the Titan but it's clearly not in the position to do so. In other words AMD just pulled another classic "AMD cock-up". Again this is my POV, had high hopes for this ... but then again I should've known better...
Don't worry, none of this will stop the AMD fanboy apologists from defending AMD to the hilt, attacking the likes of you and me. :rolleyes: Weird, as we're not the enemy here, just pointing out the shortcomings of some for-profit company selling substandard stuff, so they should be thanking us instead. Oh and the fanboys don't even get paid for it. :laugh:
Posted on Reply
#166
Liviu Cojocaru
I am not sure why people are so bothered about the gaming side of this card and compare it with the Titan Xp in gaming, it's not made for that is like you would compare a Ferrari with a comfort mode available on it with an S class for the comfort on the normal roads...
Posted on Reply
#167
Imsochobo
KainXSwell he only has a 550W PSU when AMD recommends 850W for this card.(bet 750 would be fine though)
tis no issue on a singlerail psu 550W without OC.
end of discussion.
Posted on Reply
#168
KainXS
Imsochobotis no issue on a singlerail psu 550W without OC.
end of discussion.
The Power Discussion is over yes I know.
KainXSNormally when a GPU does not get enough power from the PSU it crashes the system or cuts off from its protection but with that system and 600W PSU, the PSU must have been bad or cheaply made. Even with this Vega card, a 550W PSU is not what I would recommend but if his PSU is a quality one(RM550 or something) it should run and the PSU should not be the limiting factor here. Thermaltake years ago made some really shifty PSU's on their low end, maybe that was your problem.
Posted on Reply
#169
Basard
TheGuruStudNo. If you believe this ridiculousness, then Fiji is much faster (about same scoring as Fury X at 1.25 GHz).

I guess you gotta rile up fanboys for attention, though.

Might as well ignore all posts for a couple months.
Yeah... that's why I come here, to rile up fanboys and to get attention...

So it's a Fury X at 1600Mhz with shittier memory then?

I just call em like I see em. Based on all of the posts I've seen in the last couple months, I see this as a 1600Mhz Fury X, which isn't bad... just not great.
Posted on Reply
#170
TheGuruStud
BasardYeah... that's why I come here, to rile up fanboys and to get attention...

So it's a Fury X at 1600Mhz with shittier memory then?

I just call em like I see em. Based on all of the posts I've seen in the last couple months, I see this as a 1600Mhz Fury X, which isn't bad... just not great.
I meant you as a vaguery to everyone reposting this idiot from discus.

A fury x with polaris uarch would be a lot better than this...that's why this is just a joke.
Posted on Reply
#171
Th3pwn3r
qubitDon't worry, none of this will stop the AMD fanboy apologists from defending AMD to the hilt, attacking the likes of you and me. :rolleyes: Weird, as we're not the enemy here, just pointing out the shortcomings of some for-profit company selling substandard stuff, so they should be thanking us instead. Oh and the fanboys don't even get paid for it. :laugh:
Do Nvidia fanboys get paid for their smear campaign? This card is actually really good for what its intended use is.

Before you try to say I'm a fanboy, my best card is a 1080. However, I have two Amd cards and two Nvidia in my four running machines. When the consumer/gaming version of Vega drops I'm either getting one or another 1080ti. I'm not going to sit here bashing without knowing what I'm talking about though like 50% of people here.
Posted on Reply
#173
qubit
Overclocked quantum bit
Th3pwn3rDo Nvidia fanboys get paid for their smear campaign?
I've no idea, you'll have to ask one.
Th3pwn3rThis card is actually really good for what its intended use is.
I think not and I've explained why, so no point repeating myself. If you don't agree, that's fine by me. You certainly do sound like and AMD apologist/fanboy and owning an NVIDIA card doesn't change that.
Posted on Reply
#174
EarthDog
Hugh MungusYou go look through thousands of games. I have a life.
like he said, you made the claim. I went out and provided a link (since i have no life, therefore the time) and here we are again with your lips around our collective rims blowing smoke up it. I dont mind the lips, but your words hold ZERO merit without support.

Support your claims or continue to be grouped with the other clueless muppets blowing smoke.
Posted on Reply
#175
PerfectWave
why some ppl test this card at 1080p? Also for non gaming app? Really really...
Posted on Reply
Add your own comment
Dec 18th, 2024 18:06 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts