Monday, June 26th 2017

Vega Frontier Ed Beats TITAN Xp in Compute, Formidable Game Performance: Preview

PC World posted a preview of an AMD Radeon Pro Vega Frontier Edition graphics card, and reported some interesting observations about the card ahead of its review NDA. The tech publication compared the air-cooled Pro Vega Frontier Edition against NVIDIA's fastest consumer graphics card, the TITAN Xp. It did reveal performance numbers of the two cards in two compute-heavy tests, SPECViewPerf 12.1 and Cinebench R15 (OpenGL test), where the Vega FE significantly outperforms the TITAN Xp. This shouldn't come as a shocker because AMD GPUs tend to have a strong footing with GPU compute performance, particularly with open standards.

It's PC World's comments on the Vega card's gaming performance that might pique your interest. In its report, the publication comments that the Radeon Pro Vega Frontier Edition offers gaming performance that is faster than NVIDIA's GeForce GTX 1080, but slightly slower than its GTX 1080 Ti graphics card. To back its statement, PC World claims to have run the Vega Frontier Edition and TITAN Xp in "Doom" with Vulkan API, "Prey" with DirectX 11, and "Sniper Elite 4" with DirectX 12. You must also take into account that the Radeon Pro Vega Frontier Edition could command a four-figure price, in the league of the TITAN Xp; and that gamers should look forward to the Radeon RX Vega series, bound for a late-July/early-August launch, at price-points more appropriate to their competitive positioning. The RX Vega is also expected to have 8 GB of memory compared to 16 GB on the Frontier Edition. Watch PC World's video presentation in the source link below.
Sources: PC World, VideoCardz
Add your own comment

77 Comments on Vega Frontier Ed Beats TITAN Xp in Compute, Formidable Game Performance: Preview

#26
the54thvoid
Super Intoxicated Moderator
RejZoRBut when AMD makes optimized games, they just optimize their own stuff. When NVIDIA does it, it's optimize their own and actively nerf the competition.
No, quite wrong, Gameworks pisses off everyone.

Though to be fair that's due to lazy implementation. And you can turn GW features off in most sensible titles.
Posted on Reply
#27
ZoneDymo
qubitDon't forget it's up to them to prove to us, not us to have faith in them or make excuses for them.

Think about it, if they make fantastic sales by not being quite as good as their competition, then what's to motivate them to beat their competition?
Because they are not beating the competition right now?
Gotta have revenue to develop better tech, and Nvidia and Intel have a LOT more revenue and Intel is doing nothing with it due to no competition.
So invest in AMD, so they have revenue to develop more to compete with what Intel is finally bringing out as a reaction to Ryzen and Nvidia.
Posted on Reply
#29
Steevo
the54thvoidHere are the Quadro results.

Even the Maxwell based card beats this in compute. So on hardware level, the pro drivers make Maxwell better than Vega. Obviously the 6000 range is teh Gx100 (or GX200) core but either way, this M6000 has only got 3072 core count (1000 less than VEGA).



Not sure what RTG are getting at. Titan x/p/xp ali-docious has always been a stupidly expensive gaming card with a limited appeal for compute. We all know that. If you want to show your compute prowess, use the M125 card, not Vega FE. I assume RTG feel the FE is the same as the Titan (i.e. a rip off for the consumer), if so, welcome to the club RTG/AMD, you're about to piss off your fans.

You folks do understand now don't you? This is AMD's Titan card. And they'll charge you for it heavily, just like Nvidia do. So, do we expect all those who vehemently denegrated Nvidia for their pricing are going to come out and have a shot at AMD???

The black shoe called pot is on the other kettle foot now. Or something.
Considering they said It's made for compute its little suprise that it falls behind in gaming, Raj or whoever at RTG even said that. Its meant for oil exploration and real time modeling meaning there are checks in place and forced high accuracy. Speaking of which, what are your thoughts on Nvidia losing performance with HDR tuned on as it forces their whole pipeline to use 16 bit or higher color and calculations, but AMD is seeing no performance penalty? My 7970 plays HDR enabled with no loss that I have noticed.
Posted on Reply
#30
punani
ZoneDymo"well should the TDP be an indication of actual power consumption, then this..."victory" is really not all that impressive" then sure.
But you just say "hey it uses a lot of power, its not looking good" which is just in no way linked to this comparison at all.
TDP is an indicator of consumption. Remember Fermi ? Most enthusiast gamers don't want a jet engine in their case and that's why my first post is still valid.
Posted on Reply
#31
El_Diablo
SteevoConsidering they said It's made for compute its little suprise that it falls behind in gaming, Raj or whoever at RTG even said that. Its meant for oil exploration and real time modeling meaning there are checks in place and forced high accuracy. Speaking of which, what are your thoughts on Nvidia losing performance with HDR tuned on as it forces their whole pipeline to use 16 bit or higher color and calculations, but AMD is seeing no performance penalty? My 7970 plays HDR enabled with no loss that I have noticed.
My fury x performs fine.
ran it in 12bit color for nearly 2 years now. have not noticed any difference in fps

Vega is going to be fast, but theres somuch misinformation out there
Posted on Reply
#32
EarthDog
Yikes... no comment on the actual article but... this misinfomation is killing me, lol!
ZoneDymoermmm that has nothing to do with anything in this article... and again, tdp is not the same as actual power consumption.
If I put 10 8-pin connectors on an RX480 the TPD will be through the roof, consumption however stays the same.

but yeah, stop posting unrelated stuff.
And TDP has nothing to do with how many power connectors are on a card. TDP is more closely related to board power than the number of power leads on the card..
Posted on Reply
#33
the54thvoid
Super Intoxicated Moderator
SteevoConsidering they said It's made for compute its little suprise that it falls behind in gaming, Raj or whoever at RTG even said that. Its meant for oil exploration and real time modeling meaning there are checks in place and forced high accuracy. Speaking of which, what are your thoughts on Nvidia losing performance with HDR tuned on as it forces their whole pipeline to use 16 bit or higher color and calculations, but AMD is seeing no performance penalty? My 7970 plays HDR enabled with no loss that I have noticed.
Having bought a £2300 HDR 4k LG OLED last year and having very few things to actually watch (even now) on HDR, I really dont care too much. I also only game on an LED Dell monitor I bought in 2011 (U2711b) for £550 (top end back then). I can genuinely say that HDR is good but on a good TV (mine for example), HD at 1080p from a good source (blu-ray) is not that different from 4k HDR (from a good source, UHD Blu Ray).

So I couldnt tell you how Pascal fares on HDR and I wouldn't be willing to shell out the cash for a better monitor. Always about the hardware.... And the LG is too slow for gaming.

FWIW, HDR is really, really a PR product so far with very little support. Give it a few years...

EDIT: streaming HDR at 4k is also not that great - poor compression even on Netflix and Amazon. An HD Blu ray still looks better than streamed UHD (where i live anyway). I do also own a UHD Blu Ray player and the UHD discs are far superior to 4k streamed, but not so much better than HD discs.
Posted on Reply
#34
Unregistered
MuhammedAbdoFunny thing is, AMD compares it's Vega FE to Quadro price wise, even though an 800$ Quadro P4000 wipes the floor with Vega FE. Another funny thing, AMD displaying fps on Vega demos a year ago with Doom and Battlefront demos, and then Sniper Elite 4 fps on Vega FE, but suddenly when TitanXP is in the room fps counters are gone! Yeah total product confidence alright!
It's a more "pro" titan Xp competitor, so more work and less play, but still by no means a quadro competitor (although it wil fair well in most quadro applications probably). For prosumers it's a bloody good card, though!
Posted on Edit | Reply
#35
EarthDog
You define 'bloody good' as performance we don't know and power rumored to be 300 and 375w?

Damn I wish I was selling products to people these days... :roll:
Posted on Reply
#36
Unregistered
EarthDogYou define 'bloody good' as performance we don't know and power rumored to be 300 and 375w?

Damn I wish I was selling products to people these days... :roll:
TDP's of 300W and 375W, probably more like 250W and 300W power usage. We have at least had a few benchmarks, enough to see vega FE can hold its own for its usecase at its pricepoint. Unless you mostly use nvidia optimized software or want more "pro" than a "prosumer" card can give you, vega FE is pretty good.
Posted on Edit | Reply
#37
efikkan
SteevoSpeaking of which, what are your thoughts on Nvidia losing performance with HDR tuned on as it forces their whole pipeline to use 16 bit or higher color and calculations, but AMD is seeing no performance penalty? My 7970 plays HDR enabled with no loss that I have noticed.
FYI: All current games do fragment processing in fp32, which in the end is downsampled into the final framebuffer. Games are already doing HDR internally, but have been using tone mapping until now. Using HDR output should not have a negative impact on performance.
Posted on Reply
#38
El_Diablo
the54thvoidHaving bought a £2300 HDR 4k LG OLED last year and having very few things to actually watch (even now) on HDR, I really dont care too much. I also only game on an LED Dell monitor I bought in 2011 (U2711b) for £550 (top end back then). I can genuinely say that HDR is good but on a good TV (mine for example), HD at 1080p from a good source (blu-ray) is not that different from 4k HDR (from a good source, UHD Blu Ray).

So I couldnt tell you how Pascal fares on HDR and I wouldn't be willing to shell out the cash for a better monitor. Always about the hardware.... And the LG is too slow for gaming.

FWIW, HDR is really, really a PR product so far with very little support. Give it a few years...

EDIT: streaming HDR at 4k is also not that great - poor compression even on Netflix and Amazon. An HD Blu ray still looks better than streamed UHD (where i live anyway). I do also own a UHD Blu Ray player and the UHD discs are far superior to 4k streamed, but not so much better than HD discs.
too slow?
22ms is fine for gaming for me
human reaction time is around 300-500ms anyway

HDR= higher bit per color
thats supposed to eleminate color banding
so anything under blueray and advertised in hdr is not worth it
Posted on Reply
#39
EarthDog
Hugh MungusTDP's of 300W and 375W, probably more like 250W and 300W power usage. We have at least had a few benchmarks, enough to see vega FE can hold its own for its usecase at its pricepoint. Unless you mostly use nvidia optimized software or want more "pro" than a "prosumer" card can give you, vega FE is pretty good.
Never saw board power be so much less than the TDP... we have seen business uses, not games... or at least not games that are highly optimized for AMD in the first place. Its more pro than consumer.
Posted on Reply
#40
cdawall
where the hell are my stars
Hugh MungusTDP's of 300W and 375W, probably more like 250W and 300W power usage. We have at least had a few benchmarks, enough to see vega FE can hold its own for its usecase at its pricepoint. Unless you mostly use nvidia optimized software or want more "pro" than a "prosumer" card can give you, vega FE is pretty good.
When in the history of AMD have they released a GPU that offered substantially lower power usage vs TDP?
Posted on Reply
#41
radrok
Meh, too late imo.

Even though I love Radeon graphics I just snagged a second reference 1080 Ti for like 550€, guess I'm fine for a couple of years.
Posted on Reply
#42
jabbadap
Was there compute benches somewhere? I only see specviewperf(d3d11,ogl) and cinebench R15(ogl), neither benchmarks compute performance.
Posted on Reply
#43
ZoneDymo
punaniTDP is an indicator of consumption. Remember Fermi ? Most enthusiast gamers don't want a jet engine in their case and that's why my first post is still valid.
Im not saying its not an indicator.
Im saying the framing of the comment is just off.
Unless you add reason for suddenly talking about power consumption of either card, its a random new variable that is not mentioned/present in the Article.
The article is purely comparing the performance is a set of tests between the two cards.

I mean you might as well start talking about the colors of the card, or how convenient it is to fit one in your case, or store availability.
Unless you also provide context like I did in my example, it has nothing to do with the article and thus is unrelated.
Posted on Reply
#44
qubit
Overclocked quantum bit
ZoneDymoBecause they are not beating the competition right now?
Gotta have revenue to develop better tech, and Nvidia and Intel have a LOT more revenue and Intel is doing nothing with it due to no competition.
So invest in AMD, so they have revenue to develop more to compete with what Intel is finally bringing out as a reaction to Ryzen and Nvidia.
Well, I'll let you do that investing then!

The situation's a bit of a catch 22 really, because they need the money for R&D and we need to look after our interests by buying the best products at the time which may not be AMD. It's up to them to find the money some other way, perhaps with loans, investments etc. The kind of thing that highly paid corporate accountants are good at. ;)

No, they're not beating Intel on IPC unfortunately, which means that games run faster on Intel and that's an important selling feature that will directly affect their sales. There's loads of reviews out there that show this, including a couple on TPU.
Posted on Reply
#45
Xzibit
the54thvoidNo, quite wrong, Gameworks pisses off everyone.

Though to be fair that's due to lazy implementation. And you can turn GW features off in most sensible titles.
Not picking on you but there is really no reason to be lazy. Especially when Nvidia is sponsoring and has gone on the record on several tech interviews (print and video) via Tom Petersen (PCPerspective) saying they send a team of GameWorks tech engineers to the game companies to help with GameWorks on titles.

Not really lazy implementation at that point.
Posted on Reply
#46
HD64G
punanidude.... article is vega FE vs Titan, and my post is was vega FE vs titan. And they do most likely not set TDP that high for no reason.
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/28.html

Here it says that even FE of 1080Ti consumes ~270W max.

www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Ti_Amp_Extreme/29.html

Here again, it shows that a factory oced 1080Ti is able to consume up to 320W.

In conclusion, don't look at the official numbers of TDP to make assumptions about true consumption of CPU or GPU. My estimation is that full RX Vega will consume close to 280W. The top version with WC might surpass 300W. Normal for top tier gaming cards imho. And TDP limit allows some oc too, so it MUST be higher than the stock clock consumption.
Posted on Reply
#47
evernessince
the54thvoidDoom with Vulkan.
Prey (is DX11 anyway)
Sniper Elite DX12

Cherry. That's all.

However, not too long now, only another month and a bit. And a bit more. As mentioned too, AMD have been humping Nvidia in compute for ages now on their gaming based cards. But to be realistic, I don't use compute to game.
Actually you do. Many newer game titles use compute and the amount of compute being used in games is only increasing.
Posted on Reply
#48
Fluffmeister
Hopefully someone will bench this poncy FE card soon, my God... this is the most epically drawn out launch EVAR.
Posted on Reply
#49
Unregistered
FluffmeisterHopefully someone will bench this poncy FE card soon, my God... this is the most epically drawn out launch EVAR.
Just how americans like their disney soaps!
Posted on Edit | Reply
#50
Waxinator
That test seems a bit odd...
My old "AMD Radeon R9 290" + "AMD Ryzen 7 1800X" gets 107.08 fps in OpenGL ?!?
Posted on Reply
Add your own comment
Dec 18th, 2024 15:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts