Monday, June 26th 2017
Vega Frontier Ed Beats TITAN Xp in Compute, Formidable Game Performance: Preview
PC World posted a preview of an AMD Radeon Pro Vega Frontier Edition graphics card, and reported some interesting observations about the card ahead of its review NDA. The tech publication compared the air-cooled Pro Vega Frontier Edition against NVIDIA's fastest consumer graphics card, the TITAN Xp. It did reveal performance numbers of the two cards in two compute-heavy tests, SPECViewPerf 12.1 and Cinebench R15 (OpenGL test), where the Vega FE significantly outperforms the TITAN Xp. This shouldn't come as a shocker because AMD GPUs tend to have a strong footing with GPU compute performance, particularly with open standards.
It's PC World's comments on the Vega card's gaming performance that might pique your interest. In its report, the publication comments that the Radeon Pro Vega Frontier Edition offers gaming performance that is faster than NVIDIA's GeForce GTX 1080, but slightly slower than its GTX 1080 Ti graphics card. To back its statement, PC World claims to have run the Vega Frontier Edition and TITAN Xp in "Doom" with Vulkan API, "Prey" with DirectX 11, and "Sniper Elite 4" with DirectX 12. You must also take into account that the Radeon Pro Vega Frontier Edition could command a four-figure price, in the league of the TITAN Xp; and that gamers should look forward to the Radeon RX Vega series, bound for a late-July/early-August launch, at price-points more appropriate to their competitive positioning. The RX Vega is also expected to have 8 GB of memory compared to 16 GB on the Frontier Edition. Watch PC World's video presentation in the source link below.
Sources:
PC World, VideoCardz
It's PC World's comments on the Vega card's gaming performance that might pique your interest. In its report, the publication comments that the Radeon Pro Vega Frontier Edition offers gaming performance that is faster than NVIDIA's GeForce GTX 1080, but slightly slower than its GTX 1080 Ti graphics card. To back its statement, PC World claims to have run the Vega Frontier Edition and TITAN Xp in "Doom" with Vulkan API, "Prey" with DirectX 11, and "Sniper Elite 4" with DirectX 12. You must also take into account that the Radeon Pro Vega Frontier Edition could command a four-figure price, in the league of the TITAN Xp; and that gamers should look forward to the Radeon RX Vega series, bound for a late-July/early-August launch, at price-points more appropriate to their competitive positioning. The RX Vega is also expected to have 8 GB of memory compared to 16 GB on the Frontier Edition. Watch PC World's video presentation in the source link below.
77 Comments on Vega Frontier Ed Beats TITAN Xp in Compute, Formidable Game Performance: Preview
Though to be fair that's due to lazy implementation. And you can turn GW features off in most sensible titles.
Gotta have revenue to develop better tech, and Nvidia and Intel have a LOT more revenue and Intel is doing nothing with it due to no competition.
So invest in AMD, so they have revenue to develop more to compete with what Intel is finally bringing out as a reaction to Ryzen and Nvidia.
look at these figures
this is the quadro score
its rendered at 1900x1060
uploads.disquscdn.com/images/0359c7ac8787f70354c703109e43f7b822f3d22f5dd8289aca764333820b236c.png
this is vega score renderd at 4K but in a window off 1900x1060
vega totally reks the nvidia gpus
ran it in 12bit color for nearly 2 years now. have not noticed any difference in fps
Vega is going to be fast, but theres somuch misinformation out there
So I couldnt tell you how Pascal fares on HDR and I wouldn't be willing to shell out the cash for a better monitor. Always about the hardware.... And the LG is too slow for gaming.
FWIW, HDR is really, really a PR product so far with very little support. Give it a few years...
EDIT: streaming HDR at 4k is also not that great - poor compression even on Netflix and Amazon. An HD Blu ray still looks better than streamed UHD (where i live anyway). I do also own a UHD Blu Ray player and the UHD discs are far superior to 4k streamed, but not so much better than HD discs.
Damn I wish I was selling products to people these days... :roll:
22ms is fine for gaming for me
human reaction time is around 300-500ms anyway
HDR= higher bit per color
thats supposed to eleminate color banding
so anything under blueray and advertised in hdr is not worth it
Even though I love Radeon graphics I just snagged a second reference 1080 Ti for like 550€, guess I'm fine for a couple of years.
Im saying the framing of the comment is just off.
Unless you add reason for suddenly talking about power consumption of either card, its a random new variable that is not mentioned/present in the Article.
The article is purely comparing the performance is a set of tests between the two cards.
I mean you might as well start talking about the colors of the card, or how convenient it is to fit one in your case, or store availability.
Unless you also provide context like I did in my example, it has nothing to do with the article and thus is unrelated.
The situation's a bit of a catch 22 really, because they need the money for R&D and we need to look after our interests by buying the best products at the time which may not be AMD. It's up to them to find the money some other way, perhaps with loans, investments etc. The kind of thing that highly paid corporate accountants are good at. ;)
No, they're not beating Intel on IPC unfortunately, which means that games run faster on Intel and that's an important selling feature that will directly affect their sales. There's loads of reviews out there that show this, including a couple on TPU.
Not really lazy implementation at that point.
Here it says that even FE of 1080Ti consumes ~270W max.
www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Ti_Amp_Extreme/29.html
Here again, it shows that a factory oced 1080Ti is able to consume up to 320W.
In conclusion, don't look at the official numbers of TDP to make assumptions about true consumption of CPU or GPU. My estimation is that full RX Vega will consume close to 280W. The top version with WC might surpass 300W. Normal for top tier gaming cards imho. And TDP limit allows some oc too, so it MUST be higher than the stock clock consumption.
My old "AMD Radeon R9 290" + "AMD Ryzen 7 1800X" gets 107.08 fps in OpenGL ?!?