Tuesday, May 16th 2017

AMD Announces Radeon Vega Frontier Edition - Not for Gamers

Where is Vega? When is it launching? On AMD's Financial Analyst Day 2017, Raja Koduri spoke about the speculation in the past few weeks, and brought us an answer: Radeon Vega Frontier Edition is the first iteration of Vega, aimed at data scientists, immersion engineers and product designers. It will be released in the second half of June for AMD's "pioneers". The wording, that Vega Frontier Edition will be released in the second half of June, makes it so that AMD still technically releases Vega in the 2H 2017... It's just not the consumer, gaming Vega version of the chip. This could unfortunately signify an after-June release time-frame for consumer GPUs based on the Vega micro-architecture.

This news comes as a disappointment to all gamers who have been hoping for Vega for gaming, because it reminds of what happened with dual Fiji. A promising design which ended up unsuitable for gaming and was thus marketed for content creators as Radeon Pro Duo, with little success. But there is still hope: it just looks like we really will have to wait for Computex 2017 to see some measure of details on Vega's gaming prowess.

Vega Frontier Edition is the Vega GPU we've been seeing in leaks in the last few weeks, packing 16 GB of HBM2 memory, which, as we posited, didn't really make much sense on typical gaming workloads. But we have to say that if AMD's Vega truly does deliver only a 1.5x improvement in FP32 performance (the one that's most critical for gaming at the moment), this probably paints AMD's Vega as fighting an uphill battle against NVIDIA's Pascal architecture (probably ending up somewhere between GTX 1070 and GTX 1080). If these are correct, this could mean a dual GPU Vega is indeed in the works, so as to allow AMD to reclaim the performance crown from NVIDIA, albeit with a dual-GPU configuration against NVIDIA's current single-chip performance king, Titan Xp. Also worth nothing is that the AMD Radeon Vega Frontier Edition uses two PCI-Express 8-pin power connectors, which suggests a power draw north of 300 Watts.
For now, it seems AMD actually did its best to go all out on the machine learning craze, looking for the higher profits that are available in the professional market segment than on the consumer side of graphics. Let's just hope they didn't do so at the expense of gaming performance leaps.

After an initial throwback to AMD's times since he became lead of Radeon Technologies Group, where Raja mentioned the growing amount of graphics engineers in AMD, including their commitment to the basics of graphics computing: power, performance, and software. Better basics in hardware, software, and marketing are things that Raja says are responsible for AMD's current market outlook, both from a gamer and content creator perspective, which led to an increase in AMD's graphics marketshare.
RTG's chapter two of Radeon Rising, going beyond the basics, will allow the company to go after premium market dollars, with an architecture that excels on both gaming and CAD applications. Raja Koduri said he agreed with NVIDIA CEO Jensen Huang in that at some point in the future, every single human being will be a gamer.
The final configuration of Vega was finalized some two years ago, and AMD's vision for it was to have a GPU that could plow through 4K resolutions at over 60 frames per second. And Vega has achieved it. Sniper Elite 4 at over 60 FPS on 4K. Afterwards, Raja talked about AMD's High Bandwidth Cache Controller, running Rise of the Tomb Raider, giving the system only 2 GB of system memory, with the HBCC-enabled system delivering more than 3x the minimum frame-rates than the non-HBCC enabled system, something we've seen in the past, though on Deus Ex: mankind Divided. So now we know that wasn't just a single-shot trick.
Raja Koduri then showed AMD's SSG implementation and how it works on a fully ray-traced environment, with the SSG system delivering much smoother transitions in the system. AMD worked with Adobe on integrating SSG capability into Adobe Premiere Pro.
Raja then jumped towards machine intelligence, which Raja believes will be dominated not by the GPU (NVIDIA's green) or CPU (Intel blue) paths, but in true heterogeneous computing.
Raja took to stage results on DeepBench, a machine learning benchmark where NVIDIA dominates at the moment, joking about AMD's absence from the benchmark - since they really didn't have a presence in this area. In a benchmark, AMD pitted Vega against NVIDIA's P100 architecture (interestingly, not against NVIDIA's recently announced V100 architecture, which brings many specific improvements to this kind of workloads), delivering an almost 30% performance lead.
Add your own comment

91 Comments on AMD Announces Radeon Vega Frontier Edition - Not for Gamers

#51
medi01
AbsolutionWelp, always one step behind AMD :(
Oh, please, stop the BS, Volta cards are not available yet, how on earth would they have compared vs it?
Patriotbut nvidia is training 100k dev
What on earth are you talking about? Not even Microsoft has 100k devs.
Posted on Reply
#53
medi01
NokironNot in-house...
Oh, "going to train 100k developers"
They are also in 50 million cars, aren't they?
"Much faster than 480" lol.
cdawallKeep clinging to that one. I have played that card and played that card the entire time that the FX series was their lineup. It isn't going to change how piss poor the FX design was and how horridly it ruined their name in the market. That was on them, not intel, not the consumers. They did a bad job and payed for it with market share.
.
Come on, more expensive, more power hungry, SLOWER Prescott outsold Athlon 64.

God knows what "market share" is shown on your picture, which country it is and if it is number of units or revenue. I'd bet on the former.
Posted on Reply
#54
Nokiron
medi01Oh, "going to train 100k developers"
They are also in 50 million cars, aren't they?
"Much faster than 480" lol.
This statement makes no sense to me. What do you mean by this?
Posted on Reply
#55
medi01
NokironThis statement makes no sense to me. What do you mean by this?
I mean that nvidia's claims are rather far from reality.
Posted on Reply
#56
Assimilator
medi01Come on, more expensive, more power hungry, SLOWER Prescott outsold Athlon 64.
Nobody's disputing that. What we are saying is that after the P4/Athlon 64 era, when AMD should have been able to capitalise, they instead shot themselves in the foot - multiple times - with the uncompetitive Bulldozer architecture.
Posted on Reply
#57
Ebo
Am I the only one that really likes that AMD isent saying anything about the gaming version of "Vega" ?.

Why should they follow Nvidias release cycle and make a bigger and better card than Nvidia has as their top tier card ? It isent where the money is, thats mainstream gaming and not in the enthusiast sigment.

Just look at RX480/580 they have done well and made a lot of much needed money for AMD, and belive me when I predict next quarters financial status of AMD, is going to be a hell of a lot better with Ryzen sale numbers comming out, and now Vega to fuel the GFX department after.

I for one dosent like early leaks, fake news, call it whatever you want, the public dosent have a right to know until release date, and I second that 100%.

For the moment, if you want the best of the best, just go green camp(GFX1080TI), its over priced in most countries anyway.
If Vega can compete within a margin of 5% with a lower pricetag, I would say its a slamdunk. Just let the top tier cards all belong to Nvidia. Let them use a lot of money on developing a new arkitecture just for the bragging rights, and low profit pr unit, then you fill up the mainstream section with competive alternatives and make a much better profit pr unit in the end, I really like this a lot:clap:, since AMD has been bleeding a lot of money which they need to come back again.
Posted on Reply
#58
medi01
AssimilatorWhat we are saying is that after the P4/Athlon 64 era, when AMD should have been able to capitalise, they instead shot themselves in the foot - multiple times - with the uncompetitive Bulldozer architecture.
They struggled with fab expenses and had to sell it, because they actually were not able to properly capitalize on superior products is what my memory tells me.
Similar story with Fermi outselling fantastic products AMD had. (can't blame this one on strongarming though, basically underdog always has to fight an uphill battle)

There is something about "their own fault" narrative I simply cannot grasp.
Posted on Reply
#59
the54thvoid
Super Intoxicated Moderator
medi01They struggled with fab expenses and had to sell it, because they actually were not able to properly capitalize on superior products is what my memory tells me.
Similar story with Fermi outselling fantastic products AMD had. (can't blame this one on strongarming though, basically underdog always has to fight an uphill battle)

There is something about "their own fault" narrative I simply cannot grasp.
Fermi was hot and hungry but Nvidia did something people thought not possible. They dropped a full core Fermi as the 580.
It helped stem the bleeding from a power monster and set them up well.
Posted on Reply
#60
springs113
TheinsanegamerNThe existence of ryzen seems to disagree with you. It's performance is the definition of "enthusiast". It's a whole 2-5% slower in games (while still maintaining over 100FPS) and wallops intel in creative production, with better price/perf to boot and an 8 core chip in the standard socket that doesnt suck.

That simply isnt true. When AMD was competitive, IE the 5000 and 7000 series, They sold quite well.

AMD could never get their driver game together when they had good hardware, and that scared away quite a few consumers. By the time they got their drivers mostly straightened out (400 series), their hardware was a generation behind.

Given how complicated GPUs are, it will take years for RTG to straighten out the damage that AMD did to the ATI GPU. With ryzen selling well and vega finally out, R+D should finally start catching up to nvidia.
On the 5000 series AMD did have the better product but in actuality Nvidia still outsold them. I believe adoredtv did a video on this.
Posted on Reply
#61
cdawall
where the hell are my stars
medi01Come on, more expensive, more power hungry, SLOWER Prescott outsold Athlon 64.

God knows what "market share" is shown on your picture, which country it is and if it is number of units or revenue. I'd bet on the former.
Market share didn't plummet until core 2 duo. AMD should have exceeded Intels market share during the p4 era, but bad business practices prevented that.

As for the chart that's the world Wide one that was posted on tpu not even like two months ago... So I don't know ask the news editors?
Posted on Reply
#62
RejZoR
Or how people were still buying TOTALLY inferior GeForce FX cards even though any common sense would tell you to stay away from them like vampire from garlic essence. But when roles are reversed and AMD is just mildly worse at few specific things OMG OMG AMD SUXORZ (because only thing slightly worse is power consumption, in case of Polaris). Sometimes you just can't understand people.
Posted on Reply
#63
cdawall
where the hell are my stars
RejZoROr how people were still buying TOTALLY inferior GeForce FX cards even though any common sense would tell you to stay away from them like vampire from garlic essence. But when roles are reversed and AMD is just mildly worse at few specific things OMG OMG AMD SUXORZ (because only thing slightly worse is power consumption, in case of Polaris). Sometimes you just can't understand people.
Slightly worse? AMD doesn't offer any thing in the HEDT market period.
Posted on Reply
#64
dozenfury
Vega has been hyped as a gamer card for over a year (also keep in mind we're almost a year out from the 1070/1080 releases), and just now less than 3 weeks before E3 it's suddenly switched up to be a professional/deep learning card instead of a gaming one?

It's amazing what AMD has been able to accomplish in the David vs. Goliath competition with Intel and Nvidia. But the reality is that there's no way with their relatively tiny R&D they can hang with them. Marketing is cheap by comparison and AMD is great at that. So over and over it's the same story with incredible claims pre-release that get people salivating, and then varying degrees of let down and defending AMD after release. AMD is to be commended for what they've been able to do. But I'd be kicking myself had I been waiting for a year now for Vega to learn it's both delayed again in gaming form, and that it's a professional card first.
Posted on Reply
#65
RejZoR
cdawallSlightly worse? AMD doesn't offer any thing in the HEDT market period.
I'm talking Polaris vs Pascal (in its respective segment aka RX 580 vs GTX 1060). Only thing RX 580 can be objected is power consumption and even that one quite frankly isn't bad at all. They aren't loud, they aren't slow. And yet everyone is taking a piss at them almost more than anyone ever has back in the day for GeForce FX which was hot, loud, more power hungry and had a terminal design flaw in the rendering pipeline.
Posted on Reply
#66
efikkan
Did we actually get any new useful information about Vega yesterday?
Posted on Reply
#67
springs113
What are you all complaining about. This conference was not for consumers"gamers" it was basically for shareholders and investors. Our time is at at the end of the month. Vega is a segmented product. When a company holds a press release that outlines it's strategy for the next 2-5 years, how is it so that people hoping to hear info on an unreleased get upset because they didn't hear what they wanted to hear. I'm pretty sure that's what Computex is all about (us gamers). AMD did not hold any of you guys' hand and told you all not to make a purchase.

@efikkan yes and no....nothing concrete just things to majkere assumptions on again.
Posted on Reply
#68
cdawall
where the hell are my stars
RejZoRI'm talking Polaris vs Pascal (in its respective segment aka RX 580 vs GTX 1060). Only thing RX 580 can be objected is power consumption and even that one quite frankly isn't bad at all. They aren't loud, they aren't slow. And yet everyone is taking a piss at them almost more than anyone ever has back in the day for GeForce FX which was hot, loud, more power hungry and had a terminal design flaw in the rendering pipeline.
Performance of a 1060 and the power consumption of a 1070/1080. Depending on the card they are loud under load, even worse when you have two of them.
Posted on Reply
#69
RejZoR
That's the exact thing I don't understand people. It doesn't fucking matter. And it can't be loud, because if it is, then my GTX 980 is also stupendously loud. And I can hardly hear it. We used to all run 300W god damn graphic cards and no one bat an eye. But now all of a sudden, power consumption goes above framerate in priority which is like the basics of why we even bother buying this shit.
Posted on Reply
#70
cdawall
where the hell are my stars
RejZoRThat's the exact thing I don't understand people. It doesn't fucking matter. And it can't be loud, because if it is, then my GTX 980 is also stupendously loud. And I can hardly hear it. We used to all run 300W god damn graphic cards and no one bat an eye. But now all of a sudden, power consumption goes above framerate in priority which is like the basics of why we even bother buying this shit.
Power consumption has been a topic for discussion pretty much since they added power connectors to the card.

Remember I ran two of these prior to my 1080ti. They were some of the better cards out there. They were also hot, loud and slower than one Ti.

Now with a bios mod I have already seen over 380w worth of consumption from my Ti, but that's a far cry less than the 520w I saw from my pair of 480's.
Posted on Reply
#71
RejZoR
Average Joe doesn't buy a pair of cards when they mostly can't afford a single one... And yes, you're all overdramatizing with the power consumption. Also, RX 480 is a 150W card (after the applied driver fix). That's 300W for a pair, not 520W... Do I really have to also drag out the fact that 2 cards were NEVER more efficient than single GPU card? It's physically impossible.
Posted on Reply
#72
cdawall
where the hell are my stars
RejZoRAverage Joe doesn't buy a pair of cards when they mostly can't afford a single one... And yes, you're all overdramatizing with the power consumption. Also, RX 480 is a 150W card (after the applied driver fix). That's 300W for a pair, not 520W... Do I really have to also drag out the fact that 2 cards were NEVER more efficient than single GPU card? It's physically impossible.
Couple of things 520w was dramatized that was actual power draw and it was enough that I had to go from my 750w platinum seasonic to a 1200w for stability.

Raja himself said that two 480's was more efficient than a 1080.
Posted on Reply
#73
BiggieShady
RejZoRthey place it on a graph next to R9 Fury X.
Let's look at it as a comparison of Fiji chip versus Vega chip
Posted on Reply
#74
cdawall
where the hell are my stars
BiggieShadyLet's look at it as a comparison of Fiji chip versus Vega chip
They can't put it next to the pro duo which was the last professional card released because it is slower. Unluckily people haven't quite taken that into thought yet. If it is slower than a pair of Fiji cards it's slower than the 1080ti let alone a titan.
Posted on Reply
#75
GC_PaNzerFIN
I don't know about you guys but for me the Blue VEGA Frontier Edition pretty much is the ugliest looking graphics card I have ever seen.
RX VEGA probably has half the HBM memory, but something is obviously problem at the red team unable to deliver VEGA volume in time resorting in *limited edition pro card launch. Niche product is all good and well, but AMD needs the next thing for the gaming masses or NVIDIA can cash cow yet another holiday season.
Posted on Reply
Add your own comment
Dec 16th, 2024 18:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts