Tuesday, May 16th 2017

AMD Announces Radeon Vega Frontier Edition - Not for Gamers

Where is Vega? When is it launching? On AMD's Financial Analyst Day 2017, Raja Koduri spoke about the speculation in the past few weeks, and brought us an answer: Radeon Vega Frontier Edition is the first iteration of Vega, aimed at data scientists, immersion engineers and product designers. It will be released in the second half of June for AMD's "pioneers". The wording, that Vega Frontier Edition will be released in the second half of June, makes it so that AMD still technically releases Vega in the 2H 2017... It's just not the consumer, gaming Vega version of the chip. This could unfortunately signify an after-June release time-frame for consumer GPUs based on the Vega micro-architecture.

This news comes as a disappointment to all gamers who have been hoping for Vega for gaming, because it reminds of what happened with dual Fiji. A promising design which ended up unsuitable for gaming and was thus marketed for content creators as Radeon Pro Duo, with little success. But there is still hope: it just looks like we really will have to wait for Computex 2017 to see some measure of details on Vega's gaming prowess.

Vega Frontier Edition is the Vega GPU we've been seeing in leaks in the last few weeks, packing 16 GB of HBM2 memory, which, as we posited, didn't really make much sense on typical gaming workloads. But we have to say that if AMD's Vega truly does deliver only a 1.5x improvement in FP32 performance (the one that's most critical for gaming at the moment), this probably paints AMD's Vega as fighting an uphill battle against NVIDIA's Pascal architecture (probably ending up somewhere between GTX 1070 and GTX 1080). If these are correct, this could mean a dual GPU Vega is indeed in the works, so as to allow AMD to reclaim the performance crown from NVIDIA, albeit with a dual-GPU configuration against NVIDIA's current single-chip performance king, Titan Xp. Also worth nothing is that the AMD Radeon Vega Frontier Edition uses two PCI-Express 8-pin power connectors, which suggests a power draw north of 300 Watts.
For now, it seems AMD actually did its best to go all out on the machine learning craze, looking for the higher profits that are available in the professional market segment than on the consumer side of graphics. Let's just hope they didn't do so at the expense of gaming performance leaps.

After an initial throwback to AMD's times since he became lead of Radeon Technologies Group, where Raja mentioned the growing amount of graphics engineers in AMD, including their commitment to the basics of graphics computing: power, performance, and software. Better basics in hardware, software, and marketing are things that Raja says are responsible for AMD's current market outlook, both from a gamer and content creator perspective, which led to an increase in AMD's graphics marketshare.
RTG's chapter two of Radeon Rising, going beyond the basics, will allow the company to go after premium market dollars, with an architecture that excels on both gaming and CAD applications. Raja Koduri said he agreed with NVIDIA CEO Jensen Huang in that at some point in the future, every single human being will be a gamer.
The final configuration of Vega was finalized some two years ago, and AMD's vision for it was to have a GPU that could plow through 4K resolutions at over 60 frames per second. And Vega has achieved it. Sniper Elite 4 at over 60 FPS on 4K. Afterwards, Raja talked about AMD's High Bandwidth Cache Controller, running Rise of the Tomb Raider, giving the system only 2 GB of system memory, with the HBCC-enabled system delivering more than 3x the minimum frame-rates than the non-HBCC enabled system, something we've seen in the past, though on Deus Ex: mankind Divided. So now we know that wasn't just a single-shot trick.
Raja Koduri then showed AMD's SSG implementation and how it works on a fully ray-traced environment, with the SSG system delivering much smoother transitions in the system. AMD worked with Adobe on integrating SSG capability into Adobe Premiere Pro.
Raja then jumped towards machine intelligence, which Raja believes will be dominated not by the GPU (NVIDIA's green) or CPU (Intel blue) paths, but in true heterogeneous computing.
Raja took to stage results on DeepBench, a machine learning benchmark where NVIDIA dominates at the moment, joking about AMD's absence from the benchmark - since they really didn't have a presence in this area. In a benchmark, AMD pitted Vega against NVIDIA's P100 architecture (interestingly, not against NVIDIA's recently announced V100 architecture, which brings many specific improvements to this kind of workloads), delivering an almost 30% performance lead.
Add your own comment

91 Comments on AMD Announces Radeon Vega Frontier Edition - Not for Gamers

#76
RejZoR
cdawallCouple of things 520w was dramatized that was actual power draw and it was enough that I had to go from my 750w platinum seasonic to a 1200w for stability.

Raja himself said that two 480's was more efficient than a 1080.
How do you get 520W from 2x 150W cards. Either you're shit at math or you're bending laws of physics. Because of that PCIe power draw scandal, it has been measured over and over and it's in fact 150W consumption. So, where you get those 520W is beyond me...
Posted on Reply
#77
xkm1948
AMD can totally wait until Q4 to release the Vega to compete with mid-range Volta 1160. It all make sense now. They probably never intended to let Vega be a top end card. Vega may end up just like Polaris, battling the xx60 of volta.
Posted on Reply
#78
Nokiron
RejZoRHow do you get 520W from 2x 150W cards. Either you're shit at math or you're bending laws of physics. Because of that PCIe power draw scandal, it has been measured over and over and it's in fact 150W consumption. So, where you get those 520W is beyond me...
Or you know, it could be option number three. He was not using reference RX480s. When I used my Nitro+ I was pulling 80W+ more than a stock RX480 without overclocking.
Posted on Reply
#79
efikkan
xkm1948AMD can totally wait until Q4 to release the Vega to compete with mid-range Volta 1160. It all make sense now. They probably never intended to let Vega be a top end card. Vega may end up just like Polaris, battling the xx60 of volta.
There will be no consumer versions of Volta anytime soon.
Currently, AMD is only competitive in the low-end, they really need to have some decent mid-range chips with high volume sales.
Posted on Reply
#80
cdawall
where the hell are my stars
RejZoRHow do you get 520W from 2x 150W cards. Either you're shit at math or you're bending laws of physics. Because of that PCIe power draw scandal, it has been measured over and over and it's in fact 150W consumption. So, where you get those 520W is beyond me...
efficiency curve, overclocking, additional voltage, etc.
NokironOr you know, it could be option number three. He was not using reference RX480s. When I used my Nitro+ I was pulling 80W+ more than a stock RX480 without overclocking.
Thank you for answering this for me...I don't know how many times I have to state "overclocked", alas I guess one more.
Posted on Reply
#81
RejZoR
Graphic cards have never been more efficient per performance delivered and everyone is freaking out like there is no tomorrow. That's why. Overclocked or not. We've had cards that delivered half the performance for twice the power consumption and it was like "raised brow a bit". But here it's "OH MAH GOD IT'S LA TERRIBLÉ" That's why I don't understand any of you whining here.
Posted on Reply
#82
xkm1948
ReJzor you should totally get a Vega FE when it id available and bench it youself to shut up all the "haters"

Im counting on ya!
efikkanThere will be no consumer versions of Volta anytime soon.
Currently, AMD is only competitive in the low-end, they really need to have some decent mid-range chips with high volume sales.
Sounds about right, AMD is more for low end ~ ultra low end.
Posted on Reply
#83
cdawall
where the hell are my stars
RejZoRGraphic cards have never been more efficient per performance delivered and everyone is freaking out like there is no tomorrow. That's why. Overclocked or not. We've had cards that delivered half the performance for twice the power consumption and it was like "raised brow a bit". But here it's "OH MAH GOD IT'S LA TERRIBLÉ" That's why I don't understand any of you whining here.
I knew the 480's pulled some juice when I purchased them, however I did not exactly plan for them to pull 260 watts a pop. I have had some power hungry cards in the past (290, furys, fermi etc) power consumption doesn't bother me so much as the performance lack in comparison. I was also immensely disappointed in the GTX 1070 after testing.

This is also colored by fighting those cards for almost a year. Issue after issue with drawing too much juice through the board, running too hot (even stock) for crossfire use) etc. When they worked they easily equaled a 1080 even with some overclock on it, but that didn't translate to every game.
Posted on Reply
#84
RejZoR
Hot were only reference models. The aftermarket ones were fine in terms of thermals and noise.
Posted on Reply
#85
cdawall
where the hell are my stars
RejZoRHot were only reference models. The aftermarket ones were fine in terms of thermals and noise.
Put two of them in a MATX case and tell me how that goes. Again you haven't got one, let alone two of these cards. I actually owned them, tested with the super low wattage RX480 RS models from XFX that actually pulled less power than reference as well as my Nitro+ cards that are on the other end of that. I also tested with clockspeeds in the 1480mhz realm which is to 5-10% of RX480's on air. They were hot, load and power hungry. I quite honestly don't know what you are arguing you don't even have the cards?
Posted on Reply
#86
RejZoR
Put two Saturn V rockets in a shed and tell me how that goes. Yes, you're over dramatizing. Over and over and over. When only thing you keep bringing up is CrossfireX and some ridiculous tiny case scenarios, you're really grasping at straws here. People who buy these things generally have at least midi tower and they pretty much never crossfire them. C'mon, one would expect out of all people you'd be the one to know these things.

I do have GTX 980 which is essentially the same thing. And when it's overclocked it's scorching hot and also becomes noisy. Performance projections are about the same.
Posted on Reply
#87
cdawall
where the hell are my stars
RejZoRPut two Saturn V rockets in a shed and tell me how that goes. Yes, you're over dramatizing. Over and over and over. When only thing you keep bringing up is CrossfireX and some ridiculous tiny case scenarios, you're really grasping at straws here. People who buy these things generally have at least midi tower and they pretty much never crossfire them. C'mon, one would expect out of all people you'd be the one to know these things.

I do have GTX 980 which is essentially the same thing. And when it's overclocked it's scorching hot and also becomes noisy. Performance projections are about the same.
520w is 520w's. That isn't a question that actually happens.

They were fine in my larger matxish. They draw more power than the 980 both oc'd and there are plenty of people who run crossfire. That isn't rare, that isn't new.

The cards weren't quiet when new, or if set to the silent bios they throttled the whole time. The same issue all of them have faced. Owners of them know this.
Posted on Reply
#89
medi01
cdawallDepending on the card they are loud under load,
No, they aren't, in fact, most AIB 580's are quieter than 1060s
It only consumes much on stock voltage and comes with unique driver features ("chill") that can cut power consumption by two thirds.

Not that 40-80W of total power consumptions mattered in this context.
xkm1948AMD can totally wait until Q4 to release the Vega to compete with mid-range Volta 1160. It all make sense now. They probably never intended to let Vega be a top end card. Vega may end up just like Polaris, battling the xx60 of volta.
Because 1060 can take on 980Ti, right?
Let me remind you citation of one of the TPU users: "OC 1080 vs my OC 980Ti is only 11% faster".
/sigh

Wild expectations from Volta, when we got barely better perf/$ from Pascal, which was a MONSTROUS process jump previous gen, eh?

I recall nintendo Switch expectations were even crazier, it was supposed to beat PS4/Xbone (yey), cause, you know, reality distortion field many team green users live in.
Oh, it ended up a tad faster than WiiU.
Posted on Reply
#90
cdawall
where the hell are my stars
medi01No, they aren't, in fact, most AIB 580's are quieter than 1060s
It only consumes much on stock voltage and comes with unique driver features ("chill") that can cut power consumption by two thirds.

Not that 40-80W of total power consumptions mattered in this context.
The 480 has that same feature. Let me tell you how well it worked :rolleyes:

Also which AIB card specifically are we talking here... Quite a few models with overclocks were a lot more than 40-80w but then again even 40-80 watts is over half the consumption of a 1060

Edit:

Looks like over 100w in games, louder than the reference 1060 as well. All while just barely edgying it out in gaming performance.

www.techpowerup.com/reviews/Sapphire/RX_580_Nitro_Plus/28.html
Posted on Reply
Add your own comment
May 8th, 2024 07:51 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts