# AMD Announces Radeon Vega Frontier Edition - Not for Gamers



## Raevenlord (May 16, 2017)

Where is Vega? When is it launching? On AMD's Financial Analyst Day 2017, Raja Koduri spoke about the speculation in the past few weeks, and brought us an answer: Radeon Vega Frontier Edition is the first iteration of Vega, aimed at data scientists, immersion engineers and product designers. It will be released in the second half of June for AMD's "pioneers". The wording, that Vega Frontier Edition will be released in the second half of June, makes it so that AMD still technically releases Vega in the 2H 2017... It's just not the consumer, gaming Vega version of the chip. This could unfortunately signify an after-June release time-frame for consumer GPUs based on the Vega micro-architecture.

This news comes as a disappointment to all gamers who have been hoping for Vega for gaming, because it reminds of what happened with dual Fiji. A promising design which ended up unsuitable for gaming and was thus marketed for content creators as Radeon Pro Duo, with little success. But there is still hope: it just looks like we really will have to wait for Computex 2017 to see some measure of details on Vega's gaming prowess.



 

 

 





Vega Frontier Edition is the Vega GPU we've been seeing in leaks in the last few weeks, packing 16 GB of HBM2 memory, which, as we posited, didn't really make much sense on typical gaming workloads. But we have to say that if AMD's Vega truly does deliver only a 1.5x improvement in FP32 performance (the one that's most critical for gaming at the moment), this probably paints AMD's Vega as fighting an uphill battle against NVIDIA's Pascal architecture (probably ending up somewhere between GTX 1070 and GTX 1080). If these are correct, this could mean a dual GPU Vega is indeed in the works, so as to allow AMD to reclaim the performance crown from NVIDIA, albeit with a dual-GPU configuration against NVIDIA's current single-chip performance king, Titan Xp. Also worth nothing is that the AMD Radeon Vega Frontier Edition uses two PCI-Express 8-pin power connectors, which suggests a power draw north of 300 Watts.



 

For now, it seems AMD actually did its best to go all out on the machine learning craze, looking for the higher profits that are available in the professional market segment than on the consumer side of graphics. Let's just hope they didn't do so at the expense of gaming performance leaps.

After an initial throwback to AMD's times since he became lead of Radeon Technologies Group, where Raja mentioned the growing amount of graphics engineers in AMD, including their commitment to the basics of graphics computing: power, performance, and software. Better basics in hardware, software, and marketing are things that Raja says are responsible for AMD's current market outlook, both from a gamer and content creator perspective, which led to an increase in AMD's graphics marketshare.



 

 

 

 

RTG's chapter two of Radeon Rising, going beyond the basics, will allow the company to go after premium market dollars, with an architecture that excels on both gaming and CAD applications. Raja Koduri said he agreed with NVIDIA CEO Jensen Huang in that at some point in the future, every single human being will be a gamer.



 

 

 

 

The final configuration of Vega was finalized some two years ago, and AMD's vision for it was to have a GPU that could plow through 4K resolutions at over 60 frames per second. And Vega has achieved it. Sniper Elite 4 at over 60 FPS on 4K. Afterwards, Raja talked about AMD's High Bandwidth Cache Controller, running Rise of the Tomb Raider, giving the system only 2 GB of system memory, with the HBCC-enabled system delivering more than 3x the minimum frame-rates than the non-HBCC enabled system, something we've seen in the past, though on Deus Ex: mankind Divided. So now we know that wasn't just a single-shot trick.



 

 

 

Raja Koduri then showed AMD's SSG implementation and how it works on a fully ray-traced environment, with the SSG system delivering much smoother transitions in the system. AMD worked with Adobe on integrating SSG capability into Adobe Premiere Pro.



 

 

 

Raja then jumped towards machine intelligence, which Raja believes will be dominated not by the GPU (NVIDIA's green) or CPU (Intel blue) paths, but in true heterogeneous computing.



 

 

 

Raja took to stage results on DeepBench, a machine learning benchmark where NVIDIA dominates at the moment, joking about AMD's absence from the benchmark - since they really didn't have a presence in this area. In a benchmark, AMD pitted Vega against NVIDIA's P100 architecture (interestingly, not against NVIDIA's recently announced V100 architecture, which brings many specific improvements to this kind of workloads), delivering an almost 30% performance lead.



 

 



*View at TechPowerUp Main Site*


----------



## eidairaman1 (May 16, 2017)

Too bad this isn't the days of past where workstation boards could be bios modded for gaming.


----------



## xkm1948 (May 16, 2017)

Raja was shying away from Vega's gaming performance as much as he can. And the first card using Vega will be a Radeon PRO workstation card. Not keeping my hopes high then.  Realistically anywhere around 1080 performance would be something to celebrate.

Also that card he demoed in the end has 16GB of HBM2? Four stacks of HBM2 or Dual GPUs?

Also Computex will probably be a paper launch. As the availability is June.


----------



## Dante Uchiha (May 16, 2017)

"Vega has achieved it. Sniper Elite 4 at over 60 FPS on 4K"

vs.


----------



## Absolution (May 16, 2017)

> AMD pitted Vega against NVIDIA's P100 architecture (interestingly, not against NVIDIA's recently announced V100 architecture, which brings many specific improvements to this kind of workloads),



Welp, always one step behind AMD


----------



## efikkan (May 16, 2017)

"Polaris: better hardware"
"Up to 2.8× Performance-per-Watt"
PR usually stretches the truth, but that's far fetched.


----------



## Absolution (May 16, 2017)

Dante Uchiha said:


> "Vega has achieved it. Sniper Elite 4 at over 60 FPS on 4K"
> 
> vs.



No idea on settings though



efikkan said:


> "Polaris: better hardware"
> "Up to 2.8× Performance-per-Watt"
> PR usually stretches the truth, but that's far fetched.



You'd think they would let go of Polaris already


----------



## oxidized (May 16, 2017)

I think it's actually good that raja has toned it down a bit, it'll make look AMD ads less ridiculous. Pretty satisfied by the cpu side actually.


----------



## iO (May 16, 2017)

xkm1948 said:


> ...Also that card he demoed in the end has 16GB of HBM2? Four stacks of HBM2 or Dual GPUs?..



Two 8-Hi stacks.


----------



## Grings (May 16, 2017)

Raevenlord said:


> (interestingly, not against NVIDIA's recently announced V100 architecture, which brings many specific improvements to this kind of workloads)



Recently announced, so how would AMD have one already to use in a live demo?


----------



## xkm1948 (May 16, 2017)

Grings said:


> Recently announced, so how would AMD have one already to use in a live demo?



p100, not v100


----------



## Fluffmeister (May 16, 2017)

I'm just happy there was a graph confirming that 16GB is indeed four times the memory of Fury X.

But joking aside, I'm all for frontier editions, but what about consumer editions?

Also, I'm drunk.


----------



## efikkan (May 16, 2017)

Raevenlord said:


> In a benchmark, AMD pitted Vega against NVIDIA's P100 architecture (interestingly, not against NVIDIA's recently announced V100 architecture, which brings many specific improvements to this kind of workloads), delivering an almost 30% performance lead.


In all fairness, Volta is at least not released yet.

But in the Polaris/Polaris refresh PR materials AMD chose Maxwell as comparison, not the more relevant Pascal.



Absolution said:


> You'd think they would let go of Polaris already


But stay tuned, Vega is bringing 4× performance per watt. (yeah right)


----------



## Grings (May 16, 2017)

xkm1948 said:


> p100, not v100



i was referring to the 'interestingly not against' part, i know they used a p100


----------



## xkm1948 (May 16, 2017)

Fluffmeister said:


> I'm just happy there was a graph confirming that 16GB is indeed four times the memory of Fury X.
> 
> But joking aside, I'm all for frontier editions, but what about consumer editions?
> 
> Also, I'm drunk.




Is it just me or was Raja trying his best to avoid talking about Vega gaming performance? You would think that high end gaming GPU is great for profiting margins and Raja seems to avoid it as hard as he could.


----------



## Caring1 (May 16, 2017)

Fluffmeister said:


> I'm just happy there was a graph confirming that 16GB is indeed four times the memory of Fury X.
> 
> But joking aside, I'm all for frontier editions, but what about consuming editions?
> 
> Also, I'm drunk.


With the right sauce I suppose they might be easier to digest.


----------



## Toothless (May 16, 2017)

xkm1948 said:


> Raja was shying away from Vega's gaming performance as much as he can. And the first card using Vega will be a Radeon PRO workstation card. Not keeping my hopes high then.  Realistically anywhere around 1080 performance would be something to celebrate.
> 
> Also that card he demoed in the end has 16GB of HBM2? Four stacks of HBM2 or Dual GPUs?
> 
> Also Computex will probably be a paper launch. As the availability is June.





xkm1948 said:


> Is it just me or was Raja trying his best to avoid talking about Vega gaming performance? You would think that high end gaming GPU is great for profiting margins and Raja seems to avoid it as hard as he could.


You confuse me.


----------



## Fluffmeister (May 16, 2017)

I'll give AMD credit here, and as their own slide acknowledges... compute is where the money is, and like Nv it is where they should focus their efforts.

Technology should trickle down, giving it away for cheap right from day one might gain a few fans... but it doesn't pay the bills


----------



## W1zzard (May 17, 2017)

Fluffmeister said:


> compute is where the money is


Not true I think. Biggest margins yes. But there's big margins also in enthusiast gamer and much bigger volume


----------



## AsRock (May 17, 2017)




----------



## Basard (May 17, 2017)

We're all doomed....


----------



## Patriot (May 17, 2017)

I mean... nice marketing words... but nvidia is training 100k devs in deep learning, AMD isn't hardly supporting ROCm.


----------



## MrMilli (May 17, 2017)

_But we have to say that if AMD's Vega truly does deliver only a 1.5x improvement in FP32 performance (the one that's most critical for gaming at the moment), this probably paints AMD's Vega as fighting an uphill battle against NVIDIA's Pascal architecture (probably ending up somewhere between GTX 1070 and GTX 1080)._​
Why is that so? Theoretical FLOPS have no direct relation to actual performance when comparing in between different architectures, as has already been proven by the current situation between AMD and nVidia. If Vega can perform closer to its theoretical performance compared to Fiji then real life performance will be >50% increased. From all the slides/info I've seen, everything point to Vega being more efficient compared to Fiji.


----------



## Fluffmeister (May 17, 2017)

W1zzard said:


> Not true I think. Biggest margins yes. But there's big margins also in enthusiast gamer and much bigger volume



Yeah I guess, that is fair enough. But that in turn begs the question.... when is the enthusiast gamer gonna get their hands on one?

Is this card coming late June really first Vega based card coming to the market?


----------



## xkm1948 (May 17, 2017)

Do the Radeon Pro cards have Crimson gaming driver support? If so one can easily take this during launch and do game benchmarks on it.


----------



## yotano211 (May 17, 2017)

If this thing is going to be release in late June, when is the consumer card going to launch, late July?


----------



## TheGuruStud (May 17, 2017)

Why is there not a single video of this presentation online?

And where would they get a volta to test from? The video card fairy that appears out of ether? All 20 they made so far were sold LOL


----------



## cotes42 (May 17, 2017)

Can someone please tell me what the 4 screw holes are for at the end of the card! I have a Radeon Pro Duo with the same 4 screw holes and I asked everywhere and no one can make a guess. Thanks!


----------



## FordGT90Concept (May 17, 2017)

Server cases can have mounts on both sides of the card to prevent it from sagging.


----------



## ZoneDymo (May 17, 2017)

xkm1948 said:


> p100, not v100



yeah he knows, he is asking how AMD is suppose to pit their stuff against something that has only been announced...aka not here yet.


----------



## Camm (May 17, 2017)

People really need to get it into their heads that AMD is NOT an enthusiast company anymore. (but some of our interests may intersect).

Why? Well for better or for worse, we didn't buy their products (even when they did have superior designs), so they've moved onto other markets where they will.

So of course Vega is targeted at FP16, and is why we've had a tonne of talk from AMD about working with companies to target shaders for FP16 (games don't really need FP32 for most tasks). Most computational tasks use FP16, so guess where the bulk of Vega's design is!

Its like Ryzen with its CCX setup. Cross CCX calls hurt gaming tasks, but most batch computational tasks don't care about the increase in latency, and is beneficial overall as it allows AMD to avoid the increasingly large and complex amount of silicon in order to maintain cache coherency like Intel designs (allowing more cores, lower power usage and faster cores overall).


----------



## W1zzard (May 17, 2017)

Camm said:


> is targeted at FP16


If you have FP16 support then FP16 is twice as fast as FP32 because 16 vs 32 bits. Not a huge miracle I'd say


----------



## R-T-B (May 17, 2017)

eidairaman1 said:


> Too bad this isn't the days of past where workstation boards could be bios modded for gaming.



Wouldn't make a difference as cost is sure to be the limiting factor, not the bios.


----------



## xkm1948 (May 17, 2017)

So according to this


 

Vega will have less memory bandwidth than the FuryX?

If everything is optimized, shouldn't the card utilize more bandwidth?  Also the 16GB is only 2048bit I would assume?

On a side note, that metallic blue color surely is beautiful.  The gold one is not that bad as well.


----------



## FordGT90Concept (May 17, 2017)

HBM bandwidth is directly tied to how many stacks.  It has less bandwidth than Fiji because it has two stacks versus four and presumably at a slightly lower clock speed too.  Thing is, those two stacks are 8 x 1 GiB layers so it is much denser.

Vega should be cheaper to manufacturer than Fiji.


----------



## xkm1948 (May 17, 2017)

FordGT90Concept said:


> HBM bandwidth is directly tied to how many stacks.  It has less bandwidth than Fiji because it has two stacks versus four and presumably at a slightly lower clock speed too.  Thing is, those two stacks are 8 x 1 GiB layers so it is much denser.
> 
> Vega should be cheaper to manufacturer than Fiji.




I was hoping these to be 4 stacks of 4GB at 4096bit. Oh well.  

The rumor of Vega in limited quantity could be true. If the gaming performance doesn't justify Vega in gaming GPU ranks, I can totally see AMD rebrand it as deep learning GPU and sell it for way better profit than gaming GPU.


----------



## TheinsanegamerN (May 17, 2017)

Camm said:


> People really need to get it into their heads that AMD is NOT an enthusiast company anymore. (but some of our interests may intersect).


The existence of ryzen seems to disagree with you. It's performance is the definition of "enthusiast". It's a whole 2-5% slower in games (while still maintaining over 100FPS) and wallops intel in creative production, with better price/perf to boot and an 8 core chip in the standard socket that doesnt suck. 


> Why? Well for better or for worse, we didn't buy their products (even when they did have superior designs), so they've moved onto other markets where they will.


That simply isnt true. When AMD was competitive, IE the 5000 and 7000 series, They sold quite well. 

AMD could never get their driver game together when they had good hardware, and that scared away quite a few consumers. By the time they got their drivers mostly straightened out (400 series), their hardware was a generation behind. 

Given how complicated GPUs are, it will take years for RTG to straighten out the damage that AMD did to the ATI GPU. With ryzen selling well and vega finally out, R+D should finally start catching up to nvidia.


----------



## Camm (May 17, 2017)

TheinsanegamerN said:


> The existence of ryzen seems to disagree with you. It's performance is the definition of "enthusiast". It's a whole 2-5% slower in games (while still maintaining over 100FPS) and wallops intel in creative production, with better price/perf to boot and an 8 core chip in the standard socket that doesnt suck..



Quite the contrary.  Ryzen is an example where our interests align. However a truly aimed at enthusiasts chip wouldn't have used a LPP process for the chip, and would likely have been a single heterogeneous design. It just works out that because Intel has been bleeding the market from a monopoly perspective that Ryzen works 'well enough' in gaming that we are more than happy to lap up its other advantages.


----------



## cdawall (May 17, 2017)

And what do you know consumers are delayed again. I am no longer worried that I made the wrong choice. Thank you AMD for confirming I shouldn't wait. This is frustrating I just want to see the succeed, but it is like they are determined to let us down at every single turn.


----------



## Brusfantomet (May 17, 2017)

TheinsanegamerN said:


> Given how complicated GPUs are, it will take years for RTG to straighten out the damage that AMD did to the ATI GPU. With ryzen selling well and vega finally out, R+D should finally start catching up to nvidia.



The driver problems for the radeon line stems from the ATi time, not from the AMD time (first AMD designed radeon was the  HD 7970 with GCN. Previews designs was based on VLIW design dating from the 9700 series)


----------



## evernessince (May 17, 2017)

Camm said:


> Quite the contrary.  Ryzen is an example where our interests align. However a truly aimed at enthusiasts chip wouldn't have used a LPP process for the chip, and would likely have been a single heterogeneous design. It just works out that because Intel has been bleeding the market from a monopoly perspective that Ryzen works 'well enough' in gaming that we are more than happy to lap up its other advantages.



Yes, because you know what Ryzen needed more than AMD's engineers.  I must have missed that it's a rousing success.

FYI enthusiast does not equal forget efficiency, especially in today's world.


----------



## R00kie (May 17, 2017)

I seriously hope that HBCC isn't something that game developers have to code for, and it is just driver dependant...


----------



## TheGuruStud (May 17, 2017)

cdawall said:


> And what do you know consumers are delayed again. I am no longer worried that I made the wrong choice. Thank you AMD for confirming I shouldn't wait. This is frustrating I just want to see the succeed, but it is like they are determined to let us down at every single turn.



Give them a few billion and you wouldn't have to wait.

What's that? Intel stole it all and got away with it? No way....


----------



## cdawall (May 17, 2017)

TheGuruStud said:


> Give them a few billion and you wouldn't have to wait.
> 
> What's that? Intel stole it all and got away with it? No way....



Keep clinging to that one. I have played that card and played that card the entire time that the FX series was their lineup. It isn't going to change how piss poor the FX design was and how horridly it ruined their name in the market. That was on them, not intel, not the consumers. They did a bad job and payed for it with market share.






Mind showing me on here where intel cheated AMD? oh wait that was the P4 era. 2006 was the release of the Core 2 architecture and AMD never came back, they never released a product that performed better in that entire time either.


----------



## FordGT90Concept (May 17, 2017)

2006 was the year Core 2 Duo debuted and AMD bought out ATI (so AMD had no capital).  Bulldozer debuted in 2011.


----------



## cdawall (May 17, 2017)

FordGT90Concept said:


> 2006 was the year Core 2 Duo debuted and AMD bought out ATI (so AMD had no capital).  Bulldozer debuted in 2011.



Yup core 2 marked the end for AMD having competing market share. Bulldozer only made it worse. Here is to the hope that Ryzen can start the deep dig out.


----------



## OneCool (May 17, 2017)

Not surprised really. I'm sure I'm not the only one that's noticed the trend in AMDs releases.
Massive work station/rendering CPUs, upgrades to their workstation pro line GPUs...the threadripper!!?? Coming too

AMD invested a lot into consule gaming with Sony, Nintendo and Xbox.
Why would they want to improve their PC gaming hardware?


----------



## sweet (May 17, 2017)

xkm1948 said:


> So according to this
> View attachment 88043
> 
> Vega will have less memory bandwidth than the FuryX?
> ...



They introduced new compression with Polaris already. Less bandwidth but more effective now.


----------



## ShurikN (May 17, 2017)

Oh my god, the watercooled golden version is sick. Probably the best looking piece of hardware I've seen since In Win 904.


----------



## RejZoR (May 17, 2017)

People complaining Vega isn't positioned against V100. Dudes, all of you, V100 DOESN'T EXIST YET.

Also, what's up with "It's not a gamer card" and then they place it on a graph next to R9 Fury X. Ugh?


----------



## RejZoR (May 17, 2017)

Absolution said:


> Welp, always one step behind AMD



Why people assume it's one steep behind just because their release schedules don't line up as they used to and AMD's isn't the first? Time wise, sure, it's "late", but that's how it is. Product releases don't match up with release timeframes anymore and I don't think they'll line up any time soon. I mean, that would mean releasing Vega now and releasing Navi sometime in Fall 2017. Which is just not going to happen.


----------



## medi01 (May 17, 2017)

Absolution said:


> Welp, always one step behind AMD


Oh, please, stop the BS, Volta cards are not available yet, how on earth would they have compared vs it?




Patriot said:


> but nvidia is training 100k dev


What on earth are you talking about? Not even Microsoft has 100k devs.


----------



## Nokiron (May 17, 2017)

medi01 said:


> What on earth are you talking about? Not even Microsoft has 100k devs.


Not in-house...

https://venturebeat.com/2017/05/09/...00000-developers-on-deep-learning-ai-in-2017/


----------



## medi01 (May 17, 2017)

Nokiron said:


> Not in-house...


Oh, "going to train 100k developers"
They are also in 50 million cars, aren't they? 
"Much faster than 480" lol.



cdawall said:


> Keep clinging to that one. I have played that card and played that card the entire time that the FX series was their lineup. It isn't going to change how piss poor the FX design was and how horridly it ruined their name in the market. That was on them, not intel, not the consumers. They did a bad job and payed for it with market share.
> .



Come on, more expensive, more power hungry, SLOWER Prescott outsold Athlon 64.

God knows what "market share" is shown on your picture, which country it is and if it is number of units or revenue. I'd bet on the former.


----------



## Nokiron (May 17, 2017)

medi01 said:


> Oh, "going to train 100k developers"
> They are also in 50 million cars, aren't they?
> "Much faster than 480" lol.


This statement makes no sense to me. What do you mean by this?


----------



## medi01 (May 17, 2017)

Nokiron said:


> This statement makes no sense to me. What do you mean by this?


I mean that nvidia's claims are rather far from reality.


----------



## Assimilator (May 17, 2017)

medi01 said:


> Come on, more expensive, more power hungry, SLOWER Prescott outsold Athlon 64.



Nobody's disputing that. What we are saying is that after the P4/Athlon 64 era, when AMD should have been able to capitalise, they instead shot themselves in the foot - multiple times - with the uncompetitive Bulldozer architecture.


----------



## Ebo (May 17, 2017)

Am I the only one that really likes that AMD isent saying anything about the gaming version of "Vega" ?.

Why should they follow Nvidias release cycle and make a bigger and better card than Nvidia has as their top tier card ? It isent where the money is, thats mainstream gaming and *not *in the enthusiast sigment.

Just look at RX480/580 they have done well and made a lot of much needed money for AMD, and belive me when I predict next quarters financial status of AMD, is going to be a hell of a lot better with Ryzen sale numbers comming out, and now Vega to fuel the GFX department after.

I for one dosent like early leaks, fake news, call it whatever you want, the public *dosent* have a right to know until release date, and I second that 100%.

For the moment, if you want the best of the best, just go green camp(GFX1080TI), its over priced in most countries anyway.
*If* Vega can compete within a margin of 5% with a lower pricetag, I would say its a slamdunk. Just let the top tier cards  all belong to Nvidia. Let them use a lot of money on developing a new arkitecture just for the bragging rights,  and low profit pr unit, then you fill up the mainstream section with competive alternatives and make a much better profit  pr unit in the end, I really like this a lot, since AMD has been bleeding a lot of money which they  need to come back again.


----------



## medi01 (May 17, 2017)

Assimilator said:


> What we are saying is that after the P4/Athlon 64 era, when AMD should have been able to capitalise, they instead shot themselves in the foot - multiple times - with the uncompetitive Bulldozer architecture.



They struggled with fab expenses and had to sell it, because they actually were not able to properly capitalize on superior products is what my memory tells me.
Similar story with Fermi outselling fantastic products AMD had. (can't blame this one on strongarming though, basically underdog always has to fight an uphill battle)

There is something about "their own fault" narrative I simply cannot grasp.


----------



## the54thvoid (May 17, 2017)

medi01 said:


> They struggled with fab expenses and had to sell it, because they actually were not able to properly capitalize on superior products is what my memory tells me.
> Similar story with Fermi outselling fantastic products AMD had. (can't blame this one on strongarming though, basically underdog always has to fight an uphill battle)
> 
> There is something about "their own fault" narrative I simply cannot grasp.



Fermi was hot and hungry but Nvidia did something people thought not possible. They dropped a full core Fermi as the 580.
It helped stem the bleeding from a power monster and set them up well.


----------



## springs113 (May 17, 2017)

TheinsanegamerN said:


> The existence of ryzen seems to disagree with you. It's performance is the definition of "enthusiast". It's a whole 2-5% slower in games (while still maintaining over 100FPS) and wallops intel in creative production, with better price/perf to boot and an 8 core chip in the standard socket that doesnt suck.
> 
> That simply isnt true. When AMD was competitive, IE the 5000 and 7000 series, They sold quite well.
> 
> ...


On the 5000 series AMD did have the better product but in actuality Nvidia still outsold   them.  I believe adoredtv did a video on this.


----------



## cdawall (May 17, 2017)

medi01 said:


> Come on, more expensive, more power hungry, SLOWER Prescott outsold Athlon 64.
> 
> God knows what "market share" is shown on your picture, which country it is and if it is number of units or revenue. I'd bet on the former.



Market share didn't plummet until core 2 duo. AMD should have exceeded Intels market share during the p4 era, but bad business practices prevented that. 

As for the chart that's the world Wide one that was posted on tpu not even like two months ago... So I don't know ask the news editors?


----------



## RejZoR (May 17, 2017)

Or how people were still buying TOTALLY inferior GeForce FX cards even though any common sense would tell you to stay away from them like vampire from garlic essence. But when roles are reversed and AMD is just mildly worse at few specific things OMG OMG AMD SUXORZ (because only thing slightly worse is power consumption, in case of Polaris). Sometimes you just can't understand people.


----------



## cdawall (May 17, 2017)

RejZoR said:


> Or how people were still buying TOTALLY inferior GeForce FX cards even though any common sense would tell you to stay away from them like vampire from garlic essence. But when roles are reversed and AMD is just mildly worse at few specific things OMG OMG AMD SUXORZ (because only thing slightly worse is power consumption, in case of Polaris). Sometimes you just can't understand people.



Slightly worse? AMD doesn't offer any thing in the HEDT market period.


----------



## dozenfury (May 17, 2017)

Vega has been hyped as a gamer card for over a year (also keep in mind we're almost a year out from the 1070/1080 releases), and just now less than 3 weeks before E3 it's suddenly switched up to be a professional/deep learning card instead of a gaming one?

It's amazing what AMD has been able to accomplish in the David vs. Goliath competition with Intel and Nvidia.  But the reality is that there's no way with their relatively tiny R&D they can hang with them.  Marketing is cheap by comparison and AMD is great at that.  So over and over it's the same story with incredible claims pre-release that get people salivating, and then varying degrees of let down and defending AMD after release.  AMD is to be commended for what they've been able to do.  But I'd be kicking myself had I been waiting for a year now for Vega to learn it's both delayed again in gaming form, and that it's a professional card first.


----------



## RejZoR (May 17, 2017)

cdawall said:


> Slightly worse? AMD doesn't offer any thing in the HEDT market period.



I'm talking Polaris vs Pascal (in its respective segment aka RX 580 vs GTX 1060). Only thing RX 580 can be objected is power consumption and even that one quite frankly isn't bad at all. They aren't loud, they aren't slow. And yet everyone is taking a piss at them almost more than anyone ever has back in the day for GeForce FX which was hot, loud, more power hungry and had a terminal design flaw in the rendering pipeline.


----------



## efikkan (May 17, 2017)

Did we actually get any new useful information about Vega yesterday?


----------



## springs113 (May 17, 2017)

What are you all complaining about.   This conference was not for consumers"gamers" it was basically for shareholders and investors.   Our time is at at the end of the month.   Vega is a segmented product.   When a company holds a press release that outlines it's strategy for the next 2-5 years, how is it so that people hoping to hear info on an unreleased get upset because they didn't hear what they wanted to hear.   I'm pretty sure that's what Computex is all about (us gamers).  AMD  did not hold any of you guys' hand and told you all not to make a purchase.

@efikkan yes and no....nothing concrete just things to majkere assumptions on again.


----------



## cdawall (May 17, 2017)

RejZoR said:


> I'm talking Polaris vs Pascal (in its respective segment aka RX 580 vs GTX 1060). Only thing RX 580 can be objected is power consumption and even that one quite frankly isn't bad at all. They aren't loud, they aren't slow. And yet everyone is taking a piss at them almost more than anyone ever has back in the day for GeForce FX which was hot, loud, more power hungry and had a terminal design flaw in the rendering pipeline.



Performance of a 1060 and the power consumption of a 1070/1080. Depending on the card they are loud under load, even worse when you have two of them.


----------



## RejZoR (May 17, 2017)

That's the exact thing I don't understand people. It doesn't fucking matter. And it can't be loud, because if it is, then my GTX 980 is also stupendously loud. And I can hardly hear it. We used to all run 300W god damn graphic cards and no one bat an eye. But now all of a sudden, power consumption goes above framerate in priority which is like the basics of why we even bother buying this shit.


----------



## cdawall (May 17, 2017)

RejZoR said:


> That's the exact thing I don't understand people. It doesn't fucking matter. And it can't be loud, because if it is, then my GTX 980 is also stupendously loud. And I can hardly hear it. We used to all run 300W god damn graphic cards and no one bat an eye. But now all of a sudden, power consumption goes above framerate in priority which is like the basics of why we even bother buying this shit.



Power consumption has been a topic for discussion pretty much since they added power connectors to the card.

Remember I ran two of these prior to my 1080ti. They were some of the better cards out there. They were also hot, loud and slower than one Ti.

Now with a bios mod I have already seen over 380w worth of consumption from my Ti, but that's a far cry less than the 520w I saw from my pair of 480's.


----------



## RejZoR (May 17, 2017)

Average Joe doesn't buy a pair of cards when they mostly can't afford a single one... And yes, you're all overdramatizing with the power consumption. Also, RX 480 is a 150W card (after the applied driver fix). That's 300W for a pair, not 520W... Do I really have to also drag out the fact that 2 cards were NEVER more efficient than single GPU card? It's physically impossible.


----------



## cdawall (May 17, 2017)

RejZoR said:


> Average Joe doesn't buy a pair of cards when they mostly can't afford a single one... And yes, you're all overdramatizing with the power consumption. Also, RX 480 is a 150W card (after the applied driver fix). That's 300W for a pair, not 520W... Do I really have to also drag out the fact that 2 cards were NEVER more efficient than single GPU card? It's physically impossible.



Couple of things 520w was dramatized that was actual power draw and it was enough that I had to go from my 750w platinum seasonic to a 1200w for stability. 

Raja himself said that two 480's was more efficient than a 1080.


----------



## BiggieShady (May 17, 2017)

RejZoR said:


> they place it on a graph next to R9 Fury X.


Let's look at it as a comparison of Fiji chip versus Vega chip


----------



## cdawall (May 17, 2017)

BiggieShady said:


> Let's look at it as a comparison of Fiji chip versus Vega chip



They can't put it next to the pro duo which was the last professional card released because it is slower. Unluckily people haven't quite taken that into thought yet. If it is slower than a pair of Fiji cards it's slower than the 1080ti let alone a titan.


----------



## GC_PaNzerFIN (May 17, 2017)

I don't know about you guys but for me the Blue VEGA Frontier Edition pretty much is the ugliest looking graphics card I have ever seen. 
RX VEGA probably has half the HBM memory, but something is obviously problem at the red team unable to deliver VEGA volume in time resorting in *limited edition pro card launch. Niche product is all good and well, but AMD needs the next thing for the gaming masses or NVIDIA can cash cow yet another holiday season.


----------



## RejZoR (May 17, 2017)

cdawall said:


> Couple of things 520w was dramatized that was actual power draw and it was enough that I had to go from my 750w platinum seasonic to a 1200w for stability.
> 
> Raja himself said that two 480's was more efficient than a 1080.



How do you get 520W from 2x 150W cards. Either you're shit at math or you're bending laws of physics. Because of that PCIe power draw scandal, it has been measured over and over and it's in fact 150W consumption. So, where you get those 520W is beyond me...


----------



## xkm1948 (May 17, 2017)

AMD can totally wait until Q4 to release the Vega to compete with mid-range Volta 1160. It all make sense now. They probably never intended to let Vega be a top end card. Vega may end up just like Polaris, battling the xx60 of volta.


----------



## Nokiron (May 17, 2017)

RejZoR said:


> How do you get 520W from 2x 150W cards. Either you're shit at math or you're bending laws of physics. Because of that PCIe power draw scandal, it has been measured over and over and it's in fact 150W consumption. So, where you get those 520W is beyond me...


Or you know, it could be option number three. He was not using reference RX480s. When I used my Nitro+ I was pulling 80W+ more than a stock RX480 without overclocking.


----------



## efikkan (May 17, 2017)

xkm1948 said:


> AMD can totally wait until Q4 to release the Vega to compete with mid-range Volta 1160. It all make sense now. They probably never intended to let Vega be a top end card. Vega may end up just like Polaris, battling the xx60 of volta.


There will be no consumer versions of Volta anytime soon.
Currently, AMD is only competitive in the low-end, they really need to have some decent mid-range chips with high volume sales.


----------



## cdawall (May 17, 2017)

RejZoR said:


> How do you get 520W from 2x 150W cards. Either you're shit at math or you're bending laws of physics. Because of that PCIe power draw scandal, it has been measured over and over and it's in fact 150W consumption. So, where you get those 520W is beyond me...



efficiency curve, overclocking, additional voltage, etc.



Nokiron said:


> Or you know, it could be option number three. He was not using reference RX480s. When I used my Nitro+ I was pulling 80W+ more than a stock RX480 without overclocking.



Thank you for answering this for me...I don't know how many times I have to state "overclocked", alas I guess one more.


----------



## RejZoR (May 17, 2017)

Graphic cards have never been more efficient per performance delivered and everyone is freaking out like there is no tomorrow. That's why. Overclocked or not. We've had cards that delivered half the performance for twice the power consumption and it was like "raised brow a bit". But here it's "OH MAH GOD IT'S LA TERRIBLÉ" That's why I don't understand any of you whining here.


----------



## xkm1948 (May 17, 2017)

ReJzor you should totally get a Vega FE when it id available and bench it youself to shut up all the "haters"

Im counting on ya!




efikkan said:


> There will be no consumer versions of Volta anytime soon.
> Currently, AMD is only competitive in the low-end, they really need to have some decent mid-range chips with high volume sales.



Sounds about right, AMD is more for low end ~ ultra low end.


----------



## cdawall (May 17, 2017)

RejZoR said:


> Graphic cards have never been more efficient per performance delivered and everyone is freaking out like there is no tomorrow. That's why. Overclocked or not. We've had cards that delivered half the performance for twice the power consumption and it was like "raised brow a bit". But here it's "OH MAH GOD IT'S LA TERRIBLÉ" That's why I don't understand any of you whining here.



I knew the 480's pulled some juice when I purchased them, however I did not exactly plan for them to pull 260 watts a pop. I have had some power hungry cards in the past (290, furys, fermi etc) power consumption doesn't bother me so much as the performance lack in comparison. I was also immensely disappointed in the GTX 1070 after testing.

This is also colored by fighting those cards for almost a year. Issue after issue with drawing too much juice through the board, running too hot (even stock) for crossfire use) etc. When they worked they easily equaled a 1080 even with some overclock on it, but that didn't translate to every game.


----------



## RejZoR (May 17, 2017)

Hot were only reference models. The aftermarket ones were fine in terms of thermals and noise.


----------



## cdawall (May 17, 2017)

RejZoR said:


> Hot were only reference models. The aftermarket ones were fine in terms of thermals and noise.



Put two of them in a MATX case and tell me how that goes. Again you haven't got one, let alone two of these cards. I actually owned them, tested with the super low wattage RX480 RS models from XFX that actually pulled _less_ power than reference as well as my Nitro+ cards that are on the other end of that. I also tested with clockspeeds in the 1480mhz realm which is to 5-10% of RX480's on air. They were hot, load and power hungry. I quite honestly don't know what you are arguing you don't even have the cards?


----------



## RejZoR (May 17, 2017)

Put two Saturn V rockets in a shed and tell me how that goes. Yes, you're over dramatizing. Over and over and over. When only thing you keep bringing up is CrossfireX and some ridiculous tiny case scenarios, you're really grasping at straws here. People who buy these things generally have at least midi tower and they pretty much never crossfire them. C'mon, one would expect out of all people you'd be the one to know these things.

I do have GTX 980 which is essentially the same thing. And when it's overclocked it's scorching hot and also becomes noisy. Performance projections are about the same.


----------



## cdawall (May 17, 2017)

RejZoR said:


> Put two Saturn V rockets in a shed and tell me how that goes. Yes, you're over dramatizing. Over and over and over. When only thing you keep bringing up is CrossfireX and some ridiculous tiny case scenarios, you're really grasping at straws here. People who buy these things generally have at least midi tower and they pretty much never crossfire them. C'mon, one would expect out of all people you'd be the one to know these things.
> 
> I do have GTX 980 which is essentially the same thing. And when it's overclocked it's scorching hot and also becomes noisy. Performance projections are about the same.



520w is 520w's. That isn't a question that actually happens. 

They were fine in my larger matxish. They draw more power than the 980 both oc'd and there are plenty of people who run crossfire. That isn't rare, that isn't new. 

The cards weren't quiet when new, or if set to the silent bios they throttled the whole time. The same issue all of them have faced. Owners of them know this.


----------



## eidairaman1 (May 17, 2017)

Link from rtg if not the same.

https://m.facebook.com/story.php?story_fbid=1704484156511581&id=1622923754667622


----------



## medi01 (May 18, 2017)

cdawall said:


> Depending on the card they are loud under load,


No, they aren't, in fact, most AIB 580's are quieter than 1060s
It only consumes much on stock voltage and comes with unique driver features ("chill") that can cut power consumption by two thirds.

Not that 40-80W of total power consumptions mattered in this context.



xkm1948 said:


> AMD can totally wait until Q4 to release the Vega to compete with mid-range Volta 1160. It all make sense now. They probably never intended to let Vega be a top end card. Vega may end up just like Polaris, battling the xx60 of volta.


Because 1060 can take on 980Ti, right?
Let me remind you citation of one of the TPU users: "OC 1080 vs my OC 980Ti is only 11% faster".
/sigh

Wild expectations from Volta, when we got barely better perf/$ from Pascal, which was a MONSTROUS process jump previous gen, eh?

I recall nintendo Switch expectations were even crazier, it was supposed to beat PS4/Xbone (yey), cause, you know, reality distortion field many team green users live in.
Oh, it ended up a tad faster than WiiU.


----------



## cdawall (May 18, 2017)

medi01 said:


> No, they aren't, in fact, most AIB 580's are quieter than 1060s
> It only consumes much on stock voltage and comes with unique driver features ("chill") that can cut power consumption by two thirds.
> 
> Not that 40-80W of total power consumptions mattered in this context.



The 480 has that same feature. Let me tell you how well it worked 

Also which AIB card specifically are we talking here... Quite a few models with overclocks were a lot more than 40-80w but then again even 40-80 watts is over half the consumption of a 1060

Edit:

Looks like over 100w in games, louder than the reference 1060 as well. All while just barely edgying it out in gaming performance. 

https://www.techpowerup.com/reviews/Sapphire/RX_580_Nitro_Plus/28.html


----------



## medi01 (May 19, 2017)

https://www.computerbase.de/2017-04...-bios-test/2/#abschnitt_lautstaerke__kuehlung




cdawall said:


> The 480 has that same feature. Let me tell you how well it worked


Tell me.


----------

