Wednesday, June 6th 2018
AMD "Vega" Outsells "Previous Generation" by Over 10 Times
At its Computex presser, leading up to its 7 nm Radeon Vega series unveil, AMD touched upon the massive proliferation of the Vega graphics architecture, which is found not only in discrete GPUs, but also APUs, and semi-custom SoCs of the latest generation 4K-capable game consoles. One such slide that created quite some flutter reads that "Vega" shipments are over 10 times greater than those of the "previous generation."
Normally you'd assume the previous-generation of "Vega" to be "Polaris," since we're talking about the architecture, and not an implementation of it (eg: "Vega 10" or "Raven Ridge," etc.). AMD later, at its post event round-table, clarified that it was referring to "Fiji," or the chip that went into building the Radeon R9 Fury X, R9 Nano, etc., and comparing its sales with that of products based on the "Vega 10" silicon. Growth in shipments of "Vega" based graphics cards is triggered by the crypto-mining industry, and for all intents and purposes, AMD considers the "Vega 10" silicon to be a commercial success.
Normally you'd assume the previous-generation of "Vega" to be "Polaris," since we're talking about the architecture, and not an implementation of it (eg: "Vega 10" or "Raven Ridge," etc.). AMD later, at its post event round-table, clarified that it was referring to "Fiji," or the chip that went into building the Radeon R9 Fury X, R9 Nano, etc., and comparing its sales with that of products based on the "Vega 10" silicon. Growth in shipments of "Vega" based graphics cards is triggered by the crypto-mining industry, and for all intents and purposes, AMD considers the "Vega 10" silicon to be a commercial success.
61 Comments on AMD "Vega" Outsells "Previous Generation" by Over 10 Times
Edit: To add, this is for BOTH Nvidia and AMD.
I'm guessing part of why Nvidia has such a lead is because of their better perf/W. AMD fans will tell you that's inconsequential, but it costs AMD pretty much the whole laptop market.
EDIT: 144fps where it matters and freesync when it doesn't. 142fps stable on far cry 5 or rise of the tomb raider is nothing it being a singleplayer game mainly. You just want a smooth experience. Something new and online(fps-like) matters(as it affects input lag) though like Squad, Quake Champions, next battlefields etc.
EDIT2: Just wanted to add that not talking about 2k/4k. That's not for real 144hz/144fps crowd with current game optimizations. I mean give me a benchmark with a single nvidia 1080ti giving 144fps stable even on 2k with latest deus ex(never going below on deus ex latest on ultra, thats what it would mean). I think it will take 2-4 years before we have cards like that, so without freesync/g-sync can't go above fullhd yet if you are about to go different sync tech.
Regardless, consoles have literally zero to do with any of all this and frankly its an uninteresting subject. You don't choose your parts in there anyway and you cannot even gauge performance from what architecture is used either. With your third post on this forum, that's an odd statement to make. Start a blog I'd say...
EDIT: You also joined like few years ago, 2014. Been here from 2006. I didnt like having blogs btw, closed all my 3 blogs 18 years ago.
There is no way you (and others) could convince me that something so easily tunable matters that much. How easily? Just look at Polaris Pro 460 with 16 CU in MacBook Pros. They manage to cut the power in half, but lost only a third of performance (benchmarks exist, look them up). Efficiency can be a minor factor, but don't blow it out of the proportions please.
I have no need for further discussion on this topic (I think I proved my point), so I'm moving out ...
But I won't leave your ramblings unanswered, just in case some noob stumbles upon them.