Wednesday, August 22nd 2018

AMD 7nm "Vega" by December, Not a Die-shrink of "Vega 10"

AMD is reportedly prioritizing its first 7 nanometer silicon fabrication allocations to two chips - "Rome," and "Vega 20." Rome, as you'll recall, is the first CPU die based on the company's "Zen 2" architecture, which will build the company's 2nd generation EPYC enterprise processors. "Vega 20," on the other hand, could be the world's first 7 nm GPU.

"Vega 20" is not a mere die-shrink of the "Vega 10" GPU die to the 7 nm node. For starters, it is flanked by four HBM2 memory stacks, confirming it will feature a wider memory interface, and support for up to 32 GB of memory. AMD at its Computex event confirmed that "Vega 20" will build Radeon Instinct and Radeon Pro graphics cards, and that it has no plans to bring it to the client-segment. That distinction will be reserved for "Navi," which could only debut in 2019, if not later.
Source: VideoCardz
Add your own comment

194 Comments on AMD 7nm "Vega" by December, Not a Die-shrink of "Vega 10"

#2
Mussels
Freshwater Moderator
I was planning on getting a 2080Ti, but seeing those prices and now seeing this news...
(yes i know it'll be 2019, but the 20x0 cards will still be over $1.5K here in aus by then)
Posted on Reply
#3
DeathtoGnomes
MusselsI was planning on getting a 2080Ti, but seeing those prices and now seeing this news...
(yes i know it'll be 2019, but the 20x0 cards will still be over $1.5K here in aus by then)
This news has been in the works for sometime, not surprised AMD is trying to steal Nvidias steam from 20x0 . Navi may not be til 2nd half of 2019.
Posted on Reply
#4
cucker tarlson
MusselsI was planning on getting a 2080Ti, but seeing those prices and now seeing this news...
(yes i know it'll be 2019, but the 20x0 cards will still be over $1.5K here in aus by then)
AMD at its Computex event confirmed that "Vega 20" will build Radeon Instinct and Radeon Pro graphics cards, and that it has no plans to bring it to the client-segment.

enthusiast gamers are at the very end of their scope,forget they're gonna prioritize or innovate in that segment.
Posted on Reply
#5
TheinsanegamerN
And AMD, once again, leaves an entire segment to nvidia for a third generation in a row.

AMD sould just sell radeon by this point. They can do really well with GPUs, or CPUs, but not both. Sell Radeon to somebody that can actually produce decent GPUs. Vega was a year late and many watts too high, and is about to get eclipsed by a new generation of GPUs from nvidia.
Posted on Reply
#6
THE_EGG
MusselsI was planning on getting a 2080Ti, but seeing those prices and now seeing this news...
(yes i know it'll be 2019, but the 20x0 cards will still be over $1.5K here in aus by then)
Yep, I was so sad when I saw $1899 for the Ti :cry:

Retail pricing for AIB cards isn't looking much better either....
Posted on Reply
#7
R0H1T
DeathtoGnomesThis news has been in the works for sometime, not surprised AMD is trying to steal Nvidias steam from 20x0 . Navi may not be til 2nd half of 2019.
Is "Navi" the next gaming chip, end of the line for GCN?
Posted on Reply
#8
cucker tarlson
It's the next polaris, to be used in next gen consoles and mid range gpus.
Posted on Reply
#9
mtcn77
TheinsanegamerNAnd AMD, once again, leaves an entire segment to nvidia for a third generation in a row.

AMD sould just sell radeon by this point. They can do really well with GPUs, or CPUs, but not both. Sell Radeon to somebody that can actually produce decent GPUs. Vega was a year late and many watts too high, and is about to get eclipsed by a new generation of GPUs from nvidia.
AMD is winning on density. They have features unsupported by Directx, still, which could turn the tables. You are free to doubt that balance however the case on gpu-mining should provide pointers on how they really stack up against one another.
Posted on Reply
#10
Vayra86
R0H1TIs "Navi" the next gaming chip, end of the line for GCN?
I think they will still use GCN in some revised form, but just glue them together like TR/EPYC
Posted on Reply
#11
Vya Domus
Vayra86Sadness
MusselsI was planning on getting a 2080Ti, but seeing those prices and now seeing this news...
This is going to be a turning point. If the 20 series is successful and people pay up it will likely mark the end of any effort AMD will make to compete in the high end mainstream PC market. Should they come up with something better and much cheaper, they'll be at a disadvantage because they'll have much lower margins on their products. And if they want to have the same margins then they'll have to ask the same prices, either way the consumer will be screwed.

You reap what you sow, dear consumer. Anyone can ask whatever amount of money they want but that price becomes common place only when it is accepted on a large scale. Nvidia keeps rising the bar, are people going to accept it ? That's all it comes down to.
Posted on Reply
#12
wurschti
TheinsanegamerNAnd AMD, once again, leaves an entire segment to nvidia for a third generation in a row.

AMD sould just sell radeon by this point. They can do really well with GPUs, or CPUs, but not both. Sell Radeon to somebody that can actually produce decent GPUs. Vega was a year late and many watts too high, and is about to get eclipsed by a new generation of GPUs from nvidia.
*cough* Intel *cough*
Posted on Reply
#13
DeathtoGnomes
Vayra86I think they will still use GCN in some revised form, but just glue them together like TR/EPYC
"glue" is such an Intel word.
Posted on Reply
#14
BluesFanUK
C'mon AMD, you've already got Intel running scared, focus on Nvidia now.
Posted on Reply
#15
Fluffmeister
A 32GB HBM2 7nm chip is gonna be expensive, it's no surprise they are focusing on the HPC/Pro market where money is. Volta already has large chunks of the market sewn up and Turing based Quadros are going reign supreme in the pro sector... they need 7nm up their competitiveness.
Posted on Reply
#16
cucker tarlson
Even with 7nm and 32gb, they'll have a hard time against quadro rtx's. 48gb,nvlink,rt/dl specific hardware + software support. They'd have a more chance to succeed if they ran against geforce cards in the enthusiast segment not quadros and teslas.
Posted on Reply
#17
Vayra86
Vya DomusThis is going to be a turning point. If the 20 series is successful and people pay up it will likely mark the end of any effort AMD will make to compete in the mainstream PC market. Should they come up with something better and much cheaper, they'll be at a disadvantage because they'll have much lower margins on their products. And if they want to have the same margins then they'll have to ask the same prices, either way the consumer will be screwed.

You reap what you sow, dear consumer. Anyone can ask whatever amount of money they want but that price becomes common place only when it is accepted on a large scale.
True. But it was never a healthy market to begin with, when ATI fell you could already see this scenario. Ironically most GPU makers have themselves to blame for failing; if you dont score designnwins and capitalize on them, you just lose. Its a tough industry but also one where real progress and innovation gets rewarded well.

I dont think this is even up to consumers really. We get the worst chips on each wafer!
Posted on Reply
#18
mtcn77
cucker tarlsonEven with 7nm and 32gb, they'll have a hard time against quadro rtx's. 48gb,nvlink,rt/dl specific hardware + software support. They'd have a more chance to succeed if they ran against geforce cards in the enthusiast segment not quadros and teslas.
IF has 2 times more bandwidth than NVLink, afaik.
Posted on Reply
#20
TEAMRED
Seeing the specs, I suspect 2080 will have over 10% improvement over 1080ti in general. That's probably why they defined a new benchmark and try to impress you with that. I hate the fact they didn't go for 7nm and try to ask the consumers to pre-pay their immature technology, wait for 30 series or Na'vi
Posted on Reply
#21
mtcn77
cucker tarlsonIF is on die.
Still, that is how TR works.
Posted on Reply
#22
cucker tarlson
TEAMREDSeeing the specs, I suspect 2080 will have over 10% improvement over 1080ti in general. That's probably why they defined a new benchmark and try to impress you with that. I hate the fact they didn't go for 7nm and try to ask the consumers to pre-pay their immature technology, wait for 30 series or Na'vi
Really ? I really thought you'd be for early adoption of rt cores. My word....
Posted on Reply
#23
mtcn77
TEAMREDSeeing the specs, I suspect 2080 will have over 10% improvement over 1080ti in general. That's probably why they defined a new benchmark and try to impress you with that. I hate the fact they didn't go for 7nm and try to ask the consumers to pre-pay their immature technology, wait for 30 series or Na'vi
45 milliseconds to render a scene which a competing 4xV100 multi-gpu took 55 milliseconds to render is the benchmark you mention.
Posted on Reply
#24
TheinsanegamerN
mtcn77AMD is winning on density. They have features unsupported by Directx, still, which could turn the tables.
Yes, all those special features, that AMD fans were screaming would save AMD. DX12, async, Mantle, vulkan, tressFX, the list goes on.

Special features dont matter unless you can get most developers to use it, and only nvidia's gameworks has seen such success. For 5 years "special features" were going to be AMD's ace up their sleeve, and for 5 years Nvidia has dominated them on sales. AMD needs to focus less on special features they cant support and more on producing fast GPUs.
You are free to doubt that balance however the case on gpu-mining should provide pointers on how they really stack up against one another.
Performance in one application =? performance overall. You could just as easily point to nvidia's gaming performance and CUDA performance in pro applications and say "You can doubt it, but that shows how they really stack up".

Regardless of how good VEGA is (which is highly subjective based on application), VEGA was over a year late to market, power hungry, with very little OC capability, was hampered by minuscule availability and HBM production. The result was Nvidia capturing a huge portion of the market using now 2 year old GPUs because AMD never bothered to show up. You cant just leave an entire generation behind and expect people to continue supporting your brand.

AMD now considering leaving a second generation to nvidia does two things. First, it creates an even stronger idea that AMD simply cant compete on the high end, reinforcing the "mindshare" that many AMD fans are convinced exists. in reality, it is people being uncertain about investing in a brand when said brand cannot consistently show UP to compete. The second is that it gives nvidia a captive market to milk for $$$, which helps keep them economically ahead of AMD, able to make bigger investments in development of new tech, and perpetually keeping AMD in a position of catching up.
Posted on Reply
#25
cucker tarlson
mtcn77Still, that is how TR works.
yes, but that's multi chip on one die, not single big chips like vega 20. that's what I meant, nvidia has a more efficient way to connect those.
Posted on Reply
Add your own comment
Nov 14th, 2024 19:20 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts