Friday, July 12th 2019

AMD Retires the Radeon VII Less Than Five Months Into Launch

AMD has reportedly discontinued production of its flagship Radeon VII graphics card. According to a Cowcotland report, AMD no longer finds it viable to produce and sell the Radeon VII at prices competitive to NVIDIA's RTX 2080, especially when its latest Radeon RX 5700 XT performs within 5-12 percent of the Radeon VII at less than half its price. AMD probably expects custom-design RX 5700 XT cards to narrow the gap even more. The RX 5700 XT has a much lesser BOM (bill of materials) cost compared to the Radeon VII, due to the simplicity of its ASIC, a conventional GDDR6 memory setup, and far lighter electrical requirements.

In stark contrast to the RX 5700 XT, the Radeon VII is based on a complex MCM (multi-chip module) that has not just a 7 nm GPU die, but also four 32 Gbit HBM2 stacks, and a silicon interposer. It also has much steeper VRM requirements. Making matters worse is the now-obsolete "Vega" architecture it's based on, which loses big time against "Navi" at performance/Watt. The future of AMD's high-end VGA lineup is uncertain. Looking at the way "Navi" comes close to performance/Watt parity with NVIDIA on the RX 5700, AMD may be tempted to design a larger GPU die based on "Navi," with a conventional GDDR6-based memory sub-system, to take another swing at NVIDIA's high-end.
Source: Cowcotland
Add your own comment

123 Comments on AMD Retires the Radeon VII Less Than Five Months Into Launch

#26
londiste
HBM was not and is not cost-effective. Initially GDDR5X and later GDDR6 rose up to the challenge and provide enough bandwidth (which is what GPUs seem to care most about) for cheap right now.
Posted on Reply
#27
Vya Domus
HBM was never meant to be cost effective, it's meant to be fast and size/power efficient and it's certainly not going away for certain type of products.

There's a reason Nvidia hasn't made a V100 successor with Turing and GDDR6.
Posted on Reply
#28
EarthDog
vega22Nobody. They pushed tech forward, which historically amd always have. They pioneered hbm whilst the competition used cheaper tech, which gave vastly less of an upstep in performance.

Nobody should ever take flack for pushing the envelope, even if it misses the mark.
What good was/is HBM? It was more expensive than GDDR5X and GDDR6. Sure, it offered more bandwidth, but for what? It doesn't matter at 1080p or 2560x1440. Maybe 4K, but no card that had HBM/HBM2 had enough horsepower for 60 FPS 4K Ultra/High... so what? Sew buttons... :p

HBM, in its current implementation really didn't bring much to the people. So, yay for that innovation? What really did it bring us???? They pushed GCN forward several generations... middling performance at more power use (but its cheaper!!!)... what did I miss?
vega22Nobody should ever take flack for pushing the envelope, even if it misses the mark.
Where was this sentiment when RTX came out with ray tracing?
Posted on Reply
#29
Fatalfury
when Nvidia retires RTX 2080 and RTX 2070 for Super Series. ($699 vs $499 for same performance)
People: i feel cheated,r@ped,hatred towards Nvidia.

When AMD retires Radeon vii for Radeon 5700 XT series ($699 vs $399)
People: Good Move..it was only a Productivity card.. not needed for Gaming

....
Posted on Reply
#30
Divide Overflow
So, what's AMD going to do to target the performance gaming segment now that they've thrown in the towel on Radeon VII?
Posted on Reply
#31
Slizzo
Fatalfurywhen Nvidia retires RTX 2080 and RTX 2070 for Super Series. ($699 vs $499 for same performance)
People: i feel cheated,r@ped,hatred towards Nvidia.

When AMD retires Radeon vii for Radeon 5700 XT series ($699 vs $399)
People: Good Move..it was only a Productivity card.. not needed for Gaming

....
Reason being, there's still plenty of room to scale RDNA up. They're only using 40CUs on the "big" card right now. Fill it up with more CUs and you have a great highend to enthusiast card on your hands.
Posted on Reply
#32
Vya Domus
FatalfuryPeople: Good Move..it was only a Productivity card.. not needed for Gaming
It is funny though, because every living soul called this card either useless/obsolete for gaming or indeed a productivity card. Some people out there are very confused.
Posted on Reply
#33
FordGT90Concept
"I go fast!1!11!1!"
CowCotLand was completely wrong about Navi's launch date so I don't consider them a credible source; however, it appears that AMD did not change Radeon VII MSRP in response to NVIDIA Super and further, it looks like they're mostly sold out. Put the two together and it is plausible that margins are so slim on Radeon VII that they can't cut the price to be competitive with RTX 2080's new price so they're forced to discontinue it. There is logic in CowCotLand's claim.
SlizzoReason being, there's still plenty of room to scale RDNA up. They're only using 40CUs on the "big" card right now. Fill it up with more CUs and you have a great highend to enthusiast card on your hands.
Yeah, RX 5700 XT has 10.3 billion transistors versus RTX 2080 Ti's 18.6 billion. If AMD wanted to make a similar sized Navi chip and Navi can handle it, AMD could take the performance crown...but the price will be more or less the same.
Posted on Reply
#34
Fluffmeister
Well yeah isn't the RTX 2080 Super coming in @ $699 now. As a gaming card it makes even less sense than it did just 5 months ago.
Posted on Reply
#35
Athlonite
If only they (AMD) had the for sight to re-purpose that R VII's cooler for the 5700/xt cards
Posted on Reply
#36
windwhirl
SlizzoReason being, there's still plenty of room to scale RDNA up. They're only using 40CUs on the "big" card right now. Fill it up with more CUs and you have a great highend to enthusiast card on your hands.
Can they, though? I remember someone brought that up with Polaris, and someone else shot it down because it didn't scale up well and power consumption would go up too much.
Posted on Reply
#37
Midland Dog
big navi better be special otherwise an rtx 3060 will run rings around it
Posted on Reply
#38
Wavetrex
This card was, from the beginning, a "Collector Item".

No one in their right mind would consider buying it for price/performance.
Anyway, those that did get it have a nice piece of history "First consumer GPU on 7nm". YAY !
Posted on Reply
#39
Midland Dog
WavetrexThis card was, from the beginning, a "Collector Item".

No one in their right mind would consider buying it for price/performance.
Anyway, those that did get it have a nice piece of history "First consumer GPU on 7nm". YAY !
amd should have put r7 away and never brought it out coz the 5700 would have still been the first consumer 7nm part anyway
Posted on Reply
#40
Basard
ZoneDymosounds like you were born yesterday, stuff what has been around for decades if not longer is constantly being screwed up by everybody around the world.
Thanks for that compliment.... Being born yesterday and being able to string together coherent messages is a huge feat of genius.
Posted on Reply
#41
Bones
WavetrexThis card was, from the beginning, a "Collector Item".

No one in their right mind would consider buying it for price/performance.
Anyway, those that did get it have a nice piece of history "First consumer GPU on 7nm". YAY !
TBH I'm perfectly happy with my collector pieces.
As long as they do what I want them to do it's all good and I intend to use them until they are done - At least get some use back vs the investment. One is doing great as the card in my daily, the other I stashed back in the box after a 24hr bout of folding along side the one used as my daily.
Since it did well I know it's OK and it's been in the box ever since.....

Waiting for that rainy day I might need it or just whatever.
Benching?
Possibly - U never know.
Posted on Reply
#42
ironwolf
Is this going to tank the resale value of the card at all?
Posted on Reply
#43
Slizzo
windwhirlCan they, though? I remember someone brought that up with Polaris, and someone else shot it down because it didn't scale up well and power consumption would go up too much.
I don't see why not. Remember, this isn't GCN, so it doesn't have the same pitfalls that Polaris and Vega had. Also benefitting from the 7nm process; I believe they have plenty of room to scale.
Posted on Reply
#44
HenrySomeone
So, they are already saying goodbye to the Crapeon 7 - a pathetic attempt to show they can still be competitive in the high end segment (which obviously failed miserably), leaving them with no offerings past mid range at all, lol. And to those that delusionally hope for a larger Navi chip: a relatively small 5700XT is already guzzling as much juice as a 2080, how much do you think sth that would sit at least a bit above that one would? 300, 350W? :D Not to mention sth that could compete with a 2080Ti - it would probably be a 500W, record breaking behemoth! :laugh:
Posted on Reply
#45
Darmok N Jalad
EarthDogWhere was this sentiment when RTX came out with ray tracing?
Honestly, I think the original statement was too bold of a claim to begin with. While pushing the envelope is often good, it’s not always well thought out or executed. MSFT has been pushing the envelope with Windows 10, much to many users’ misery.
WavetrexAnyway, those that did get it have a nice piece of history "First consumer GPU on 7nm". YAY !
Yes, and by doing such a short run, this card just might be a pretty good collector’s item. I wonder how many were even manufactured?
Posted on Reply
#46
Bones
Darmok N JaladYes, and by doing such a short run, this card just might be a pretty good collector’s item. I wonder how many were even manufactured?
Probrably not too many within such a short time period.
Got mine and I'm happy with them, nuff said on that. Not looking at it in collector's terms, I am using the one with the other in reserve if it's ever needed.
Posted on Reply
#47
WikiFM
Since the new 2070 super is faster at 499 how could R7 compete? It wont reduce its price to 499.
Posted on Reply
#48
srsbsns
This is going to mean warranty support will be a nightmare.
Posted on Reply
#49
Patriot
Yall need some learning. When Fury and Vega were designed for HBM they were the first few products with HBM/2
They helped design the HBM standard and then all the FPGAs and compute cards started using it.
The memory makers then decided they could charge more and increased the price 4x or more.
So between Vega demo and price announcement the price increased 4x on their hbm and it screwed over the competitiveness.
Vii therefore was not an expected consumer product but was always a stopgap/defective mi50.
It cannot be decreased in price at all without losing money and the 5700xt is nipping at it as is.
Big navi was too power hungry and got bumped to 7nm+ for 15% power reduction.
Posted on Reply
#50
Metroid
I knew this would happen, Vega was always a failed GPU, was good for amd to have vega cause now they know to never make that mistake ever again ehhe, also I predicted this would happen, on january 2019 when lisa said we will launch radeon vii and not ryzen 3xxx series i knew amd was up to something and that was shift whole 7nm production for radeon vii for maximum profit gain and then eol it once navi is released.
Posted on Reply
Add your own comment
Jan 1st, 2025 21:47 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts