Friday, July 12th 2019
AMD Retires the Radeon VII Less Than Five Months Into Launch
AMD has reportedly discontinued production of its flagship Radeon VII graphics card. According to a Cowcotland report, AMD no longer finds it viable to produce and sell the Radeon VII at prices competitive to NVIDIA's RTX 2080, especially when its latest Radeon RX 5700 XT performs within 5-12 percent of the Radeon VII at less than half its price. AMD probably expects custom-design RX 5700 XT cards to narrow the gap even more. The RX 5700 XT has a much lesser BOM (bill of materials) cost compared to the Radeon VII, due to the simplicity of its ASIC, a conventional GDDR6 memory setup, and far lighter electrical requirements.
In stark contrast to the RX 5700 XT, the Radeon VII is based on a complex MCM (multi-chip module) that has not just a 7 nm GPU die, but also four 32 Gbit HBM2 stacks, and a silicon interposer. It also has much steeper VRM requirements. Making matters worse is the now-obsolete "Vega" architecture it's based on, which loses big time against "Navi" at performance/Watt. The future of AMD's high-end VGA lineup is uncertain. Looking at the way "Navi" comes close to performance/Watt parity with NVIDIA on the RX 5700, AMD may be tempted to design a larger GPU die based on "Navi," with a conventional GDDR6-based memory sub-system, to take another swing at NVIDIA's high-end.
Source:
Cowcotland
In stark contrast to the RX 5700 XT, the Radeon VII is based on a complex MCM (multi-chip module) that has not just a 7 nm GPU die, but also four 32 Gbit HBM2 stacks, and a silicon interposer. It also has much steeper VRM requirements. Making matters worse is the now-obsolete "Vega" architecture it's based on, which loses big time against "Navi" at performance/Watt. The future of AMD's high-end VGA lineup is uncertain. Looking at the way "Navi" comes close to performance/Watt parity with NVIDIA on the RX 5700, AMD may be tempted to design a larger GPU die based on "Navi," with a conventional GDDR6-based memory sub-system, to take another swing at NVIDIA's high-end.
123 Comments on AMD Retires the Radeon VII Less Than Five Months Into Launch
Hopefully they will keep supporting it with driver updates otherwise I will be screwed. :ohwell:
If hbm is such a flop why are both and and NV still selling their top tier cards with it? Add in cards that cost more than most people's entire gaming system. For just 1 card. Why are chips going into self driving cars with it?
Sure it is not needed for gamers because gaming GPU are still not powerful enough to require that bandwidth. But they will be soon. Who knows if gddr will catch up by that point?
How long will it be before hbm is in CPU with 3d stacked GPU and a whole gaming system is in a single chip?
Who knows what will come of it, but and has for sure helped it get where it is.
As for NV and rtx, I'm guessing you're just a salty fan boi for bringing it up tbh.
I think they did good in pushing rtrt, it is no doubt the future for gaming and CGI. I never gave them stick for it, just for the way they acted like they created and totaly ignoring the fact it has been around since halflife 2....
Vega is not a failure, it was designed for multiple use cases and is competitive in a few, many bought and are happy with them as will be the imac pro owners with a better version of it.
I am not surprised to me radeonVII was always a stopgap answer Amds core fans wanted.
And yes, hopefully that sapphire 5900/5800/5700/5600 rumour has weight, sure would help nudge me away from a lowly vega 64 I'm struggling with (sarcasm that bit).
I really don't appreciate being called salty though, in particular when its pretty far from the truth and I was just responding to your post. :)
I mean you know, yet you don't "Know" exactly what it will do, how well it will hold up, perform..... All this is like anything else, growing pains based on 7nm, which is expected no matter who makes it.
Thing now is to see if they can do better with the next release - And probrably will, question with that is how much better?
We'll find out soon enough.
Only logical compute use for these GCN+HBM cards are crypto mining or whatever super niche application where AMD actually bothered to polish their software support enough for normal use.
I owned an HD2900...
my heart goes out.
Literally the worst ATi GPU ever
Sorry, I didn't mean to offend you. I don't know you well enough to really judge your stance on that. But you must understand my point of view too? By that I mean bringing team green up in a team red discussion does tend to be more of a fan boi move.
Personally I like to beat on them all when they take the piss. We all know they all do it too. Be it rebrands, false specs or the old bait and switch. Why I don't hold as much weight on release day reviews as I do those where they get retail items themselves.
Anyway, that is off the beaten path for the thread, but wanted to clarify why I said that.
The Radeon VII was just a cry from AMD "do not forget us, we're still here".
The card never made much sense, I don´t blame AMD for trying to get away from it as soon as possible!
They would be torching Nvidia's shiny new headquarters if this was an Ngreedia card.
The spin here is..... good news! AMD are lovely to their customers! This must mean something better is coming!
Nvidia can't buy that mindshare for love nor money.
AMD had to show anything due to the Turing launch, Vega were getting more and more behind, so that's what AMD got to show off! And there's no harm in it, that's what they had!
And the fact that after 5 months they retired the card, further reinforces the idea that it was a card only to make a statement, nothing more!
With the EOL of Radeon7, AMD opens the door for the 5800, 5800XT, 5800XTX, 5900XT and 5900XTX. These are the cards that will likely hit the market next, and I say bring it on. That's an opinion. For the performance offered, it's price was competitive with NVidia's offering. NVidia didn't raise the prices on RTX. After release they only went down in price. I was watching very closely. They were behind only in the top-tier market. Radeon 7 fixed that with a card that hit 2080 level of performance for less money. In the mid-range and budget market's AMD beat NVidia's offering handily and still do. Agreed and it was a good showing. It was a card that filled a gap and maybe you're right, it did show one thing, AMD can compete with the upper range. IF they had upped their R&D game and released expanded versions of the Radeon7, they could have competed with 2080ti and even RTX Titan, but in similar price tiers.
A lot of people want a card like THAT if they are going to drop the Radeon VII. Even 12GB isn't enough for a lot of VIDEO EDITING users. Digital Foundry talks about this. With PCIe v4.0 boosting performance in some applications (not games) then that's a card that AMD should be selling. Just allow another 8GB of VRAM and problem solved. Will AMD do this or just leave a hole in the market for NVidia to fill?
Wendel's buddy already noted that the RX-5700XT beat the RTX2080Ti in his productivity test though he admitted not comparing it to PCIe v3.0. Either way it beat the RTX2080Ti and since the RTX2080Ti doesn't support PCIe v4.0 it can't pull ahead in those tests. A perfect setup for higher-end productivity would then be an R9-3900x (12-core), RX-5700XT 16GB card, 64GB of DDR4 3600MHz CL16, and an X500 motherboard with no goddamn chipset fan (good heatsink)
Radeon VII is already a +/-300W card, to increase 30% more performance, what monstrous consumptions would it have?
Not to mention the problem that would be to cool a card with so much power consumption!
I can see that happening with Navi, but with Vega architecture, I just think it would be a waste of time for AMD.
www.evga.com/products/specs/gpu.aspx?pn=C83BF35F-63BE-4DAA-9B7F-0CB8DAEA1AF6
That was a monstrosity of a card at 370w of power draw under load. I expected my power bill to shoot up. It did not. One month to the next, my bill went up by an entire $1, and that was in a summer month with the central air running. Into the winter very little impact to the power bill. So this constant argument(it's not just you) about power usage needs to stop. It's worthy of consideration until that consideration is compared to performance, then it needs to be ignored. It's not as big a problem as you'd think. All I'm saying is that they could have done it, and with Navi they still can. It makes more sense. If I were AMD, I'd pull out all the stops and build an RTX Titan killer. They can do it and likely at a much better price to the end user while still making solid profit.
What I'm waiting to see is AMD's answer to RTX's RTRT. Come on AMD, get on it!