Monday, May 27th 2019
AMD Announces Radeon RX 5700 Based on Navi: RDNA, 7nm, PCIe Gen4, GDDR6
AMD at its 2019 Computex keynote today unveiled the Radeon RX 5000 family of graphics cards that leverage its new Navi graphics architecture and 7 nm silicon fabrication process. Navi isn't just an incremental upgrade over Vega with a handful new technologies, but the biggest overhaul to AMD's GPU SIMD design since Graphics CoreNext, circa 2011. Called RDNA or Radeon DNA, the new compute unit by AMD is a clean-slate SIMD design with a 1.25X IPC uplift over Vega, an overhauled on-chip cache hierarchy, and a more streamlined graphics pipeline.
In addition, the architecture is designed to increase performance-per-Watt by 50 percent over Vega. The first part to leverage Navi is the Radeon RX 5700. AMD ran a side-by-side demo of the RX 5700 versus the GeForce RTX 2070 at Strange Brigade, where NVIDIA's $500 card was beaten. "Strange Brigade" is one game where AMD fares generally well as it is heavily optimized for asynchonous compute. Navi also ticks two big technology check-boxes, PCI-Express gen 4.0, and GDDR6 memory. AMD has planned a July availability for the RX 5700, and did not disclose pricing.
In addition, the architecture is designed to increase performance-per-Watt by 50 percent over Vega. The first part to leverage Navi is the Radeon RX 5700. AMD ran a side-by-side demo of the RX 5700 versus the GeForce RTX 2070 at Strange Brigade, where NVIDIA's $500 card was beaten. "Strange Brigade" is one game where AMD fares generally well as it is heavily optimized for asynchonous compute. Navi also ticks two big technology check-boxes, PCI-Express gen 4.0, and GDDR6 memory. AMD has planned a July availability for the RX 5700, and did not disclose pricing.
202 Comments on AMD Announces Radeon RX 5700 Based on Navi: RDNA, 7nm, PCIe Gen4, GDDR6
Sure Strange Brigade is a "ringer" and AMD architecture poster child, but also a good all around projection of the capabilities of Vulkcan/DX12 "API Overhead" and I'm sure why AMD leads with it. They're promoting the obvious in that there's gaming engines out there that can unleash their particular architecture design. Nothing wrong about that...
It is interesting that what's been AMD second tier mainstream offering, is working over (sure the one title) a part Nvidia has promoted more-or-less entry enthusiast while such AMD "70-Series" have been task as "entry mainstream" more a-kin today to the GTX 1660. If the RX 5700 actually is 20% behind a 2070 that still has it like somewhere between the Vega 56/64, and I would consider there still a RX 5800 (full-part) still out there.
I figured Computex was just the top-level design and architectural keynotes, and honestly not taking the "marketing jargon" or any of this at face value. That said, I think AMD has a good blueprint and executed this Navi release in fairly clear-cut strategy, while holding to the schedule (we'll wait to see how well they can fill the channel). They'll have more information to drop at E3 (June 13-11), but I don't think we'll learn a-lot with 3 weeks until the actual NDA release July 7th.
This also means your interpretation of the AMD naming scheme is not correct. Since the release of RX480, there have been new names but most of that has been rebranded or very minor improvements. Navi's new naming has no relation whatsoever to performance or place in the stack really, its just taking a look at Nvidia and slotting in on the right number. There will be a bigger Navi, but what it will do is a mystery, and AMD no longer has a structure you can rely on with their product stack. Gone are the HD-xx50 / xx70 days.
The key point being, we have no idea what the bigger chip will perform like.
With Nvidia, prior to Turing (but even now, really), while they do use multiple SKUs, these are all almost straight scaled versions of each other. Sometimes some trickery is applied (asymmetrical VRAM setup, usually found on midrange, not just GTX 970; Fermi and Kepler had them too) but you won't see a split halfway the stack using radically different tech. Turing is the exception with its RT components.
It would throw a lot of work out of the window.
I agree the Nvidia 70 Series has always been in a completely different Product stack. While sure we can't say that the RX 5700 is more "a-kin" to what has been the a mainstream gelding size chip/offering; aka RX 570, R7 270, 7850, but what if that what is?
I believe we have no idea where the "RX 5700" aligns in AMD's product stack, or if it's suppose to be a part that actually contests Nvidia's "entry enthusiast" offering. The number means nothing it just a place holder that some version of a Navi that scrimmages a 2070 in Strange Bridge... It means nothing until we know.
There is really no telling. I'm quite sure they can pull a bigger Navi 20 out of this node that performs a good margin above this one - but how big of a margin? 30%? 50%? And even an optimistic scenario would give them only slightly under or over 2080ti performance. But on the other hand, we haven't seen any AMD GPU surpass GTX 1080ti performance and that card is out there for quite some time now. So far, even Navi stalls completely at the same-ish perf level as Vega 56. In reality, all we've really seen thus far, is rebadged Vega performance - even Radeon VII is just a Vega shrink. Navi's biggest achievement is the move to GDDR6.
This is the pessimistic version of AMD's roadmap though, and its based on what we've seen the past few years. Given Zen's success, who knows, things may get better.
For the second part, I guess we'll know in a couple of weeks? I've got my fingers crossed.
The thing is, HDMI 2.1 comes with VRR. And since that's probably incompatible with whatever magic AMD worked to implement their own VRR over HDMI, it could be a problem to implement.
-"Ultra High Speed"--GPU has to be able to produce 48 Gbps signal (Navi doesn't target that market).
-Dynamic HDR--don't think DisplayPort supports this. It will take a lot of R&D to implement.
-Enhanced Audio Return Channel--not sure how difficult this is to implement. Dolby isn't exactly popular on computers: huge preference towards uncompressed PCM which is lossless. It might require paying Dolby too which could mean NVIDIA/AMD/Intel will never be compliant here.
I think VRR, at least for AMD, is an easy one. They can probably make it HDMI 2.1 compliant with a driver patch because GCN apparently has a lot of granularity control over its HDMI protocol. Everything else is theoretically pretty easy (low latency) or already done (DSC).
Remember, GPUs in general are usually quite a ways behind TVs in implementing HDMI standards. HDMI was always designed to put the burden of design on the source, not the destination.
Only one of 4 HDMI 2.1 connectors on the 2019 OLEDs have eARC compatibility, which should mean this doesn't have to be on the list for GPU manufacturers to support 2.1.
Worried about 48gbits per second? Well that fancy PCIE 4.0 connector can do 256gbits/second so I hightly doubt the bandwidth is the problem holding this back.
HDR is already supported and there's no differentiation between different HDR styles in Windows yet, so this is something that, considering it's software level, would make sense that it would be a possible software update to support, but the connector itself wouldn't be held back in the meantime.
GPUs, in the past, have been far far far ahead of what the vast majority of hardware on the market is capable of. I still remember my TNT2 Ultra that supported 240hz and that was in the late 90s. It could also do 1920x1200. In the 90s. The first 1080P TVs (That I remember) came out just before PS3 in 2006. That means GPUs were ahead by at least 7-8 years back then when compared to where TVs were at.
Again, Arcturus will most likely support HDMI 2.1. Navi will not.
I dont think it can do much over 2k (2048x1080).. pretty sure it cant reach 2560x1440 60hz?
CRTs were never digital in the first place so the signal had to be converted to analog at some point (either in the GPU or in the display).
No reason why an 8K CRT couldn't be made today that accepts a DisplayPort or HDMI connector and RAMDACs it into VGA internally. Would look better than sending analog over VGA anyway because less noise.
Anyway, I don't expect it, but those Zen+ APUs could have Navi inside.
If you mean HDMI 2.1, it was announced in January 2017 and released in November the same year.
HDMI 2.1 TVs are on the market right now - LG "9" series.
And as Ford said above, the time from a new HDMI standard is launched until it reaches PC hardware has always been very long. That seems to be how the HDMI consortium works. No, they don't. They're already out in laptops. The die is known, the GPU spec is known, and it's Vega 10 with a clock bump. If they were Navi, this would show in drivers (in particular: in needing entirely bespoke drivers). Of course, the fact that they haven't launched the desktop APUs yet makes me slightly hopeful that they'll just hold off until the next generation of MCM APUs are ready some time in (very) late 2019 or early 2020 - once there's a known good Navi die that will fit the package available in sufficient quantities that it won't gimp GPU sales. Frankly, I'd prefer that over a clock-bumped 3200G/3400G. Maybe they could even bring the model names and CPU architectures in line by doing this?
Amd/comments/but81owww.sweclockers.com/nyhet/27618-amd-radeon-rx-5000-ar-hybrid-med-inslag-av-gcn-renodlad-rdna-forst-ar-2020
Navi die:
nl.hardware.info/nieuws/65723/computex-rx-5000-serie-van-amd-wordt-hybride-van-gcn-en-rdna-pure-rdna-komt-met-navi-20
Worth clarifying for the non-Swedophones(?) out there: according to this, Navi 20 ("big Navi") is supposed to be "pure" RDNA, and launch in early 2020. In other words, this is not a "half now, half next generation" situation as the title might make it seem. Still odd to make a hybrid like this, but I guess the architectures are modular enough to plug-and-play the relevant blocks on a driver level as well. This also clarifies the kinda-weird mismatch between RDNA being the architecture for "gaming in the next decade" while there being a "next-gen" arch on the roadmaps for 2020.
I wonder what implications this might have for performance and driver support. One might assume that these first cards will lose driver support earlier, but then again considering how prevalent GCN is I can't see that being for another 5 years or so anyway, by which time they'll be entirely obsolete. Performance enhancements and driver tuning might taper off more quickly, though, unless the relevant parts are RDNA and not GCN.