Monday, June 3rd 2019
Rumor: AMD Navi a Stopgap, Hybrid Design of RDNA and GraphicsCoreNext
The Wheel of Rumors turns, and assumptions come and pass, sometimes leaving unfulfilled hopes and dreams. In this case, the rumor mill, in what seems like a push from sweclockers, places Navi not as a "built from the ground-up" architecture, but rather as a highly customized iteration of GCN - iterated in the parts that it actually implements AMD's RDNA architecture, to be exact. And this makes sense from a number of reasons - it's certainly not anything to cry wolf about.
For one, AMD's GCN has been a mainstay in the graphics computing world since it was first introduced back in 2012, succeeding the company's TeraScale architecture. Game engines and assorted software have been well optimized already to take advantage of AMD's design - even with its two ISAs and assorted improvements over the years. One of the most important arguments is derived from this optimization effort: AMD's custom designs for the console market employ architectures that are GCN-based, and thus, any new architecture that would be used by both Microsoft and Sony for their next-generation consoles would have to be strictly backwards compatible.Of course, with the Xbox One X, you can brute-force compatibility with previous-gen games via a software emulator (that sometimes presents better results than the originals). However, that wouldn't cut it for the next generation of consoles - the work needed to develop a software emulation that was able to render some of the current-gen games at their level of detail (think some of the 4K-rendered titles on Xbox One X, for example) would likely be an insurmountable task. Backwards compatibility at the hardware level would always be a prerequisite, and I bet you anything that AMD made sure to use Microsoft and Sony's funding to develop their RDNA design that they're partly employing with Navi - the GPU that will power the next generation of consoles.
This means that AMD's Navi keeps its GCN bloodline in a way that allows it to have - I'd wager - an almost completely automatic backwards compatibility with current-gen titles. However, the fact that Navi already employs AMD's RDNA means that developers will also need to learn to code for the new architecture - thus easing their way into it, rather than facing a completely different new GPU design at once. Everybody wins, really - Sony and Microsoft get to keep their backwards compatibility; AMD uses their funding to develop their custom hardware, including part of the RDNA architecture; and developers get to keep compatibility with AMD's design, whilst having new architectural toys to play with.Another argument is that AMD could be looking to completely separate their graphics and compute architectures - they do say that RDNA was developed with gaming workloads in mind, which means that they won't be making the same strides they've been in the past years when it comes to compute workloads. And we all know that's where the real money lies - should AMD be able to break into that market with enough force, of course.
Speculation, certainly. But it seems we may have to wait until 2020 to see AMD's RDNA architecture in its full-glory implementation - and it'll come to PC first. Until then, nothing is stopping the hybrid Navi from being exactly what gaming ordered.
Source:
Sweclockers
For one, AMD's GCN has been a mainstay in the graphics computing world since it was first introduced back in 2012, succeeding the company's TeraScale architecture. Game engines and assorted software have been well optimized already to take advantage of AMD's design - even with its two ISAs and assorted improvements over the years. One of the most important arguments is derived from this optimization effort: AMD's custom designs for the console market employ architectures that are GCN-based, and thus, any new architecture that would be used by both Microsoft and Sony for their next-generation consoles would have to be strictly backwards compatible.Of course, with the Xbox One X, you can brute-force compatibility with previous-gen games via a software emulator (that sometimes presents better results than the originals). However, that wouldn't cut it for the next generation of consoles - the work needed to develop a software emulation that was able to render some of the current-gen games at their level of detail (think some of the 4K-rendered titles on Xbox One X, for example) would likely be an insurmountable task. Backwards compatibility at the hardware level would always be a prerequisite, and I bet you anything that AMD made sure to use Microsoft and Sony's funding to develop their RDNA design that they're partly employing with Navi - the GPU that will power the next generation of consoles.
This means that AMD's Navi keeps its GCN bloodline in a way that allows it to have - I'd wager - an almost completely automatic backwards compatibility with current-gen titles. However, the fact that Navi already employs AMD's RDNA means that developers will also need to learn to code for the new architecture - thus easing their way into it, rather than facing a completely different new GPU design at once. Everybody wins, really - Sony and Microsoft get to keep their backwards compatibility; AMD uses their funding to develop their custom hardware, including part of the RDNA architecture; and developers get to keep compatibility with AMD's design, whilst having new architectural toys to play with.Another argument is that AMD could be looking to completely separate their graphics and compute architectures - they do say that RDNA was developed with gaming workloads in mind, which means that they won't be making the same strides they've been in the past years when it comes to compute workloads. And we all know that's where the real money lies - should AMD be able to break into that market with enough force, of course.
Speculation, certainly. But it seems we may have to wait until 2020 to see AMD's RDNA architecture in its full-glory implementation - and it'll come to PC first. Until then, nothing is stopping the hybrid Navi from being exactly what gaming ordered.
30 Comments on Rumor: AMD Navi a Stopgap, Hybrid Design of RDNA and GraphicsCoreNext
I am not going to say that it is a definite but the fact that games on the console have been made to run on AMD hardware for the last how many years will begin to see better optimization for multi core, especially now that the next PS5 will basicallay be a Ryzen Based Navi part. Should that not mean that now more than ever we should start to see more Strange Brigades and AOTS to really show what AMD is capable of, Microsoft is the key though. I would like to think that Navi will support 4K 60HZ as that is what most TVs in living rooms with PS5s will have at least 60 but perhaps 120HZ or 240HZ refresh rates too.
I see RDNA just a new group of the basic "nucleotides" (building blocks) that can be architecturally arranged as needed. I think what this first Navi uses is not everything (blocks) they have, but parts and pieces that are functionally able to coexist (aka memory, cache hierarchy, latency improvements) while probably less all new arrangement of Compute Unit blocks built for specific for game loads (more optimized GCN), or other blocks for professional, calculations, HPC, AI.
I don't think we will fully see what RDNA comprises of till after a Full-Navi comes for Gaming. It was more that Rory turned up production to satisfy the first Mining Boom, then as quickly the Bust (@ bye Rory). Incoming Lisa Su, had to clear them off the books, so they discounted them and people bought them. They didn't hold back Hawaii, which was hard sell as it came and there was little presence of low cost 1440p panels that most thought would support it, so it was a expensive outlay. Many held to 1080p and for that Tahiti was more than sufficient.
Ah: the article referenced U. S. patent application 20180357064.
AMD sticking to compute helped them during the mining boom era, so they made back some of the lost cash. But that was just a fluke.
/techindustrysummary
Brought to you by "AMD never undercut competitors like that".
The ironic part is, "they" actually "knew".
It doesn't matter what comes out, as the outcome is predefined, it's only the excuses to come to that conclusion that need to be clarified.
Mainboards from 2017 do not support PCIe4? Bad, baaaad, AMD, how dare you?
But the "it's still GCN" is the loveliest and the strongest of them all.
Even stronger than "buh mah drivars", as someone can call out BS on that.
Microarch, on the other hand, who the f*ck does anyone prove or disprove it? Or even clarify why it is bad to begin with.
570, wipes the floor with 1050, 1050Ti, 1650 and is cheaper? Ah, but it's "still GCN". Oh, and my grandma's friend's husband's neighbor has 15 years old dell which can feed GPU only via PCIe.