Monday, June 3rd 2019

Rumor: AMD Navi a Stopgap, Hybrid Design of RDNA and GraphicsCoreNext

The Wheel of Rumors turns, and assumptions come and pass, sometimes leaving unfulfilled hopes and dreams. In this case, the rumor mill, in what seems like a push from sweclockers, places Navi not as a "built from the ground-up" architecture, but rather as a highly customized iteration of GCN - iterated in the parts that it actually implements AMD's RDNA architecture, to be exact. And this makes sense from a number of reasons - it's certainly not anything to cry wolf about.

For one, AMD's GCN has been a mainstay in the graphics computing world since it was first introduced back in 2012, succeeding the company's TeraScale architecture. Game engines and assorted software have been well optimized already to take advantage of AMD's design - even with its two ISAs and assorted improvements over the years. One of the most important arguments is derived from this optimization effort: AMD's custom designs for the console market employ architectures that are GCN-based, and thus, any new architecture that would be used by both Microsoft and Sony for their next-generation consoles would have to be strictly backwards compatible.
Of course, with the Xbox One X, you can brute-force compatibility with previous-gen games via a software emulator (that sometimes presents better results than the originals). However, that wouldn't cut it for the next generation of consoles - the work needed to develop a software emulation that was able to render some of the current-gen games at their level of detail (think some of the 4K-rendered titles on Xbox One X, for example) would likely be an insurmountable task. Backwards compatibility at the hardware level would always be a prerequisite, and I bet you anything that AMD made sure to use Microsoft and Sony's funding to develop their RDNA design that they're partly employing with Navi - the GPU that will power the next generation of consoles.

This means that AMD's Navi keeps its GCN bloodline in a way that allows it to have - I'd wager - an almost completely automatic backwards compatibility with current-gen titles. However, the fact that Navi already employs AMD's RDNA means that developers will also need to learn to code for the new architecture - thus easing their way into it, rather than facing a completely different new GPU design at once. Everybody wins, really - Sony and Microsoft get to keep their backwards compatibility; AMD uses their funding to develop their custom hardware, including part of the RDNA architecture; and developers get to keep compatibility with AMD's design, whilst having new architectural toys to play with.
Another argument is that AMD could be looking to completely separate their graphics and compute architectures - they do say that RDNA was developed with gaming workloads in mind, which means that they won't be making the same strides they've been in the past years when it comes to compute workloads. And we all know that's where the real money lies - should AMD be able to break into that market with enough force, of course.

Speculation, certainly. But it seems we may have to wait until 2020 to see AMD's RDNA architecture in its full-glory implementation - and it'll come to PC first. Until then, nothing is stopping the hybrid Navi from being exactly what gaming ordered.
Source: Sweclockers
Add your own comment

30 Comments on Rumor: AMD Navi a Stopgap, Hybrid Design of RDNA and GraphicsCoreNext

#1
Aldain
Speculation by the sweeds. :D
Posted on Reply
#2
Lightofhonor
Probably should put RUMOR in the headline.
Posted on Reply
#3
Raevenlord
News Editor
LightofhonorProbably should put RUMOR in the headline.
It's literally, literally, the fourth word on the news piece, but I understand the problem with people only reading the title before clicking. 'Tis not clickbait, people just jump to conclusions.
Posted on Reply
#4
medi01
#whateverthatmeans
RaevenlordHowever, the fact that Navi already employs AMD's RDNA means that developers will also need to learn to code for the new architecture...
Because GCN as instruction set is no longer supported... and why would AMD do that?
Posted on Reply
#5
iO
AMD's bridgman: "You could call it a hybrid but not in that sense... we used to talk about GCN as an ISA, but it seems that most people outside AMD think of GCN as a micro-architecture instead (ie an implementation of the ISA). RDNA is GCN ISA but not what you think of as GCN architecture."
Posted on Reply
#6
Assimilator
How is this news? We've known Navi is gonna be trash for months...
Posted on Reply
#7
bug
And this makes sense from a number of reasons - it's certainly not anything to cry wolf about.

For one, AMD's GCN has been a mainstay in the graphics computing world since it was first introduced back in 2012
I wonder what you would have written if instead of Core, Intel gave us a refined mainstay Netburst instead.
Posted on Reply
#8
kapone32
It is amazing where a one time there was little traffic about AMD but man are they in the spotlight now!!
Posted on Reply
#9
bug
kapone32It is amazing where a one time there was little traffic about AMD but man are they in the spotlight now!!
Well, there was a lot of traffic about AMD back in Athlon (in its various incarnations) days. But then AMD went Bulldozer&friends...
Posted on Reply
#10
kapone32
bugWell, there was a lot of traffic about AMD back in Athlon (in its various incarnations) days. But then AMD went Bulldozer&friends...
I know but that was just CPUs.
Posted on Reply
#12
bug
kapone32I know but that was just CPUs.
There was a lot of traffic about ATI, too, back in the Radeon 9000 series days. Anyone remembers which 9500 was guaranteed to unlock into a 9700? ;)
Posted on Reply
#13
kapone32
kapone32I know but that was just CPUs.
Tahiti was a definite hit. It was so good many AMD fanboys did not see the need to change cards which hurt AMD's bottom line.
AssimilatorHow is this news? We've known Navi is gonna be trash for months...
How can you even substantiate a statement like that?

I am not going to say that it is a definite but the fact that games on the console have been made to run on AMD hardware for the last how many years will begin to see better optimization for multi core, especially now that the next PS5 will basicallay be a Ryzen Based Navi part. Should that not mean that now more than ever we should start to see more Strange Brigades and AOTS to really show what AMD is capable of, Microsoft is the key though. I would like to think that Navi will support 4K 60HZ as that is what most TVs in living rooms with PS5s will have at least 60 but perhaps 120HZ or 240HZ refresh rates too.
Posted on Reply
#14
Casecutter
iOAMD's bridgman: "You could call it a hybrid but not in that sense... we used to talk about GCN as an ISA, but it seems that most people outside AMD think of GCN as a micro-architecture instead (ie an implementation of the ISA). RDNA is GCN ISA but not what you think of as GCN architecture."
There are naysayer... there always will be... but it still up in the air if AMD/RTG has found a new way of looking at a "micro-architecture" while still making use of the ISA.

I see RDNA just a new group of the basic "nucleotides" (building blocks) that can be architecturally arranged as needed. I think what this first Navi uses is not everything (blocks) they have, but parts and pieces that are functionally able to coexist (aka memory, cache hierarchy, latency improvements) while probably less all new arrangement of Compute Unit blocks built for specific for game loads (more optimized GCN), or other blocks for professional, calculations, HPC, AI.

I don't think we will fully see what RDNA comprises of till after a Full-Navi comes for Gaming.
kapone32Tahiti was a definite hit. It was so good many AMD fanboys did not see the need to change cards which hurt AMD's bottom line.
It was more that Rory turned up production to satisfy the first Mining Boom, then as quickly the Bust (@ bye Rory). Incoming Lisa Su, had to clear them off the books, so they discounted them and people bought them. They didn't hold back Hawaii, which was hard sell as it came and there was little presence of low cost 1440p panels that most thought would support it, so it was a expensive outlay. Many held to 1080p and for that Tahiti was more than sufficient.
Posted on Reply
#15
quadibloc
I don't find that rumor too far-fetched. A while back, I remember reading an article on a tech news site about an AMD patent that would make AMD's GPUs comparable in efficiency to those from NVIDIA. Right now, AMD has to include expensive fancy hardware, like HBM, on its video cards without raising the price to compensate in order to match NVIDIA performance - as the CEO of NVIDIA was impolite enough to point out a while back. So once it gets a design ready based on that tech, to lower production costs, I think AMD won't hesitate to switch to it.

Ah: the article referenced U. S. patent application 20180357064.
Posted on Reply
#16
R-T-B
CasecutterThere are naysayer... there always will be..
That's not a naysayer... it's an AMD employee. And his post really clarifies things in a positive way.
medi01#whateverthatmeans
Pretty ironic as that's basically every post from you in a nutshell.
Posted on Reply
#17
bug
R-T-BThat's not a naysayer... it's an AMD employee. And his post really clarifies things in a positive way.
Iirc Bridgman comes from ATI. So he's not a noob either.
Posted on Reply
#18
kapone32
CasecutterThere are naysayer... there always will be... but it still up in the air if AMD/RTG has found a new way of looking at a "micro-architecture" while still making use of the ISA.

I see RDNA just a new group of the basic "nucleotides" (building blocks) that can be architecturally arranged as needed. I think what this first Navi uses is not everything (blocks) they have, but parts and pieces that are functionally able to coexist (aka memory, cache hierarchy, latency improvements) while probably less all new arrangement of Compute Unit blocks built for specific for game loads (more optimized GCN), or other blocks for professional, calculations, HPC, AI.

I don't think we will fully see what RDNA comprises of till after a Full-Navi comes for Gaming.

It was more that Rory turned up production to satisfy the first Mining Boom, then as quickly the Bust (@ bye Rory). Incoming Lisa Su, had to clear them off the books, so they discounted them and people bought them. They didn't hold back Hawaii, which was hard sell as it came and there was little presence of low cost 1440p panels that most thought would support it, so it was a expensive outlay. Many held to 1080p and for that Tahiti was more than sufficient.
The mining boom was more the realm of Polaris and used Tahiti and Hawaii cards, Those cards were released in 2012 as the 7970 and 7950, the era of $199 GPUs and Korean 1440P screens. AMD cards have been suffering since the 780TI launched. but the 970 and 1080 were what did them in.
Posted on Reply
#19
bug
kapone32The mining boom was more the realm of Polaris and used Tahiti and Hawaii cards, Those cards were released in 2012 as the 7970 and 7950, the era of $199 GPUs and Korean 1440P screens. AMD cards have been suffering since the 780TI launched. but the 970 and 1080 were what did them in.
Yup. Back when TSMC's 22nm node failed and everybody was stuck on 28nm, AMD kept their compute resources and tried to parallelize till they couldn't feed their hardware while Nvidia (wisely, imho), decided to cut back on compute hardware that does little for gaming. Nvidia figuring out TBR was just the nail in GCN's coffin.
AMD sticking to compute helped them during the mining boom era, so they made back some of the lost cash. But that was just a fluke.
Posted on Reply
#20
Unregistered
This year's tech is a stopgap for next year's tech.

/techindustrysummary
#21
dicktracy
Probably the result of combining Raja's work with their own :roll:
Posted on Reply
#22
Fluffmeister
So it's still GCN, but with cheese on top (hence the name).
Posted on Reply
#23
Minus Infinity
It was always known Navi+ is the game changer for AMD. I'm not ruling out Navi card if price and power consumption are low and it can compete at least at 2070 levels, but we'll see.
Posted on Reply
#24
quadibloc
bugNvidia figuring out TBR was just the nail in GCN's coffin.
Did a search. Found out that Nvidia didn't disclose tile-based rendering was what they used - outside hackers figured that out themselves. And TBR is a standard technique, long used for low-power GPUs, like those in mobile phones. So, since that means there's no patent stopping them, why on Earth hasn't AMD gotten around to using TBR on their GPUs?
Posted on Reply
#25
medi01
iOAMD's bridgman: "You could call it a hybrid but not in that sense... we used to talk about GCN as an ISA, but it seems that most people outside AMD think of GCN as a micro-architecture instead (ie an implementation of the ISA). RDNA is GCN ISA but not what you think of as GCN architecture."
Shocking, eh?
AssimilatorWe've known Navi is gonna be trash for months...
This week at stupid.
Brought to you by "AMD never undercut competitors like that".

The ironic part is, "they" actually "knew".
It doesn't matter what comes out, as the outcome is predefined, it's only the excuses to come to that conclusion that need to be clarified.
Mainboards from 2017 do not support PCIe4? Bad, baaaad, AMD, how dare you?

But the "it's still GCN" is the loveliest and the strongest of them all.
Even stronger than "buh mah drivars", as someone can call out BS on that.
Microarch, on the other hand, who the f*ck does anyone prove or disprove it? Or even clarify why it is bad to begin with.

570, wipes the floor with 1050, 1050Ti, 1650 and is cheaper? Ah, but it's "still GCN". Oh, and my grandma's friend's husband's neighbor has 15 years old dell which can feed GPU only via PCIe.
Posted on Reply
Add your own comment
May 29th, 2024 05:46 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts