• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Rumor: AMD Navi a Stopgap, Hybrid Design of RDNA and GraphicsCoreNext

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.16/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
The Wheel of Rumors turns, and assumptions come and pass, sometimes leaving unfulfilled hopes and dreams. In this case, the rumor mill, in what seems like a push from sweclockers, places Navi not as a "built from the ground-up" architecture, but rather as a highly customized iteration of GCN - iterated in the parts that it actually implements AMD's RDNA architecture, to be exact. And this makes sense from a number of reasons - it's certainly not anything to cry wolf about.

For one, AMD's GCN has been a mainstay in the graphics computing world since it was first introduced back in 2012, succeeding the company's TeraScale architecture. Game engines and assorted software have been well optimized already to take advantage of AMD's design - even with its two ISAs and assorted improvements over the years. One of the most important arguments is derived from this optimization effort: AMD's custom designs for the console market employ architectures that are GCN-based, and thus, any new architecture that would be used by both Microsoft and Sony for their next-generation consoles would have to be strictly backwards compatible.





Of course, with the Xbox One X, you can brute-force compatibility with previous-gen games via a software emulator (that sometimes presents better results than the originals). However, that wouldn't cut it for the next generation of consoles - the work needed to develop a software emulation that was able to render some of the current-gen games at their level of detail (think some of the 4K-rendered titles on Xbox One X, for example) would likely be an insurmountable task. Backwards compatibility at the hardware level would always be a prerequisite, and I bet you anything that AMD made sure to use Microsoft and Sony's funding to develop their RDNA design that they're partly employing with Navi - the GPU that will power the next generation of consoles.

This means that AMD's Navi keeps its GCN bloodline in a way that allows it to have - I'd wager - an almost completely automatic backwards compatibility with current-gen titles. However, the fact that Navi already employs AMD's RDNA means that developers will also need to learn to code for the new architecture - thus easing their way into it, rather than facing a completely different new GPU design at once. Everybody wins, really - Sony and Microsoft get to keep their backwards compatibility; AMD uses their funding to develop their custom hardware, including part of the RDNA architecture; and developers get to keep compatibility with AMD's design, whilst having new architectural toys to play with.



Another argument is that AMD could be looking to completely separate their graphics and compute architectures - they do say that RDNA was developed with gaming workloads in mind, which means that they won't be making the same strides they've been in the past years when it comes to compute workloads. And we all know that's where the real money lies - should AMD be able to break into that market with enough force, of course.

Speculation, certainly. But it seems we may have to wait until 2020 to see AMD's RDNA architecture in its full-glory implementation - and it'll come to PC first. Until then, nothing is stopping the hybrid Navi from being exactly what gaming ordered.

View at TechPowerUp Main Site
 
Probably should put RUMOR in the headline.
 
Probably should put RUMOR in the headline.

It's literally, literally, the fourth word on the news piece, but I understand the problem with people only reading the title before clicking. 'Tis not clickbait, people just jump to conclusions.
 
#whateverthatmeans

However, the fact that Navi already employs AMD's RDNA means that developers will also need to learn to code for the new architecture...

Because GCN as instruction set is no longer supported... and why would AMD do that?
 
AMD's bridgman: "You could call it a hybrid but not in that sense... we used to talk about GCN as an ISA, but it seems that most people outside AMD think of GCN as a micro-architecture instead (ie an implementation of the ISA). RDNA is GCN ISA but not what you think of as GCN architecture."
 
How is this news? We've known Navi is gonna be trash for months...
 
And this makes sense from a number of reasons - it's certainly not anything to cry wolf about.

For one, AMD's GCN has been a mainstay in the graphics computing world since it was first introduced back in 2012

I wonder what you would have written if instead of Core, Intel gave us a refined mainstay Netburst instead.
 
It is amazing where a one time there was little traffic about AMD but man are they in the spotlight now!!
 
It is amazing where a one time there was little traffic about AMD but man are they in the spotlight now!!
Well, there was a lot of traffic about AMD back in Athlon (in its various incarnations) days. But then AMD went Bulldozer&friends...
 
Well, there was a lot of traffic about AMD back in Athlon (in its various incarnations) days. But then AMD went Bulldozer&friends...

I know but that was just CPUs.
 
I know but that was just CPUs.
There was a lot of traffic about ATI, too, back in the Radeon 9000 series days. Anyone remembers which 9500 was guaranteed to unlock into a 9700? ;)
 
I know but that was just CPUs.
Tahiti was a definite hit. It was so good many AMD fanboys did not see the need to change cards which hurt AMD's bottom line.
How is this news? We've known Navi is gonna be trash for months...

How can you even substantiate a statement like that?

I am not going to say that it is a definite but the fact that games on the console have been made to run on AMD hardware for the last how many years will begin to see better optimization for multi core, especially now that the next PS5 will basicallay be a Ryzen Based Navi part. Should that not mean that now more than ever we should start to see more Strange Brigades and AOTS to really show what AMD is capable of, Microsoft is the key though. I would like to think that Navi will support 4K 60HZ as that is what most TVs in living rooms with PS5s will have at least 60 but perhaps 120HZ or 240HZ refresh rates too.
 
AMD's bridgman: "You could call it a hybrid but not in that sense... we used to talk about GCN as an ISA, but it seems that most people outside AMD think of GCN as a micro-architecture instead (ie an implementation of the ISA). RDNA is GCN ISA but not what you think of as GCN architecture."
There are naysayer... there always will be... but it still up in the air if AMD/RTG has found a new way of looking at a "micro-architecture" while still making use of the ISA.

I see RDNA just a new group of the basic "nucleotides" (building blocks) that can be architecturally arranged as needed. I think what this first Navi uses is not everything (blocks) they have, but parts and pieces that are functionally able to coexist (aka memory, cache hierarchy, latency improvements) while probably less all new arrangement of Compute Unit blocks built for specific for game loads (more optimized GCN), or other blocks for professional, calculations, HPC, AI.

I don't think we will fully see what RDNA comprises of till after a Full-Navi comes for Gaming.

Tahiti was a definite hit. It was so good many AMD fanboys did not see the need to change cards which hurt AMD's bottom line.
It was more that Rory turned up production to satisfy the first Mining Boom, then as quickly the Bust (@ bye Rory). Incoming Lisa Su, had to clear them off the books, so they discounted them and people bought them. They didn't hold back Hawaii, which was hard sell as it came and there was little presence of low cost 1440p panels that most thought would support it, so it was a expensive outlay. Many held to 1080p and for that Tahiti was more than sufficient.
 
Last edited:
I don't find that rumor too far-fetched. A while back, I remember reading an article on a tech news site about an AMD patent that would make AMD's GPUs comparable in efficiency to those from NVIDIA. Right now, AMD has to include expensive fancy hardware, like HBM, on its video cards without raising the price to compensate in order to match NVIDIA performance - as the CEO of NVIDIA was impolite enough to point out a while back. So once it gets a design ready based on that tech, to lower production costs, I think AMD won't hesitate to switch to it.

Ah: the article referenced U. S. patent application 20180357064.
 
Last edited:
That's not a naysayer... it's an AMD employee. And his post really clarifies things in a positive way.
Iirc Bridgman comes from ATI. So he's not a noob either.
 
There are naysayer... there always will be... but it still up in the air if AMD/RTG has found a new way of looking at a "micro-architecture" while still making use of the ISA.

I see RDNA just a new group of the basic "nucleotides" (building blocks) that can be architecturally arranged as needed. I think what this first Navi uses is not everything (blocks) they have, but parts and pieces that are functionally able to coexist (aka memory, cache hierarchy, latency improvements) while probably less all new arrangement of Compute Unit blocks built for specific for game loads (more optimized GCN), or other blocks for professional, calculations, HPC, AI.

I don't think we will fully see what RDNA comprises of till after a Full-Navi comes for Gaming.

It was more that Rory turned up production to satisfy the first Mining Boom, then as quickly the Bust (@ bye Rory). Incoming Lisa Su, had to clear them off the books, so they discounted them and people bought them. They didn't hold back Hawaii, which was hard sell as it came and there was little presence of low cost 1440p panels that most thought would support it, so it was a expensive outlay. Many held to 1080p and for that Tahiti was more than sufficient.
The mining boom was more the realm of Polaris and used Tahiti and Hawaii cards, Those cards were released in 2012 as the 7970 and 7950, the era of $199 GPUs and Korean 1440P screens. AMD cards have been suffering since the 780TI launched. but the 970 and 1080 were what did them in.
 
The mining boom was more the realm of Polaris and used Tahiti and Hawaii cards, Those cards were released in 2012 as the 7970 and 7950, the era of $199 GPUs and Korean 1440P screens. AMD cards have been suffering since the 780TI launched. but the 970 and 1080 were what did them in.
Yup. Back when TSMC's 22nm node failed and everybody was stuck on 28nm, AMD kept their compute resources and tried to parallelize till they couldn't feed their hardware while Nvidia (wisely, imho), decided to cut back on compute hardware that does little for gaming. Nvidia figuring out TBR was just the nail in GCN's coffin.
AMD sticking to compute helped them during the mining boom era, so they made back some of the lost cash. But that was just a fluke.
 
This year's tech is a stopgap for next year's tech.

/techindustrysummary
 
  • Like
Reactions: bug
So it's still GCN, but with cheese on top (hence the name).
 
It was always known Navi+ is the game changer for AMD. I'm not ruling out Navi card if price and power consumption are low and it can compete at least at 2070 levels, but we'll see.
 
Nvidia figuring out TBR was just the nail in GCN's coffin.
Did a search. Found out that Nvidia didn't disclose tile-based rendering was what they used - outside hackers figured that out themselves. And TBR is a standard technique, long used for low-power GPUs, like those in mobile phones. So, since that means there's no patent stopping them, why on Earth hasn't AMD gotten around to using TBR on their GPUs?
 
Back
Top