Saturday, January 23rd 2021
AMD is Allegedly Preparing Navi 31 GPU with Dual 80 CU Chiplet Design
AMD is about to enter the world of chiplets with its upcoming GPUs, just like it has been doing so with the Zen generation of processors. Having launched a Radeon RX 6000 series lineup based on Navi 21 and Navi 22, the company is seemingly not stopping there. To remain competitive, it needs to be in the constant process of innovation and development, which is reportedly true once again. According to the current rumors, AMD is working on an RDNA 3 GPU design based on chiplets. The chiplet design is supposed to feature two 80 Compute Unit (CU) dies, just like the ones found inside the Radeon RX 6900 XT graphics card.
Having two 80 CU dies would bring the total core number to exactly 10240 cores (two times 5120 cores on Navi 21 die). Combined with the RDNA 3 architecture, which brings better perf-per-watt compared to the last generation uArch, Navi 31 GPU is going to be a compute monster. It isn't exactly clear whatever we are supposed to get this graphics card, however, it may be coming at the end of this year or the beginning of the following year 2022.
Source:
VideoCardz
Having two 80 CU dies would bring the total core number to exactly 10240 cores (two times 5120 cores on Navi 21 die). Combined with the RDNA 3 architecture, which brings better perf-per-watt compared to the last generation uArch, Navi 31 GPU is going to be a compute monster. It isn't exactly clear whatever we are supposed to get this graphics card, however, it may be coming at the end of this year or the beginning of the following year 2022.
141 Comments on AMD is Allegedly Preparing Navi 31 GPU with Dual 80 CU Chiplet Design
"paper launch q1 2023!" :rolleyes:
Now, yeah, pricing is way above MSRP, but so was 2080Ti's price up until Ampere came.
It's understandable, too, because Turing was a complete rip-off for the first year on the market, and then when Navi arrived on the scene a year late the party it was underwhelming AF; The 5700 wasn't even 15% faster than the Vega56, and even at MSRP, it actually offered lower performance/$. Street pricing on Vega56 by that point was way lower than the 5700-series ever reached.
The only "good" thing that happened that entire generation is that AMD rejoining the GPU market after a year of total nothingness brought TU104 down to under $500 in the 2070 Super. That didn't make it good value, but at least it restored price/performance to somewhere that wasn't obscene. Unfortunately Nvidia stopped manufacturing Turing around 9 months after the Super series launched, so if you didn't grab one during that short window you're SOL right now.
CP2077's DLSS is the first time I though that the 2070S I also own showed any noteworthy advantage over the 5700XT. Again, DLSS has been a complete non-feature for over two years, because until CP2077 showed up, none of the implementations of it were worth writing home about.
Unfortunately the miners have ruined it all again, I'm seeing 5700-series cards at almost 50% premium over MSRP, which was never that competitive in the first place....
Crazy how computer building has gone.
He's not much on following hardware and I explained to him how his card,normally would only bring around $200 or so at this point on the second hand market and that GPUs were hard to find. He started looking around and noticed no one has any new GPUs in stock and that ebay prices for scalping was astronomical. Keep old hardware. I've got two spare PSUs, spare RAM (well, DDR3 so it wouldn't help if my new build needed RAM to test with) and two GPUs. When I upgrade I try to keep old working hardware for spare parts in testing just to have for a backup so I can figure out any possible problems that come up in the future if I need spare parts to test with.
Great I'm sure miners will love it.
I was never in the market for 1000 dollar gpus and will never be either so does not change much for me if they are available or not.