Thursday, May 21st 2020

AMD RDNA2 Based Radeon RX Graphics Cards Launching This September

AMD's next-generation RDNA2 architecture based Radeon RX series client-segment graphics cards will launch in September 2020, according to a DigiTimes report citing industry sources. This would make September a mighty busy month for hardware launches, as the company is also expected to debut its 4th generation Ryzen "Vermeer" (and possibly "Renoir) desktop processors in the AM4 package. NVIDIA is expected to debut its GeForce "Ampere" client-segment graphics cards around the same time. Although not in the same computing segment, Intel could also debut its 11th generation Core "Tiger Lake" mobile processors.

RDNA2 is an important launch for AMD as it's the company's first graphics architecture that meets the DirectX 12 Ultimate logo requirements, which include real-time ray-tracing capability leveraging DXR, variable rate shading, mesh shaders, and sampler feedback. AMD and NVIDIA will be debuting their graphics cards close to the release of CD Projekt's "Cyberpunk 2077," which is emerging as the year's most hotly anticipated game.
Sources: DigiTimes, via VideoCardz
Add your own comment

70 Comments on AMD RDNA2 Based Radeon RX Graphics Cards Launching This September

#26
Papacheeks
ppnIt's the same 7nm unless they managed to get the density up to 62 Mtr/mm2 where it should be, not this 41Mtr/mm2 currently that is more 10nm-like. then 5nm +60% and 3nm +60% are just around the corner, 7nm is pretty much obsolete. So if you are in the market for videocards $300 is the maximum reasonable price for something like 5700 +60%, Yeah that $349 is too much for PS5 console-like graphics even $200 is too much for 2304 processors..
These new RDNA 2/Ryzen 4000 Chips from what i know are using EUV process. So density should be different. IPC gains, and better watt/performance should increase by a lot.
Posted on Reply
#27
ARF
ppnIt's the same 7nm unless they managed to get the density up to 62 Mtr/mm2 where it should be, not this 41Mtr/mm2 currently that is more 10nm-like. then 5nm +60% and 3nm +60% are just around the corner, 7nm is pretty much obsolete. So if you are in the market for videocards $300 is the maximum reasonable price for something like 5700 +60%, Yeah that $349 is too much for PS5 console-like graphics even $200 is too much for 2304 processors..
"5700 + 60%" should be Navi 23 with its 250 sq. mm die size.
Navi 22 with its 350 sq. mm and Navi 21 with 505 sq. mm have to be faster.
Posted on Reply
#28
dyonoctis
CasecutterThese are "client-segment" RX series graphics cards to launch September 2020. That usually does not mean "Consumer/Gaming-segment cards" or at least not yet. I'd say first "enthusiast orientated Consumer/Gaming-card" will be 3-month after, while a replacement for the RX 5700XT (Navi 10) would be at best Q1 2021. So that would make Navi 10 that released July 7th 2019, having something like a 19-month tour of duty, which is about normal.
If it's named "RX", then it's for gaming. Server gpu are called "Instinct", and worksation are simply called "Radeon pro".
AMD line up doesn't follow the same order as nvidia were it's : Big server gpu first, and then cut down gaming gpu. RDNA was first used on a gaming gpu, then on workstation. The server side is still vega on 7 nm.
Posted on Reply
#29
ARF
dyonoctisIf it's named "RX", then it's for gaming. Server gpu are called "Instinct", and worksation are simply called "Radeon pro".
AMD line up doesn't follow the same order as nvidia were it's : Big server gpu first, and then cut down gaming gpu. RDNA was first used on a gaming gpu, then on workstation. The server side is still vega on 7 nm.
And not only that but compute will be Arcturus, while here the topic is Navi 2X RDNA2.
Arcturus doesn't come with RDNA2.

And RX 5700 XT is nothing but something like a small pipe-cleaner type of card for the N7-class process.
Posted on Reply
#30
pat-roner
ChomiqBoth Nvidia and AMD releasing new GPUs in September? This is going to be fun. Here's hoping they'll duke it out in regards to pricing and performance.
Finally a good reason to upgrade my 980 ti
Posted on Reply
#31
yeeeeman
Too bad working drivers don't launch at the same time.
Posted on Reply
#32
IceShroom
ppnIt's the same 7nm unless they managed to get the density up to 62 Mtr/mm2 where it should be, not this 41Mtr/mm2 currently that is more 10nm-like. then 5nm +60% and 3nm +60% are just around the corner, 7nm is pretty much obsolete. So if you are in the market for videocards $300 is the maximum reasonable price for something like 5700 +60%, Yeah that $349 is too much for PS5 console-like graphics even $200 is too much for 2304 processors..
AMD already uses 62MT/mm2 desity in their high clocked Renoir APU's and it has 4x Clock speed of Ampere.
Why not ask Nvidia to do that and like Turing they launched the new generation GPU first. If Nvidia prices their gpus resonable AMD will have to do the same.
champsilva5700XT 1 year lifespan

That was fast.
Better than buying 2000$+ laptop with Turing gpu now when Ampere is just around the corner.
Posted on Reply
#33
SL2
IceShroomBetter than buying 2000$+ laptop with Turing gpu now when Ampere is just around the corner.
The mobile GPU's usually shows up a few months later tho. Four months later the last time.
Posted on Reply
#34
IceShroom
MatsThe mobile GPU's usually shows up a few months later tho. Four months later the last time.
Well you can change a desktop gpu easily, but can you do the same in those 2000$+ laptop?
Posted on Reply
#35
SL2
IceShroomWell you can change a desktop gpu easily, but can you do the same in those 2000$+ laptop?
Nooo, I'm just saying that around the corner doesn't really apply universally to both desktop and mobile. The new laptops might not be just as "around the corner" as the new 3000 graphics cards.

I'm not updated with gaming laptop upgradeability, but I assume MXM cards isn't really as common as it should be anymore.
Posted on Reply
#36
Jinxed
Vayra86For consoles AMD has a volume advantage and they also have nearly guaranteed sales for it. Margins can be very low and still profitable, which I reckon is also part of the reason AMD is still supplying these chips. Cost effectiveness, which is what consoles still need even if specs are better.
That's not true and it never was. Yet some people keep recycling "the consoles" argument over and over again.

There were 106 million PS4 and 46.9 million Xbox One consoles sold in total since their release in 2013. That's 7 years and counting and roughly 21 million per year. NVIDIA alone sold more than 15 million of their RTX GPUs since their release less than 2 years ago. And that's a higher end product (and even a more expensive than previous generations). Combined with lower end cards and those of AMD, PC hardware market seems just as big, if not bigger, than consoles. It is certainly more profitable (Nvidia's 65% margin speaks for itself).

Consoles are not an advatage for AMD. They help AMD survive, barely. They also suck a big portion of waffer allocation from AMD, that could have been otherwise sold in a more profitable PC or server market (if AMD was competitive, which it is not except for CPUs, unless RDNA2 changes that somehow).
Posted on Reply
#37
Vayra86
JinxedThat's not true and it never was. Yet some people keep recycling "the consoles" argument over and over again.

There were 106 million PS4 and 46.9 million Xbox One consoles sold in total since their release in 2013. That's 7 years and counting and roughly 21 million per year. NVIDIA alone sold more than 15 million of their RTX GPUs since their release less than 2 years ago. And that's a higher end product (and even a more expensive than previous generations). Combined with lower end cards and those of AMD, PC hardware market seems just as big, if not bigger, than consoles. It is certainly more profitable (Nvidia's 65% margin speaks for itself).

Consoles are not an advatage for AMD. They help AMD survive, barely. They also suck a big portion of waffer allocation from AMD, that could have been otherwise sold in a more profitable PC or server market (if AMD was competitive, which it is not except for CPUs, unless RDNA2 changes that somehow).
Missing the point completely...

- AMD has produced 150+ million of the exact same GPUs. That is volume. Turing is comprised of a whole range of cards: even two radically different designs, 20 and 16 series, with fundamentally different blocks in them. And yet, Nvidia still reaches about 1/5th of AMDs console chip sales.

- AMD has no limit on wafer allocation.. they buy in and price is factored into the product cost. But consoles do offer a long term contract that allows AMD to project demand way ahead of everyone else. Win win; fab gets work, AMD gets a lower price. And the design is always the same too.
Posted on Reply
#38
InVasMani
Anyone curious to see what the new Radeon Pro cards are like in crossfire despite I believe being Vega based still the infinity fabric is certainly intriguing. If they scale well for gaming they'll actually be rather neat dual purpose work and play cards expensive though, but not so much relative to the current lineup of Quadro's.
Posted on Reply
#39
Vayra86
ARFFury X with its 4GB... no way it's going to work properly on newer titles anyways.
Vega is a compute-oriented architecture, with many compromises for the normal user.

RDNA2 will be a true new architecture, RDNA is just a hybrid or halfway between GCN and RDNA2.
LMAO yes that is exactly what we heard a half dozen times before right?
Posted on Reply
#40
Legacy-ZA
The question is; Will they be readily available worldwide with this virus doing the rounds?
Posted on Reply
#41
Elysium
The 5700 [XT] will be made obsolete by RDNA 2, simply because of ray-tracing. Will it still have good perf in most games? Yeah of course. But if it's lacking a feature that even consoles are going to have, it's a bad decision to buy one from here on out at anything more than $200. So champsilva's point actually holds true.
Posted on Reply
#42
kapone32
champsilva5700XT 1 year lifespan

That was fast.
You mean it's MSRP right?
Posted on Reply
#45
InVasMani
cyberlonerbeat nvidia?
Looks it for now at least in that particular benchmark. By the time it's actually released though it might not be the case. Still if it gets closer to a 2080 Ti it'll at least lower the barrier a bit on the lower performance tier of cards. Some competition is better than none at least. Gotta also wonder how the power efficiency is just the same that comes into play if for no other reason than power supply restraints people may have in their systems.
Posted on Reply
#46
ARF
cyberlonerbeat nvidia?
Imagine if it is the 250 sq. mm Navi 23.
InVasManiLooks it for now at least in that particular benchmark. By the time it's actually released though it might not be the case. Still if it gets closer to a 2080 Ti it'll at least lower the barrier a bit on the lower performance tier of cards. Some competition is better than none at least. Gotta also wonder how the power efficiency is just the same that comes into play if for no other reason than power supply restraints people may have in their systems.
How is it faster in the benchmark but over time it would become slower ? ?
Posted on Reply
#47
1d10t
theoneandonlymrkProbably not, it will likely be the lower end part's next time round so will retain active support, Rdna2 might only be in the big new Navi with everything below being made up from Navi 1 chip's in the 6### lineup.
efikkanComplaining about new products being better probably one of the most pointless types of complaints. Progress is good, if you can't deal with it, don't buy anything. :cool:
It's not like back in the late 90s when buying a $2500 high-end PC would barely run (if at all) the coolest games released the following year, and technologies and APIs were deprecated left and right.
I didn't write anything beside my short rant, do I ? :D
I mean short lifespan in my PEG slot, and that's it. You guys watching too much conspiracy theory :D
Posted on Reply
#48
TheoneandonlyMrK
1d10tI didn't write anything beside my short rant, do I ? :D
I mean short lifespan in my PEG slot, and that's it. You guys watching too much conspiracy theory :D
Nah no way mate.
If you can't control the Itch, that's on you stop moaning:D.
And chips are directly linkable to aliens, we didn't have chip's before Bob lazar's mate's back engineering efforts on Aliens ships;) :p :D
Posted on Reply
#49
kapone32
theoneandonlymrkNah no way mate.
If you can't control the Itch, that's on you stop moaning:D.
And chips are directly linkable to aliens, we didn't have chip's before Bob lazar's mate's back engineering efforts on Aliens ships;) :p :D
:laugh: Regardless of aliens or not I still marvel at how we have come from the1980s with 3MB hard drives to today's 7nm transistors and 8TB NVME drives.
Posted on Reply
#50
Jinxed
Vayra86- AMD has produced 150+ million of the exact same GPUs. That is volume. Turing is comprised of a whole range of cards: even two radically different designs, 20 and 16 series, with fundamentally different blocks in them. And yet, Nvidia still reaches about 1/5th of AMDs console chip sales.
Pretending to be dumb, huh?

AMD manufactured 150 million console chips OVER 7 YEARS. That is 21 million per year on average. Nvidia's statement about 15 million of their RTX GPUs (RTX 2060 and above, not inlcuding GTX 1660, 1650, etc.) came out in March this year (and was widely quoted on the net), as you can read here:
blogs.nvidia.com/blog/2020/03/23/dlss-2-ai-gaming-directx-12-ultimate/

RTX GPUs were released from 20th of September 2018 (RTX 2080) through 17th of October 2018 (RTX 2070) to 15th of January 2019 (RTX 2060), as you can read here:
en.wikipedia.org/wiki/GeForce_20_series

There was also a shortage of these cards initially, lasting to roughly January 2019. With the highest volume card (RTX 2060) release in mid January 2019, the initial shortages and the "15 million RTX GPUs sold" appearing in March 2020, it's easy to see that Nvidia sold those 15 million RTX GPUs roughly within a year. Add the lower end cards like GTX 1660, 1660ti, 1650 etc. and you can easily get to 21 million of GPUs sold, if not more.

In the retrospective, over those seven years since the consoles were released, Nvidia alone might have sold more chips than all the consoles together, especially if you think about sales generated by the cryptocurrency mining waves.
Vayra86- AMD has no limit on wafer allocation.. they buy in and price is factored into the product cost. But consoles do offer a long term contract that allows AMD to project demand way ahead of everyone else. Win win; fab gets work, AMD gets a lower price. And the design is always the same too.
Of course they do have waffer allocation limits. TSMC capacity is not infinite you know. And the 7nm capacity has to be shared among big players like Apple, Nvidia, AMD and others. And it is fully booked. Is that selective blindness on your part or what?

"TSMC 7nm capacity fully booked till 2H 2020": wccftech.com/amd-7nm-wafer-production-set-to-double-in-2h-2020-7nm-capacity-at-tsmc-currently-fully-booked/
Posted on Reply
Add your own comment
Dec 22nd, 2024 18:25 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts