Friday, March 6th 2020

AMD RDNA2 Graphics Architecture Detailed, Offers +50% Perf-per-Watt over RDNA

With its 7 nm RDNA architecture that debuted in July 2019, AMD achieved a nearly 50% gain in performance/Watt over the previous "Vega" architecture. At its 2020 Financial Analyst Day event, AMD made a big disclosure: that its upcoming RDNA2 architecture will offer a similar 50% performance/Watt jump over RDNA. The new RDNA2 graphics architecture is expected to leverage 7 nm+ (7 nm EUV), which offers up to 18% transistor-density increase over 7 nm DUV, among other process-level improvements. AMD could tap into this to increase price-performance by serving up more compute units at existing price-points, running at higher clock speeds.

AMD has two key design goals with RDNA2 that helps it close the feature-set gap with NVIDIA: real-time ray-tracing, and variable-rate shading, both of which have been standardized by Microsoft under DirectX 12 DXR and VRS APIs. AMD announced that RDNA2 will feature dedicated ray-tracing hardware on die. On the software side, the hardware will leverage industry-standard DXR 1.1 API. The company is supplying RDNA2 to next-generation game console manufacturers such as Sony and Microsoft, so it's highly likely that AMD's approach to standardized ray-tracing will have more takers than NVIDIA's RTX ecosystem that tops up DXR feature-sets with its own RTX feature-set.
AMD GPU Architecture Roadmap RDNA2 RDNA3 AMD RDNA2 Efficiency Roadmap AMD RDNA2 Performance per Watt AMD RDNA2 Raytracing
Variable-rate shading is another key feature that has been missing on AMD GPUs. The feature allows a graphics application to apply different rates of shading detail to different areas of the 3D scene being rendered, to conserve system resources. NVIDIA and Intel already implement VRS tier-1 standardized by Microsoft, and NVIDIA "Turing" goes a step further in supporting even VRS tier-2. AMD didn't detail its VRS tier support.

AMD hopes to deploy RDNA2 on everything from desktop discrete client graphics, to professional graphics for creators, to mobile (notebook/tablet) graphics, and lastly cloud graphics (for cloud-based gaming platforms such as Stadia). Its biggest takers, however, will be the next-generation Xbox and PlayStation game consoles, who will also shepherd game developers toward standardized ray-tracing and VRS implementations.

AMD also briefly touched upon the next-generation RDNA3 graphics architecture without revealing any features. All we know about RDNA3 for now, is that it will leverage a process node more advanced than 7 nm (likely 6 nm or 5 nm, AMD won't say); and that it will come out some time between 2021 and 2022. RDNA2 will extensively power AMD client graphics products over the next 5-6 calendar quarters, at least.
Add your own comment

306 Comments on AMD RDNA2 Graphics Architecture Detailed, Offers +50% Perf-per-Watt over RDNA

#251
ARF
Rumour/leak has it:



Posted on Reply
#252
R0H1T
400W TDP, yeah BS o_O
Posted on Reply
#253
ARF
R0H1T400W TDP
True halo, enthusiast part. Even Threadripper 3990X is a single CPU alone draws 280-watts.

This will have GDDR6, not HBM, so the high power is welcome and good for us now.
Posted on Reply
#254
Space Lynx
Astronaut
im ok with 400w as long it beats a 2080 ti by 15-20% and costs around $799
Posted on Reply
#255
Super XP
lynx29im ok with 400w as long it beats a 2080 ti by 15-20% and costs around $799
400W? No thanks, regardless on who releases such a thing.
Posted on Reply
#256
ARF
Super XP400W? No thanks, regardless on who releases such a thing.
For you there will be Navi 23 with 150-watt and still as fast as RTX 2080 Ti :D
Posted on Reply
#257
Unregistered
I have a gfx that reaches 300w and I have a hard time keeping it below 80c.
I would not want a 400w card. I just wouldn't.
Posted on Edit | Reply
#258
Super XP
ARFFor you there will be Navi 23 with 150-watt and still as fast as RTX 2080 Ti :D
:peace::laugh:
jmcslobI have a gfx that reaches 300w and I have a hard time keeping it below 80c.
I would not want a 400w card. I just wouldn't.
Me either,
Posted on Reply
#259
Space Lynx
Astronaut
jmcslobI have a gfx that reaches 300w and I have a hard time keeping it below 80c.
I would not want a 400w card. I just wouldn't.
I mean the only way I would want it is if they invested heavily in a stock cooler that keeps it at 70 celsius in most situation, but yeah I agree with you otherwise

this 2 fan vaporchamber leaked design of the rtx 3080 looks very interesting for example, i bet it could cool 400 watts. all those fins :D
Posted on Reply
#260
Super XP
lynx29I mean the only way I would want it is if they invested heavily in a stock cooler that keeps it at 70 celsius in most situation, but yeah I agree with you otherwise

this 2 fan vaporchamber leaked design of the rtx 3080 looks very interesting for example, i bet it could cool 400 watts. all those fins :D
For new modern technology, that's too much wattage. Which tells me if true, Nvidia has nothing new and are taking a slightly revamped Turing and pumping up the juice then renaming it 3080.
Posted on Reply
#261
Valantar
400W can't be cooled effectively in a PCIe form factor in >99% of cases on the market. It simply isn't feasible. Water cooling with a 240mm radiator as stock might work, but that still limits compatibility to a) cases that can fit one, and b) users who don't already have that space taken by their CPU cooling. If a 360mm rad is needed, it is DOA.

So no, this is BS. 400W isn't happening.
Posted on Reply
#263
R0H1T
Those were dual GPUs, you should know better than to spread such egregious rumors. Slow rumor day for you?
Posted on Reply
#264
ARF
R0H1TThose were dual GPUs, you should know better than to spread such egregious rumors. Slow rumor day for you?
And so what ? The power delivery circuitry obviously doesn't care how many chips you got on the PCB :laugh:
Posted on Reply
#265
moproblems99
lynx29im ok with 400w as long it beats a 2080 ti by 15-20% and costs around $799
It better double the performance for nearly double the watts and 2 years later.
Posted on Reply
#266
R0H1T
Double as compared to what exactly?
Posted on Reply
#267
moproblems99
R0H1TThose were dual GPUs, you should know better than to spread such egregious rumors. Slow rumor day for you?
Frankly, likely bullshit GPU rumors are still better than real news and reality this year.
R0H1TDouble as compared to what exactly?
See quoted post.
Posted on Reply
#268
ARF
moproblems99See quoted post.
Really ? It will be great if double the performance of the RX 5700 XT at 1080p.
Posted on Reply
#269
moproblems99
ARFReally ? It will be great if double the performance of the RX 5700 XT at 1080p.
How would that be impressive? It would basically be two 5700XT, complete with double the power draw. That is not impressive. 400W is a disappointment with any metric unless it has double the performance of 2080 Ti.
Posted on Reply
#271
Master Tom
I have the Radeon 64 Liquid Cooling. It does not even reach 60°C.
Super XP400W? No thanks, regardless on who releases such a thing.
How much does 1 kw/h cost in Greece?
Posted on Reply
#272
moproblems99
ARFAre you serious?


www.techpowerup.com/review/amd-radeon-rx-5700-xt/28.html
Yes.

If you give me double the wattage then at a minimum I am expecting double the performance. To be impressed, you have to give me double the performance at significantly less than double the wattage or double the wattage and significantly more than double the performance. Besides, if they (AMD) achieve double performance of a 5700XT, then they will be at roughly the expected performance of Ampere...but.....at significantly more power usage. I, for one, don't want to dump 400watts of heat into my office that is already 28C because it is summer. I don't care about the power draw itself, I care about working/gaming in a sauna.
Posted on Reply
#273
ARF
moproblems99Yes.

If you give me double the wattage then at a minimum I am expecting double the performance. To be impressed, you have to give me double the performance at significantly less than double the wattage or double the wattage and significantly more than double the performance. Besides, if they (AMD) achieve double performance of a 5700XT, then they will be at roughly the expected performance of Ampere...but.....at significantly more power usage. I, for one, don't want to dump 400watts of heat into my office that is already 28C because it is summer. I don't care about the power draw itself, I care about working/gaming in a sauna.
RTX 2080 Ti is a 300-watt part. So, you expect 33% higher performance, not double the performance up to 400-watt.
Posted on Reply
#274
Master Tom
moproblems99Besides, if they (AMD) achieve double performance of a 5700XT, then they will be at roughly the expected performance of Ampere...but.....at significantly more power usage. I, for one, don't want to dump 400watts of heat into my office that is already 28C because it is summer. I don't care about the power draw itself, I care about working/gaming in a sauna.
In Germany it so hot in summer, that you need air condition anyway. In winter the i9-9900K and the Radeon Vega 64 Lquid Cooling help heating the room :)
In which country do you live?
Posted on Reply
#275
Caring1
lynx29im ok with 400w as long it beats a 2080 ti by 15-20% and costs around $799
I'd be ok with 400W if it beat the 2080Ti by 100% and idle power consumption was less than 10W.
Posted on Reply
Add your own comment
Dec 22nd, 2024 06:38 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts