Friday, December 13th 2019

Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2

Hardware-accelerated ray tracing and variable-rate shading will be the design focal points for AMD's next-generation RDNA2 graphics architecture. Microsoft's reveal of its Xbox Series X console attributed both features to AMD's "next generation RDNA" architecture (which logically happens to be RDNA2). The Xbox Series X uses a semi-custom SoC that features CPU cores based on the "Zen 2" microarchitecture and a GPU based on RDNA2. It's highly likely that the SoC could be fabricated on TSMC's 7 nm EUV node, as the RDNA2 graphics architecture is optimized for that. This would mean an optical shrink of "Zen 2" to 7 nm EUV. Besides the SoC that powers Xbox Series X, AMD is expected to leverage 7 nm EUV for its RDNA2 discrete GPUs and CPU chiplets based on its "Zen 3" microarchitecture in 2020.

Variable-rate shading (VRS) is an API-level feature that lets GPUs conserve resources by shading certain areas of a scene at a lower rate than the other, without perceptible difference to the viewer. Microsoft developed two tiers of VRS for its DirectX 12 API, tier-1 is currently supported by NVIDIA "Turing" and Intel Gen11 architectures, while tier-2 is supported by "Turing." The current RDNA architecture doesn't support either tiers. Hardware-accelerated ray-tracing is the cornerstone of NVIDIA's "Turing" RTX 20-series graphics cards, and AMD is catching up to it. Microsoft already standardized it on the software-side with the DXR (DirectX Raytracing) API. A combination of VRS and dynamic render-resolution will be crucial for next-gen consoles to achieve playability at 4K, and to even boast of being 8K-capable.
Add your own comment

119 Comments on Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2

#1
ratirt
Wonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.
Posted on Reply
#2
londiste
ratirtWonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.
Parts of the screen with some pretty fine granularity are shaded at a lower resolution.
Wolfenstein II was the first game to implement it with minor but measurable performance boost.
UL has VRS feature test out as part of 3DMark: www.techpowerup.com/261825/ul-benchmarks-outs-3dmark-feature-test-for-variable-rate-shading-tier-2

There are some documents and videos that have pretty good explanation of how this works:
software.intel.com/en-us/videos/use-variable-rate-shading-vrs-to-improve-the-user-experience-in-real-time-game-engines
developer.nvidia.com/vrworks/graphics/variablerateshading
Posted on Reply
#3
eidairaman1
The Exiled Airman
ratirtWonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.
Radeon Rays
Posted on Reply
#4
ratirt
So does this mean the rx 5800 (or basically the upcoming NAVI cards in 2020) will have this?
Posted on Reply
#5
DeathtoGnomes
eidairaman1Radeon Rays
Magik is happening again!
Posted on Reply
#6
eidairaman1
The Exiled Airman
ratirtSo does this mean the rx 5800 (or basically the upcoming NAVI cards in 2020) will have this?
Plausible,
Posted on Reply
#7
xkm1948
But, but Real time ray tracing are gimmicks!

—-certain fanbois
Posted on Reply
#8
Vya Domus
xkm1948But, but Real time ray tracing are gimmicks!

—-certain fanbois
If more than one manufacturer does something it means it's not a gimmick!

-braindead logic
Posted on Reply
#9
ratirt
I don't need RR that much as long as I can get the 60 FPS in 4K I'm ok. I won't get mad if I'll have to wait longer for the RR+VDS feature. What I'm concerned about is, if this RR is happening whit the 5800 model Navi and up, that can mean delays in the release due to some new feature implementation and whatever, and I really don't want that to happen. I need a new Graphics card.
Posted on Reply
#10
londiste
RX 5800 is not really confirmed, is it? AMD should be in a bit of a trouble trying to fit into 250W with big Navi especially considering RX 5700XT is a 225W card.
RDNA2 is more likely to be a next generation thing, RX 6000-series or whatever its eventual name will be.
Posted on Reply
#11
spectatorx
ratirtSo does this mean the rx 5800 (or basically the upcoming NAVI cards in 2020) will have this?
From news i'm reading about navi i'm confused and leaning more and more towards conclusion there will be no 5800/5900 at all. There is absolutely no info about these cards, seems like navi2/rdna2 is what will be next and that to me is sad as i expected high-end gpu from amd to show up in the end of this year or at beginning of 2020. This only means i will seat on my temporary upgrade 580 for much longer than i expected.

Variable rate shading and raytracing are things i do not care at all. First one reduces image quality second one done properly, not faked, as it is with for example RTX library (yes, rtx is simplified and faked in many aspects form of "raytracing" and still killing performance too much), requires drastic changes to graphics rendering overall and still tons of performance which we will not reach in many decades.
Posted on Reply
#13
efikkan
londisteRX 5800 is not really confirmed, is it? AMD should be in a bit of a trouble trying to fit into 250W with big Navi especially considering RX 5700XT is a 225W card.
RDNA2 is more likely to be a next generation thing, RX 6000-series or whatever its eventual name will be.
I would be surprised if they have a larger Navi ready, and if they did, the TDP would probably be in the 300-350W range.
Also don't forget that RX 5700 was renamed "last minute", even some official photos displayed "RX 690".
Posted on Reply
#14
BakerMan1971
Well I welcome Raytracing from the Red Team, its great that we all get faster and faster cards, but if the visuals don't change and developers just continue to bloat existing engines in the vain attempt to offer us 'new' exeriences, we need those new technolgies to take the next step.

Hope it makes sense I am all tired and dizzy :)
Posted on Reply
#15
cucker tarlson
ratirtWonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.
well it's not like nvidia already does it in wolfenstein games
and why would that be blurry ? it's not an image reconstruction tehcnique.
Posted on Reply
#16
Maelwyse
I continue to hope that we get a frame-rate improvement, rather than feature expansion.*
I'm running on a (admittedly top of the line) card from a generation ago (1080TI) it's 2 years old. I would upgrade if I were to get a reliable, significant, increase in framerates. A new feature that drops framerates? No.

Now, if you were to tell me I could have both, I'd consider it, but I'm not going to buy a card which is a 10-25% framerate improvement, for about 2x the price, of a 2 year old card.

*I'm almost convinced that VRS is the key piece I am hungry for. I've taken a hard look at a couple different samples, and if it works like I've seen, I'd certainly accept the image quality 'drop' for part of the screen, for the framerate improvements they have been touting.

And if that came with RR, meh. I wouldn't complain, but it wouldn't be the reason I buy a new card. Besides. HOW many games support hardware RayTracing so far? 7? 8? of which I'd actually think about paying for 2 or so of?
Posted on Reply
#17
Steevo
xkm1948But, but Real time ray tracing are gimmicks!

—-certain fanbois
There is more than one way to skin a cat.


Hardware support with compressed vector tables to reduce the computational overhead of real time is one. Allow a CPU core or two to work out basic angle dependant setup info then hand that off the same way we got angle independent anisotropic filtering.

How many games support Ray tracing again? Physx hardware accelerated fluff still in the news? Overburden of Tesselation? Hair works?

Nvidia deserves the flack for what they do, just like AMD deserves so much shit it would take a bulldozer to move it, except it overheated with it's "real men" cores.
Posted on Reply
#18
Chomiq
Wider adaption of RT can only help the consumer while VRS can't help deliver better performance to the entire ecosystem, be it PC or consoles.
Posted on Reply
#19
TheGuruStud
Microsoft is still out of touch with reality.

"Hey, want to one up ourselves with the dumbest naming since Xbox One and Xbox X?"
"Sure, let me hear it, Brain Dead Idiot Employee # 2!"
"Xbox Series X!"
"You did it. You crazy son of a bitch, you did it."

You know how they came up with the hardware specs? They just copied Sony leaks and rumors.
Posted on Reply
#20
SIGSEGV
xkm1948But, but Real time ray tracing are gimmicks!

—-certain fanbois
It heavy taxed the GPU (performance penalty).
those features even don't change the way you games.
GIMMICKS! :nutkick:

:roll:
Posted on Reply
#21
windwhirl
xkm1948But, but Real time ray tracing are gimmicks!

—-certain fanbois
Eh, the thing is that Nvidia wanted to sell RTRT as if it was the Holy Grail of graphics. Because it wasn't all that much of an improvement and severely reduces FPS, it had mixed reception.
Recus... and 5 years too early...
...won't be a mainstream thing until it's offered on "all ranges [of GPUs] from low-end to high-end..
...RT cores wasting die space.

:roll:
Yeah, it's too early. But at some point it had to arrive to consumer gaming space, either by Nvidia's hand or AMD's, or even Intel's. If anything, Nvidia now has more consumer feedback which will help enhance their RTRT implementation.
Posted on Reply
#22
NC37
No Radeon card has much of any value till they get this done.
Posted on Reply
#23
Manoa
what good is "8k capable" and "ray traced" when you are blurring it with VRS ?! what a tards
Posted on Reply
#24
InVasMani
Manoawhat good is "8k capable" and "ray traced" when you are blurring it with VRS ?! what a tards
VRS is great that's one of the more exciting new GPU features in reality. A simple and easy tangible performance boost by diluting parts of the scene you care less about in the grand scheme why would anyone view that as a bad trade off is beyond me. Utilizing resources where they can be put to best usage plain and horsepower means very little when you have no traction which is exactly why drag cars do burnouts before they race to warm those f*ckers up a little to grip the road when they goose it aka 3, 2, 1 punch it.
windwhirlEh, the thing is that Nvidia wanted to sell RTRT as if it was the Holy Grail of graphics. Because it wasn't all that much of an improvement and severely reduces FPS, it had mixed reception.



Yeah, it's too early. But at some point it had to arrive to consumer gaming space, either by Nvidia's hand or AMD's, or even Intel's. If anything, Nvidia now has more consumer feedback which will help enhance their RTRT implementation.
To be perfectly fair it kind of is the holy grail of higher realistic graphics however it's just widely considered premature by probably half a generation. They jumped the gun, but the software won't be available w/o the hardware the same time so it'll help ease things into that direction, but no RTRT today isn't the holy grail of graphics it's just the primer coating before the real paint is applied and them gets a few coats of clear coat over the top of it. RTRT hardware is just the first stage of many additional coats I mean rasterization still is getting new paint jobs.
Posted on Reply
#25
efikkan
InVasManiVRS is great that's one of the more exciting new GPU features in reality. A simple and easy tangible performance boost by diluting parts of the scene you care less about in the grand scheme why would anyone view that as a bad trade off is beyond me. Utilizing resources where they can be put to best usage plain and horsepower means very little when you have no traction which is exactly why drag cars do burnouts before they race to warm those f*ckers up a little to grip the road when they goose it aka 3, 2, 1 punch it.
VRS is a technology that I've wanted for 10 years, but not as a way to reduce details in parts of the scene, only to improve select parts. I think this technology has great potential, but like with many other advanced techniques, it needs to be utilized right, otherwise the end result is bad.

Let's say you have a scene with a nice landscape in the lower half of the screen, and a sky (just a skydome or skybox) in the upper half. You might think that rendering the upper half in much fewer samples might be a good way to optimize away wasteful samples. But the truth is that low detail areas like skies are very simple to render in the first place, so you will probably end up with a very blurry area and marginal performance savings.

To make matters worse, this will probably only increase the frame rate variance (if not applied very carefully). If you have a first-person game walking a landscape, looking straight up or down will result in very high frame rates while looking straight forward into an open landscape will give low performance. Even if you don't do any particular fancy LoD algorithms, the GPU is already pretty good at culling off-screen geometry, and I know from experience that trying to optimize away any "unnecessary" detail can actually increase this frame rate variance even more.
Posted on Reply
Add your own comment
May 29th, 2024 06:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts