• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,675 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Hardware-accelerated ray tracing and variable-rate shading will be the design focal points for AMD's next-generation RDNA2 graphics architecture. Microsoft's reveal of its Xbox Series X console attributed both features to AMD's "next generation RDNA" architecture (which logically happens to be RDNA2). The Xbox Series X uses a semi-custom SoC that features CPU cores based on the "Zen 2" microarchitecture and a GPU based on RDNA2. It's highly likely that the SoC could be fabricated on TSMC's 7 nm EUV node, as the RDNA2 graphics architecture is optimized for that. This would mean an optical shrink of "Zen 2" to 7 nm EUV. Besides the SoC that powers Xbox Series X, AMD is expected to leverage 7 nm EUV for its RDNA2 discrete GPUs and CPU chiplets based on its "Zen 3" microarchitecture in 2020.

Variable-rate shading (VRS) is an API-level feature that lets GPUs conserve resources by shading certain areas of a scene at a lower rate than the other, without perceptible difference to the viewer. Microsoft developed two tiers of VRS for its DirectX 12 API, tier-1 is currently supported by NVIDIA "Turing" and Intel Gen11 architectures, while tier-2 is supported by "Turing." The current RDNA architecture doesn't support either tiers. Hardware-accelerated ray-tracing is the cornerstone of NVIDIA's "Turing" RTX 20-series graphics cards, and AMD is catching up to it. Microsoft already standardized it on the software-side with the DXR (DirectX Raytracing) API. A combination of VRS and dynamic render-resolution will be crucial for next-gen consoles to achieve playability at 4K, and to even boast of being 8K-capable.



View at TechPowerUp Main Site
 
Wonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.
 
Wonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.
Parts of the screen with some pretty fine granularity are shaded at a lower resolution.
Wolfenstein II was the first game to implement it with minor but measurable performance boost.
UL has VRS feature test out as part of 3DMark: https://www.techpowerup.com/261825/...feature-test-for-variable-rate-shading-tier-2

There are some documents and videos that have pretty good explanation of how this works:
 
Wonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.

Radeon Rays
 
So does this mean the rx 5800 (or basically the upcoming NAVI cards in 2020) will have this?
 
Last edited:
But, but Real time ray tracing are gimmicks!

—-certain fanbois

If more than one manufacturer does something it means it's not a gimmick!

-braindead logic
 
I don't need RR that much as long as I can get the 60 FPS in 4K I'm ok. I won't get mad if I'll have to wait longer for the RR+VDS feature. What I'm concerned about is, if this RR is happening whit the 5800 model Navi and up, that can mean delays in the release due to some new feature implementation and whatever, and I really don't want that to happen. I need a new Graphics card.
 
RX 5800 is not really confirmed, is it? AMD should be in a bit of a trouble trying to fit into 250W with big Navi especially considering RX 5700XT is a 225W card.
RDNA2 is more likely to be a next generation thing, RX 6000-series or whatever its eventual name will be.
 
So does this mean the rx 5800 (or basically the upcoming NAVI cards in 2020) will have this?
From news i'm reading about navi i'm confused and leaning more and more towards conclusion there will be no 5800/5900 at all. There is absolutely no info about these cards, seems like navi2/rdna2 is what will be next and that to me is sad as i expected high-end gpu from amd to show up in the end of this year or at beginning of 2020. This only means i will seat on my temporary upgrade 580 for much longer than i expected.

Variable rate shading and raytracing are things i do not care at all. First one reduces image quality second one done properly, not faked, as it is with for example RTX library (yes, rtx is simplified and faked in many aspects form of "raytracing" and still killing performance too much), requires drastic changes to graphics rendering overall and still tons of performance which we will not reach in many decades.
 
RX 5800 is not really confirmed, is it? AMD should be in a bit of a trouble trying to fit into 250W with big Navi especially considering RX 5700XT is a 225W card.
RDNA2 is more likely to be a next generation thing, RX 6000-series or whatever its eventual name will be.
I would be surprised if they have a larger Navi ready, and if they did, the TDP would probably be in the 300-350W range.
Also don't forget that RX 5700 was renamed "last minute", even some official photos displayed "RX 690".
 
Well I welcome Raytracing from the Red Team, its great that we all get faster and faster cards, but if the visuals don't change and developers just continue to bloat existing engines in the vain attempt to offer us 'new' exeriences, we need those new technolgies to take the next step.

Hope it makes sense I am all tired and dizzy :)
 
Wonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.
well it's not like nvidia already does it in wolfenstein games
and why would that be blurry ? it's not an image reconstruction tehcnique.
 
I continue to hope that we get a frame-rate improvement, rather than feature expansion.*
I'm running on a (admittedly top of the line) card from a generation ago (1080TI) it's 2 years old. I would upgrade if I were to get a reliable, significant, increase in framerates. A new feature that drops framerates? No.

Now, if you were to tell me I could have both, I'd consider it, but I'm not going to buy a card which is a 10-25% framerate improvement, for about 2x the price, of a 2 year old card.

*I'm almost convinced that VRS is the key piece I am hungry for. I've taken a hard look at a couple different samples, and if it works like I've seen, I'd certainly accept the image quality 'drop' for part of the screen, for the framerate improvements they have been touting.

And if that came with RR, meh. I wouldn't complain, but it wouldn't be the reason I buy a new card. Besides. HOW many games support hardware RayTracing so far? 7? 8? of which I'd actually think about paying for 2 or so of?
 
But, but Real time ray tracing are gimmicks!

—-certain fanbois


There is more than one way to skin a cat.


Hardware support with compressed vector tables to reduce the computational overhead of real time is one. Allow a CPU core or two to work out basic angle dependant setup info then hand that off the same way we got angle independent anisotropic filtering.

How many games support Ray tracing again? Physx hardware accelerated fluff still in the news? Overburden of Tesselation? Hair works?

Nvidia deserves the flack for what they do, just like AMD deserves so much shit it would take a bulldozer to move it, except it overheated with it's "real men" cores.
 
Wider adaption of RT can only help the consumer while VRS can't help deliver better performance to the entire ecosystem, be it PC or consoles.
 
Microsoft is still out of touch with reality.

"Hey, want to one up ourselves with the dumbest naming since Xbox One and Xbox X?"
"Sure, let me hear it, Brain Dead Idiot Employee # 2!"
"Xbox Series X!"
"You did it. You crazy son of a bitch, you did it."

You know how they came up with the hardware specs? They just copied Sony leaks and rumors.
 
But, but Real time ray tracing are gimmicks!

—-certain fanbois

It heavy taxed the GPU (performance penalty).
those features even don't change the way you games.
GIMMICKS! :nutkick:

:roll:
 
But, but Real time ray tracing are gimmicks!

—-certain fanbois

Eh, the thing is that Nvidia wanted to sell RTRT as if it was the Holy Grail of graphics. Because it wasn't all that much of an improvement and severely reduces FPS, it had mixed reception.


Yeah, it's too early. But at some point it had to arrive to consumer gaming space, either by Nvidia's hand or AMD's, or even Intel's. If anything, Nvidia now has more consumer feedback which will help enhance their RTRT implementation.
 
No Radeon card has much of any value till they get this done.
 
what good is "8k capable" and "ray traced" when you are blurring it with VRS ?! what a tards
 
what good is "8k capable" and "ray traced" when you are blurring it with VRS ?! what a tards
VRS is great that's one of the more exciting new GPU features in reality. A simple and easy tangible performance boost by diluting parts of the scene you care less about in the grand scheme why would anyone view that as a bad trade off is beyond me. Utilizing resources where they can be put to best usage plain and horsepower means very little when you have no traction which is exactly why drag cars do burnouts before they race to warm those f*ckers up a little to grip the road when they goose it aka 3, 2, 1 punch it.

Eh, the thing is that Nvidia wanted to sell RTRT as if it was the Holy Grail of graphics. Because it wasn't all that much of an improvement and severely reduces FPS, it had mixed reception.



Yeah, it's too early. But at some point it had to arrive to consumer gaming space, either by Nvidia's hand or AMD's, or even Intel's. If anything, Nvidia now has more consumer feedback which will help enhance their RTRT implementation.
To be perfectly fair it kind of is the holy grail of higher realistic graphics however it's just widely considered premature by probably half a generation. They jumped the gun, but the software won't be available w/o the hardware the same time so it'll help ease things into that direction, but no RTRT today isn't the holy grail of graphics it's just the primer coating before the real paint is applied and them gets a few coats of clear coat over the top of it. RTRT hardware is just the first stage of many additional coats I mean rasterization still is getting new paint jobs.
 
Last edited:
Back
Top