Friday, December 13th 2019

Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2
Hardware-accelerated ray tracing and variable-rate shading will be the design focal points for AMD's next-generation RDNA2 graphics architecture. Microsoft's reveal of its Xbox Series X console attributed both features to AMD's "next generation RDNA" architecture (which logically happens to be RDNA2). The Xbox Series X uses a semi-custom SoC that features CPU cores based on the "Zen 2" microarchitecture and a GPU based on RDNA2. It's highly likely that the SoC could be fabricated on TSMC's 7 nm EUV node, as the RDNA2 graphics architecture is optimized for that. This would mean an optical shrink of "Zen 2" to 7 nm EUV. Besides the SoC that powers Xbox Series X, AMD is expected to leverage 7 nm EUV for its RDNA2 discrete GPUs and CPU chiplets based on its "Zen 3" microarchitecture in 2020.
Variable-rate shading (VRS) is an API-level feature that lets GPUs conserve resources by shading certain areas of a scene at a lower rate than the other, without perceptible difference to the viewer. Microsoft developed two tiers of VRS for its DirectX 12 API, tier-1 is currently supported by NVIDIA "Turing" and Intel Gen11 architectures, while tier-2 is supported by "Turing." The current RDNA architecture doesn't support either tiers. Hardware-accelerated ray-tracing is the cornerstone of NVIDIA's "Turing" RTX 20-series graphics cards, and AMD is catching up to it. Microsoft already standardized it on the software-side with the DXR (DirectX Raytracing) API. A combination of VRS and dynamic render-resolution will be crucial for next-gen consoles to achieve playability at 4K, and to even boast of being 8K-capable.
Variable-rate shading (VRS) is an API-level feature that lets GPUs conserve resources by shading certain areas of a scene at a lower rate than the other, without perceptible difference to the viewer. Microsoft developed two tiers of VRS for its DirectX 12 API, tier-1 is currently supported by NVIDIA "Turing" and Intel Gen11 architectures, while tier-2 is supported by "Turing." The current RDNA architecture doesn't support either tiers. Hardware-accelerated ray-tracing is the cornerstone of NVIDIA's "Turing" RTX 20-series graphics cards, and AMD is catching up to it. Microsoft already standardized it on the software-side with the DXR (DirectX Raytracing) API. A combination of VRS and dynamic render-resolution will be crucial for next-gen consoles to achieve playability at 4K, and to even boast of being 8K-capable.
119 Comments on Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2
www.cryengine.com/news/view/crytek-releases-neon-noir-a-real-time-ray-tracing-demonstration-for-cryengine#comments/page/4
I'm more worried about the implementation of this so that it wont end up like NV's DLSS. Nvidia does it? You mean NVidia's using it.
Obviously.
Because once we get into full throttle RT "non-RT cards are not supported" some time in 2025, today's card will absolutely be adequate to run it.
Apparently. Low ray count ray tracing, producing hell of a noisy images that get heavily "denoised" to produce a handful of effects in some otherwise traditionally rasterized scenes.. is not gimmick?
Because, let me guess, it has "RT" and "real time" in it?
Clearly, only fanbois would disagree with it!
Exciting times!
isn't it good to have a choice...
about that 1%.....look how many triple A games out now or announced for 2020 have rtx support.
blurry and noisey ? depnds
this is rtx+dlss in Control
Do tell, what has been you user experience/utilisation of RTRT over the course of the ownership of your RTX card? Do you believe that you will fully take advantage it's RTRT capabilities before replacing it with a next gen nvidia/amd card that can actually handle such?
P.S. If it's a gimmick at present there is nothing wrong with doing so.
Oh wait:
VRS, on the other hand, is a potential framerate improvement, leaving us with "better' state, provided it doesn't significantly impact image quality. Why would anyone NOT want this, IF it works as advertised?
Until Crytek implements DXR there is no direct comparison. So far Neon Noir running on Vega 56/64 is about on par with it running on GTX1080. Anything DXR cards can do (mainly considerably larger amount of rays) is on top of that. If you want a comparison, check the differences between GTX and RTX cards - RTX2060 should be generally on par with GTX1080, so it is direct enough comparison - in Battlefield V DXR. It employs DXR for the same effect as Neon Noir employs its RT shaders). VRS is in DX12 and both Nvidia and Intel have this capability deployed. I believe Nvidia also has OpenGL and Vulkan extensions available for it, not sure about Intel.
VRS is not an image reconstruction technique. It does reduce image quality in parts of the image but option of using VRS is purely and entirely up to developer. When used well - in parts of screen that do not benefit from more details and quality lowered to acceptable degree - it provides a small but measurable performance boost for minimal image quality penalty.
Back in the day I owned Asus Mars II which was Bitchin'fast!3D2000 personified. It was 365W TDP tripple slot 3x 8pin monstrosity, and yes, it had over 20000 BungholioMarks.
I had no problem with heat, and it lasted over 3 years. Good times.
Turing packs all this tech already, and can only improve once Nvidia go down to 7nm too, RTRT may realistically remain years off, but things like VRS coming to next gen consoles can certainly offer some nice benefits as details and res go up. AMD are playing catch up, but there is no need to get too butt hurt people.
Here's a example GPU recognizes frame rate is below 60FPS or say below 30FPS which is even worse and input lag gets really crappy really quickly below that point. AA is gets set for 75% of screen resolution for w/e high quality setting you determine and the other 25% gets set lower when the trigger point kicks in until the frame rate normalizes. Frame rate variance improved a bit of image quality reduction temporarily, but in the grand scheme a good trade off perhaps given the scenario described. That could be applied to more than AA like shading, lighting, and geometry as well as other stuff. Boils down to how it gets used and applied, but VRS has the premise of improving quality and performance both in variable ways. It just depends how it gets injected into the render pipe line. Plenty of video streams do variable rate adjustments similar to that based on DL speed because of traffic congestion. I honestly wouldn't mind a bit of selective DLSS smeary temporarily if my FPS dipped below a frame rate threshold I determined it beats chopping frame rates and sloppy input lag. That scene looks like it's be post processed with Charmin Ultrasoft soap opera effect. I cannot in good faith say I'm fond of the look. My vision isn't even 20/20 I'm blind as a bat w/o glasses, but that would drive me nuts personally so I'd hate to think what people with good vision think of it that dull mess. I practically looks like a mClassic doing upscaling from like a 720p console game in terms of texture detail it's horrible quite frankly the quality simply isn't there bottom line you can't make a blu-ray quality video out of a DVD rom which also true of all the sacrifices to RTRT to do with ray tracing and lower the amount of light passes and denoise to poorly run RTRT at a unsteady frame rate. Let's not kid ourselves here next generation nvidia/amd card won't handle won't be tricking anyone's eyes into thinking RTRT Crysis is real life either. Speak of the devil or close enough and actually that that demo was one of the better examples of RTRT type effects aside from that staged star wars demo Nvidia did that didn't materialize into actual playable games like that go figure who would've guessed it. Crytek did a pretty decent job though defiantly not perfect simply because single GPU hardware won't get us to GoPro Hero realism at this point in time we've got a ways to go still before we reach that point. You make a bit of a good point RTRT could be viewed as a bit of a preemptive **** block attempt by Nvidia to ray tracing with developers that will ultimately slow the progression of ray tracing. No one wants another HairWorks or Physx scenario down the road for ray tracing, but could be right where things are headed towards. Luckily AMD is in the next gen console's so we might avoid that scenario so good chess move follow up by Lisa Su. I'm sure RTRT will improve in the coming hears and heat up further, but at this stage it's safe to call it a bit of a gimmick given how it both looks and performs neither are optimal and need tons more polish before people consider them high quality and high desirability. I don't think too many people bought RTX cards for RTRT alone, but rather for both performance/efficiency and RTX features that include RTRT among other tech like mesh shading and DLSS. Pretty much agreed Nvidia moving to 7nm will certainly only bring about further advancements though so too will AMD moving to 7nm EUV and increasing it's GPU divisions R&D budget over time as it continues to pay down debt from that ATI merger from years past. Intel's CPU stumbling will only benefit AMD especially given it's higher focus on the CPU side at present. AMD is defiantly in a good position to shift gears and focus or switch from 2WD to 4WD at any point in time between CPU/GPU so that's a good thing really it's worse days appear behind them. AMD has it's work cut out for them ahead especially on the GPU side of things, but I think they'll inch their way forward and regain market share back from Nvidia over the coming years. I do believe a stronger R&D budget and less debt will make a big difference in their overall competitiveness plus Intel's stumbles should help and those security stumbles could hurt Intel a lot that won't just be forget about given the scale of them that keeps getting deeper.
well,thanks for the elaborate description.
I don't know why it looks like that to you ,maybe you do need glasses after all. :roll:
350W is pushing it in terms of cooling it without terrible noise levels. I think you missed the point. The hardware is of course the same, the variance is in the workload. It is certainly possible to build an algorithm that uses performance metrics from previous frames and dynamically adjusts LoD on the fly, I have even looked into implementing something like that once. The issue is that you have to rely on the performance metrics of the last ~10 frames, so any adjustment will happen after the performance has changed, and will for this reason not reduce stutter. The best approach is to reduce the variance preemptively.
I stand by my claim that it can be used poorly, resulting in blurry scenes and in worst case flickering or artifacts.
Ironic.
Let me elaborate, not only is RTX a gimmick at this point (pathetic number of rays that are capable to produce noisy shadow/reflection like effects, with heavy de-noising) it is absolutely not clear how this area will develop. Whatever it will be, as with FreeSync, it won't be NV alone deciding it, heck, but AMD alone could, as, wait for it:
1) AMD commands 35% of the GPU market (and is poised to grab more), but also
2) 100% of the console market (Switch is in no way capable of RT-ing anyhow) which is expected to roll out next gen consoles with GPU at 2070-2080-ish levels
Last, but not least, the screenshot you have shared makes me smile. Looks like generic adventure to me.
Yeah, devs were able to gimmick reflections/shadows way before RT-ing, it's just about effort to implement it (must be much easier with RT). It will depend on the price.
greedinessgreatness in a less subtle way?Would you mind linking it, I have an itch to read it verbatim.
you're taking longer and longer to catch on.
but www.awn.com/news/sgi-demos-integration-art-vps-ray-tracing-hardware
You will find SGI had it ages ago for cad not really for gaming.
Pretty sure sun had real time ray tracing as well..
First gen hardware / software tends to suck, until it get momentum.
An open standard would be nice.
Since consoles will be using and and consoles drive PC gaming unless some killer app has Ray tracing is just a gimmick at the moment.
Anyone else remember stand alone physics cards? Wank factory 99% there where some titles that you could run that where cool but other than that a waste of money.
VRS is a rational thing to do, RT is more of a buzzwordy hype, like VR was, it might take off next year or two, or much later, we'll see.
Of goals AMD has next year, RT is certainly not high priority.