VRS is a technology that I've wanted for 10 years, but not as a way to reduce details in parts of the scene, only to improve select parts. I think this technology has great potential, but like with many other advanced techniques, it needs to be utilized right, otherwise the end result is bad.
Let's say you have a scene with a nice landscape in the lower half of the screen, and a sky (just a skydome or skybox) in the upper half. You might think that rendering the upper half in much fewer samples might be a good way to optimize away wasteful samples. But the truth is that low detail areas like skies are very simple to render in the first place, so you will probably end up with a very blurry area and marginal performance savings.
To make matters worse, this will probably only increase the frame rate variance (if not applied very carefully). If you have a first-person game walking a landscape, looking straight up or down will result in very high frame rates while looking straight forward into an open landscape will give low performance. Even if you don't do any particular fancy LoD algorithms, the GPU is already pretty good at culling off-screen geometry, and I know from experience that trying to optimize away any "unnecessary" detail can actually increase this frame rate variance even more.
I think you used a poor example because it's unlikely that scenario would be applied or only sparingly. As far as frame time variance is concerned the hardware remains the same in either case it's simply prioritizing render tasks a bit different within the GPU. If anything it could be used to improve frame time variance in the worse case scenario's by utilizing VRS selectively switching a few things to a lower quality when frame rates dip below certain FPS trigger thresholds til they normalize. Sure I'm sure it could get used poorly, but it could get used very well at the same time and having it as a option doesn't hurt.
Here's a example GPU recognizes frame rate is below 60FPS or say below 30FPS which is even worse and input lag gets really crappy really quickly below that point. AA is gets set for 75% of screen resolution for w/e high quality setting you determine and the other 25% gets set lower when the trigger point kicks in until the frame rate normalizes. Frame rate variance improved a bit of image quality reduction temporarily, but in the grand scheme a good trade off perhaps given the scenario described. That could be applied to more than AA like shading, lighting, and geometry as well as other stuff. Boils down to how it gets used and applied, but VRS has the premise of improving quality and performance both in variable ways. It just depends how it gets injected into the render pipe line.
On the VRS video linked earlier it does get blurry when the res is really down. So your image reconstruction technique mention here is simply pointless.
I'm more worried about the implementation of this so that it wont end up like NV's DLSS. Nvidia does it? You mean NVidia's using it.
Plenty of video streams do variable rate adjustments similar to that based on DL speed because of traffic congestion. I honestly wouldn't mind a bit of selective DLSS smeary temporarily if my FPS dipped below a frame rate threshold I determined it beats chopping frame rates and sloppy input lag.
well better go back to buying cards that can't do none of it.
isn't it good to have a choice...
about that 1%.....look how many triple A games out now or announced for 2020 have rtx support.
blurry and noisey ? depnds
this is rtx+dlss in Control
View attachment 139363
That scene looks like it's be post processed with Charmin Ultrasoft soap opera effect. I cannot in good faith say I'm fond of the look. My vision isn't even 20/20 I'm blind as a bat w/o glasses, but that would drive me nuts personally so I'd hate to think what people with good vision think of it that dull mess. I practically looks like a mClassic doing upscaling from like a 720p console game in terms of texture detail it's horrible quite frankly the quality simply isn't there bottom line you can't make a blu-ray quality video out of a DVD rom which also true of all the sacrifices to RTRT to do with ray tracing and lower the amount of light passes and denoise to poorly run RTRT at a unsteady frame rate.
View attachment 139366
Do tell, what has been you user experience/utilisation of RTRT over the course of the ownership of your RTX card? Do you believe that you will fully take advantage it's RTRT capabilities before replacing it with a next gen nvidia/amd card that can actually handle such?
P.S. If it's a gimmick at present there is nothing wrong with doing so.
Let's not kid ourselves here next generation nvidia/amd card won't handle won't be tricking anyone's eyes into thinking RTRT Crysis is real life either.
Hell yeah, how dare I!?!?!?!
Oh wait:
I guess it is too much to expect from users on techy forum to have basic understanding of the underlying technology...
Speak of the devil or close enough and actually that that demo was one of the better examples of RTRT type effects aside from that staged star wars demo Nvidia did that didn't materialize into actual playable games like that go figure who would've guessed it. Crytek did a pretty decent job though defiantly not perfect simply because single GPU hardware won't get us to GoPro Hero realism at this point in time we've got a ways to go still before we reach that point.
This goes back to the comment I quoted, I'm probably one of those certain people claimed to say "RT is a gimmick", that is out of context. How Nvidia is implementing it is a gimmick, no matter how you cut it with the current batch of cards it's not practical nor useful therefore a gimmick. RT itself was never thought or said to be superfluous by myself personally. That out of way will probably get there eventually but feel like had nvidia not snatched it up it would we would have seen acceptable RT sooner because the collaborative element has been removed, working on RT from that point either develop your own from scratch or go through Nvidia and use their hardware.
You make a bit of a good point RTRT could be viewed as a bit of a preemptive **** block attempt by Nvidia to ray tracing with developers that will ultimately slow the progression of ray tracing. No one wants another HairWorks or Physx scenario down the road for ray tracing, but could be right where things are headed towards. Luckily AMD is in the next gen console's so we might avoid that scenario so good chess move follow up by Lisa Su. I'm sure RTRT will improve in the coming hears and heat up further, but at this stage it's safe to call it a bit of a gimmick given how it both looks and performs neither are optimal and need tons more polish before people consider them high quality and high desirability. I don't think too many people bought RTX cards for RTRT alone, but rather for both performance/efficiency and RTX features that include RTRT among other tech like mesh shading and DLSS.
Turing packs all this tech already, and can only improve once Nvidia go down to 7nm too, RTRT may realistically remain years off, but things like VRS coming to next gen consoles can certainly offer some nice benefits as details and res go up.
Pretty much agreed Nvidia moving to 7nm will certainly only bring about further advancements though so too will AMD moving to 7nm EUV and increasing it's GPU divisions R&D budget over time as it continues to pay down debt from that ATI merger from years past. Intel's CPU stumbling will only benefit AMD especially given it's higher focus on the CPU side at present. AMD is defiantly in a good position to shift gears and focus or switch from 2WD to 4WD at any point in time between CPU/GPU so that's a good thing really it's worse days appear behind them. AMD has it's work cut out for them ahead especially on the GPU side of things, but I think they'll inch their way forward and regain market share back from Nvidia over the coming years. I do believe a stronger R&D budget and less debt will make a big difference in their overall competitiveness plus Intel's stumbles should help and those security stumbles could hurt Intel a lot that won't just be forget about given the scale of them that keeps getting deeper.