Monday, March 18th 2019

Microsoft Paves the Way for Industry-Wide Adoption of Variable Rate Shading

Microsoft today via a devblog announced their push to make Variable Rate Shading an industry-wide adoption in the search of increased performance that can support the number of pixels and quality of those pixels in future games. The post starts with what is likely the foremost question on the mind of any discerning user that hears of a technique to improve performance: "does it degrade image quality?". And the answer, it seems, is a no: no discernible image quality differences between the Variable Rate Shading part of the image, and the fully rendered one. I'll give you the option to speak on your own perception, though: analyze the image below and cast your vote on the poll.

As resolution increases, so does the amount of work that any given GPU has to process to generate a single frame - and compare that to the amount of additional work that goes from rendering a 30 FPS, 1080p game to a 60 FPS, 4K one, and... It stands to reason that ways of squeezing the highest amount of performance from a given process are at a premium. Particularly in the console space, where cost concerns require the usage of more mainstream-equivalent hardware, which requires creative ways of bridging the desired image quality and the available rendering time for each frame.
We've already spoken at length regarding Variable Rate Shading, regarding both its NVIDIA Turing debut, and the AMD patent application that aims to implement a feature that's akin to that. In this case, this is Microsoft that's saying they're adapting Variable Rate Shading onto DX12, allowing developers to easily take advantage of the feature in their games. According to Microsoft, integration of VRS technology through DX12 should take developers no more than a few days of work, whilst enabling some 14% performance gains for other rendering efforts, such as resolution, target frames per second, or even more relevant image quality improvements.
Microsoft details three ways for developers to integrate the technology into their rendering engines (per draw; within a draw by using a screenspace image; or within a draw, per primitive). This should enable developers to mix and match the implementation that best applies to their engine. At the time of announcement, Microsoft said that Playground Games and Turn 10 (Forza), Ubisoft, Massive Entertainment (The Division, Avatar), 343 Industries (Halo), Stardock, Io Interactive, Activision, Epic Games and Unity have all shown their intention of adding VRS to their game engines and upcoming games. That most of these are affiliated with Microsoft's push isn't surprising: remember what we said early in that the console development space (and VR) is where these technologies are needed most.

PS: The left part of the image is the fully rendered one, the right part of the image is the VRS-powered one, rendered at a 14% increased performance.
Source: Microsoft
Add your own comment

23 Comments on Microsoft Paves the Way for Industry-Wide Adoption of Variable Rate Shading

#1
kastriot
This is just temporary solution until next gen gpus come but it will help a little i guess.
Posted on Reply
#2
bug
kastriotThis is just temporary solution until next gen gpus come but it will help a little i guess.
Not really. If it eases the load on the GPU by 20-30%, you can use that to enable better shadows, AA, AF or whatever, no matter how fast your card becomes. Already we're seeing rage at any new monitor launch that's not a 144Hz panel, I suspect when we'll all be playing at 4k@240Hz, people will still want more. Next gen GPUs will do little to mitigate that.
Posted on Reply
#3
Unregistered
As the generational increases in hardware performance shrinks, the increase in optimization of software increases. This seems like the next step after primitive discards to increase performance.

Reads like something the Stardock guys would develop.
Posted on Edit | Reply
#4
PanicLake
kastriotThis is just temporary solution until next gen gpus come but it will help a little i guess.
I hope no one with your logic is in charge of developing games...
Posted on Reply
#5
Fluffmeister
Nice tech, Turing already supports it and it's clear AMD are going to play follow the leader.

Win win.
Posted on Reply
#6
bug
FluffmeisterNice tech, Turing already supports it and it's clear AMD are going to play follow the leader.

Win win.
Oh no, you didn't :laugh:
Posted on Reply
#9
INSTG8R
Vanguard Beta Tester
FluffmeisterWell yeah so what?
Nobody’s leading/following anyone here. And well with this announcement seems everybody’s on board.
Posted on Reply
#10
bug
FluffmeisterWell yeah so what?
I believe he was trying to say AMD is already in this game ;)
Posted on Reply
#13
Fluffmeister
INSTG8RNah just don’t care for your cheerleading.
I don't care for your constant tears, but here we are.
Posted on Reply
#14
INSTG8R
Vanguard Beta Tester
FluffmeisterI don't care for your constant tears, but here we are.
But isn’t that what intended troll was about? Sorry to spoil your party. :rolleyes:
Posted on Reply
#15
kastriot
GinoLatinoI hope no one with your logic is in charge of developing games...
Truth is majority of game devs make optimisations all the time so your logic is flawed.
Posted on Reply
#16
Fluffmeister
INSTG8RBut isn’t that what intended troll was about? Sorry to spoil your party. :rolleyes:
Like I said, damn your easily upset, at least your girl Vya has your back.
Posted on Reply
#17
danbert2000
Doing this sort of selective shading makes sense on VR where you can't even look at the peripherals very well, but I'm worried that we are going down a bad path with checkerbox rendering, TAA, DLSS, and variable rate shading. If you roll all of this tech together just to say it's 4k60fps, but it doesn't even look as good as upscaled 1440p, what's the point?

I'll have to look at some more comparisons on my gaming TV, because I can't tell crap from a compressed jpeg.
Posted on Reply
#18
rtwjunkie
PC Gaming Enthusiast
danbert2000Doing this sort of selective shading makes sense on VR where you can't even look at the peripherals very well, but I'm worried that we are going down a bad path with checkerbox rendering, TAA, DLSS, and variable rate shading. If you roll all of this tech together just to say it's 4k60fps, but it doesn't even look as good as upscaled 1440p, what's the point?

I'll have to look at some more comparisons on my gaming TV, because I can't tell crap from a compressed jpeg.
I’m with you. I’m hesitant for anything that reduces my IQ, so I’m going to have to see much more of this in detail before I start jumping up and down and spitting nickels.
Posted on Reply
#19
DeathtoGnomes
if this tech is something that can take a crap monitor and make it look like a million bucks, I'm all for it. My biggest gripe is that game developers will do what they always do and take shortcuts and ruin it for the folk that cant afford big 4k gaming TVs.
Posted on Reply
#20
Rockarola
FluffmeisterLike I said, damn your easily upset, at least your girl Vya has your back.
Hey Fluffy, keep your trolling to the bedroom, mkay?
You are boring, repetitive and unimaginative...that crap belongs in your bedroom, not here!
Posted on Reply
#21
Fluffmeister
RockarolaHey Fluffy, keep your trolling to the bedroom, mkay?
You are boring, repetitive and unimaginative...that crap belongs in your bedroom, not here!
You stink of Vaseline too.
Posted on Reply
#22
Kursah
Might be time for at least a couple of you to find something else to do that isn't trashing the forums. Keep the personal jabs and BS to yourselves. Keep it on topic and within our guidelines or move along.
Posted on Reply
#23
PanicLake
- A guy says something that isn't optimization oriented.
- I say that I hope people with that logic won't be in charge of developing a game
- and you say:
kastriotTruth is majority of game devs make optimisations all the time so your logic is flawed.
- so my response is... eh?
Posted on Reply
Add your own comment
Dec 18th, 2024 03:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts