Monday, January 31st 2011
NVIDIA Developing Subpixel Reconstruction Antialiasing to Compete with MLAA
NVIDIA is developing its own antialiasing (AA) technology to rival morphological antialiasing (MLAA). The subpixel reconstruction antialiasing (SRAA) NVIDIA is currently conducting research on aims to provide better image quality with minimal performance penalty. It "combines single-pixel (1x) shading with subpixel visibility to create antialiased images without increasing the shading cost," as NVIDIA puts it in its research abstract. SRAA is suited for rendering engines that don't use multisample antialiasing (MSAA) since they use deferred shading. For such renderers, SRAA works as a post-processing layer, just like MLAA.
Where SRAA defers from MLAA is that the new algorithm can better respect geometric boundaries and has a fixed runtime independent of scene and image complexity. SRAA benefits shading-bound applications. "For example, our implementation evaluates SRAA in 1.8 ms (1280x720) to yield antialiasing quality comparable to 4-16x shading. Thus SRAA would produce a net speedup over supersampling for applications that spend 1 ms or more on shading; for comparison, most modern games spend 5-10 ms shading. We also describe simplifications that increase performance by reducing quality," claims NVIDIA.
Source:
NVIDIA Research
Where SRAA defers from MLAA is that the new algorithm can better respect geometric boundaries and has a fixed runtime independent of scene and image complexity. SRAA benefits shading-bound applications. "For example, our implementation evaluates SRAA in 1.8 ms (1280x720) to yield antialiasing quality comparable to 4-16x shading. Thus SRAA would produce a net speedup over supersampling for applications that spend 1 ms or more on shading; for comparison, most modern games spend 5-10 ms shading. We also describe simplifications that increase performance by reducing quality," claims NVIDIA.
33 Comments on NVIDIA Developing Subpixel Reconstruction Antialiasing to Compete with MLAA
Excuse me?
This sort of tech needs to be on the 8800 series and up not be exclusive to just the latest cards.
The current driver (11.1) has removed all those oddities for me :)
It felt like they were opportunely calling allusion to the AMD default texture quality blunder. While that wasn't cool on AMD's part, I feel they've been called on it enough (and have since fixed it.)
It's kind of like: "And one more thing...*kicks opponent in the balls when they aren't looking.*"
How tactful from a company that hardware engineering is realistically slightly behind on average, even if their software and marketing teams are ahead. I still remember 3Dmark03 tricks before they started (insert PC phrase) companies to develop toward their hardware differences/strengths, ala Heaven/HAWX2 this go-round, PhysX/AA the last; Let's not forget doing Vantage physics scores on the GPU (when the test isn't using the GPU for anything else) when intended for a CPU after conveniently buying the company that owned the API. There are many more hardware/software examples of unrepresentative performance/quality and underhanded tactics.
Not trying to be a troll or start a combative thread; nVIDIA has a good lineup with great features at admirable cost/performance ratios, and the 600 series likely will be as well. Just putting it out there that they don't need to lower quality settings for performance when their settings/features are almost-always the optimized default, and even then there are tricks. Just because the tactic is different, doesn't make it any less bullshit.
Radeon 9600 Pro was also able to utilize Adaptive AA but was never officially supported.
GeForce 6600GT was also able to utilize Transparency AA but was never officially supported.
So even if GeForce 2 could do it, they won't enable it to existing users just like AMD doesn't officially support MLAA for HD5000 users even though the cards can easily do it.
So yes, GTX 600 series...
This is a screenie of my HD5850 :)
For all intents & purposes, this idea of nVidia is not a bad one.
Their software / driver department is huge, I think they can easily pull it off (including the 8 series capable of Subpixel Reconstruction Antialiasing.. heck even 6 series, since its capable of programable shaders lol)
Meaning they do not depend on 3rd party developers and as such don't need the attention from the entire user base. It would be nice to have SRAA at least on GeForce 400 series and above, but i don't think it will happen.
Or should I....?? (5870 user)
Is MSAA still the best anti-aliasing in terms of picture quality? I could care less about performance impact.
It just sucks that it comes with the biggest performance hit :(
Perhaps wait for this month's release instead?