Tuesday, September 21st 2021
NVIDIA Prepares to Deliver Deep Learning Anti-Aliasing Technology for Improved Visuals, Coming first to The Elder Scrolls Online
Some time ago, NVIDIA launched its Deep Learning Super Sampling (DLSS) technology to deliver AI-enhanced upscaling images to your favorite AAA titles. It uses proprietary algorithms developed by NVIDIA and relies on the computational power of Tensor cores found in GeForce graphics cards. In the early days of DLSS, NVIDIA talked about an additional technology called DLSS2X, which was supposed to be based on the same underlying techniques as DLSS, however, just to do image sharpening and not any upscaling. That technology got its official name today: Deep Learning Anti-Aliasing or DLAA shortly.
DLAA uses technology similar to DLSS, and it aims to bring NVIDIA's image-sharpening tech to video games. It aims to use the Tensor cores found in GeForce graphics cards, and provide much better visual quality, without sacrificing the performance, as it runs on dedicated cores. It is said that the technology will be offered alongside DLSS and other additional anti-aliasing technologies in in-game settings. The first game to support it is The Elder Scrolls Online, which now has it in the public beta test server, and will be available to the general public later on.
Source:
via Tom's Hardware
DLAA uses technology similar to DLSS, and it aims to bring NVIDIA's image-sharpening tech to video games. It aims to use the Tensor cores found in GeForce graphics cards, and provide much better visual quality, without sacrificing the performance, as it runs on dedicated cores. It is said that the technology will be offered alongside DLSS and other additional anti-aliasing technologies in in-game settings. The first game to support it is The Elder Scrolls Online, which now has it in the public beta test server, and will be available to the general public later on.
50 Comments on NVIDIA Prepares to Deliver Deep Learning Anti-Aliasing Technology for Improved Visuals, Coming first to The Elder Scrolls Online
Why doing the same thing twice ?
Or it is a cut down version of DLSS that focus on edges only so it is way easier to implement ?
I'm not really a fan of upscaling as you're basically downgrading a ton of settings that scale with internal res and while it isn't a problem at 4K as the internal res is high enough, even when upscaling, it's certainly noticeable at lower resolutions, especially the RT effects. No upscaling technique can upcale effects so you're always stuck with low res effects if you use it. If it works like they initially announced it, it's just DLSS with a renderscale of 100% or higher.
@nguyen can you explain. its getting confusing as **** LOL
Good steps, zero strides. Still going to be interesting how this will pan out beyond the initial wave of support. This. We've been here a dozen times before. Nothing proprietary lasts longer than a few generations beyond its introduction. Even Gsync, which doesn't even require dev support, didn't last and got effectively replaced by something universal.
The real thing is going to happen when all three GPU families have 'cores' to do the work and can somehow share in the implementation/support.
With TAA being so dominant in today AAA games, DLAA can easily replace TAA because it's better and reduce the annoying flicker you can see in games, like the new Deathloop
I recently wanted to replace my Asus MG278Q 144 Hz gsync compatible (listed as compatible on nvidia's website) with an iiyama G-Master Red Eagle GB2770QSU-B1 27" WQHD IPS FreeSync Premium Pro; however, the monitor is not listed on nvidia's site as gsync compatible and I did not manage to find (in me allbeit limited research) anyone who had got the monitor to work with gsync. Additionally, you cannot use a gsync compatible display with HDR, you need a fully certified screen to do that.
It should be an interesting piece of tech, even if its main purpose turns out to be poking the competition into doing something similar.
Even if we think of DLSS and Gsync as proprietry/locked down technologies; I doubt AMD freesync/FSR would exist if the nvidia options did not (certainly their time to market would have been considerably longer). Whether you use it or not, where would hardware accelerated raytracing be without nvidia?
As an obervation, nvidia seem to always be setting the standard and then others follow. Maybe this scenario will improve with Intel entering the game?
FXAA, MLAA and other postprocessing AA methods are usually notably worse than better methods like MSAA that have fallen out of favor due to excessive performance hit in today's rendering methods. If DLAA can slot in between in terms of performance hit with quality that is similar or better than MSAA, it will do just fine.
I have no interest in HDR, for now anyway.
Back in the day graphics were simple with almost no foliage & geometry. You could get away with not running a temporal AA method but nowadays that's just not really an option. You're left with either TAA, downsampling or a supersampling based AA method like SGSSAA.
boooooooo
DLSS, DLAA - nonfunctional and unsupported until a title has been 'compiled' or whatever it is they do back at Nvidia by throwing it at their AI deep-learning supercomputer to train it on that game.
At launch, when a game is most popular and has the lowest performance is when DLSS and DLAA are most needed, and the very nature of requiring time for the game to get popular enough for Nvidia to take notice, feed it into Deep Thought or whatever, and then have it spit out a profile that sits and waits until the next bi-monthly game-ready driver to come around is TOO DAMNED SLOW. People have likely moved onto a newer title by then.
You'll probably have launch games with DLSS that are nvidia branded and not on the ones that are AMD branded. The bi mounthly thing you mention is for older titles.
Whatever suit your narrative man, I hope those 6900XT earn you money too
But of course we are talking about gaming here, where Nvidia techs are much much more interesting than some Lanczos derivative
Wide support/adoption is a quality in itself but visually, there is a difference.
Here is what Computerbase.de described FSR in Deathloop So yeah, just free crap no one use.
And then you will be looking back at those silly DLSS games from 10 years ago which no current gpu actually supports anymore.