• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Prepares to Deliver Deep Learning Anti-Aliasing Technology for Improved Visuals, Coming first to The Elder Scrolls Online

Joined
Jul 31, 2017
Messages
6 (0.00/day)
How come they don't implement this in a more graphics-heavy game? ESO uses a heavily modified Morrowind engine so of course this new AA tech won't completely destroy your FPS. Sneaky tactics as usual from Nvidia. "sigh"
 
Joined
Aug 31, 2016
Messages
104 (0.03/day)
Will be interesting to see how it compares to DSR+DLSS. DSR has a bit of it's own softening and not all games work well with it so maybe just plain DLAA on top of native res is going to be better. DLSS should have had resolution scale slider integrated from day one though, it is a bit silly that we need to use DSR to do this. Guess they decided that anything more than typical Low-Mid-High is going to be too complex for average user :p
 
Low quality post by InVasMani
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
NVIDIA fans right now be like sharpening effect...
1632234697452.png
 
Joined
May 24, 2007
Messages
5,433 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
FSR is markedly inferior. It might get a 2.0 but improvements there are quite likely to follow what DLSS and XeSS are currently doing.
Wide support/adoption is a quality in itself but visually, there is a difference.
Sure extremely noticeable if I stop my game play, take a picture of the current scene, and then zoom in on plant pedals to be able to come on this forum and say, “Yup, just like I thought, markedly inferior to DLSS!”

In regards to adoption, assume 20-25% AMD deployment on desktops and laptops based off Steam stats, all modern NVIDIA cards, and last but not least all PS5 and Xbox S/X.
 
Last edited:
Joined
Oct 4, 2017
Messages
706 (0.27/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
Isn't DLSS already had AA effects built-in ?
Why doing the same thing twice ?
Or it is a cut down version of DLSS that focus on edges only so it is way easier to implement

It doesn't do the same thing twice , as DLSS name suggest DLSS is about Super Sampling , on the other had DLAA gets rid of the SS component and uses Deep Learning only for Anti-Aliasing .

This is not about difficulty of implementation , it's about having native resolution with the best anti aliasing possible . Since the game is not upscaled DLAA is going to be more taxing on the hardware than DLSS .
 
Joined
Jun 22, 2006
Messages
1,097 (0.16/day)
System Name Beaver's Build
Processor AMD Ryzen 9800X3D
Motherboard Asus TUF Gaming X670E Plus WiFi
Cooling Corsair H115i RGB PLATINUM 97 CFM Liquid
Memory G.SKILL Trident Z5 Neo DDR5-6000 CL30 RAM 32GB (2x16GB)
Video Card(s) NVIDIA GeForce RTX 4090 Founders Edition
Storage WD_BLACK 8TB SN850X NVMe
Display(s) Alienware AW3225QF 32" 4K 240 Hz OLED
Case Fractal Design Design Define R6 USB-C
Audio Device(s) Focusrite 2i4 USB Audio Interface
Power Supply SuperFlower LEADEX TITANIUM 1600W
Mouse Razer DeathAdder V2
Keyboard Corsair K70 RGB Pro
Software Microsoft Windows 11 Pro
Benchmark Scores 3dmark = https://www.3dmark.com/spy/51229598
w/ 4K+ resolutions (reducing the need for AA) and 144Hz+ refresh rates (reducing the need for V-Sync) what's the point?

4K144+ is a very smooth experience, i just recently migrated from 60Hz and attest to the basic elimination of screen tearing without V-Sync enabled, etc
 
Joined
Aug 31, 2016
Messages
104 (0.03/day)
Sure extremely noticeable if I stop my game play, take a picture of the current scene, and then zoom in on plant pedals to be able to come on this forum and say, “Yup, just like I thought, markedly inferior to DLSS!”

It is actually quite the opposite. It is a still image that needs more careful inspection, especially since it is viewed in web browser and without the same focus, attention and immersion that you have when actually playing. In actual game once your focus is up and you view actual scenes in motion, then it becomes abundantly clear how much less aliasing and shimmering DLSS has. Especially if you look at notoriously problematic things like hair, thin objects or distant landscape, all of that looks like twice the resolution with DLSS despite the fact that it is reconstructed from lower resolution, while FSR is simply going inherit all TAA issues and then make it worse through basic upscaling and sharpening it uses. The only place where FSR may be comparable are big close up objects with low complexity, because these are very easy to reconstruct and to remove aliasing from. For everything else, DLSS is always going to be miles ahead, as it should given light years technological difference between the two solutions.

My intention is not to hate on FSR, but how developers are supposed to treat players seriously if they cannot see the difference between cheap upscaler/sharpener and proper reconstruction. What you are essentially telling them is "We cannot see a damn thing anyway, so why give us any actual technology?"
 
Joined
May 24, 2007
Messages
5,433 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
It is actually quite the opposite. It is a still image that needs more careful inspection, especially since it is viewed in web browser and without the same focus, attention and immersion that you have when actually playing. In actual game once your focus is up and you view actual scenes in motion, then it becomes abundantly clear how much less aliasing and shimmering DLSS has. Especially if you look at notoriously problematic things like hair, thin objects or distant landscape, all of that looks like twice the resolution with DLSS despite the fact that it is reconstructed from lower resolution, while FSR is simply going inherit all TAA issues and then make it worse through basic upscaling and sharpening it uses. The only place where FSR may be comparable are big close up objects with low complexity, because these are very easy to reconstruct and to remove aliasing from. For everything else, DLSS is always going to be miles ahead, as it should given light years technological difference between the two solutions.

My intention is not to hate on FSR, but how developers are supposed to treat players seriously if they cannot see the difference between cheap upscaler/sharpener and proper reconstruction. What you are essentially telling them is "We cannot see a damn thing anyway, so why give us any actual technology?"


Differences are minimal macro normal view, 4K FSR Ultra Quality versus 4K DLSS Quality, unless zooming very far in. In gameplay the player will not know the difference.
 
Joined
Aug 31, 2016
Messages
104 (0.03/day)

Differences are minimal, 4K FSR Ultra Quality versus 4K DLSS Quality, unless zooming very far in. In gameplay the player will not know the difference.

You have picked like the least favorable comparison for yourself out of all in the whole world. That net is TAA's and FSR's worst nightmare while it is exactly what DLSS excels at, reconstructing fine detail like that so well that it actually looks way better than native res TAA, let alone FSR that is just one big oversharpening artifact on top of already numerous native res TAA issues. I don't know if you are viewing it on a phone or 15" laptop or what, but if you cannot see the difference then I don't know what to tell you. That scene is so tailor cut for DLSS and the difference is so massive that you could almost accuse the author of cherrypicking the scene to favor DLSS, and you say there is no difference...
 
Joined
May 24, 2007
Messages
5,433 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
You have picked like the least favorable comparison for yourself out of all in the whole world. That net is TAA's and FSR's worst nightmare while it is exactly what DLSS excels at, reconstructing fine detail like that so well that it actually looks way better than native res TAA, let alone FSR that is just one big oversharpening artifact on top of already numerous native res TAA issues. I don't know if you are viewing it on a phone or 15" laptop or what, but if you cannot see the difference then I don't know what to tell you. That scene is so tailor cut for DLSS and the difference is so massive that you could almost accuse the author of cherrypicking the scene to favor DLSS, and you say there is no difference...

My monitor?

LG UHD Monitor 27' 4K LED Nano IPS ... https://www.amazon.com/dp/B08BCRYS6...imm_MTS3SH8VBDD8Q3E9G6G4?_encoding=UTF8&psc=1

The differences are negligible. Neither is a nightmare, and both look similar even rendering the game on my desktop testing both on my 3090 and 6900 XT.
 
Joined
Jun 22, 2006
Messages
1,097 (0.16/day)
System Name Beaver's Build
Processor AMD Ryzen 9800X3D
Motherboard Asus TUF Gaming X670E Plus WiFi
Cooling Corsair H115i RGB PLATINUM 97 CFM Liquid
Memory G.SKILL Trident Z5 Neo DDR5-6000 CL30 RAM 32GB (2x16GB)
Video Card(s) NVIDIA GeForce RTX 4090 Founders Edition
Storage WD_BLACK 8TB SN850X NVMe
Display(s) Alienware AW3225QF 32" 4K 240 Hz OLED
Case Fractal Design Design Define R6 USB-C
Audio Device(s) Focusrite 2i4 USB Audio Interface
Power Supply SuperFlower LEADEX TITANIUM 1600W
Mouse Razer DeathAdder V2
Keyboard Corsair K70 RGB Pro
Software Microsoft Windows 11 Pro
Benchmark Scores 3dmark = https://www.3dmark.com/spy/51229598
My monitor?

LG UHD Monitor 27' 4K LED Nano IPS ... https://www.amazon.com/dp/B08BCRYS6...imm_MTS3SH8VBDD8Q3E9G6G4?_encoding=UTF8&psc=1

The differences are negligible. Neither is a nightmare, and both look similar even rendering the game on my desktop testing both on my 3090 and 6900 XT.
I just got that same monitor a few days ago!

Going from 60 yo 144 Hz made a huge difference for me...

4K giving reduced reasons for AA, and 144 giving reduced reasons for V-Sync, imho
 
Joined
May 3, 2018
Messages
2,881 (1.19/day)
Adobe has an AI based super-resolution upscaling based enhancement in Photoshop. It resizes the image by 2x, but if you then downsize that image back to the original size using bicubic interpolation the resulting image is far more detailed in many cases. The difference can be huge and I use it for many of my images. I have a Sony 42MP camera and I'm amazed at how much more detail this can extract from and already very high quality image. It's similar to the pixel shift technology used by many camera companies now to increase resolution.

Nvidia is not doing anything original here IMO, but it is welcome in games.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,649 (6.68/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
The problem with DLSS is....like so many projects from big N, just temporary, something will come along that works on everything, some global tech that will just replace it.
And then you will be looking back at those silly DLSS games from 10 years ago which no current gpu actually supports anymore.

Yup, physx, sli comesto mind,shovel ware

So i presume AI is from the internet?
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
The problem with DLSS is....like so many projects from big N, just temporary, something will come along that works on everything, some global tech that will just replace it.
And then you will be looking back at those silly DLSS games from 10 years ago which no current gpu actually supports anymore.
Why would it need to last forever? It's meant for the here and now. Upscaling seems 100% here to stay and only going to increase, and Nvidia clearly have a big hand in paving the way forward. In 10 years time a current midrange GPU would obliterate any 2021 game and DLSS wouldn't be needed anyway.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,823 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Sure extremely noticeable if I stop my game play, take a picture of the current scene, and then zoom in on plant pedals to be able to come on this forum and say, “Yup, just like I thought, markedly inferior to DLSS!”
Actually, it is mostly the other way around, in the few FSR titles I have played FSR introduces quite noticeable shimmering. Plus it has clear sharpening artefacts, whether that is something you like or something that bothers you seems to be very subjective.

Adobe has an AI based super-resolution upscaling based enhancement in Photoshop. It resizes the image by 2x, but if you then downsize that image back to the original size using bicubic interpolation the resulting image is far more detailed in many cases. The difference can be huge and I use it for many of my images. I have a Sony 42MP camera and I'm amazed at how much more detail this can extract from and already very high quality image. It's similar to the pixel shift technology used by many camera companies now to increase resolution.

Nvidia is not doing anything original here IMO, but it is welcome in games.
The trick with DLSS, as well as XeSS and FSR is to do this in a very limited timeframe and in a way that would be temporally stable. Nvidia has said they target DLSS at doing what it does at 2160p in 1.5ms, less at lower resolutions. XeSS and FSR no doubt have similar targets.

Research into different upscaling methods have been going on for decades and while the methods DLSS/XeSS/FSR use are from a while back (if we look at the landscape of general upscaling) the limited timeframe is the key factor in games.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,341 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The devs can implement DLSS, they don't need nvidia to do it, also 2.0 changed the way it works, there's no longer such a dependency on the AI supercomputers at nvidia headquarters. That's what i understood anyway,

You'll probably have launch games with DLSS that are nvidia branded and not on the ones that are AMD branded. The bi mounthly thing you mention is for older titles.
If the devs implement DLSS and nvidia don't feed it into their DL supercomputer for training, then it's not really DLSS, is it? It just uses the same generic upsampling and filtering techniques like FSR and prior title-specific postprocessing filters use.

DLAA without DL is just regular AA, we already have about two decades of development covering a myriad of clever sampling/dynamic resolution/temporal/sharpened/edge-detecting combinations. FXAA or TXAA seem to be good enough for most people at modern HD resolutions and 60FPS framerates, so adding some proprietary garbage that only works properly once Nvidia have tuned the algorithm and bundled it into drivers is, IMO, not useful enough and too late to the party in almost every instance.

I think my beef with DLAA is in the naming, not the actual technology. Using the otherwise-wasted tensor cores to do something useful is good. Just call it TensorAA ffs. It has NOTHING to do with deep-learning if it's just making use of spare silicon to do a regular AA job.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,823 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
If the devs implement DLSS and nvidia don't feed it into their DL supercomputer for training, then it's not really DLSS, is it? It just uses the same generic upsampling and filtering techniques like FSR and prior title-specific postprocessing filters use.
Based on what Nvidia has disclosed and what we know, DLSS is using an algorithm derived from machine-learning. Nvidia has said DLSS 2.0 is no longer trained per game which does not mean it is not machine learning, the resulting algorithm is simply more generic and applies well enough to games in general. The use of Tensor cores to run this thing also points at it being ML-derived.
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
If the devs implement DLSS and nvidia don't feed it into their DL supercomputer for training, then it's not really DLSS, is it? It just uses the same generic upsampling and filtering techniques like FSR and prior title-specific postprocessing filters use.

DLAA without DL is just regular AA, we already have about two decades of development covering a myriad of clever sampling/dynamic resolution/temporal/sharpened/edge-detecting combinations. FXAA or TXAA seem to be good enough for most people at modern HD resolutions and 60FPS framerates, so adding some proprietary garbage that only works properly once Nvidia have tuned the algorithm and bundled it into drivers is, IMO, not useful enough and too late to the party in almost every instance.

I think my beef with DLAA is in the naming, not the actual technology. Using the otherwise-wasted tensor cores to do something useful is good. Just call it TensorAA ffs. It has NOTHING to do with deep-learning if it's just making use of spare silicon to do a regular AA job.

"FXAA and TXAA seem to be good enough" - said no gamer ever :roll:

Nvidia probably has trained its neural network well enough to create Skynet, now they are telling the tensor cores to destroy gamers in games first before carrying out real world domination.
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
FXAA with reshade is fine is closely comparable to SMAA imo and better performance. I even prefer the image quality myself, but as I said it's close. The text is more clear too for what it's worth.

SMAA vs FXAA
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
FXAA with reshade is fine is closely comparable to SMAA imo and better performance. I even prefer the image quality myself, but as I said it's close. The text is more clear too for what it's worth.

SMAA vs FXAA

I tried FXAA once in GTA V and I would rather lower some settings + 2xMSAA
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
Are you using in game FXAA or injection? I get what you're saying though on adjusting game render resolution and AA quality. Always a delicate balance between performance and image quality.
 
Joined
Feb 20, 2019
Messages
8,341 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
"FXAA and TXAA seem to be good enough" - said no gamer ever :roll:

Nvidia probably has trained its neural network well enough to create Skynet, now they are telling the tensor cores to destroy gamers in games first before carrying out real world domination.
Good enough to compensate for higher resolutions? No.
Good enough compared to the best alternatives at any given resolution? Yes.

  • FXAA can't compensate for lack of resolution, but hides jaggies for free. It's better than no AA unless it's implemented so poorly that it also applies to the HUD/Text.

  • TXAA does a much better job of maintaining detail but comes with a requirement for reasonably high framerates, and whilst cheap it's not free. 99% of cases it should be used if it's an option because the image quality improvements are significant for very little cost.
In an ideal world, AI is smart enough in realtime to know which parts of the image require VRS, MSAA, and TXAA. Nvidia's DLSS/DLAA isn't that AI, so until we get an actual inteligent frame-analysis AI that can apply enhancements on a per-frame basis, we're still living in the "generic postprocessing filter" era. Call it whatever name you want, it's not going to be a solution without compromise. The only good thing about DLAA is that tensor cores are horrendously under-utilised on consumer RTX cards, so giving them anything to do is a bonus.
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
does this mean DLSS 2.0 will make more sense for 1080p gamers want to render higher? currently DLSS 2.0 only makes sense for 1440p and 4k gamers from what I understand. so this tech will make it worthwhile for 1080p wanting to scale up higher rez's?

@nguyen can you explain. its getting confusing as **** LOL
Your question is confusing.

Are there many wanting to scale up to 1080p anyway?!.
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Good enough to compensate for higher resolutions? No.
Good enough compared to the best alternatives at any given resolution? Yes.

  • FXAA can't compensate for lack of resolution, but hides jaggies for free. It's better than no AA unless it's implemented so poorly that it also applies to the HUD/Text.

  • TXAA does a much better job of maintaining detail but comes with a requirement for reasonably high framerates, and whilst cheap it's not free. 99% of cases it should be used if it's an option because the image quality improvements are significant for very little cost.
In an ideal world, AI is smart enough in realtime to know which parts of the image require VRS, MSAA, and TXAA. Nvidia's DLSS/DLAA isn't that AI, so until we get an actual inteligent frame-analysis AI that can apply enhancements on a per-frame basis, we're still living in the "generic postprocessing filter" era. Call it whatever name you want, it's not going to be a solution without compromise. The only good thing about DLAA is that tensor cores are horrendously under-utilised on consumer RTX cards, so giving them anything to do is a bonus.

Yeah if I liked The Shimmering then I would use FXAA


What I liked best about DLSS is that it reduce the texture flickering, which is quite distracting, in Deathloop the flickering just made me not wanna to look at the game anymore.
 
Top