• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What's your preferred DLSS config?

What's your preferred DLSS config?

  • Native (DLSS disabled)

    Votes: 8,245 53.3%
  • Native + Frame Generation

    Votes: 735 4.7%
  • DLAA

    Votes: 1,223 7.9%
  • DLAA + Frame Generation

    Votes: 1,023 6.6%
  • Upscaling

    Votes: 2,014 13.0%
  • Upscaling + Frame Generation

    Votes: 1,351 8.7%
  • FSR (on NVIDIA)

    Votes: 887 5.7%

  • Total voters
    15,478
  • Poll closed .
Joined
Nov 22, 2023
Messages
321 (0.74/day)
Considering the Nvidia card that's running in my TV Box right now is a 980ti, FSR has been a real godsend getting stuff like Hogwarts Legacy and Jedi Fallen Order running acceptably.

Kinda the ideal use case scenario for upscaling tech: keeping an old piece of hardware out of the landfill for a few more years.
 
Joined
Oct 8, 2022
Messages
289 (0.34/day)
Processor Ryzen 5 7600X
Motherboard Asus Proart B650
Cooling Noctua U12S
Memory Corsair Vengeance DDR5 32GB (2x16GB) 5600MHz C36 AMD Expo
Video Card(s) Sapphire RX 7800 XT Nitro+
Storage Samsung 990 Pro 1Tb
Case Fractal Design Pop Silent
Audio Device(s) Edifier r1900tII
Power Supply Seasonic Prime Platinum 650W
I used FSR on performance settings to play Baldur's Gate 3 on an iGPU (it was entirely playable at 10-20fps I guess).
But with a GPU I would (and do) always choose no upscaling whatsoever if I can get a game running at 30+ fps because for me graphical quality is more important than very high fps.

But I'm someone who often plays with no antialiasing because of the minute blurring it can sometimes add and I even have these settings tweaked for no compromise on quality in Radeon settings:
1701095525124.png

And this is probably what less than 1% of gamers do since the defaults are 'standard' and 'enabled'.
 
Joined
Jun 1, 2010
Messages
452 (0.08/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
Sorry, my take!

There's no need for any tricks that substitute native rendering. If one needs better framerate, the culling is one of the best things invented for that purpose.
Upscaling should be feature for old weak cards, not new and powerful.

I personally can't say, that games for last five to seven or so years, have a significant jump/progress and eyecandy. So what's a point of buying 4080-4090 and then use DLSS, considering the game was developed for a console RX6700-ish class GPU? Wouldn't want to hamper image any further. The only acceptable addition to the rendering is some AA technique, that actually improves the image quality.

BTW. Nobody would need the DLSS/FSR and let alone frame generation, if the devs actually were optimizing the games. The GPU vendors advertise the game dev's incompetence. No AI would substitute the buggy mess and filthy code noodles, left unfixed for ages.

P.S.: All DLSS apostles praise it. That might be good. But this is a feature for games, that are couple years old at most. Does it work with old games (5-10 years old) that do not have the support of the said feature? Same for FSR. No offence. Just curious.
 
Last edited:
Joined
Dec 12, 2012
Messages
794 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
I used FSR on performance settings to play Baldur's Gate 3 on an iGPU (it was entirely playable at 10-20fps I guess).

Using FSR on iGPUs is actually a bad idea. You get almost no performance increase, because the computational cost of FSR is too high.

Most games will actually run better at native 720p compared to 1080p with FSR Performance (so 540p).

BTW. Nobody would need the DLSS/FSR and let alone frame generation, if the devs actually were optimizing the games. The GPU vendors advertise the game dev's incompetence. No AI would substitute the buggy mess and filthy code noodles, left unfixed for ages.

I have one use case from my personal experience where you "need" upscaling - playing on a 4K TV. There are no 1440p TVs, everything is 4K (or 1080p in the low end). So if you don't want to buy a 4090, you have to upscale. And DLSS is fantastic in that scenario.

But I absolutely agree about the lack of optimization. It's a plague now. Consoles are running most "next-gen" games at ~720p60, which is laughable. And it translates to the PC versions, which usually have lots of other problems.
 
Joined
Jan 14, 2019
Messages
13,791 (6.26/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I used FSR on performance settings to play Baldur's Gate 3 on an iGPU (it was entirely playable at 10-20fps I guess).
But with a GPU I would (and do) always choose no upscaling whatsoever if I can get a game running at 30+ fps because for me graphical quality is more important than very high fps.

But I'm someone who often plays with no antialiasing because of the minute blurring it can sometimes add and I even have these settings tweaked for no compromise on quality in Radeon settings:
View attachment 323184
And this is probably what less than 1% of gamers do since the defaults are 'standard' and 'enabled'.
Texture filtering quality on high is a must, but honestly, I haven't seen surface format optimisation do much, if anything at all, so I just leave it on. Same with tessellation on AMD optimised vs default.

Other than that, I agree. As long as I have 30+ FPS, I'll always choose visual quality over more performance.
 
Joined
Nov 22, 2023
Messages
321 (0.74/day)
A major issue with "Native" nowadays is that its not actually native, it usually had a mandatory schmeer of TAA applied over the top. In some apps, the TAA implementation performs well while in others the native TAA is just garbage.

Unfortunately older AA approaches like MSAA and such generally aren't an option anymore, and neither is outright turning off TAA either.

In such a world, something like DLSS or even FSR can be considered superior to a native that never was.
 
Joined
Jan 14, 2019
Messages
13,791 (6.26/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
A major issue with "Native" nowadays is that its not actually native, it usually had a mandatory schmeer of TAA applied over the top. In some apps, the TAA implementation performs well while in others the native TAA is just garbage.

Unfortunately older AA approaches like MSAA and such generally aren't an option anymore, and neither is outright turning off TAA either.

In such a world, something like DLSS or even FSR can be considered superior to a native that never was.
Yeah, what happened with MSAA and SSAA? Why don't we see them in games anymore? :(
 
Joined
Jun 29, 2023
Messages
611 (1.06/day)
Location
Spain
System Name Gungnir
Processor Ryzen 5 7600X
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
Yeah, what happened with MSAA and SSAA? Why don't we see them in games anymore? :(
MSAA (if improperly implemented) can be rather demanding, look at red dead 2, it's a mess in that game. Something that may help explain it better than I ever could is this document from nvidia, but in short, it's a bit of a pain in the ass to properly detect edges with deferred rendering when compared to forward rendering.
And SSAA is just a resolution slider nowadays, and when that doesn't work you can usually find a way in drivers, be it DLDSR or VSR

But I absolutely agree about the lack of optimization. It's a plague now. Consoles are running most "next-gen" games at ~720p60, which is laughable. And it translates to the PC versions, which usually have lots of other problems.
I keep thinking about that roundtable discussion that happened on digital foundry about DLSS, and how a CDPR dev said that DLSS is actually a great optimization tool, and they see it as a good way to keep pushing visual fidelity. They literally see it as a crutch for optimization, we are going backwards, I can excuse alan wake 2 because that actually has a lot of new tech and they are doing a lot in changing the engine, so I expect growing pains, and it also has the most detail I've ever seen in a game, it actually feels next gen. But anyone else? Considering how the visual fidelity isn't much improved, I'd just say they are lazy honestly, and things like what that CDPR dev said just confirms my belief
 
Last edited:
Joined
Dec 12, 2012
Messages
794 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
And SSAA is just a resolution slider nowadays, and when that doesn't work you can usually find a way in drivers, be it DLDSR or VSR

Downsampling is not quite the same as SSAA. I'm no expert, so I'm not gonna pretend to know how it works, but there's a substantial visual difference.

I especially remember early Euro Truck Simulator 2 days. It was a DX9 game back then, so you could force all kinds of AA in NV Inspector, but the game also had internal scaling. I was playing at 1080p, and using SGSSAAx2 looked sooo much cleaner than downscaling even from 4K. SGSSAAx4 looked absolutely perfect, and that's not even the full on SSAA.
There were also different compatibility flags that would change the visual output. Some were blurrier than others.

Another game with horrible aliasing is Destiny 2. Even if you play in 4K with 200% scaling (which means 8K), there are still many jaggies everywhere, it can never look perfect. I expect an actual SSAA implementation would do a much better job, but I'd rather just see DLAA for performance reasons.

Alien Isolation also has some crazy aliasing (specular or whatever), which you can never eliminate using downsampling. But I think there are some TAA mods for it, which clean it up nicely.

That's exactly why TAA was invented, and why all these new upscalers are based on it. It has drawbacks, but purely from an "eliminating jaggies" perspective, it's the most superior method, and it has an extremely low performance cost. MSAA and SSAA stopped being viable a long time ago, as they both have a high cost and a weak end result in modern games.
 
Joined
Aug 10, 2023
Messages
341 (0.64/day)
Yeah, what happened with MSAA and SSAA? Why don't we see them in games anymore? :(
Both essentially need to much power and also MSAA has a huge downside it doesn't fix all problems on the screen (in that regard TAA is better). SSAA especially is rare but you can always use VSR or DSR to upscale your resolution to 2x+ if you really want "SSAA".
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,399 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
It's interesting that in the op by w1z specifically asks nvidia users.
NVIDIA users: we're curious... if a game fully supports all the available options, which ones would you choose?
Yet anyone and everyone can vote, so I'd say that these results aren't truly indicative of how only nvidia users would vote.

From what I see online from bonafide nvidia users, Native would have a significantly smaller representation, with DLAA / upscaling / upscaling + FG would have more.

I can appreciate there's no practical way to achieve this, and what I myself see through my relatively narrow lens is likely skewed, just thought it was an interesting point to make.
 
Joined
Nov 21, 2020
Messages
2 (0.00/day)
I generally prefer native even though it comes with some asterisks. DLSS is great tech but I've still run into a few games where it looks quite poor when compared with native. And in those games where it does look better then it's often because the native TAA implementation itself is poor.

I'm still in the MSAA camp overall and especially miss SGSSAA. The new Forza Motorsport looks genuinely dreadful after the switch from MSAA to whatever TAA they're using and DLSS is even worse than that in this game. TAA can look fantastic but many recent implementations haven't been that great, in my opinion.
 
Joined
Jul 20, 2020
Messages
1,172 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
OK I finally got a 2060 Super just so I could take the poll.

Kinda mostly maybe a little bit. As I was playing Hogwarts Legacy on it at 1440p and enjoying myself, I'll vote upscaling/DLSS though DLAA is very nice too.
 
Joined
Oct 1, 2023
Messages
25 (0.05/day)
Processor Intel i9 9900K @5GHz all 8 cores.
Motherboard Asus Prime H370M Plus
Cooling Arctic Cooler Freezer 12
Memory 64GB Micron dual-channel @ 2666MHz
Video Card(s) Asus TUF RTX 4090 OC
Storage 5TB SSD + 6TB HDD
Display(s) LG C3 42'' 4K OLED Display
Case AeroCool
Audio Device(s) Realtek + Edifier R1900T II + Sony Subwoofer
Power Supply MSi MPG A850G 850W
Mouse Multilaser MO277
Keyboard T-Dagger Destroyer (mechanical)
Software Windows 11 Pro Beta Ring
Sharpening filters are subjective and introduce sharpening artifacts. They've also been around for decades and not something specific to DLSS. You can inject sharpening pretty easily w/o DLSS
Not DLAA. I have no artifacts here, on every game I've tested.
 
Joined
May 11, 2006
Messages
144 (0.02/day)
System Name HTPC
Processor Ryzen 3600
Motherboard ASRock X570M Pro4
Cooling Aquacomputer Cuplex Kryros Delrin + Aquaduct 720XT
Memory Mushkin Ridgeback DDR4-3200
Video Card(s) Gigabyte Waterforce WB RTX 2080Ti
Storage Crucial M1 500GB
Display(s) Philips 55POS9002/12 OLED TV
Case Silverstone Sugo SG11
Audio Device(s) FiiO D03k -> Pioneer A676 -> KEF iQ7
Power Supply Seasonic G550 PCGH Edition
Mouse MX Master
Keyboard Corsair K63 wireless Lapboard
Depending on the Performance DLDSR->DLAA->DLSS Q->DLSS B -> reduce other details.
Since there is no FG for Ampere, I can not comment on it.
 
Joined
Jun 6, 2022
Messages
622 (0.64/day)
All are welcome tools, useful especially when your old GPU shows signs of fatigue in the fight with new games. But not only.
For example, in the game below, when joining the two video recordings, I had to run a third video recording because I wasn't sure which one was off and which was on. The color of the hero's helmet is the biggest difference, otherwise it is difficult to distinguish and only by direct comparison.

In the conditions that DLSS does its job well, its activation brings some benefits:
1. FPS increase, if you need it.
2. The consistent decrease of GPU Usage (even vRAM Usage), which leads to the reduction of energy consumption and temperatures, when playing with vSync ON.

Since activation is not required, I don't understand where the problem is with those who criticize them. You don't like it, you don't use it.

 
Joined
Dec 12, 2016
Messages
2,120 (0.71/day)
So after all the testing, press, hardcore brand loyalist debates, code optimizations, etc., over half of Nvidia users don’t enable or even want super sampling/scaling/etc.

Native rendering FTW!
 
Joined
Nov 13, 2023
Messages
25 (0.06/day)
Processor Intel i7-10700
Cooling Noctua
Video Card(s) Nvidia Gigabyte 4070 Ti Windforce OC
Display(s) Msi 170 Hz 1440p
Case Phantek A400
Power Supply Seasonic 850w Platinum
I love DLAA and Frame Gen, i use them whenever possible. I also love DLDSR when DLAA isn't an option. In older games, i play at 5120x2880 using DSR with no blur.

I can't stand TAA in most games today, it's just a blurry mess and aliasing is still present a lot.
 
Joined
Jun 29, 2023
Messages
611 (1.06/day)
Location
Spain
System Name Gungnir
Processor Ryzen 5 7600X
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
Since activation is not required, I don't understand where the problem is with those who criticize them. You don't like it, you don't use it.
I agree, though activation is actually becoming a requirement in some games, even if native, you're still running the algorithm, which can be expensive, example being Alan Wake 2.

Other times the games are made with "upscalers in mind", such is the case with remnant II, or seemingly every UE5 game thus far, criticism is becoming more prevalent because developers are seeing it as "optimization" (example being CDPR, said in the roundtable discussion with DF), instead of optimizing for native rendering, they use upscalers as a way to achieve the target performance, which people don't like as it's not how we the users thought it should be used for
 

Winssy

New Member
Joined
Mar 31, 2023
Messages
21 (0.03/day)
I am the owner of a 4080 and a 1440p 165Hz monitor, and I prefer to play with DLAA + Frame Generation. This significantly improves the picture quality compared to native, and the gameplay smoothness is much higher. However, before enabling Frame Generation, I adjust the graphics settings to ensure that the clean FPS with DLAA is not less than 70 FPS. This way, enabling generation does not introduce any negative effects.
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,399 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
So after all the testing, press, hardcore brand loyalist debates, code optimizations, etc., over half of Nvidia users don’t enable or even want super sampling/scaling/etc.
The poll is flawed, it asks only Nvidia users to vote, but anyone can (and has) voted, so no this doesn't in any meaningful way accurately represent Nvidia users.
 
Joined
Feb 24, 2023
Messages
3,451 (4.92/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
And this is probably what less than 1% of gamers do since the defaults are 'standard' and 'enabled'.
I am in this <1%. Yet I hate when it's lower than 55 FPS. I had too much of sub-30 FPS gaming when I was a kid. I am now a big boy and big boys need big framerate numbers. xD

My preferred DLSS config would be resolution 4 times higher than native (i.e. 5120x2880 on a 2560x1440 display) + DLSS: Quality if the framerate is right (55 or higher).
If that's too slow I would resort to plain DLSS: Quality. Why "whould?" I don't own an nVidia GPU. That's how I play with FSR and I don't think DLSS experience will be much different from that apart from being less artifacted and generally more stable.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,399 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I think you'll get more AMD users that'll vote native which skews the results.
bang on.
I like playing games the way they were imagined by their creator.
Not sure I follow this logic, the creator creates games with standard resolution options, generally from 720p ish or even lower, all the way to 4k and beyond, sometimes (hopefully even, with ultrawide options and VR too) so which one of those is intended by the creator? Is playing 720p native more the way the creator imagined than 1440p? what about 1440 upscaled to 4k? or is 4k native what's intended and anything lower isn't? I'm interested to hear from you how the end users arbitrary resolution choice relates to the creators vision.

Bonus question, if the Developer lists upscaling for all the system specs and target resolutions and fps as we've seen lately, is playing with upscaling the way it's imaged by the creator?

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------

I'm continually puzzled by the antiquated notion that native rendering resolution will always be best, the goal, some pinnacle of image quality.... the native resolution of any given persons personal monitor?..

If rendering at native panel resolution is to be touted as the best, then why does super sampling exist?

Now I don't doubt for a second that each panel has an absolute quality limit, imposed by the physical resolution. However, when taking about rendering resolution, it's absolutely plainly evident and obvious, that rendering at a given panels native resolution isn't where IQ stops improving.
 
Last edited:
Top