• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Elusive FidelityFX Super Resolution Coming This June?

Joined
Sep 2, 2020
Messages
1,491 (0.92/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Like I said, low fps can be masked with motion blur, probably why you like it

Consoles use motion blur alot to make the 20-30 fps seem smooth. Especially last gen with weak CPUs. Current gen should be able to hit 60 but some games will still be 30 and have motion blur enabled as default. Pretty much no games that run 60 fps on consoles have motion blur enabled as default, because it's not really needed. 60 fps is smooth enough.

Motion interpolation is pretty garbage for gaming. Your reaction time goes up. Hell, some people even get headache from it, which is why many games has an ON/OFF toggle.
 
Last edited:
Joined
Sep 2, 2020
Messages
1,491 (0.92/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
Like I said, low fps can be masked with motion blur, probably why you like it

Consoles use motion blur alot to make the 20-30 fps seem smooth. Especially last gen with weak CPUs. Current gen should be able to hit 60 but some games will still be 30 and have motion blur enabled as default. Pretty much no games that run 60 fps on consoles have motion blur enabled as default, because it's not really needed. 60 fps is smooth enough.

Motion interpolation is pretty garbage for gaming. Your reaction time goes up. Hell, some people even get headache from it, which is why many games has an ON/OFF toggle.
Um are you okay I can get 60+ fps in all my games
 
Joined
May 20, 2020
Messages
1,395 (0.81/day)
So DLSS is (simply put) some sort of bicubic upscale filter, clever, yet just a filter. No replacement for displacement image rendered at its set resolution. Smells like marketing gimmick.
Perhaps AMD can make a better filter. :)
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,429 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
So DLSS is (simply put) some sort of bicubic upscale filter
Not quite

DLSS 2.0 works as follows

The neural network is trained by Nvidia using "ideal" images of video games of ultra-high resolution on supercomputers and low resolution images of the same games. The result is stored on the video card driver. It is said that Nvidia uses DGX-1 servers to perform the training of the network.

The neural network stored on the driver compares the actual low resolution image with the reference and produce a full high resolution result. The inputs used by the trained neural network are the low resolution aliased images rendered by the game engine, and the low resolution, motion vectors from the same images, also generated by the game engine. The motion vectors tell the network which direction objects in the scene are moving from frame to frame, in order to estimate what the next frame will look like.
 
Joined
Feb 13, 2016
Messages
3,318 (1.01/day)
Location
Buenos Aires
System Name Ryzen Monster
Processor Ryzen 7 5700X3D
Motherboard Asus ROG Crosshair Hero VII WiFi
Cooling Corsair H100i RGB Platinum
Memory Corsair Vengeance RGB Pro 32GB (4x8GB) 3200Mhz CMW16GX4M2C3200C16
Video Card(s) Asus ROG Strix RX5700XT OC 8Gb
Storage WD Black 500GB NVMe 250Gb Samsung SSD, OCZ 500Gb SSD WD M.2 500Gb, plus three spinners up to 1.5Tb
Display(s) LG 32GK650F-B 32" UltraGear™ QHD
Case Cooler Master Storm Trooper
Audio Device(s) Supreme FX on board
Power Supply Corsair RM850X full modular
Mouse Corsair Ironclaw wireless
Keyboard Logitech G213
VR HMD Headphones Logitech G533 wireless
Software Windows 11 Start 11
Benchmark Scores 3DMark Time Spy 4532 (9258 March 2021, 9399 July 2021)
Will this be RDNA2 only?
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,429 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Joined
Feb 8, 2017
Messages
270 (0.09/day)
As a photographer I use a program called Topaz Gigapixel AI to do upscaling of images. I regularly upscale my bird photos by around 30% and sometimes 50% and as long as my source material is good, I cannot tell the difference and often the upscaled image is better. The AI does a spectacular job of improving detail. In one comparison it was impossible to tell the difference between a native 24MP photo taken by one camera and the same photo taken with a 61MP camera when the smaller image was upscaled. This is the beauty of the new AI training, compared to old braindead bicubic upscaling. It's not just about sharpening low res upscaled data, the AI can help create improved detail because it knows what the texture should like.

The results I've seen for DLSS when upscaling from say 1440p-1800p to 4K are more than good to pass muster and given in a game you are not usually standing still looking for tiny flaws, I'll take image quality that looks almost as good but with 50% higher frame rates any day. I would never try and upscale 1080p to 4K, 1440p minimum.
That is a still image and again the higher quality source you use the better, if you can put up a single 1GB raw image into it and upscale it by 50% it will probably look 99% the same. Use a 640x320 image and upscale it 30% and see how garbage it looks.

Nvidia is not really using AI to upscale the image, its recreating a similar picture from a set of database usually from the same game, but with dlss 2.0 from other games as well. So it sees as image and it replaces it with a different one, unfortunately because it wants to add performance it recreates a worse image with washed out colors and less pixels.

There is no way to have much less pixels in an image and to be better. Again if we are talking 40m/px to 80m/px raw 1/2gb image that already have tens of billions of pixels, even a say 10 million pixel loss would barely be noticeable especially if its on the outskirts of the image and with a bit of sharpening it can always appear to look better.

Nvidia with DLSS 2.0 is creating images of worse quality in in order to obtain performance.

The issue with DLSS is that I can do that on my own by just reducing settings. Lets take Red Dead 2 for example, a very taxing game, it can even tax a RTX 3080, but manually optimizing the settings can give you up to 50% better performance for barely any visual loss. You can do this with a lot of games, just selecting 1 preset lower than max will often times improve performance by over 25%, just a global preset of 1 level bellow max will gain at least 25% performance, if you actually tune the settings you can easily get 30-40% increased performance for minimum loss of quality, in fact better quality than DLSS and just as fast, if not faster performance.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,429 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Nvidia with DLSS 2.0 is creating images of worse quality in in order to obtain performance.
The issue with DLSS is that I can do that on my own by just reducing settings. Lets take Red Dead 2 for example, a very taxing game, it can even tax a RTX 3080, but manually optimizing the settings can give you up to 50% better performance for barely any visual loss. You can do this with a lot of games, just selecting 1 preset lower than max will often times improve performance by over 25%, just a global preset of 1 level bellow max will gain at least 25% performance, if you actually tune the settings you can easily get 30-40% increased performance for minimum loss of quality, in fact better quality than DLSS and just as fast, if not faster performance.
These things are not at all the same and can even be stacked should you want additional performance, not in RDR2 of course, but DLSS 2.0 supported games. Do you own an RTX series card? Go try Control or Metro Exodus EE in DLSS quality mode, then come back and tell me how much worse quality you think it is. Believe me, you need to see it running natively with your own eyes, watching a youtube video, looking at stills side by side etc, do not effectively demonstrate what it looks like in person. There are some potential artefacts and drawbacks, sure, it's not perfect, but if you're looking at what it says it does on paper and concluded it must not be able to deliver what it says it does, well, it does, and the majority of people who actually use it agree.
 
Last edited:
Joined
Oct 12, 2005
Messages
727 (0.10/day)
The higher the resolution, the better image upscaling tech are. If we move in the future over 4K or if 4K become more widely available, Upscaling tech will just become more and more useful.

But at 1080p, there isn't just enough detail to upscale properly from lower resolution. Same thing with videos. When 1080p was released, upscaled version looked like crap. But now almost all 4K movies are just 1080p upscaled.

But in the end, upscaling technique will become standard. The gaming industry was always about making things run as fast as possible and cheating to get the best results. upscaling is just another tool to achieve this. It's here to stay. I just hope that the winning tech isn't a proprietary solution but an open one that all GPU vendor can implement...

Not sure if people were there at the beginning of 3D, but some game were only usable with a 3dfx card, you had to use software renderer to use it with a different card. That was bad and i hope we never ever get back to something like that.
 
Joined
Feb 8, 2017
Messages
270 (0.09/day)
These things are not at all the same and can even be stacked should you want additional performance, not in RDR2 of course, but DLSS 2.0 supported games. Do you own an RTX series card? Go try Control or Metro Exodus EE in DLSS quality mode, then come back and tell me how much worse quality you think it is. Believe me, you need to see it running natively with your own eyes, watching a youtube video, looking at stills side by side etc, do not effectively demonstrate what it looks like in person. There are some potential artefacts and drawbacks, sure, it's not perfect, but if you're looking at what it says it does on paper and concluded it must not be able to deliver what it says it does, well, it does, and the majority of people who actually use it agree.
Its the same result! Its not the same exact thing technically, but in reality its exactly the same thing!
Do you have Red Dead Redemption 2? Go to hardware unvoxed youtube video of their optimization of it, you can even further tune it yourself, and apply those settings and compare the game. You get 50% more performance for barely any loss of quality.
You can not get that with DLSS 2.0, but in essence its what it does, it lowers image quality for faster performance.

Control is a unique game in that DLSS 2.0 does look better over native, but only if you are using their absolute garbage TAA setting, DISABLE TAA in their game and force fxaa over nvidia control panel or AMd's settings, and see what a difference it makes. I think they've intentionally made their TAA implementation bad in order to have DLSS 2.0 appear better. Plus that is the only game where dlss looks better over native and again its because they are cheating, their TAA implementation is garbage! Play the game without it and see how much better the image quality is. Compare max settings Control without their garbage TAA antialliasing, just run in without AA and compare it to DLSS max quality, see that even in control the game actually does look much better over dlss.

Again Nvidia has to rely on cheats to make their "feature" compelling. When I buy a high end card, I but it because I don't want to lower visual quality, I don't want to use cheats, I don't want to use upscaling, I want a great 1440p performance at 60+ frames, I actually prefer 100+ frames on my 144hz monitor.

I don't even have 4k monitor because if I'm running 60fps on average that means its going bellow that, I prefer best visual quality on 1440p with 100+ frames for the best possible gameplay and boy does 100+ frames make a difference!

You get better image quality playing at 100+ frames on a high refresh rate monitor. In fact if you go back to 60hz monitor the image looks unbearable, you have to get used to it over several days.

And if I'm targeting ray tracing for "better" visuals, I certainly don't want to then negate that by using Nvidia's DLSS which defeats the purpose of ray tracing. If ray tracing provides better image quality and it doesn't always do that, then I don't want to negate that by running dlss.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,429 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Its the same result! Its not the same exact thing technically, but in reality its exactly the same thing!
Not the same thing at all, not the same result. You are altering the image in very different ways, it's different.
Do you have Red Dead Redemption 2? Go to hardware unvoxed youtube video of their optimization of it, you can even further tune it yourself, and apply those settings and compare the game. You get 50% more performance for barely any loss of quality.
Yes I do and yes I've followed their optimization guide, this is again, not the same thing as any upscaling method let alone DLSS.

One approach alters the quality of lighting, assets, textures, reflections, particles, volumetrics, motion blur etc.. on a granular, per-setting basis, the other alters the resolution and overall presentation of the entire image, completely independently of any previous settings chosen.
Control is a unique game in that DLSS 2.0 does look better over native
So it does look better than native, glad we're agreed. Metro Exodus Enhanced Edition falls in that bucket too now btw, with several others being comparable. Kind of negates your earlier assertion that it is creating images of worse quality in order to obtain performance, when you agree on an example that look better, so, the tech can and does work.
And if I'm targeting ray tracing for "better" visuals, I certainly don't want to then negate that by using Nvidia's DLSS which defeats the purpose of ray tracing. If ray tracing provides better image quality and it doesn't always do that, then I don't want to negate that by running dlss.
They're not necessarily a married couple, you can use one or the other or both in most cases. But say in Control or Metro where you get the best of both worlds, why wouldn't you use it if it was an available option?

Again, this sounds like a paper analysis, do you own/game on a DLSS capable card? If the answer is yes I'd encourage you to come to discuss image quality here and give your opinion on how it can be improved or enhanced etc. If the answer is no, well forgive me for putting little stock in your take.
 
Last edited:
Joined
Feb 8, 2017
Messages
270 (0.09/day)
Not the same thing at all, not the same result. You are altering the image in very different ways, it's different.

Yes I do and yes I've followed their optimization guide, this is again, not the same thing as any upscaling method let alone DLSS.

One approach alters the quality of lighting, assets, textures, reflections, particles, volumetrics, motion blur etc.. on a granular, per-setting basis, the other alters the resolution and overall presentation of the entire image, completely independently of any previous settings chosen.

So it does look better than native, glad we're agreed. Metro Exodus Enhanced Edition falls in that bucket too now btw, with several others being comparable. Kind of negates your earlier assertion that it is creating images of worse quality in order to obtain performance, when you agree on an example that look better, so, the tech can and does work.

They're not necessarily a married couple, you can use one or the other or both in most cases. But say in Control or Metro where you get the best of both worlds, why wouldn't you use it if it was an available option?

Again, this sounds like a paper analysis, do you own/game on a DLSS capable card? If the answer is yes I'd encourage you to come to discuss image quality here and give your opinion on how it can be improved or enhanced etc. If the answer is no, well forgive me for putting little stock in your take.

Dude if you need to omit my full sentences to make your argument, your argument is not very good!

"Control is a unique game in that DLSS 2.0 does look better over native, but only if you are using their absolute garbage TAA setting, DISABLE TAA in their game and force fxaa over nvidia control panel or AMd's settings, and see what a difference it makes."

That is what I wrote. I used to have a RTX 2070, sold it when the mining got going and bought a RX 5700xt. I've tried pretty much all DLSS games till one year ago, it is ultimately an automated settings adjuster.

Yeah I've not tried it with Cyberpunk 2077 since the game came out very soon, but I've watched the videos, I've seen the images, its exactly like all the other games. Clearly visible difference in image quality.

I think its overblown, to me something like VRS and just resolution upscaling works much better.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,429 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Dude if you need to omit my full sentences to make your argument, your argument is not very good!

"Control is a unique game in that DLSS 2.0 does look better over native, but only if you are using their absolute garbage TAA setting, DISABLE TAA in their game and force fxaa over nvidia control panel or AMd's settings, and see what a difference it makes."

That is what I wrote. I used to have a RTX 2070, sold it when the mining got going and bought a RX 5700xt. I've tried pretty much all DLSS games till one year ago, it is ultimately an automated settings adjuster.

Yeah I've not tried it with Cyberpunk 2077 since the game came out very soon, but I've watched the videos, I've seen the images, its exactly like all the other games. Clearly visible difference in image quality.

I think its overblown, to me something like VRS and just resolution upscaling works much better.
Sure then, I'll quote you in full. I'm not modding Control, not when it works perfectly and looks amazing without modding it, especially as I might only stand to gain a negligible clarity difference, and lose +75% fps I get from DLSS in quality mode, that's a bad trade. You don't like the TAA? suit yourself, I happen to like/use TAA when it has nothing to do with DLSS anyway. Agree to disagree?

This will be the last time I say it, but you are outright wrong about how it works and it being an "automatic settings adjuster" in the context you've elaborated on in earlier posts. It doesn't work like that, it doesn't take its place, all the in-game separate settings still work regardless of any DLSS (or coming soon, FSR) setting used. If that's how you choose to view it because of the end results you perceive, knock yourself out. Again, agree to disagree?

VRS and resolution upscaling have been shown repeatedly to have worse IQ results than a good DLSS 2.0 imp, but I won't discount how very handy they can be, especially given a lot of games have the option to run a lower render scale baked in. 70% or higher with a tuned sharpen filter on top has the ability to be very effective at what it sets out to achieve, negligible/acceptable IQ loss for gained performance.

This is where FSR has a chance to shine, if it can retain better image quality than a simple lowered render scale or checkerboarding, it's already a successful feature. It doesn't have to be "as good as native" (and notice, it seems AMD aren't claiming that either), it just needs to look better than the game otherwise would at an equivalent performance level that would be gained by traditional downsampling, or people will just do that.
 
Joined
Sep 2, 2020
Messages
1,491 (0.92/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
This is where FSR has a chance to shine, if it can retain better image quality
If its even half decent it will blow dlss out of the water
Its on the consoles so almost all the AAA games will have it in a few years
it supports MORE hardware so the devs would support it
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,429 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
If its even half decent it will blow dlss out of the water
Its on the consoles so almost all the AAA games will have it in a few years
it supports MORE hardware so the devs would support it
It certainly has a lot of potential, perhaps it's top strength lies in ease of implementation. Were in for at the very least a few years of both competing at the same time, and NVIDIA RTX users get the best of both worlds, DLSS in NVIDIA sponsored titles, FSR in AMD sponsored titles.

It's definitely good news for everyone, I love being spoiled for choice, not to mention any tech that can help weaker/older GPU's keep their head above water is clearly welcome.
 
Joined
Sep 2, 2020
Messages
1,491 (0.92/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
It certainly has a lot of potential, perhaps it's top strength lies in ease of implementation. Were in for at the very least a few years of both competing at the same time, and NVIDIA RTX users get the best of both worlds, DLSS in NVIDIA sponsored titles, FSR in AMD sponsored titles.

It's definitely good news for everyone, I love being spoiled for choice, not to mention any tech that can help weaker/older GPU's keep their head above water is clearly welcome.
honestly i think dlss will be left on the side of the road
and fall behind fsr
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,429 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
honestly i think dlss will be left on the side of the road
and fall behind fsr
I think that if that happens, it's going to take a good while, years. NVIDIA is full steam ahead on DLSS right now, trying to get the groundwork done in multiple engines. Plus the tech is still very young, they'd be foolish to just roll over and willingly let FSR 'leave it on the side of the road', I'd wager they will continue to develop it and improve adoption and IQ/performance. The next few years now is going to be very interesting!
 
Joined
Sep 2, 2020
Messages
1,491 (0.92/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
I think that if that happens, it's going to take a good while, years. NVIDIA is full steam ahead on DLSS right now, trying to get the groundwork done in multiple engines. Plus the tech is still very young, they'd be foolish to just roll over and willingly let FSR 'leave it on the side of the road', I'd wager they will continue to develop it and improve adoption and IQ/performance. The next few years now is going to be very interesting!
It doesnt matter that they are full steam ahead when only 1 percent of the market can use it
Basiclly all they can do to stop being smashed is make it work on more cards
if fsr is good dlss will be left for dead mark my words
with the consoles running it more pcs running it
I expect more devs to jump behind it
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,429 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
It doesnt matter that they are full steam ahead when only 1 percent of the market can use it
Basiclly all they can do to stop being smashed is make it work on more cards
if fsr is good dlss will be left for dead mark my words
Looking at the PC market, DLSS capable cards represent more like 15% of current users (steam April survey) and that number grows daily. Also bear in mind FSR is open source, so there is nothing to stop Nvidia from working on it/implementing it themselves too, like we could end up with effectively two tiers of DLSS (or they could call the spatial one whatever they want), one that works universally, and one that works with tensor cores. Who knows.

Consider your words marked. I think that in the PC space, it's still picking up steam right now, where FSR has a lot of ? marks. By the end of the year I'd like to revisit this landscape and see where both are at.
 
Last edited:

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
So DLSS is (simply put) some sort of bicubic upscale filter, clever, yet just a filter. No replacement for displacement image rendered at its set resolution. Smells like marketing gimmick.
Perhaps AMD can make a better filter. :)

No it's not.


Sadly FSR is worse than DLSS 1.0 and will never be on par or better than DLSS 2.0

Read that article if in doubt. The best one on the subject.
 
Joined
Sep 2, 2020
Messages
1,491 (0.92/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
No it's not.


Sadly FSR is worse than DLSS 1.0 and will never be on par or better than DLSS 2.0

Read that article if in doubt. The best one on the subject.
So the tool that is not even out yet us terrible and you have tried it yourself oh wait
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
So the tool that is not even out yet us terrible and you have tried it yourself oh wait

Maybe you should read the article or watch the videos on youtube about it, pretty much everyone says it's worse than DLSS 1.0, the image used is from AMDs own video
Did people expect otherwise? AMD was forced to bring out this feature, because their RT performance is so bad, this will be the feature that will - on paper - make it useful
DLSS 2.0 support is spreading like a wildfire while AMD is betatesting this spatial upscaler (which is nothing new)
I can't wait to see the final result but my expectations dropped, looks like FSR is going to be a blurry mess, even worse than DLSS 1.0 and DLSS 1.0 sucked bigtime.
 
Joined
Sep 2, 2020
Messages
1,491 (0.92/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
Maybe you should read the article or watch the videos on youtube about it, pretty much everyone says it's worse than DLSS 1.0, the image used is from AMDs own video
Did people expect otherwise? AMD was forced to bring out this feature, because their RT performance is so bad, this will be the feature that will - on paper - make it useful
DLSS 2.0 support is spreading like a wildfire while AMD is betatesting this spatial upscaler (which is nothing new)
I can't wait to see the final result but my expectations dropped, looks like FSR is going to be a blurry mess, even worse than DLSS 1.0 and DLSS 1.0 sucked bigtime.
it will get better
give amd time
they clearly are not doing this just cause of raytracing otherwise it would not be open
also dlss support will be tiny compared to fsr once it takes off
consoles support it
that will be a BIG help for amd
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
it will get better
give amd time
they clearly are not doing this just cause of raytracing otherwise it would not be open
also dlss support will be tiny compared to fsr once it takes off
consoles support it
that will be a BIG help for amd

If FSR is going to be blurry, no-one will care.. Did people care about DLSS 1.0? No. DLSS 2.0 is what made all the difference.
Consoles might use it anyway tho, because they are used to 30 fps and forced motion blur haha.

So far FSR looks like a regular upscaler, nothing impressive or technical about it, these have exited for maaaany years, the problem with these, have always been blur and/or visual artifacts.

Lets see if DLSS support is going to be tiny, tons of AAA games have it or will get it, UE4 + UE5 has native support, Unity too, two of the most widely used gameengines.

But yeah, I am looking forward to see FSR tested by 3rd party, if people can already spot blur in AMDs own presentation, hopes are not high... DLSS 1.0 looked decent in Nvidia's own presentation too, 3rd party then found it to be a blur-fest. I expect FSR to be the same.

And this is probably the reason why AMD allows Nvidia GPUs to use it... Because no-one will want to use it :laugh:
 
Last edited:
Top