• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alan Wake 2: FSR 2.2 vs. DLSS 3.5 Comparison

Joined
Sep 21, 2023
Messages
30 (0.07/day)
Is it so bad a comparison of FSR on RTX and RX cards that needs excess argumentation of why it doesn't need to happen?
Anyone with both cards can compare how FSR looks on RTX and RX cards. I just tried to explain to you why no one bothers to do it. It just doesn't make sense.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Anyone with both cards can compare how FSR looks on RTX and RX cards. I just tried to explain to you why no one bothers to do it. It just doesn't make sense.
Even out of curiosity this could be a nice review. And it does make sense. In the far past there was a debate about ATI/AMD cards and Nvidia color quality. It was making sense to say "all cards produce the same standard output", but there where many rumors about differences in color quality between the two brands. The result was in the end people pointing at a default setting in the Nvidia driver being the reason of that difference.

Still, about FSR. As I said, why so many arguments to avoid something that you describe as something so simple to check?

2018


2015

Nvidia vs Amd color quality - Graphics Cards - Linus Tech Tips

There where discussions about it even before 2010.
 
Joined
Sep 10, 2018
Messages
6,907 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Even out of curiosity this could be a nice review. And it does make sense. In the far past there was a debate about ATI/AMD cards and Nvidia color quality. It was making sense to say "all cards produce the same standard output", but there where many rumors about differences in color quality between the two brands. The result was in the end people pointing at a default setting in the Nvidia driver being the reason of that difference.

Still, about FSR. As I said, why so many arguments to avoid something that you describe as something so simple to check?

2018


2015

Nvidia vs Amd color quality - Graphics Cards - Linus Tech Tips

There where discussions about it even before 2010.

For sure it would be ideal when comparing image quality to use an AMD card to test FSR with but I can also see why an Nvidia card is used it makes testing much less work becuase all 3 technologies work on it

I believe W1zard is sending the reviewer a 7900XT and a A750 so that will be nice but you will still see the usual suspects saying something is wrong with the testing etc etc.

On my end a 6700XT produces the same results as my 3080ti/4090 so I'm not expecting anything to really change. The only real difference is how much sharpening an end user can stomach some it doesn't bother them and some it does at least in games it allows you to adjust sharpness anyway.

At the end of the day a lot of this is subjective I can't personally stand shimmering artifacts or disocclusion artifacts two things fsr does poorly to different degrees vs what DLSS isn't good at even TAA in some games has major issues that both FSR and DLSS clean up so it really dows depend game to game.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
For sure it would be ideal when comparing image quality to use an AMD card to test FSR with but I can also see why an Nvidia card is used it makes testing much less work becuase all 3 technologies work on it

I believe W1zard is sending the reviewer a 7900XT and a A750 so that will be nice but you will still see the usual suspects saying something is wrong with the testing etc etc.
Obviously it makes testing easier. No card swapping. Also many who post reviews are simple individuals with one PC that use for that testing and probably only one card in their possession. They are not test labs. And the probable result is expected both(three) cards to offer identical image quality. Still using one's tech on the other's products, doesn't necessary exclude the possibility of finding differences for obvious reasons. Different architectures, different drivers. Isn't for example ARC FULLY compatible with DirectX 12? So why Starfield was unplayable on ARC GPUs when it came out? Why AMD cards where not displaying the stars? Why Nvidia cards where under performing? Don't they have the same support for DirectX 12? We expect something to work across the board only to end up with bugs and differences. We accept as normal a game to have 100 bugs, drivers to have 100 bugs and suddenly the idea of FSR not producing a perfectly identical picture on Nvidia or Intel cards, with what it produces on AMD cards absurd???? Why?

The "usual suspects" can insist that the images used for an AMD vs intel vs Nvidia FSR side by side comparison, are coming in fact from the same graphics card and the reviewer is trying to fool the viewers. In my case, my question is sincere and I think logical. When choosing to use words like "broken", at least verify it first. Verify that the tech is working as it should in any hardware before using such words. In the past Hairworks and TressFX where tested in both competitors' cards to spot differences or performance issues. Why not test FSR?

On my end a 6700XT produces the same results as my 3080ti/4090 so I'm not expecting anything to really change. The only real difference is how much sharpening an end user can stomach some it doesn't bother them and some it does at least in games it allows you to adjust sharpness anyway.
This is the expected result, but before calling something as "broken", shouldn't the parameter of hardware used be tested?

At the end of the day a lot of this is subjective I can't personally stand shimmering artifacts or disocclusion artifacts two things fsr does poorly to different degrees vs what DLSS isn't good at even TAA in some games has major issues that both FSR and DLSS clean up so it really dows depend game to game.
No one can "stand shimmering artifacts or disocclusion artifacts". Imagine choosing 4K and everything at ultra/high, only to see a picture that is dancing in frond of your eyes and probably sending you to get an aspirin half an hour latter for the headache.

"Subjectivity". One more reason for that hardware dependency check.
 
Joined
Sep 21, 2023
Messages
30 (0.07/day)
Even out of curiosity this could be a nice review. And it does make sense. In the far past there was a debate about ATI/AMD cards and Nvidia color quality. It was making sense to say "all cards produce the same standard output", but there where many rumors about differences in color quality between the two brands. The result was in the end people pointing at a default setting in the Nvidia driver being the reason of that difference.

Still, about FSR. As I said, why so many arguments to avoid something that you describe as something so simple to check?

2018


2015

Nvidia vs Amd color quality - Graphics Cards - Linus Tech Tips

There where discussions about it even before 2010.
No, it still doesn't make sense. Color difference you posted is due to a lower level technology. FSR is much high level technology. FSR is a higher level algorithm and that algorithm is absolutely the same on AMD and on Nvidia, since the code is the same. Difference can only come from lower level technology which FSR uses. But difference in lower level tech would not be exclusive to FSR, and would affect anything that uses it. In that case it makes sense to analyze that lower level tech, not the FSR.

I tried to explain this to you before, but you accused me of "excess argumentation". It just doesn't make sense to compare every high level algorithm that executes same commands on all cards, as then each review could be 100 pages. TPU did test AMD GPUs with FSR in the performance review. Don't you think, if they would notice any quality differences, that they would report it?
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
No, it still doesn't make sense. Color difference you posted is due to a lower level technology. FSR is much high level technology. FSR is a higher level algorithm and that algorithm is absolutely the same on AMD and on Nvidia, since the code is the same. Difference can only come from lower level technology which FSR uses. But difference in lower level tech would not be exclusive to FSR, and would affect anything that uses it. In that case it makes sense to analyze that lower level tech, not the FSR.

I tried to explain this to you before, but you accused me of "excess argumentation". It just doesn't make sense to compare every high level algorithm that executes same commands on all cards, as then each review could be 100 pages. TPU did test AMD GPUs with FSR in the performance review. Don't you think, if they would notice any quality differences, that they would report it?
See my reply above. No reason to repeat my reply and point at the obvious. A simple article that you say will produce zero evidence of differences, is not a bad idea in fact. And considering today many sites would love to have content that others don't, this is one more reason for an article with a title "FSR on AMD, Nvidia and Intel. Did we spot any differences?".
 
Joined
Sep 10, 2018
Messages
6,907 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
This is the expected result, but before calling something as "broken", shouldn't the parameter of hardware used be tested?


No one can "stand shimmering artifacts or disocclusion artifacts". Imagine choosing 4K and everything at ultra/high, only to see a picture that is dancing in frond of your eyes and probably sending you to get an aspirin half an hour latter for the headache.

"Subjectivity". One more reason for that hardware dependency check.

Honestly for me I would love FSR to outright be superior to DLSS because that would mean better image quality with similar benefits. I'm of the mind that both companies need to improve and neither should be given a pass and to a certain extent both have I just wish FSR2 was a bit more refined at this point and think people who purchase AMD gpus deserve that much. Upscaling should never be required though but it is nice to have especially as our hardware ages.
 
Joined
Sep 21, 2023
Messages
30 (0.07/day)
See my reply above. No reason to repeat my reply and point at the obvious. A simple article that you say will produce zero evidence of differences, is not a bad idea in fact. And considering today many sites would love to have content that others don't, this is one more reason for an article with a title "FSR on AMD, Nvidia and Intel. Did we spot any differences?".
You already have that. Any generic "Image quality on AMD, Nvidia and Intel. Did we spot any differences?" will produce that.

Let's say FSR uses commands A, B, C, D and E. These commands are also used for other things in games. Eg. A & B for shader X, C & D for shader Y, E for post processing effect Z. So, if any of these produce difference it will be will be visible in generic image quality comparison.

This is like, if we know that A+B+C=X, then we also know B+C+A, C+A+B, B+A+C and C+B+A are also X. But you keep arguing how we can be sure B+A+C si really X, if no one will actually calculate it.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Honestly for me I would love FSR to outright be superior to DLSS because that would mean better image quality with similar benefits. I'm of the mind that both companies need to improve and neither should be given a pass and to a certain extent both have I just wish FSR2 was a bit more refined at this point and think people who purchase AMD gpus deserve that much. Upscaling should never be required though but it is nice to have especially as our hardware ages.
FSR will never reach DLSS levels, because it lacks hardware and AMD also lacks the experience and expertise of Nvidia programmers on AI. Also AMD is repeating one of their common errors of the past. They bring a technology at a certain level and then abandon it. We haven't seen any upgrades, fixes, improvements on FSR the last many months. We even see new games that use FSR 2.0 or 2.1 instead of the last 2.2 version. There is of course FSR 3.0, but this was mostly a necessity, because Nvidia is doubling it's scores on benchmark charts thanks to Frame generation. So, either AMD would produce hardware that is twice as fast as Nvidia's, or create it's own Frame Generation equivalent, or just drop out of mid/high end.
But going from "inferior" for example to "broken" as a description of FSR, that's a huge distance and someone should base it on some evidence other than a subjective opinion based on how it runs on a competing solution.

Alan Wake 2 has shown us that upscaling and even Frame generation is becoming a necessity because visual quality is pushed harder than the hardware can follow.

You already have that. Any generic "Image quality on AMD, Nvidia and Intel. Did we spot any differences?" will produce that.

Let's say FSR uses commands A, B, C, D and E. These commands are also used for other things in games. Eg. A & B for shader X, C & D for shader Y, E for post processing effect Z. So, if any of these produce difference it will be will be visible in generic image quality comparison.

This is like, if we know that A+B+C=X, then we also know B+C+A, C+A+B, B+A+C and C+B+A are also X. But you keep arguing how we can be sure B+A+C si really X, if no one will actually calculate it.
Oh come on. I didn't asked for more theory. Please, do something else than replying to me with more theory.
 
Joined
Sep 10, 2018
Messages
6,907 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
FSR will never reach DLSS levels, because it lacks hardware and AMD also lacks the experience and expertise of Nvidia programmers on AI. Also AMD is repeating one of their common errors of the past. They bring a technology at a certain level and then abandon it. We haven't seen any upgrades, fixes, improvements on FSR the last many months. We even see new games that use FSR 2.0 or 2.1 instead of the last 2.2 version. There is of course FSR 3.0, but this was mostly a necessity, because Nvidia is doubling it's scores on benchmark charts thanks to Frame generation. So, either AMD would produce hardware that is twice as fast as Nvidia's, or create it's own Frame Generation equivalent, or just drop out of mid/high end.
But going from "inferior" for example to "broken" as a description of FSR, that's a huge distance and someone should base it on some evidence other than a subjective opinion based on how it runs on a competing solution.

Alan Wake 2 has shown us that upscaling and even Frame generation is becoming a necessity because visual quality is pushed harder than the hardware can follow.


Oh come on. I didn't asked for more theory. Please, do something else than replying to me with more theory.

It does seem like Microsoft and likely sony want an AI version of it in their next consoles so maybe RDNA4 will get some sort of hardware implementation.
 
Joined
Sep 21, 2023
Messages
30 (0.07/day)
Oh come on. I didn't asked for more theory. Please, do something else than replying to me with more theory.
Theory explains why it doesn't make sense to do the comparison you would like to see. I'll wait you find someone to do the test in practice and tell you the same answer.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
It does seem like Microsoft and likely sony want an AI version of it in their next consoles so maybe RDNA4 will get some sort of hardware implementation.
It wouldn't surprise me if one of the next consoles from MS or SONY end up using Nvidia hardware.

Theory explains why it doesn't make sense to do the comparison you would like to see. I'll wait you find someone to do the test in practice and tell you the same answer.
1698575457074.jpeg
 
Joined
Jun 12, 2023
Messages
58 (0.11/day)
It’s a fairly new project, we’re just messing around with it and not promoting it too much. Our main site is the big money maker, YouTube just costs money at this stage, we’re all willing to learn and improve though.
I don’t see myself going in front of the camera though. Why stop doing what i love and am good at, to be miserable instead. I still enjoy being involved with data collection, prep, ideas for presentation style, etc. i also produce b roll for some videos and I’m terrible at it but improving.
Well thank you for your work! And I totally agree with your mentality and work ethic there. Have a good one
 
Joined
Jun 12, 2015
Messages
2 (0.00/day)
Location
Taipei, Taiwan
Am I the only one to notice there's no system config, especially the CPU, listed for benchmarking? This is only half the information, we don't know if this is on a Ryzen 7950X or an Intel 8100. I'm aware that CPU doesn't play as nearly a large a factor as GPU in testing, however it still does factor in. These results are incomplete and not reproducible nor comparable without the full system specs. Techpowerup, please do better.
 
Top