• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alan Wake 2: FSR 2.2 vs. DLSS 3.5 Comparison

Is it so bad a comparison of FSR on RTX and RX cards that needs excess argumentation of why it doesn't need to happen?
Anyone with both cards can compare how FSR looks on RTX and RX cards. I just tried to explain to you why no one bothers to do it. It just doesn't make sense.
 
Anyone with both cards can compare how FSR looks on RTX and RX cards. I just tried to explain to you why no one bothers to do it. It just doesn't make sense.
Even out of curiosity this could be a nice review. And it does make sense. In the far past there was a debate about ATI/AMD cards and Nvidia color quality. It was making sense to say "all cards produce the same standard output", but there where many rumors about differences in color quality between the two brands. The result was in the end people pointing at a default setting in the Nvidia driver being the reason of that difference.

Still, about FSR. As I said, why so many arguments to avoid something that you describe as something so simple to check?

2018


2015

Nvidia vs Amd color quality - Graphics Cards - Linus Tech Tips

There where discussions about it even before 2010.
 
Even out of curiosity this could be a nice review. And it does make sense. In the far past there was a debate about ATI/AMD cards and Nvidia color quality. It was making sense to say "all cards produce the same standard output", but there where many rumors about differences in color quality between the two brands. The result was in the end people pointing at a default setting in the Nvidia driver being the reason of that difference.

Still, about FSR. As I said, why so many arguments to avoid something that you describe as something so simple to check?

2018


2015

Nvidia vs Amd color quality - Graphics Cards - Linus Tech Tips

There where discussions about it even before 2010.

For sure it would be ideal when comparing image quality to use an AMD card to test FSR with but I can also see why an Nvidia card is used it makes testing much less work becuase all 3 technologies work on it

I believe W1zard is sending the reviewer a 7900XT and a A750 so that will be nice but you will still see the usual suspects saying something is wrong with the testing etc etc.

On my end a 6700XT produces the same results as my 3080ti/4090 so I'm not expecting anything to really change. The only real difference is how much sharpening an end user can stomach some it doesn't bother them and some it does at least in games it allows you to adjust sharpness anyway.

At the end of the day a lot of this is subjective I can't personally stand shimmering artifacts or disocclusion artifacts two things fsr does poorly to different degrees vs what DLSS isn't good at even TAA in some games has major issues that both FSR and DLSS clean up so it really dows depend game to game.
 
For sure it would be ideal when comparing image quality to use an AMD card to test FSR with but I can also see why an Nvidia card is used it makes testing much less work becuase all 3 technologies work on it

I believe W1zard is sending the reviewer a 7900XT and a A750 so that will be nice but you will still see the usual suspects saying something is wrong with the testing etc etc.
Obviously it makes testing easier. No card swapping. Also many who post reviews are simple individuals with one PC that use for that testing and probably only one card in their possession. They are not test labs. And the probable result is expected both(three) cards to offer identical image quality. Still using one's tech on the other's products, doesn't necessary exclude the possibility of finding differences for obvious reasons. Different architectures, different drivers. Isn't for example ARC FULLY compatible with DirectX 12? So why Starfield was unplayable on ARC GPUs when it came out? Why AMD cards where not displaying the stars? Why Nvidia cards where under performing? Don't they have the same support for DirectX 12? We expect something to work across the board only to end up with bugs and differences. We accept as normal a game to have 100 bugs, drivers to have 100 bugs and suddenly the idea of FSR not producing a perfectly identical picture on Nvidia or Intel cards, with what it produces on AMD cards absurd???? Why?

The "usual suspects" can insist that the images used for an AMD vs intel vs Nvidia FSR side by side comparison, are coming in fact from the same graphics card and the reviewer is trying to fool the viewers. In my case, my question is sincere and I think logical. When choosing to use words like "broken", at least verify it first. Verify that the tech is working as it should in any hardware before using such words. In the past Hairworks and TressFX where tested in both competitors' cards to spot differences or performance issues. Why not test FSR?

On my end a 6700XT produces the same results as my 3080ti/4090 so I'm not expecting anything to really change. The only real difference is how much sharpening an end user can stomach some it doesn't bother them and some it does at least in games it allows you to adjust sharpness anyway.
This is the expected result, but before calling something as "broken", shouldn't the parameter of hardware used be tested?

At the end of the day a lot of this is subjective I can't personally stand shimmering artifacts or disocclusion artifacts two things fsr does poorly to different degrees vs what DLSS isn't good at even TAA in some games has major issues that both FSR and DLSS clean up so it really dows depend game to game.
No one can "stand shimmering artifacts or disocclusion artifacts". Imagine choosing 4K and everything at ultra/high, only to see a picture that is dancing in frond of your eyes and probably sending you to get an aspirin half an hour latter for the headache.

"Subjectivity". One more reason for that hardware dependency check.
 
Even out of curiosity this could be a nice review. And it does make sense. In the far past there was a debate about ATI/AMD cards and Nvidia color quality. It was making sense to say "all cards produce the same standard output", but there where many rumors about differences in color quality between the two brands. The result was in the end people pointing at a default setting in the Nvidia driver being the reason of that difference.

Still, about FSR. As I said, why so many arguments to avoid something that you describe as something so simple to check?

2018


2015

Nvidia vs Amd color quality - Graphics Cards - Linus Tech Tips

There where discussions about it even before 2010.
No, it still doesn't make sense. Color difference you posted is due to a lower level technology. FSR is much high level technology. FSR is a higher level algorithm and that algorithm is absolutely the same on AMD and on Nvidia, since the code is the same. Difference can only come from lower level technology which FSR uses. But difference in lower level tech would not be exclusive to FSR, and would affect anything that uses it. In that case it makes sense to analyze that lower level tech, not the FSR.

I tried to explain this to you before, but you accused me of "excess argumentation". It just doesn't make sense to compare every high level algorithm that executes same commands on all cards, as then each review could be 100 pages. TPU did test AMD GPUs with FSR in the performance review. Don't you think, if they would notice any quality differences, that they would report it?
 
No, it still doesn't make sense. Color difference you posted is due to a lower level technology. FSR is much high level technology. FSR is a higher level algorithm and that algorithm is absolutely the same on AMD and on Nvidia, since the code is the same. Difference can only come from lower level technology which FSR uses. But difference in lower level tech would not be exclusive to FSR, and would affect anything that uses it. In that case it makes sense to analyze that lower level tech, not the FSR.

I tried to explain this to you before, but you accused me of "excess argumentation". It just doesn't make sense to compare every high level algorithm that executes same commands on all cards, as then each review could be 100 pages. TPU did test AMD GPUs with FSR in the performance review. Don't you think, if they would notice any quality differences, that they would report it?
See my reply above. No reason to repeat my reply and point at the obvious. A simple article that you say will produce zero evidence of differences, is not a bad idea in fact. And considering today many sites would love to have content that others don't, this is one more reason for an article with a title "FSR on AMD, Nvidia and Intel. Did we spot any differences?".
 
This is the expected result, but before calling something as "broken", shouldn't the parameter of hardware used be tested?


No one can "stand shimmering artifacts or disocclusion artifacts". Imagine choosing 4K and everything at ultra/high, only to see a picture that is dancing in frond of your eyes and probably sending you to get an aspirin half an hour latter for the headache.

"Subjectivity". One more reason for that hardware dependency check.

Honestly for me I would love FSR to outright be superior to DLSS because that would mean better image quality with similar benefits. I'm of the mind that both companies need to improve and neither should be given a pass and to a certain extent both have I just wish FSR2 was a bit more refined at this point and think people who purchase AMD gpus deserve that much. Upscaling should never be required though but it is nice to have especially as our hardware ages.
 
See my reply above. No reason to repeat my reply and point at the obvious. A simple article that you say will produce zero evidence of differences, is not a bad idea in fact. And considering today many sites would love to have content that others don't, this is one more reason for an article with a title "FSR on AMD, Nvidia and Intel. Did we spot any differences?".
You already have that. Any generic "Image quality on AMD, Nvidia and Intel. Did we spot any differences?" will produce that.

Let's say FSR uses commands A, B, C, D and E. These commands are also used for other things in games. Eg. A & B for shader X, C & D for shader Y, E for post processing effect Z. So, if any of these produce difference it will be will be visible in generic image quality comparison.

This is like, if we know that A+B+C=X, then we also know B+C+A, C+A+B, B+A+C and C+B+A are also X. But you keep arguing how we can be sure B+A+C si really X, if no one will actually calculate it.
 
Honestly for me I would love FSR to outright be superior to DLSS because that would mean better image quality with similar benefits. I'm of the mind that both companies need to improve and neither should be given a pass and to a certain extent both have I just wish FSR2 was a bit more refined at this point and think people who purchase AMD gpus deserve that much. Upscaling should never be required though but it is nice to have especially as our hardware ages.
FSR will never reach DLSS levels, because it lacks hardware and AMD also lacks the experience and expertise of Nvidia programmers on AI. Also AMD is repeating one of their common errors of the past. They bring a technology at a certain level and then abandon it. We haven't seen any upgrades, fixes, improvements on FSR the last many months. We even see new games that use FSR 2.0 or 2.1 instead of the last 2.2 version. There is of course FSR 3.0, but this was mostly a necessity, because Nvidia is doubling it's scores on benchmark charts thanks to Frame generation. So, either AMD would produce hardware that is twice as fast as Nvidia's, or create it's own Frame Generation equivalent, or just drop out of mid/high end.
But going from "inferior" for example to "broken" as a description of FSR, that's a huge distance and someone should base it on some evidence other than a subjective opinion based on how it runs on a competing solution.

Alan Wake 2 has shown us that upscaling and even Frame generation is becoming a necessity because visual quality is pushed harder than the hardware can follow.

You already have that. Any generic "Image quality on AMD, Nvidia and Intel. Did we spot any differences?" will produce that.

Let's say FSR uses commands A, B, C, D and E. These commands are also used for other things in games. Eg. A & B for shader X, C & D for shader Y, E for post processing effect Z. So, if any of these produce difference it will be will be visible in generic image quality comparison.

This is like, if we know that A+B+C=X, then we also know B+C+A, C+A+B, B+A+C and C+B+A are also X. But you keep arguing how we can be sure B+A+C si really X, if no one will actually calculate it.
Oh come on. I didn't asked for more theory. Please, do something else than replying to me with more theory.
 
FSR will never reach DLSS levels, because it lacks hardware and AMD also lacks the experience and expertise of Nvidia programmers on AI. Also AMD is repeating one of their common errors of the past. They bring a technology at a certain level and then abandon it. We haven't seen any upgrades, fixes, improvements on FSR the last many months. We even see new games that use FSR 2.0 or 2.1 instead of the last 2.2 version. There is of course FSR 3.0, but this was mostly a necessity, because Nvidia is doubling it's scores on benchmark charts thanks to Frame generation. So, either AMD would produce hardware that is twice as fast as Nvidia's, or create it's own Frame Generation equivalent, or just drop out of mid/high end.
But going from "inferior" for example to "broken" as a description of FSR, that's a huge distance and someone should base it on some evidence other than a subjective opinion based on how it runs on a competing solution.

Alan Wake 2 has shown us that upscaling and even Frame generation is becoming a necessity because visual quality is pushed harder than the hardware can follow.


Oh come on. I didn't asked for more theory. Please, do something else than replying to me with more theory.

It does seem like Microsoft and likely sony want an AI version of it in their next consoles so maybe RDNA4 will get some sort of hardware implementation.
 
Oh come on. I didn't asked for more theory. Please, do something else than replying to me with more theory.
Theory explains why it doesn't make sense to do the comparison you would like to see. I'll wait you find someone to do the test in practice and tell you the same answer.
 
It does seem like Microsoft and likely sony want an AI version of it in their next consoles so maybe RDNA4 will get some sort of hardware implementation.
It wouldn't surprise me if one of the next consoles from MS or SONY end up using Nvidia hardware.

Theory explains why it doesn't make sense to do the comparison you would like to see. I'll wait you find someone to do the test in practice and tell you the same answer.
1698575457074.jpeg
 
It’s a fairly new project, we’re just messing around with it and not promoting it too much. Our main site is the big money maker, YouTube just costs money at this stage, we’re all willing to learn and improve though.
I don’t see myself going in front of the camera though. Why stop doing what i love and am good at, to be miserable instead. I still enjoy being involved with data collection, prep, ideas for presentation style, etc. i also produce b roll for some videos and I’m terrible at it but improving.
Well thank you for your work! And I totally agree with your mentality and work ethic there. Have a good one
 
Am I the only one to notice there's no system config, especially the CPU, listed for benchmarking? This is only half the information, we don't know if this is on a Ryzen 7950X or an Intel 8100. I'm aware that CPU doesn't play as nearly a large a factor as GPU in testing, however it still does factor in. These results are incomplete and not reproducible nor comparable without the full system specs. Techpowerup, please do better.
 
Back
Top