I don’t think you guys understand how amd played this one. There was no prerelease info, briefing or support. The tech suddenly appeared publicly in two games last week
On Friday afternoon, right before everybody goes to weekend, their pr people emailed me „new tech is live now. do you want keys?“
I’m in paris right now for some family time between launches
if you think you can do a better job? Just let me know, I’ll even pay you
I think most peoples issue is with testing an AMD feature, solely on Nvidia hardware, when other key features that exclusively work on RDNA3 hardware arent there to support FG. Same goes for the FSR reviews tbh, just because AMD allows certain software features to be hardware agnostic doesn’t mean testing with one brand of hardware will paint the entire, or proper picture. It absolutely comes across as a bad way to review software features.
I don’t think you guys understand how amd played this one. There was no prerelease info, briefing or support. The tech suddenly appeared publicly in two games last week
On Friday afternoon, right before everybody goes to weekend, their pr people emailed me „new tech is live now. do you want keys?“
I’m in paris right now for some family time between launches
if you think you can do a better job? Just let me know, I’ll even pay you
Why didn't you ask Nvidia PR to send you the written article, i'm sure they would have come to the same conclusion but delivered a faster and more comprehensive piece testing FSR 3 on actual AMD GPU to seem less biased.
For those who have been wondering if it's possible to combine DLSS Super Resolution with AMD's FSR 3 Frame Generation technology, the answer is simple—you can't enable NVIDIA DLSS and AMD's Frame Generation at the same time in this game. And that's somewhat disappointing, because with DLSS enabled the overall quality of generated frames would have been significantly improved.
That is the job off Nvidia to make DLSS compatible with AMD FSR 3 FG not the other way around since the later is open-source. Disappointing indeed AMD didn't reverse engineered DLSS to make FSR FG compatible with it.
I'm not "complaining" about either nor even mentioned any brand. You said the 'fake frame' tech in general is "good for simulators" and I simply asked a polite question of whether it (in general) still messes up the HUD as was seen in the video.
It is an open technology that does not require machine learning (ML) hardware, allowing it to be supported on a broad range of products and platforms, including consoles.
When using DLSS Frame Generation you can just toggle the feature on and you are ready to go without any issues under the hood, so no extra steps needed. If you try to do the same with AMD's Frame Generation solution, you'll get an unstable frame pacing with constant stuttering even at high framerates, which will result in a very sluggish and stuttery experience. In order to avoid these issues, it is highly recommended to turn on Vsync and set a custom framerate limit a few frames below your monitors refresh rate, for example 58 FPS for 60 Hz or 142 FPS for 144 Hz. It is important to not hit the limit of your monitor's refresh rate to avoid an additional 30% of input latency increase from enabling Vsync. Also, FSR 3 Frame Generation in the current state does not work properly with Variable Refresh Rate (VRR), G-Sync or Freesync technologies.
tl;dr turn on VSync with a framerate cap of ~2% less than your monitor's max framerate - also AMD bad, Nvidia good
For comparison, with AMD's Frame Generation solution it is recommended to have at least 70 FPS as the base framerate, while with DLSS Frame Generation it is recommended to have a minimum of 60 FPS
AMD bad, Nvidia good, also the official AMD website advises 60 FPS so I guess some inventions are necessary to make AMD more bad
you'll get the input latency of around 35 FPS with the smoothness of 60 FPS, which felt very slow and sluggish, especially when rapidly moving the camera, and in the second scenario you'll get the input latency of around 80 FPS with the smoothness of 140 FPS, which was a much better experience in terms of overall responsiveness, but still didn't really feel like playing at native 140 FPS.
AMD bad
Let's not mention that Nvidia just as bad since it's
how frame generation technology works
Nvidia
It is important to note that AMD has its own latency reduction feature called "Anti-Lag+," which is only available for the RX 7000 series GPUs, and thus wasn't applied in our testing as we're using a NVIDIA RTX 3080 GPU.
FSR 2 bad, AMD SR bad
Nvidia DLSS 3 also changed nothing of DLSS 2 rendering, let's not mention that and just focus on AMD bad
Speaking of image quality, it is important to note that in order to use AMD's Frame Generation solution, the Super Resolution upscaling component is required. Unfortunately, the image quality of the Super Resolution component isn't improved with the third version of FSR, and the game is still essentially using FSR 2 for upscaling, which, as we have tested numerous times, has major instabilities in motion, especially at lower resolutions. Forspoken in particular is a fast paced action game with a lot of small particle effects on screen during combat and the FSR upscaling solution just fails to render these details, producing a very blurry, pixelated and aliased image in motion, especially at 1080p and 1440p resolutions. Also, the FSR upscaling has very noticeable disocclusion artifacts around main character. Because the Super Resolution upscaling component is required for Frame Generation to work—all of these image quality issues are transformed into generated frames and they are even more noticeable when Frame Generation is enabled, creating an even more unstable image in motion. However, there is a "Native AA" mode available in the FSR 3 quality settings, which runs the Super Resolution technology without its upscaling component, similarly to NVIDIA's DLAA, but with a higher performance cost in comparison to the native TAA solution. With native FSR enabled, the overall image is sharper, but still has shimmering issues, disocclusion artifacts and pixelated particle effects, they are just a bit less visible.
On the good side, AMD's Frame Generation solution does not have any issues with the in-game on-screen UI, the area where DLSS Frame Generation often has issues.
For those who have been wondering if it's possible to combine DLSS Super Resolution with AMD's FSR 3 Frame Generation technology, the answer is simple—you can't enable NVIDIA DLSS and AMD's Frame Generation at the same time in this game. And that's somewhat disappointing, because with DLSS enabled the overall quality of generated frames would have been significantly improved.
AMD not work with Nvidia's closed proprietary tech, AMD bad, Nvidia work with AMD open tech, Nvidia good
Speaking of performance, with FSR upscaling in Quality mode and FSR 3 Frame Generation enabled, you can expect doubled performance across all resolutions compared to native rendering.
So the entire article was 80% AMD bad, Nvidia not bad, and at the end we have a one-liner mention of "the thing doubles your framerate BTW, and outside of complaining about FSR 2 or generally not liking AMD, I have nothing to say about that".
This article is like me reviewing a movie and complaining for 90% of the article on how the other director has a better camera, the cinema's popcorn was terrible, the seats were uncomfortable, the sound system was poor, and the director's wife was better when she was my wife, and at the end go "I guess the movie was a masterpiece".
Even if AMD showed their usual PR competence by shitting the bed before asking reviewers to lay in it, an article where 10% of the article is dedicated to the actual tech and 90% about complaining about AMD is something pretty ridiculous. It was hilarious to read and re-read though.
I'm not "complaining" about either nor even mentioned any brand. You said the 'fake frame' tech in general is "good for simulators" and I simply asked a polite question of whether it (in general) still messes up the HUD as was seen in the video.
Polite or not, the question is ridiculous. And you didn't "mention" any brands, you just pulled a video of an Nvidia tech to put in question the value of an AMD tech.
It's as if Dell made you a terrible PC, so now you put in question every brand of prebuilt PC. AMD's tech is not Nvidia's tech. And the UI problems that Nvidia has do not seem to exist here, incidentally.
Polite or not, the question is ridiculous. And you didn't "mention" any brands, you just pulled a video of an Nvidia tech to put in question the value of an AMD tech.
It's as if Dell made you a terrible PC, so now you put in question every brand of prebuilt PC. AMD's tech is not Nvidia's tech. And the UI problems that Nvidia has do not seem to exist here, incidentally.
"Has the previously shown problem with frame-generation been fixed yet or not" with a genre you claim "it works well for" isn't ridiculous to most normal people. Jesus dude if you're too angry to respond normally then don't bother. I'll ask someone less fanboyish on another FSR / DLSS thread...
"Has the previously shown problem with frame-generation been fixed yet or not" with a genre you claim "it works well for" isn't ridiculous to most normal people. Jesus dude if you're too angry to respond normally then don't bother. I'll ask someone less fanboyish on another FSR / DLSS thread...
Oh wait, you were asking if DLSS 3 fixed its UI problems in an FSR 3 article, so of course I misunderstood your question. I thought you were talking about FSR 3 in an FSR 3 article.
To answer your question: I don't know since I didn't buy Nvidia and have no access to this tech.
I hope this answer suits you well. It was the best answer you were going to get from a guy with an AMD graphics card in an article about an AMD graphical technology, I'm afraid.
Oh wait, you were asking if DLSS 3 fixed its UI problems in an FSR 3 article, so of course I misunderstood your question. I thought you were talking about FSR 3 in an FSR 3 article. To answer your question: I don't know since I didn't buy Nvidia and have no access to this tech. I hope this answer suits you well. It was the best answer you were going to get from a guy with an AMD graphics card in an article about an AMD graphical technology, I'm afraid.
I was actually asking about both (whether the "VHS jitter" style HUD corruption issue previously seen in MS Flight Sim is still present roday on either DLSS3 OR FSR 3) and I responded to you in this thread since you said (in this thread) "Frame-generation is good for simulators, etc" so I assumed you had experience with it and just ask offhand. But if all you have is a wall of passive-aggressive responses, as I said, don't worry I'll ask someone else...
In both Forspoken and Immortals I didn't notice any artifacting from GUI elements. Even when turning on FG while at 40 fps. They must be doing something different than nvidia for GUI rendering.
While I don’t disagree about not testing on AMD hardware, it’s far more likely the person doing the review didnt have an amd card on hand; which is still a huge oversight. It doesn’t necessarily make someone a fanboy.
You might just be used to it after ~20 years and it seems normal to you. If you'd look at a CRT right now, or an LCD with BFI/ULMB, you'd be astonished.
I usually don't notice any TAA ghosting until people point it out. Sometimes not even then. I think some people do actually confuse TAA ghosting with LCD smearing, because they say they see something in a video, while I don't see it on an OLED, whether in motion or paused.
I notice TAA smearing in some games sometimes, but I don't in others. Not that it bothers me much, I just find it weird.
If I got used to all LCD features and caveats in ~20 years, then I think it's a non-issue. What you don't notice is not a problem, and doesn't need to be pointed out by a reviewer. Why do we focus on the negative so much instead of enjoying the game?
FSR 3 (just like FSR 2) is hardware agnostic so all fanboys moaning about testing on GeForce RTX are purely hilarious. Yes, GeForce RTX can't use Anti-Lag but he didn't measure the latency anyway. Also bad experience with ~30 FPS base performance before frame gen applies to NVIDIA as well so no difference there.
FSR 3 (just like FSR 2) is hardware agnostic so all fanboys moaning about testing on GeForce RTX are purely hilarious. Yes, GeForce RTX can't use Anti-Lag but he didn't measure the latency anyway. Also bad experience with ~30 FPS base performance before frame gen applies to NVIDIA as well so no difference there.
It's bad practice. The article in general though, doesn't meet TPU's typical standards. We understand why this has happened... but honestly... . It lacks proper testing with current gen ,or vendor appropriate, hardware. And comes to a conclusion that's is kinda off base because of that.
@W1zzard , you guys can do better than this . If you can't hit a proper standard... don't release. LTT just got shredded for stuff like this.
*late edit*
To be fair. I think it would be an awesome forum post. But as an article... it feels off. I've pull back abit because re-reading it... I think I was way to harsh. Sorry about that.
Nvidia just has a lot bigger and better PR team that usually does not dunk on itself. When DLSS3 FG was announced and then released you could read and watch videos about it everywhere and it came in more and better games.
Here we have AMD releasing it pretty quietly (yeah they had their annoucment on Gamescom but its not the same) on only 2 games that were played on steam by literally less than 100 people.
Releasing the preview driver was honestly better than adding FSR3 in those two games in my opinion, at least in my case it impressed me more, when I could turn a feature on in a driver and suddenly get 100fps with RT in Witcher 3.
If the tech itself was not impressive then I could understand it but FSR3 and AFMF really look great. Whether you like FG or not its a different topic, but the fact that it's working and it does not look worse than DLSS FG and it's not locked to only newest cards from one vendor, they should get more high profile games (Starfield for example), give the updated versions to press earlier, give them the preview driver and let the hype do its job.
Nvidia just has a lot bigger and better PR team that usually does not dunk on itself. When DLSS3 FG was announced and then released you could read and watch videos about it everywhere and it came in more and better games.
Here we have AMD releasing it pretty quietly (yeah they had their annoucment on Gamescom but its not the same) on only 2 games that were played on steam by literally less than 100 people.
Releasing the preview driver was honestly better than adding FSR3 in those two games in my opinion, at least in my case it impressed me more, when I could turn a feature on in a driver and suddenly get 100fps with RT in Witcher 3.
If the tech itself was not impressive then I could understand it but FSR3 and AFMF really look great. Whether you like FG or not its a different topic, but the fact that it's working and it does not look worse than DLSS FG and it's not locked to only newest cards from one vendor, they should get more high profile games (Starfield for example), give the updated versions to press earlier, give them the preview driver and let the hype do its job.
If im not mistaken you can already apply FG at the driver level to dx11 & dx12 titles (some users on the 7900xtx owners thread at OCN). Whether or not it works properly is another question.
If im not mistaken you can already apply FG at the driver level to dx11 & dx12 titles (some users on the 7900xtx owners thread at OCN). Whether or not it works properly is another question.
Yeah you can with the preview driver I mentioned it. Tried it and it works to some degree. I tried Witcher 3 and Cyberpunk and as long as you do not move your mouse too quickly for a driver level feature it works great. Hope they will improve it until official release next year. But also they need a lot more games getting on board the FSR3. And since they got that feature released now it would be good if they would improve the upscaling part of FSR3.
Forspoken is kinda weird. Maybe because it is still a demo? Though this also seems typical these days. There are so many different settings and ways to play with different tech, it must be hard to keep things straight. Just looking at what you have hear, why the heck is FSR 4K Quality much more blurrier than even 1080p FSR Quality?
Popin is really bad as well. Walk around, watch the rocks. Rocks really far away pop in and rocks that are also right under you popin at the same time. When you enable FSR, distant objects (really easy to see on the cliff face) get SO much more detail. Without FSR enabled, the rock face is just a blurry mess. All these little qwerks are annoying.
I was actually asking about both (whether the "VHS jitter" style HUD corruption issue previously seen in MS Flight Sim is still present roday on either DLSS3 OR FSR 3) and I responded to you in this thread since you said (in this thread) "Frame-generation is good for simulators, etc" so I assumed you had experience with it and just ask offhand. But if all you have is a wall of passive-aggressive responses, as I said, don't worry I'll ask someone else...
I just didn't understand your question and don't have a response. Sorry.
And I didn't say "for simulators" alone, I just illustrated the type of games it can work for and the ones it can't. I don't think sims are games that require any kind of high reactivity.
Forspoken is kinda weird. Maybe because it is still a demo? Though this also seems typical these days. There are so many different settings and ways to play with different tech, it must be hard to keep things straight. Just looking at what you have hear, why the heck is FSR 4K Quality much more blurrier than even 1080p FSR Quality?
Popin is really bad as well. Walk around, watch the rocks. Rocks really far away pop in and rocks that are also right under you popin at the same time. When you enable FSR, distant objects (really easy to see on the cliff face) get SO much more detail. Without FSR enabled, the rock face is just a blurry mess. All these little qwerks are annoying.
Had multiple people say that Forspoken is just kind of not great on its own, and worse with FSR 2, but Aveum is actually a total success in both FSR implementation and native.
Of course Aveum doesn't have a demo, so you'd have to buy the game to verify...
Let's not forget that this is one game tested with a preview driver. Even tested on an AMD card wouldn't give conclusive results about the technology as a whole, imo.
So far the frame generation is the same quality as DLSS3.0, all problems can be attributed to just FSR2 being inferior to DLSS2.
Immortals of Aveum is perfectly fine but Forspoken has some jittery problems when rotating the camera, so that needs work.
"If you try to do the same with AMD's Frame Generation solution, you'll get an unstable frame pacing with constant stuttering even at high framerates, which will result in a very sluggish and stuttery experience."
is this a joke? we know there is a bug that doesnt show frame pacing correctly