Thursday, February 14th 2019

AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

A report via PCGamesN places AMD's stance on NVIDIA's DLSS as a rather decided one: the company stands for further development of SMAA (Enhanced Subpixel Morphological Antialiasing) and TAA (Temporal Antialising) solutions on current, open frameworks, which, according to AMD's director of marketing, Sasa Marinkovic, "(...) are going to be widely implemented in today's games, and that run exceptionally well on Radeon VII", instead of investing in yet another proprietary solution. While AMD pointed out that DLSS' market penetration was a low one, that's not the main issue of contention. In fact, AMD decides to go head-on against NVIDIA's own technical presentations, comparing DLSS' image quality and performance benefits against a native-resolution, TAA-enhanced image - they say that SMAA and TAA can work equally as well without "the image artefacts caused by the upscaling and harsh sharpening of DLSS."

Of course, AMD may only be speaking from the point of view of a competitor that has no competing solution. However, company representatives said that they could, in theory, develop something along the lines of DLSS via a GPGPU framework - a task for which AMD's architectures are usually extremely well-suited. But AMD seems to take the eyes of its DLSS-defusing moves, however, as AMD's Nish Neelalojanan, a Gaming division exec, talks about potential DLSS-like implementations across "Some of the other broader available frameworks, like WindowsML and DirectML", and that these are "something we [AMD] are actively looking at optimizing… At some of the previous shows we've shown some of the upscaling, some of the filters available with WindowsML, running really well with some of our Radeon cards." So whether it's an actual image-quality philosophy, or just a competing technology's TTM (time to market) one, only AMD knows.
Source: PCGamesN
Add your own comment

170 Comments on AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

#151
ratirt
satrianiboysRTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success
What a hoot :) I literally fell from a chair :) Even though it's in the game doesn't mean it's being used :)
Posted on Reply
#152
95Viper
Stay on topic.
Don't be insulting other members.
Don't be bickering back and forth with each other with off topic banter.

Thanks.
Posted on Reply
#153
moproblems99
RichFIf anything, humanity has become less emotional
Maybe so but humanity hasn't become any more logical.
Posted on Reply
#154
mtcn77
RichFIf you think DLSS is an improvement over competing technologies then you can count it as innovation. If not, then it's not innovation. It's merely change. The notion that change is always a good thing is the fallacy of liberalism. The opposing fallacy is the belief that tradition is superior to change (the fallacy of conservatism).

Coming up with clever new ways to trick people into spending their money (marketing innovation) counts as innovation if you're a stockholder but it's not in the interest of the typical product buyer.
You are forgetting this is a businesswise decision to shove more transistors to take-in contemporaneous solutions.
Posted on Reply
#155
RichF
mtcn77You are forgetting this is a businesswise decision to shove more transistors to take-in contemporaneous solutions.
What? I have no idea what you're trying to say.
Posted on Reply
#156
mtcn77
RichFWhat? I have no idea what you're trying to say.
Posted on Reply
#157
Super XP
Disable DLSS for the best possible Picture Quality.
That's it.
satrianiboysRTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success
I see the sarcasm in your post. As for hairworks, it doesn't look as good as some might think. Perhaps is one's opinion whether they like it or not.
As for DLSS, that's complete garbage.
Raevenlord said:
Of course, MAD may only be speaking from the point of view of a competitor that has no competing solution.
AMD has no reason to compete with something that's already proven to be a complete failure.
Posted on Reply
#158
mandelore
I have changed my mind on DLSS, having played Metro Exodus with a 2080 TI since updates, I can say that the blurr introduced is very minimal now, especially when upscaling the resolution. It actually provides a nice trade off for high resolution performance when paired with Ray Tracing. First examples looked terrible, now when switching DLSS on and off there is such a small degree of quality loss that can be compensated by rendering at higher resolutions if your rig can manage it.

My updated Two Cents
Posted on Reply
#159
FordGT90Concept
"I go fast!1!11!1!"
But remember that DLSS is only a thing to use those tensors cores that would otherwise be idle. If those tensor core transistors were instead put to work on rendering, you could get the higher resolution/higher framerate without the blurriness. In a graphics product, the tradeoff is stupid. Put tensor cores on compute products. Hell, make discreet tensor cards for systems that need it and divorce them entirely from the existing product stack.
Posted on Reply
#160
Vayra86
FordGT90ConceptBut remember that DLSS is only a thing to use those tensors cores that would otherwise be idle. If those tensor core transistors were instead put to work on rendering, you could get the higher resolution/higher framerate without the blurriness. In a graphics product, the tradeoff is stupid. Put tensor cores on compute products. Hell, make discreet tensor cards for systems that need it and divorce them entirely from the existing product stack.
The reason Nvidia sections off a part of the die is so they have a new growth capacity. It is very much like the quotes from Jensen above here, Nvidia needs a new market now that GPUs can pretty much rip up the common resolutions. They need a new incentive for people to upgrade (4K is not it, mind you, it is still niche and only has a horizon for 1-2 generations). This underlines that in Nvidia's mind its certainly going to be a long term strategy item along with a USP as it is today. I can see the business case, and competition is far away in catching up to it.

That also underlines the primary motive for Nvidia. The motive is not 'the tech/hardware is capable now'. They just repurposed technology to 'make it work', and that is the starting point from which they'll probably iterate.

Nevertheless, I do agree, given the RT implementations we've seen at this point and the die sizes required to get there. On the other hand, if you see the way Nvidia built up Turing, its hard to imagine getting the desired latencies etc. for all that data transport to a discrete RT/tensor only card.
Posted on Reply
#161
Ubersonic
mandeloreFirst examples looked terrible, now when switching DLSS on and off there is such a small degree of quality loss that can be compensated by rendering at higher resolutions if your rig can manage it.
I agree what you say is factually correct, with DSR on DLSS does look as good as MSAA, however DSR lowers FPS more than MSAA does, so DLSS requiring the use of DSR in order to make it visually passable completely defeats the entire purpose of DLSS. What's the point of using a feature that cuts out the performance overhead of AA if you have to add in an even bigger performance overhead to stop it looking like ****.
Posted on Reply
#162
mtcn77
UbersonicI agree what you say is factually correct, with DSR on DLSS does look as good as MSAA, however DSR lowers FPS more than MSAA does, so DLSS requiring the use of DSR in order to make it visually passable completely defeats the entire purpose of DLSS. What's the point of using a feature that cuts out the performance overhead of AA if you have to add in an even bigger performance overhead to stop it looking like ****.
DLSS uses FP16 which is quarter precision per say, however, precisely equal to the highest color channel bit depth. The gain is due on that since you don't need any higher color precision. Even INT8 is sufficient for desktop. You can save a lot in the render backends which is Nvidia's biggest weakness compared to AMD.

Your float is overflow safe so long as division uses a higher bitstep than the addition when taking the average of pixels.

Figure 1. Tensor cores signficantly accelerate FP16 and INT8 matrix calculations
*Signficantly* Really, Nvidia?
Posted on Reply
#163
mandelore
UbersonicI agree what you say is factually correct, with DSR on DLSS does look as good as MSAA, however DSR lowers FPS more than MSAA does, so DLSS requiring the use of DSR in order to make it visually passable completely defeats the entire purpose of DLSS. What's the point of using a feature that cuts out the performance overhead of AA if you have to add in an even bigger performance overhead to stop it looking like ****.
That was my original argument, but actually using it, I have found it to work out pretty good. Not perfect, but the realism afforded by ray tracing requires the performance boost from DLSS to make it work. It does work. I am really , like REALLY finicky about smooth gameplay. Can barely tollerate sub 80fps due to my eyes as I always seem to see frame jitter etc. The fps boost and lighting quality work well enough to feel the game is smooth and detailed.
Obviously this is just my experience on my hardware for just one game implimentation, but it has notebly improved my enjoyment of this game.
Posted on Reply
Add your own comment
Mar 30th, 2025 11:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts