• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

RTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success

What a hoot :) I literally fell from a chair :) Even though it's in the game doesn't mean it's being used :)
 
Low quality post by satrianiboys
Sigh..
Who is the donkey now for taking literally my words..

Guess i should become donkey too by putting /s next time
Yes, only donkey (read: plank) who always need an /s to understand the context of some words
 
Low quality post by INSTG8R
Sigh..
Who is the donkey now for taking literally my words..

Guess i should become donkey too by putting /s next time
Yes, only donkey (read: plank) who always need an /s to understand the context of some words
Your backpedaling is just as pathetic as your original statement...
 
Stay on topic.
Don't be insulting other members.
Don't be bickering back and forth with each other with off topic banter.

Thanks.
 
Low quality post by medi01
RTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success
Thanks for chuckles.
 
Low quality post by RichF
I mean look at the average human, logic is mostly gone.
That's the classic Star Trek TOS fallacy. Emotions are an expression of logic.

Girlfriend sells your gaming PC to buy shoes = anger.

As for the implication that humanity has become more emotional and less logical, consider how many people are on emotional depressants (anti-depressants). If anything, humanity has become less emotional due to the desire to medicate away the higher highs and lower lows. There is also the War on Drugs. Back when people were free to use whatever drugs they wanted to the resulting emotions were likely to be more intense and less rational.

Various cultures have tried a lot of things to reduce emotion. Some Buddhists aren't allowed to eat things like onions and garlic because it is believed that their flavors are too intense. The Kellogg cereal company got its start by peddling products (cereals like Corn Flakes) that were supposed to taste so bland they would prevent sexual arousal.

Focus here is on the fact that it's something new.
Newness does not equal innovation.

innovation = striking improvement
evolution = incremental improvement
iteration = possibly no improvement

So, if company X releases GPU A, then GPU B, and GPU B has marginal differences, it is an iterative product. What constitutes evolution and iteration is subjective. However, innovation clearly implies a very significant improvement.

If you think DLSS is an improvement over competing technologies then you can count it as innovation. If not, then it's not innovation. It's merely change. The notion that change is always a good thing is the fallacy of liberalism. The opposing fallacy is the belief that tradition is superior to change (the fallacy of conservatism).

Coming up with clever new ways to trick people into spending their money (marketing innovation) counts as innovation if you're a stockholder but it's not in the interest of the typical product buyer.
 
If you think DLSS is an improvement over competing technologies then you can count it as innovation. If not, then it's not innovation. It's merely change. The notion that change is always a good thing is the fallacy of liberalism. The opposing fallacy is the belief that tradition is superior to change (the fallacy of conservatism).

Coming up with clever new ways to trick people into spending their money (marketing innovation) counts as innovation if you're a stockholder but it's not in the interest of the typical product buyer.
You are forgetting this is a businesswise decision to shove more transistors to take-in contemporaneous solutions.
 
What? I have no idea what you're trying to say.
264ff402-a196-4b96-b4e7-d44899722222.png
a314495b-b52f-42cc-b7a5-2c0563b3a246.png
 
Disable DLSS for the best possible Picture Quality.
That's it.

RTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success
I see the sarcasm in your post. As for hairworks, it doesn't look as good as some might think. Perhaps is one's opinion whether they like it or not.
As for DLSS, that's complete garbage.

Raevenlord said:
Of course, MAD may only be speaking from the point of view of a competitor that has no competing solution.

AMD has no reason to compete with something that's already proven to be a complete failure.
 
Low quality post by dont whant to set it"'
odly why nVidia has not came up for a name by theyr own to be given to their next "generation whatevers" by themselves, yet they trashed hystory
ps: too drunk to grammar check.
Le: I have came up with at least a name, just ask Entropy. :p
2nd Le: grammar on first "Lateredit"
 
Low quality post by dont whant to set it"'
does it matter that Nvidia used the "tensor" word, without fathomisingsly it at all. they are so broke on naming ideeeas like since a douzine years ago(we , us as Nvidia are so desperate we go trash history for cash because we cannot come up with names for our own concepts because whatwe call as being our own conncepts are not actualy our onw we just trashed history for a profit allthewhile delivering pardon my French : shit . )
ps: some of my 2 cents being randomly drunk
 
Low quality post by mtcn77
does it matter that Nvidia used the "tensor" word, without fathomisingsly it at all. they are so broke on naming ideeeas like since a douzine years ago(we , us as Nvidia are so desperate we go trash history for cash because we cannot come up with names for our own concepts because whatwe call as being our own conncepts are not actualy our onw we just trashed history for a profit allthewhile delivering pardon my French : shit . )
ps: some of my 2 cents being randomly drunk
I like random trolls intermittently.
 
I have changed my mind on DLSS, having played Metro Exodus with a 2080 TI since updates, I can say that the blurr introduced is very minimal now, especially when upscaling the resolution. It actually provides a nice trade off for high resolution performance when paired with Ray Tracing. First examples looked terrible, now when switching DLSS on and off there is such a small degree of quality loss that can be compensated by rendering at higher resolutions if your rig can manage it.

My updated Two Cents
 
But remember that DLSS is only a thing to use those tensors cores that would otherwise be idle. If those tensor core transistors were instead put to work on rendering, you could get the higher resolution/higher framerate without the blurriness. In a graphics product, the tradeoff is stupid. Put tensor cores on compute products. Hell, make discreet tensor cards for systems that need it and divorce them entirely from the existing product stack.
 
But remember that DLSS is only a thing to use those tensors cores that would otherwise be idle. If those tensor core transistors were instead put to work on rendering, you could get the higher resolution/higher framerate without the blurriness. In a graphics product, the tradeoff is stupid. Put tensor cores on compute products. Hell, make discreet tensor cards for systems that need it and divorce them entirely from the existing product stack.

The reason Nvidia sections off a part of the die is so they have a new growth capacity. It is very much like the quotes from Jensen above here, Nvidia needs a new market now that GPUs can pretty much rip up the common resolutions. They need a new incentive for people to upgrade (4K is not it, mind you, it is still niche and only has a horizon for 1-2 generations). This underlines that in Nvidia's mind its certainly going to be a long term strategy item along with a USP as it is today. I can see the business case, and competition is far away in catching up to it.

That also underlines the primary motive for Nvidia. The motive is not 'the tech/hardware is capable now'. They just repurposed technology to 'make it work', and that is the starting point from which they'll probably iterate.

Nevertheless, I do agree, given the RT implementations we've seen at this point and the die sizes required to get there. On the other hand, if you see the way Nvidia built up Turing, its hard to imagine getting the desired latencies etc. for all that data transport to a discrete RT/tensor only card.
 
Last edited:
First examples looked terrible, now when switching DLSS on and off there is such a small degree of quality loss that can be compensated by rendering at higher resolutions if your rig can manage it.
I agree what you say is factually correct, with DSR on DLSS does look as good as MSAA, however DSR lowers FPS more than MSAA does, so DLSS requiring the use of DSR in order to make it visually passable completely defeats the entire purpose of DLSS. What's the point of using a feature that cuts out the performance overhead of AA if you have to add in an even bigger performance overhead to stop it looking like ****.
 
I agree what you say is factually correct, with DSR on DLSS does look as good as MSAA, however DSR lowers FPS more than MSAA does, so DLSS requiring the use of DSR in order to make it visually passable completely defeats the entire purpose of DLSS. What's the point of using a feature that cuts out the performance overhead of AA if you have to add in an even bigger performance overhead to stop it looking like ****.
DLSS uses FP16 which is quarter precision per say, however, precisely equal to the highest color channel bit depth. The gain is due on that since you don't need any higher color precision. Even INT8 is sufficient for desktop. You can save a lot in the render backends which is Nvidia's biggest weakness compared to AMD.
Turing-Tensor-Core_30fps_FINAL_736x414.gif

Your float is overflow safe so long as division uses a higher bitstep than the addition when taking the average of pixels.
Tensor-Core-Matrix.png

Figure 1. Tensor cores signficantly accelerate FP16 and INT8 matrix calculations
*Signficantly* Really, Nvidia?
 
Last edited:
I agree what you say is factually correct, with DSR on DLSS does look as good as MSAA, however DSR lowers FPS more than MSAA does, so DLSS requiring the use of DSR in order to make it visually passable completely defeats the entire purpose of DLSS. What's the point of using a feature that cuts out the performance overhead of AA if you have to add in an even bigger performance overhead to stop it looking like ****.

That was my original argument, but actually using it, I have found it to work out pretty good. Not perfect, but the realism afforded by ray tracing requires the performance boost from DLSS to make it work. It does work. I am really , like REALLY finicky about smooth gameplay. Can barely tollerate sub 80fps due to my eyes as I always seem to see frame jitter etc. The fps boost and lighting quality work well enough to feel the game is smooth and detailed.
Obviously this is just my experience on my hardware for just one game implimentation, but it has notebly improved my enjoyment of this game.
 
Back
Top