Thursday, February 14th 2019

AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

A report via PCGamesN places AMD's stance on NVIDIA's DLSS as a rather decided one: the company stands for further development of SMAA (Enhanced Subpixel Morphological Antialiasing) and TAA (Temporal Antialising) solutions on current, open frameworks, which, according to AMD's director of marketing, Sasa Marinkovic, "(...) are going to be widely implemented in today's games, and that run exceptionally well on Radeon VII", instead of investing in yet another proprietary solution. While AMD pointed out that DLSS' market penetration was a low one, that's not the main issue of contention. In fact, AMD decides to go head-on against NVIDIA's own technical presentations, comparing DLSS' image quality and performance benefits against a native-resolution, TAA-enhanced image - they say that SMAA and TAA can work equally as well without "the image artefacts caused by the upscaling and harsh sharpening of DLSS."

Of course, AMD may only be speaking from the point of view of a competitor that has no competing solution. However, company representatives said that they could, in theory, develop something along the lines of DLSS via a GPGPU framework - a task for which AMD's architectures are usually extremely well-suited. But AMD seems to take the eyes of its DLSS-defusing moves, however, as AMD's Nish Neelalojanan, a Gaming division exec, talks about potential DLSS-like implementations across "Some of the other broader available frameworks, like WindowsML and DirectML", and that these are "something we [AMD] are actively looking at optimizing… At some of the previous shows we've shown some of the upscaling, some of the filters available with WindowsML, running really well with some of our Radeon cards." So whether it's an actual image-quality philosophy, or just a competing technology's TTM (time to market) one, only AMD knows.
Source: PCGamesN
Add your own comment

170 Comments on AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

#76
moproblems99
cucker tarlsonbit too shiny and fake to me
Sounds familiar...like I saw it in Screen shots of a Battlefield game or something....what was that feature called...
Posted on Reply
#77
Litzner
londisteDepends. It is unclear if there is any machine learning-ish taking place during rendering. Most of that work is supposed to happen in Nvidia's render farm and data from that delivered either via game or driver updates. I have seen a few suspicions - but no details or analysis - that something might take place locally but that does not necessarily mean delayed frames but more likely comparisons with previous (few) frames.
Which is why I am very curious to see some input latency tests done for DLSS :)
Posted on Reply
#78
Super XP
hutt132I think you have the labeling wrong on the comparison screenshots.
DLSS doesn't look great. Agreed.
Posted on Reply
#79
cucker tarlson
moproblems99Sounds familiar...like I saw it in Screen shots of a Battlefield game or something....what was that feature called...
lol any reflections in any pc game are designed to look that way.I absolutely adore how amd fanbase still compalins how those rtx reflections too shiny,as if they didn't know it's still a computer game rednering of a reflection,we are not at the stage of photo realism yet sweetie :)
said that before,grapes are getting even more sour with amd leaving rt features out for nvidia to claim for themselves.
Posted on Reply
#80
moproblems99
cucker tarlsonlol any reflections in any pc game are designed to look that way.I absolutely adore how amd fanbase still compalins how those look reflections too shiny,as if they didn't know it's still a computer game,we are deceades from photorealism.
said that before,grapes are getting even more sour with amd leaving rt features out for nvidia to claim for themselves.
Wait, didn't you just say the same thing about TressFX? Pot, meet kettle.

Edit: To add to that, if AMD had RTRT features, that crap would be off. It looks terrible. I might give it a go on Metro (if I had it) but as long as it looks like BFV, that shit is off.
Posted on Reply
#81
cucker tarlson
moproblems99Wait, didn't you just say the same thing about TressFX? Pot, meet kettle.
what did I say about tress fx ?
the fact it's shiny ? yes,it is,which stands out when its applied to hair.
still beats the default hair and I like how it has no performance impact. I liked hw in witcher 3 more, but that came with a big performance penalty. not everyone has enough gpu resources to spare fort hat.
Posted on Reply
#82
ratirt
cucker tarlsonbut you said you stick with amd for implementing new image quality improvements. therefore I'm asking you cause I can't think of any in the recent few years.
read again.
Posted on Reply
#83
moproblems99
cucker tarlsonwhat did I say about tress fx ?
the fact it's shiny ? yes,it is,which stands out when its applied to hair.
What did I say about RTX? The exact same thing. Reflective surfaces are way too shiny and look fake. How do our comments differ? Enlighten me.
Posted on Reply
#84
cucker tarlson
moproblems99What did I say about RTX? The exact same thing. Reflective surfaces are way to shiny and look fake. How do our comments differ? Enlighten me.
cause you were compalining how reflections are shiny ?

ratirtread again.
okay.you go
ratirtI stick with AMD and open techniques for improving image quality and implement new stuff.
I ask
cucker tarlsonwhat new stuff is amd implementing ?
you go
ratirtAsk AMD not me.
Posted on Reply
#85
ratirt
cucker tarlsoncause you were compalining how reflections are shiny ?






okay.you go



I ask


you go
Listen I'm not giving you English lessons.
Posted on Reply
#86
moproblems99
cucker tarlsoncause you were compalining how reflections are shiny ?
Sorry but can you show me where I said the reflections were too shiny? Let me quote myself below.

Edit: I'll cut you a break because I didn't explicitly say reflective surfaces. However, I figured someone with your intellect would have put it together.
moproblems99Sounds familiar...like I saw it in Screen shots of a Battlefield game or something....what was that feature called...
Posted on Reply
#87
jabbadap
ratirtat first I'd say that blurry picture is nothing innovative if you speak about DLSS. Besides too much sharpness or too much blurriness is not good either way. DLSS brings nothing new except the FPS is higher? (correct me if I'm wrong) If that's the case then it's because the image quality is down and that's how I see it from the TPU review of the DLSS.
To summarize. Nvidia didn't invent anything new with DLSS as you perceive it. Blurriness for me have been with us a long time. Calling something DLSS and telling people this is a new technology isn't ok. Maybe it brings something new to the table but, I hope, this is yet to be seen. NV's focus, as we all know, is on money so this DLSS is more of a marketing than actual innovation for me (innovation with poor image quality). Especially when they come up with some new tech ditching all other 1-2 years old innovation bringing something new to the table to charge more for it.
I stick with AMD and open techniques for improving image quality and implement new stuff. This is just another way for NV (that's just my opinion) to give "something new that only they have" (but in fact it's been in the market already developed) to charge more and convince customers and game developers that's they way to go and of course charge more.
Just like it is with G-Sync and just like it will be with G-sync compatible ( which in fact is free sync). that's just lame. :)
Well it's new way of doing AA, there haven't been AA method using machine learning before DLSS. So saying it's nothing new is a false statement. And like there is in article AMD is exploring similar WinML upscaling filters too. And the filter works as well as it could be teached. Now dlss looks like crap, maybe it could get better over time with more ML runs on super computer to get better training model for inference. But if it can't achieve that, it's just one more bad AA method out there, which will eventually die a way.
Posted on Reply
#88
cucker tarlson
moproblems99Sorry but can you show me where I said the reflections were too shiny? Let me quote myself below.

Edit: I'll cut you a break because I didn't explicitly say reflective surfaces. However, I figured someone with your intellect would have put it together.
I said hair looks "too fake and shiny"
you said "like the feature I saw in BF5"

BF5 has raytraced reflections only. you're literally complaining how nvidia reflections are shiny.
don't worry tho,I'm enjoying this,it's hilarious :laugh:
Posted on Reply
#89
moproblems99
cucker tarlsonBF5 has raytraced reflections only.
Right you are. But it is the reflective surfaces that look too fake and shiny. The reflections probably look great. I just can't get over the surfaces looking terrible.
Posted on Reply
#90
cucker tarlson
moproblems99Right you are. But it is the reflective surfaces that look too fake and shiny. The reflections probably look great. I just can't get over the surfaces looking terrible.
probably has to happen at this point,or it would be too subltle for anyone's eye to really catch.ssr has been doing exact same thing for years but with poor reproduction of the reflected object,never seen you complain.
puddles,cars,windows - everything in a pc game that uses ssr is overexposed to show off the effect.
Posted on Reply
#91
moproblems99
cucker tarlsonprobably has to happen at this point,or it would be too subltle for anyone's eye to really catch.
Oh, so now we have to make the reflective surfaces overly shiny, fake, and not realistic looking so that people MIGHT actually be able to see this revolutionary tech for realism improvements.

Gotcha.

Nice Ninja.
Posted on Reply
#92
ArbitraryAffection
FluffmeisterIt would just be nice if AMD could conclusively beat a 2 year old EOL card with 7nm tech, but the spin from their director of marketing is.... expected.
Would be nice if NVIDIA could compete with a 2 year old not-quiet-EOL card with their 16 and 12nm tech :P

£150~ RX 570 8GB mustard rice ~

I think TAA looks good. It's in Fallout 4 and Warframe. I'd take it over a dumb computer program guessing what a super-sampled image should look like. Especially when you consider how much die space is taken up by those Tensor cores. Honestly, those tensors are for AI training in servers and pro cards and NVIDIA is trying to sell them to gamers so they are OK paying more for less performance because the dies are bigger. ~
Posted on Reply
#93
cucker tarlson
moproblems99Oh, so now we have to make the reflective surfaces overly shiny, fake, and not realistic looking so that people MIGHT actually be able to see this revolutionary tech for realism improvements.

Gotcha.

Nice Ninja.
they've been like that before rtx to show off ssr,please just don't tell me you only now noticed.
Posted on Reply
#94
moproblems99
ArbitraryAffectionI'd take it over a dumb computer program guessing what a super-sampled image should look like. Especially when you consider how much die space is taken up by those Tensor cores. Honestly, those tensors are for AI training in servers and pro cards and NVIDIA is trying to sell them to gamers so they are OK paying more for less performance because the dies are bigger.
It certainly isn't dumb, just needs to be refined. If they can find out a decent algorithm then DLSS could be very useful. Much like RTX in BFV according to cucker, you likely won't notice it.
cucker tarlsonthey've been like that before rtx to show off ssr,please just don't tell me you only now noticed.
Nice try. You perfectly described the use case of RTX right now. We will see what happens next gen. If they can up the jigga rays and devs can learn the nuances, it may pan out. That is a lot of ifs. I think DLSS stands a better chance to succeed if they can refine it some more.
Posted on Reply
#95
ratirt
jabbadapWell it's new way of doing AA, there haven't been AA method using machine learning before DLSS. So saying it's nothing new is a false statement. And like there is in article AMD is exploring similar WinML upscaling filters too. And the filter works as well as it could be teached. Now dlss looks like crap, maybe it could get better over time with more ML runs on super computer to get better training model for inference. But if it can't achieve that, it's just one more bad AA method out there, which will eventually die a way.
Maybe I understand innovation a bit different here.

For me it's a way of achieving at least same resault or better in a cost efficient way (there's more but this is a good example). RTX price is huge and this innovation ain't nothing like what we already have with open techniques. It is worse.
I disagree with your statement. This is what I think about the DLSS. It's my opinion. If you say it's worth something, that's OK with me.
I also disagree that having dozen of techniques to achieve one goal is a great way. You can't focus on so called "innovations" and start from scratch every time, when you have all that's needed already there and open to everybody. Think about game developers and gamers. Do you realize what would have happend if we had these "innovations" once a year? They won't implement every single one.
So for me at this point, DLSS is a downgrade. :)
Posted on Reply
#96
jabbadap
Vya DomusNo it isn't, actually. Aliasing happens when you sample different signals in such a way that they appear identical once put back together, that is the textbook definition. Here the signals represent graphical elements before they are rasterized (sampled) not the whole image as Nvidia would like to imply. The way you avoid aliasing is to apply a filter to those particular elements before they are sampled , or in this case before the image is rendered, MSAA for example works like that (well, not exactly, but the point is that this is done on a per component basis). That is also the textbook definition of how you would go about using an anti-aliasing method.

Whatever DLSS does is applied to the whole mage after it is rendered, it is an upscaling solution nothing more. That's the equivalent of interpolating a signal, not filtering it. Post process AA like FXAA isn't a true method of AA either by definition, because there you try and alter the signal after it was sampled as well not while it is constructed.

You don't, you use those cores only to make it faster.
Acronym DLSS comes from Deep Learning Super-Sampling, so by definition some sort of Super-Sampling anti-aliasing method albeit quite crappy one. But the end result is it removes aliasing which is quite what anti-aliasing is.
ratirtMaybe I understand innovation a bit different here.

For me it's a way of achieving at least same resault or better in a cost efficient way (there's more but this is a good example). RTX price is huge and this innovation ain't nothing like what we already have with open techniques. It is worse.
I disagree with your statement. This is what I think about the DLSS. It's my opinion. If you say it's worth something, that's OK with me.
I also disagree that having dozen of techniques to achieve one goal is a great way. You can't focus on so called "innovations" and start from scratch every time, when you have all that's needed already there and open to everybody. Think about game developers and gamers. Do you realize what would have happend if we had these "innovations" once a year? They won't implement every single one.
So for me at this point, DLSS is a downgrade. :)
Well yeah if it would work like it is marketed, it could be quite game changer. Better quality than 4k with TAA at much higher framerates. I would be all over it, but in current state we don't have even close to that. So pretty big meh until proved otherwise.
Posted on Reply
#97
ratirt
jabbadapAcronym DLSS comes from Deep Learning Super-Sampling, so by definition some sort of Super-Sampling anti-aliasing method albeit quite crappy one. But the end result is it removes aliasing which is quite what anti-aliasing is.



Well yeah if it would work like it is marketed, it could be quite game changer. Better quality than 4k with TAA at much higher framerates. I would be all over it, but in current state we don't have even close to that. So pretty big meh until proved otherwise.
Just to add something to your premis. This DLSS innovation is, lets say in a "test mode". So you don't release it to the consumer market since it's not ready and I think we can all agree on that. If it worked like it is marketed? The question is will it ever work like it's marketed. I have serious doubts about that. What I've stated earlier and please let me repeat. The FPS is higher cause the image quality is down. I think it's that simple.
Posted on Reply
#98
ArbitraryAffection
moproblems99It certainly isn't dumb, just needs to be refined. If they can find out a decent algorithm then DLSS could be very useful. Much like RTX in BFV according to cucker, you likely won't notice it.
I disagree honestly, AI technology as it is right now is dumb. It's little more than basic logic engines. there is no real intelligence, and there wont be for many years. it's in its infancy but of course this is a good step and its gotta start somewhere.
Posted on Reply
#99
moproblems99
ratirtThe FPS is higher cause the image quality is down. I think it's that simple.
Let's be honest, it's pretty hard to get better performance with an increase in image quality. I have not played a title with DLSS so I am not sure if I would notice the quality or not. Really depends on the pace of the game. Metro might be slow enough of a pace that it would be noticeable. Something like BFV multiplayer, maybe not so much.
ArbitraryAffectionI disagree honestly, AI technology as it is right now is dumb. It's little more than basic logic engines. there is no real intelligence, and there wont be for many years. it's in its infancy but of course this is a good step and its gotta start somewhere.
I mean look at the average human, logic is mostly gone. We are, in general, purely emotion driven now and you are right in that it will take a long time for AI to pick up emotions. However, AI is well suited for things of this nature and can be extremely good at it. It all comes down to the algorithm used. If they can improve it, it could be nice. But again, with consoles having AMD, how much use will it really get? I guess that will come down to how many investments with game studios NV is willing to make.
Posted on Reply
#100
ArbitraryAffection
moproblems99I mean look at the average human, logic is mostly gone. We are, in general, purely emotion driven
I know this all too well lol.

Interesting topic: When does the AI program become so intelligent that it doesn't want to upscale your video games? :)
Posted on Reply
Add your own comment
Mar 30th, 2025 11:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts