Thursday, February 14th 2019

AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

A report via PCGamesN places AMD's stance on NVIDIA's DLSS as a rather decided one: the company stands for further development of SMAA (Enhanced Subpixel Morphological Antialiasing) and TAA (Temporal Antialising) solutions on current, open frameworks, which, according to AMD's director of marketing, Sasa Marinkovic, "(...) are going to be widely implemented in today's games, and that run exceptionally well on Radeon VII", instead of investing in yet another proprietary solution. While AMD pointed out that DLSS' market penetration was a low one, that's not the main issue of contention. In fact, AMD decides to go head-on against NVIDIA's own technical presentations, comparing DLSS' image quality and performance benefits against a native-resolution, TAA-enhanced image - they say that SMAA and TAA can work equally as well without "the image artefacts caused by the upscaling and harsh sharpening of DLSS."

Of course, AMD may only be speaking from the point of view of a competitor that has no competing solution. However, company representatives said that they could, in theory, develop something along the lines of DLSS via a GPGPU framework - a task for which AMD's architectures are usually extremely well-suited. But AMD seems to take the eyes of its DLSS-defusing moves, however, as AMD's Nish Neelalojanan, a Gaming division exec, talks about potential DLSS-like implementations across "Some of the other broader available frameworks, like WindowsML and DirectML", and that these are "something we [AMD] are actively looking at optimizing… At some of the previous shows we've shown some of the upscaling, some of the filters available with WindowsML, running really well with some of our Radeon cards." So whether it's an actual image-quality philosophy, or just a competing technology's TTM (time to market) one, only AMD knows.
Source: PCGamesN
Add your own comment

170 Comments on AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

#101
moproblems99
ArbitraryAffectionI know this all too well lol.

Interesting topic: When does the AI program become so intelligent that it doesn't want to upscale your video games? :)
I would assume when it realizes that you could turn settings down to achieve the same goal without having to do all this extra processing?
Posted on Reply
#102
ArbitraryAffection
moproblems99I would assume when it realizes that you could turn settings down to achieve the same goal without having to do all this extra processing?
Ohhhhhh~


Edit: sorry, lol. Couldn't resist.
Posted on Reply
#103
ratirt
ArbitraryAffectionI disagree honestly, AI technology as it is right now is dumb. It's little more than basic logic engines. there is no real intelligence, and there wont be for many years. it's in its infancy but of course this is a good step and its gotta start somewhere.
I wouldn't say totally.
Some time ago I've attended a Warsaw Security Summit conference (Hope Cucker will be thirlled cause it was held in Poland)
Innovation with AI and Deep learning for cameras and since I was doing Video Surveillance Systems I attended. Using AI and/or deep learning techiques the camera was able to recognize human emotions. It could tell if somebody is sad or stressed etc. Basically it would tell you if a dude is up to something since his behavior, face would say it. Like a somebody who's about to commit a crime. That's innovation which has been showed and analized later with in depth information and code. What we have with DLSS here? People hear Deep Learning Multi-sampling and it must be great since "deep learning" is mentioned there. What a bull crap on a barn floor that marketing shenaningans.
moproblems99Let's be honest, it's pretty hard to get better performance with an increase in image quality. I have not played a title with DLSS so I am not sure if I would notice the quality or not. Really depends on the pace of the game. Metro might be slow enough of a pace that it would be noticeable. Something like BFV multiplayer, maybe not so much.
You can take a look in TPU review. You have a comparison. Take a look. it's not only the algorithm but also purpose of it and if it's meant for this particular job (should Deep Learning be used for this ?) and of course if it's worth it.
I don't know your game preferences but imagin motion blur effect. Did you like it in games when you wanted to excel at a particular game? I didn't. I always wanted nice smooth and crispy image. With DLSS It seems like we are going backwards under the Deep Learning flag with a promise and motto "The way it's meant to be played" from now on :p. It's just sad.
Posted on Reply
#104
moproblems99
ratirtYou can take a look in TPU review. You have a comparison. Take a look.
I have a comparison of a still image that can be scrutinized till the end of the world. What I don't have is live action comparison. I don't know if I would notice it. There are times when I am so caught up playing the game that I don't notice my giant, obvious radar telling me there is someone about to kill me right before I die.

TL;DR;

Still frames are much easier to compare and scrutinize than live action. Until I see it in person while I am playing, I will withhold final judgement. That said, I likely will never see it as I likely won't be buying a 2000 series card.
Posted on Reply
#105
Casecutter
Two years from now it will be like G-Sync, another proprietary technology that went no where.
Posted on Reply
#106
ratirt
CasecutterTwo years from now it will be like G-Sync, another proprietary technology that went no where.
And with what waste of time, technology and money. (the last one not in particular since a lot of people will still go with it :P )
Posted on Reply
#107
XXL_AI
amd is a spoiled little kid with some skills to show, but they are jealous of every achievement of other companies, they mostly suck at leading the innovation but heck, they can do some decent hardware at a low cost. but they contribute global warming more than any other computer part company.
Posted on Reply
#108
John Naylor
Well regarding the thread title ... have to ask, what can we expect any competitor in any market for any product would say in response in response to the announcement of a new feature, with advantages (real or imagined), that the competitor doesn't have access to ?
Posted on Reply
#109
ratirt
This is kinda interesting for those who wanted to know something about Ray Tracing from AMD
Posted on Reply
#110
WholesomeGuy69
I honestly don't know enough to form an opinion as to which of the competing technologies is superior butttttt if those images are labeled correctly and they were produced and promoted by AMD as part of their argument in favor of TAA over DLSS.......well then that's just dumb (and I'm an AMD stock-holder lol fml)
Posted on Reply
#111
ratirt
This is interesting. Wonder if you guys have seen it. It's about ray tracing and rasterization.
Posted on Reply
#112
efikkan
ratirtThis is interesting. Wonder if you guys have seen it. It's about ray tracing and rasterization.

It's definitely an interesting subject, and it might be fine as an introduction, but it is glaringly obvious to me that the author's understanding of rendering is only skin deep, and he gets the deeper technical wrong.

22:58
But hybrid rendering is a stopgap; Nvidia needs to take the hybrid approach due to a legacy of thousands of rasterized games. We see why that is; with Turing already being poorly received due to not being fast enough at rasterization, can you imagine what would have happened had they doubled or even quadrupled their RTX gigarays while actually lowering rasterization performance?
This was the only way Nvidia could do it.
AMD on the other hand, they're the type of company that would just throw it all out and start from scratch. And I believe we will see them go down a true raytracing or pathtracing route with the game consoles one day
Anyone who knows how GPUs and raytracing work understands that the new RT cores are only doing a part of the rendering. GPUs are in fact a collection on various specific accelerated hardware (geometry processing, tessellation, TMUs, ROPs, video encoding/decoding, etc.) and clusters of ALU/FPUs for generic math. Even in a fully raytraced scene, 98% of this hardware will still be used. The RT cores are just another type specialized accelerators for one specific task. And it's not like everything can/will be raytraced; like UI elements in a game, a page in your web browser or your Windows desktop, it's rasterized because it's efficient, and that will not change.
Posted on Reply
#113
Camm
FF, Metro and BFV DLSS looks like muddy garbage at this point, which is further compounded by that there isn't enough performance to push over 90fps with DLSS on.

I'm waiting to see what the next few updates do, depending on game I might turn DLSS on, but at this point, RT is not worth the smearing that DLSS is.
Posted on Reply
#114
mtcn77
It is okay for Nvidia to keep experimenting with gauges when true to form, but higher engagement and lower results aren't in their forte. It has to be a winner solution to keep the milk going. For the longest time green spammers had a hunch for texture fidelity, this is an optimisation towards that. They keep budgeting higher transistor counts, I suppose games are forever going to look like textureless bland blobs.
Posted on Reply
#115
Fluffmeister
CasecutterTwo years from now it will be like G-Sync, another proprietary technology that went no where.

Heh, and yet G-Sync spawned FreeSync™.
Posted on Reply
#116
mtcn77
FluffmeisterHeh, and yet G-Sync spawned FreeSync™.
I'm sure Nvidia wouldn't leave it at that, unless it was the other way around...
Posted on Reply
#117
Fluffmeister
mtcn77I'm sure Nvidia wouldn't leave it at that, unless it was the other way around...
Well it wasn't the other way round, maybe the question should be; Would we have FreeSync if Nvidia hadn't...
Posted on Reply
#118
mtcn77
FluffmeisterWell it wasn't the other way round, maybe the question should be; Would we have FreeSync if Nvidia hadn't...
I'm pretty sure we would still have gpus... I'm sure you get the pun.
Posted on Reply
#119
Super XP
Let me go 2 years into the Future. One Second Please..... Wow Navi surprised everybody with its amazing performance, power efficiency, cost and AMD's version of this DLSS that actually makes the PQ look fantastic all while increasing FPS. :p
Posted on Reply
#120
efikkan
AMD catching up to Nvidia in one jump? We'll just have to see about that. Navi will compete with the successor of Turing most of its life, so it would have to offer over twice the efficiency of Vega, which will be no small feat.

BTW; many thought Vega was going to be a Pascal killer too…
Posted on Reply
#121
ratirt
efikkanIt's definitely an interesting subject, and it might be fine as an introduction, but it is glaringly obvious to me that the author's understanding of rendering is only skin deep, and he gets the deeper technical wrong.

22:58

Anyone who knows how GPUs and raytracing work understands that the new RT cores are only doing a part of the rendering. GPUs are in fact a collection on various specific accelerated hardware (geometry processing, tessellation, TMUs, ROPs, video encoding/decoding, etc.) and clusters of ALU/FPUs for generic math. Even in a fully raytraced scene, 98% of this hardware will still be used. The RT cores are just another type specialized accelerators for one specific task. And it's not like everything can/will be raytraced; like UI elements in a game, a page in your web browser or your Windows desktop, it's rasterized because it's efficient, and that will not change.
So what do you think is wrong with what that dude said? Cause I'm sure it is right. Cant see your point here. We are talking about games not web browsers. The UI isn't ray traced but that's stating the obvious. The ray tracing is for illumination and shadows (reflections as well like fire in BFV on the gun or in water) to get more realism in the graphics. It is connected to the light source in the game so depending on the scene it may not ray trace everything but a huge chunk of the image will be ray traced.
It shows how ray tracing eats the resources and we still have a limitation with the hardware currently available in the market. I think this video shows what is needed and gives an example of other ways to achieve the ray traced scene in games nowadays or in the future. We will have to see where and how it will be done and how will it work.
Posted on Reply
#122
efikkan
ratirtSo what do you think is wrong with what that dude said? Cause I'm sure it is right. Cant see your point here.
His mistake is thinking the hardware resources used in rasterized rendering is not used during raytracing, when everything except a tiny part is. Adding raytracing capabilities doesn't lower rasterization capabilities. Please read the parts i bold again and you'll see.
Posted on Reply
#123
ratirt
efikkanHis mistake is thinking the hardware resources used in rasterized rendering is not used during raytracing, when everything except a tiny part is. Adding raytracing capabilities doesn't lower rasterization capabilities. Please read the parts i bold again and you'll see.
If the rt cores are doing the ray tracing and that's how it has been shown then there's less cores doing the rasterization. So in fact he got it right. That's what I understood from his video and I think that was the main idea of it. Adding to my premise, after rasterization is complete then the ray tracing is being processed. that also creates a lag (more time needed to complete) since these can't work at the same time. You need the objects already there before you can ray trace them. We can see that in BFV when you switch on ray tracing. It barely caps at 60FPS.
Posted on Reply
#124
londiste
ratirtIf the rt cores are doing the ray tracing and that's how it has been shown then there's less cores doing the rasterization. So in fact he got it right. That's what I understood from his video and I think that was the main idea of it. Adding to my premise, after rasterization is complete then the ray tracing is being processed. that also creates a lag (more time needed to complete) since these can't work at the same time. You need the objects already there before you can ray trace them. We can see that in BFV when you switch on ray tracing. It barely caps at 60FPS.
What do you mean there is less cores doing the rasterization - that without RT cores they could fit some more shaders onto the chip or something else?
RT is being processed concurrently with rasterization work. There are some prerequisites - generally G-Buffer - but largely it happens at the same time.
Posted on Reply
#125
ratirt
londisteWhat do you mean there is less cores doing the rasterization - that without RT cores they could fit some more shaders onto the chip or something else?
RT is being processed concurrently with rasterization work. There are some prerequisites - generally G-Buffer - but largely it happens at the same time.
Are you sure about that it happens at the same time? you need the objects to be there due to make the ray tracing process. Maybe it works alongside but there must be a gap between the rasterization and ray tracing. How can you ray trace an object and add reflections, shadows etc. when the object isn't there?
I think yes. without RT cores there would be more for rasterization (unless RT cores do the rasterization as well? I don't think so) I'm not sure now since you've gotten surprised by my statement. Although it makes sense. RT cores take space. So without RT cores the 2080 TI would have been faster in rasterization.
Posted on Reply
Add your own comment
Mar 30th, 2025 11:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts