Thursday, February 14th 2019

AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

A report via PCGamesN places AMD's stance on NVIDIA's DLSS as a rather decided one: the company stands for further development of SMAA (Enhanced Subpixel Morphological Antialiasing) and TAA (Temporal Antialising) solutions on current, open frameworks, which, according to AMD's director of marketing, Sasa Marinkovic, "(...) are going to be widely implemented in today's games, and that run exceptionally well on Radeon VII", instead of investing in yet another proprietary solution. While AMD pointed out that DLSS' market penetration was a low one, that's not the main issue of contention. In fact, AMD decides to go head-on against NVIDIA's own technical presentations, comparing DLSS' image quality and performance benefits against a native-resolution, TAA-enhanced image - they say that SMAA and TAA can work equally as well without "the image artefacts caused by the upscaling and harsh sharpening of DLSS."

Of course, AMD may only be speaking from the point of view of a competitor that has no competing solution. However, company representatives said that they could, in theory, develop something along the lines of DLSS via a GPGPU framework - a task for which AMD's architectures are usually extremely well-suited. But AMD seems to take the eyes of its DLSS-defusing moves, however, as AMD's Nish Neelalojanan, a Gaming division exec, talks about potential DLSS-like implementations across "Some of the other broader available frameworks, like WindowsML and DirectML", and that these are "something we [AMD] are actively looking at optimizing… At some of the previous shows we've shown some of the upscaling, some of the filters available with WindowsML, running really well with some of our Radeon cards." So whether it's an actual image-quality philosophy, or just a competing technology's TTM (time to market) one, only AMD knows.
Source: PCGamesN
Add your own comment

170 Comments on AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

#126
londiste
The guy in the video gets some things wrong.
- DXR is not limited to reflections, shadows and AO. Nvidia simply provides a more-or-less complete solutions for these three. You are able do anything else RT yuo want with DXR. While not DX12 - and not DXR - the ray-traced Q2 on Vulkan clearly shows that full-on raytracing can be done somewhat easily.
- While RTX/DXR are named raytracing, they do also accelerate pathtracing.
- OctaneRender is awesome but does not perform miracles. Turing does speed up the non-realtime path-/raytracing software where it is implemented by at least 2, often 3-5 times compared to Pascal. And this is early implementations.
ratirtAre you sure about that it happens at the same time? you need the objects to be there due to make the ray tracing process. Maybe it works alongside but there must be a gap between the rasterization and ray tracing. How can you ray trace an object and add reflections, shadows etc. when the object isn't there?
Nvidia as well as various developers have said RT can start occurs after generating G-Buffer. Optimizing that start point was one of the bigger boosts in BFV patch.
Posted on Reply
#127
efikkan
ratirtIf the rt cores are doing the ray tracing and that's how it has been shown then there's less cores doing the rasterization.
The RT cores are accelerating the calculation and intersection of rays, but the RT cores themselves are not rendering a complete image.
Rendering is a pipelined process; you have the various stages of geometry processing (vertex shaders, geometry shaders and tessellation shaders), then fragment("pixel shaders" in Direct3D terms) processing and finally post-processing. In rasterized rendering, the rasterization itself it technically only the transition between geometry and fragment shading; it converts vertices from 3D space to 2D space, performs depth sorting and culling, before the fragment shader starts putting in textures etc.
In a fully raytraced rendering, all the geometry will still be the same, but the rasterization step between geometry and fragments are gone, and the fragment processing will have to be rewritten to interface with the RT cores. All the existing hardware is still needed, except for the tiny part which does the rasterization. So all the "cores"/shader processors, TMUs etc. are still used during raytracing.

So in conclusion, he is 100% wrong about claiming a "true" raytraced GPU wouldn't be able to do rasterization. He thinks the RT cores does the entire rendering, which of course is completely wrong. The next generations of GPUs will continue to increase the number of "shader processors"/cores, as they are still very much needed for both rasterization and raytracing, and it's not a legacy thing like he claims.
Posted on Reply
#128
ratirt
efikkanThe RT cores are accelerating the calculation and intersection of rays, but the RT cores themselves are not rendering a complete image.
Rendering is a pipelined process; you have the various stages of geometry processing (vertex shaders, geometry shaders and tessellation shaders), then fragment("pixel shaders" in Direct3D terms) processing and finally post-processing. In rasterized rendering, the rasterization itself it technically only the transition between geometry and fragment shading; it converts vertices from 3D space to 2D space, performs depth sorting and culling, before the fragment shader starts putting in textures etc.
In a fully raytraced rendering, all the geometry will still be the same, but the rasterization step between geometry and fragments are gone, and the fragment processing will have to be rewritten to interface with the RT cores. All the existing hardware is still needed, except for the tiny part which does the rasterization. So all the "cores"/shader processors, TMUs etc. are still used during raytracing.

So in conclusion, he is 100% wrong about claiming a "true" raytraced GPU wouldn't be able to do rasterization. He thinks the RT cores does the entire rendering, which of course is completely wrong. The next generations of GPUs will continue to increase the number of "shader processors"/cores, as they are still very much needed for both rasterization and raytracing, and it's not a legacy thing like he claims.
I understand how rasterization process works and since it works in a pipeline there's one thing being done after another. If you add ray tracing to the pipeline it takes longer so this means the speed drops down. That's my point.
"true" raytraced GPU? You mean like full of RT core only?
Posted on Reply
#129
efikkan
ratirtI understand how rasterization process works and since it works in a pipeline there's one thing being done after another. If you add ray tracing to the pipeline it takes longer so this means the speed drops down. That's my point.

"true" raytraced GPU? You mean like full of RT core only?
Once again, I refer you back to the quote from the video. He was talking as if the RT cores are the only used during raytracing, and the rest is used during rasterized rendering, which is not true at all.

Think of it more like this; the RT cores are accelerating one specific task, kind of like AVX extensions to x86. Code using AVX will not be AVX only, and while AVX will be doing a lot of the "heavy lifting", the majority of the code will still be "normal" x86 code.
Posted on Reply
#130
londiste
ratirtI understand how rasterization process works and since it works in a pipeline there's one thing being done after another. If you add ray tracing to the pipeline it takes longer so this means the speed drops down. That's my point.
RT Cores work concurrently.
Wasn't this slide already posted in one of the threads?
Posted on Reply
#131
medi01
Why do TP news often sound as if written by NV's CEO?
MrAMDDLSS actually looks sharper
Algo that sharpers looks sharper.
Sounds logical.
Posted on Reply
#132
ratirt
Here's some more DLSS testing.

It doesn't look great. I really can't see the improvement like you guys say.
Posted on Reply
#133
Casecutter
FluffmeisterHeh, and yet G-Sync spawned FreeSync™.
I always understood you to be more "in tune" with technologies, as to not purport such a false statement.

Variable refresh rate monitor - VESA Adaptive-Sync standard, which is part of DisplayPort 1.2a; was always a working open standard. It was just Nvidia wanted to use/push there proprietary (added hardware) implementation to jump out in front before VESA (open coalition of monitor and GPU representatives) ultimately finalized and agreed upon standard that requires (no huge additional hardware/cost), but just enabling features already developed in the DisplayPort 1.2a.

AMD and the industry had been working to commence the "open standard", it was just Nvidia saw that not progressing to their liking, and threw their weight to licensing such "add-in" proprietary work-around to monitor manufacture earlier than VESA coalition was considering the roll-out. Now that those early monitor manufacture are seeing sales of those G-Sync monitors not as lucrative (as the once had been to the early adopter community). Nvidia see's themselves on the losing end, decided to slip back in the fold, and unfortunately there's no repercussions to the lack of support they delivered upon the VESA coalition.

Much like the Donald on the "birther" crap, it was "ok" to do it and spout the lies, until one day he finally could no-long pass muster and wasn't expedient to his campaign, and declared he would no talk about it again! I think that's how Jen Hsun Huang hopes G-Sync will just pass...
Posted on Reply
#134
londiste
When G-Sync was shown, VESA Adaptive Sync in DP 1.2a did not exist. It did exist on eDP which AMD quickly found when they wanted to have a VRR solution in response. Freesync was first demoed on laptop screens. VESA Adaptive Sync took a while to take off, couple years actually until there was a good bunch of monitors available. Industry had not considered VRR important until then.

G-Sync was announced in October 2013. G-Sync monitors started to sell early 2014.
Freesync was announced in January 2014. VESA Adaptive Sync was added to DP 1.2a in mid-2014. Freesync monitors started to sell early 2015.
Posted on Reply
#135
Xzibit
ratirtHere's some more DLSS testing.

It doesn't look great. I really can't see the improvement like you guys say.
OUCH!!!
HardwareUnboxedDLSS is complete garbage
Posted on Reply
#136
InVasMani
the54thvoidYeah, how long before we get gfx cards that can run 8k at 60+FPS?
RTX on or RTX off lol
XzibitOUCH!!!
Sad part is that it's still easily noticeable enough in the comparisons despite the 1080p YouTube quality.
Posted on Reply
#137
ratirt
XzibitOUCH!!!
I know it is but it would seem that some people think it is really great. I just can't understand that. We have moved from smooth and sharp image to blurry which people call it's an improvement and looks better.
InVasManiRTX on or RTX off lol

Sad part is that it's still easily noticeable enough in the comparisons despite the 1080p YouTube quality.
I looked over few videos in on YouTube. (Hardware unboxed) and they have shown the RTX on and off. It does improve the lighting reflections etc. Especially in Metro. It looks nice and more realistic. Though it eats a lot of resources. This RTX can be seen greatly in outside scenery but also when a lot of objects is moving inside and we have different light sources. In some other no change except FPS drop. Anyway it's a nice feature but when we will be playing with RTX on and 60+ FPS on 4k? I'd say long way to go.
Posted on Reply
#138
satrianiboys
RTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success
Posted on Reply
#139
mtcn77
Funny enough, I assumed Nvidia would improve the driver AF levels now that more texture read requests could be issued in newer hardware iterations.
Posted on Reply
#140
INSTG8R
Vanguard Beta Tester
satrianiboysRTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success
:roll: Oh my sides!! Thanks for the laugh. All of those things have faded away....
Posted on Reply
#141
Fluffmeister
Heh yeah, all those TressFX games certainly taught them a lesson.
Posted on Reply
#142
INSTG8R
Vanguard Beta Tester
FluffmeisterHeh yeah, all those TressFX games certainly taught them a lesson.
Exactly all just flash in the pan “shiny things” both sides are equal with their attempts...My New one I’m “enjoying” is Freesync 2 HDR in FC5, FC New Dawn and RE2. TBH it’s not that impressive or noticeable
Posted on Reply
#143
Fluffmeister
INSTG8RExactly all just flash in the pan “shiny things” both sides are equal with their attempts...My New one I’m “enjoying” is Freesync 2 HDR in FC5, FC New Dawn and RE2. TBH it’s not that impressive or noticeable
Yeah kinda confirms open doesn't mean good or popular either, TressFX has gone nowhere. Regardless Nvidia users get to enjoy both plus get a choice of G-Sync and FreeSync displays today. Swings and roundabouts eh.
Posted on Reply
#144
INSTG8R
Vanguard Beta Tester
FluffmeisterYeah kinda confirms open doesn't mean good or popular either, TressFX has gone nowhere. Regardless Nvidia users get to enjoy both plus get a choice of G-Sync and FreeSync displays today. Swings and roundabouts eh.
Well that just proves G-Sync was unnecessary and the open standard works for everyone.
Posted on Reply
#145
Fluffmeister
INSTG8RWell that just proves G-Sync was unnecessary and the open standard works for everyone.
Not really, looking at their G-Sync compatible list, the FreeSync working ranges are more often than not awful, and the standards varied from monitor to monitor.... in short their was no standard. G-Syc displays are expensive to at least they have to achive minimum requirements. FreeSync took the machine gun approach.
Posted on Reply
#146
INSTG8R
Vanguard Beta Tester
FluffmeisterNot really, looking at their G-Sync compatible list, the FreeSync working ranges are more often than not awful, and the standards varied from monitor to monitor.... in short their was no standard. G-Syc displays are expensive to at least they have to achive minimum requirements. FreeSync took the machine gun approach.
Yeah I get that but those monitors are those “cheap and easy” 75hz Freesync range stuff that’s literally “free” Freesync I don’t really put any stock into those.
Posted on Reply
#147
mtcn77
Just what are you expecting out of a 60Hz LCD? 9FPS VRR? The upselling sickens me.
Posted on Reply
#148
ratirt
mtcn77Just what are you expecting out of a 60Hz LCD? 9FPS VRR? The upselling sickens me.
I go 60Hz 4k Freesync screen and couldn't be happier :) It's just great. Buying something like that with G-sync would cost a lot more.
Now NV is joining the club with Freesync (this compatible stuff is so damn funny ) since they have seen they are going nowhere with the price. :)
Posted on Reply
#149
satrianiboys
oops, looks like i hurt someone for telling the truth..
Posted on Reply
#150
INSTG8R
Vanguard Beta Tester
satrianiboysoops, looks like i hurt someone for telling the truth..
Nothing truthful about your “success” fantasies. It was funny though.
Posted on Reply
Add your own comment
Mar 30th, 2025 11:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts