# AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions



## Raevenlord (Feb 14, 2019)

A report via PCGamesN places AMD's stance on NVIDIA's DLSS as a rather decided one: the company stands for further development of SMAA (Enhanced Subpixel Morphological Antialiasing) and TAA (Temporal Antialising) solutions on current, open frameworks, which, according to AMD's director of marketing, Sasa Marinkovic, "(...) are going to be widely implemented in today's games, and that run exceptionally well on Radeon VII", instead of investing in yet another proprietary solution. While AMD pointed out that DLSS' market penetration was a low one, that's not the main issue of contention. In fact, AMD decides to go head-on against NVIDIA's own technical presentations, comparing DLSS' image quality and performance benefits against a native-resolution, TAA-enhanced image - they say that SMAA and TAA can work equally as well without "the image artefacts caused by the upscaling and harsh sharpening of DLSS."

Of course, AMD may only be speaking from the point of view of a competitor that has no competing solution. However, company representatives said that they could, in theory, develop something along the lines of DLSS via a GPGPU framework - a task for which AMD's architectures are usually extremely well-suited. But AMD seems to take the eyes of its DLSS-defusing moves, however, as AMD's Nish Neelalojanan, a Gaming division exec, talks about potential DLSS-like implementations across "Some of the other broader available frameworks, like WindowsML and DirectML", and that these are "something we [AMD] are actively looking at optimizing… At some of the previous shows we've shown some of the upscaling, some of the filters available with WindowsML, running really well with some of our Radeon cards." So whether it's an actual image-quality philosophy, or just a competing technology's TTM (time to market) one, only AMD knows.



 

 



*View at TechPowerUp Main Site*


----------



## MrAMD (Feb 14, 2019)

DLSS actually looks sharper to me vs TAA on 4K. Although I would rather have no AA at all for 4K..


----------



## hutt132 (Feb 14, 2019)

I think you have the labeling wrong on the comparison screenshots.


----------



## Ferrum Master (Feb 14, 2019)

hutt132 said:


> I think you have the labeling wrong on the comparison screenshots.



IMHO yes... they are vice versa.


----------



## c2DDragon (Feb 14, 2019)

Who took those screenshots ?
No way TAA does FXAA-like effects...I can't believe this.


----------



## Fouquin (Feb 14, 2019)

Raevenlord said:


> Of course, *MAD* may only be speaking from the point of view of a competitor that has no competing solution.



I dunno, I don't think they're very mad about it.


----------



## neatfeatguy (Feb 14, 2019)

hutt132 said:


> I think you have the labeling wrong on the comparison screenshots.



Nah. Those are the screen shots of the set benchmark/program that Nvidia could make look really good with DLSS. Actual performance seen from BFV and the new Metro game, plus FFXV prove that (as you said the images are labeled wrong) DLSS looks more like the TAA and the TAA looks more like the DLSS in the provided screen shots.


----------



## SystemMechanic (Feb 14, 2019)

I still dont understand How TAA in Port royal is basically BFV's DLSS ?!? For people who cant see, Left images are TAA (Blurry and jagged edges, details lost in some places, like the white arc) and DLSS is on the right, much cleaner, details preserved and no Blur. Its almost as if Nvidia faked these...



Ferrum Master said:


> IMHO yes... they are vice versa.



They arent lol. That's what confuses me as to by BFV's DLSS looks like the TAA in these screenshots..

Run the benchmark yourself, DLSS is a lot better in Port royal than its TAA counterpart.


----------



## rtwjunkie (Feb 14, 2019)

SystemMechanic said:


> Run the benchmark yourself, DLSS is a lot better in Port royal than its TAA counterpart.


Because every frame of a benchmark can be optimized for.  Out in the wild, you are more likely to see those pictures actually reversed as far as image quality, based on what I have read so far.


----------



## Prince Valiant (Feb 14, 2019)

I don't think AMD needs a competitor to DLSS if DLSS can't be drastically improved for games.

"exceptionally well on Radeon VI"
You mean Radeon VII, right?


----------



## Deleted member 158293 (Feb 14, 2019)

If AMD just had PCs to consider they would probably have a very different outlook on their market, but AMD also has the entire gaming ecosystem including consoles, multi-platform game engines and studios to consider, not just PC versions. 
The gaming heavyweights Sony & Microsoft would probably not look kindly on AMD creating proprietary solutions (like their PC competitor) which could potentially cause them issues later on.  Creating open source multi-partner solutions makes more sense in their case, albeit probably with longer development times.


----------



## spectatorx (Feb 14, 2019)

DLSS is not an aa method, how could it be anyway as it is just upscaling method?


----------



## mandelore (Feb 14, 2019)

DLSS is to be blunt, garbage.

Look at the Metro Exodus and Battlefield V screenshots recently posted on Techpowerup.

Id take no AA over DLSS and just cant comprehend how anyone would be ok with blurry graphics, completely defeats the point of using a high resolution gaming screen if you're just going to upsample from a lower resolution and live with blur.


----------



## megamanxtreme (Feb 14, 2019)

Where are the SMAA options? Multisampling? If not, then it is M.L.A.A.? Sub-pixel Morphological Anti-aliasing.



mandelore said:


> Id take no AA over DLSS and just cant comprehend how anyone would be ok with blurry graphics, completely defets the point of using a high resolution gaming screen if you're just going to upsample from a lower resolution and live with blur.


They could just enable them for older games to make them look less bad in higher resolutions, but I'll stick with higher resolutions without AA in modern games.


----------



## jabbadap (Feb 14, 2019)

spectatorx said:


> DLSS is not an aa method, how could it be anyway as it is just upscaling method?



It is AA method by definition. And no it's not just upscaling, you would not need tensor cores to do just that.


----------



## Fluffmeister (Feb 14, 2019)

It would just be nice if AMD could conclusively beat a 2 year old EOL card with 7nm tech, but the spin from their director of marketing is.... expected.


----------



## kastriot (Feb 14, 2019)

Well when one day we will have <100 micron pixel even on 32 inch monitors which is btw 8K rez then AA, DLSS and all other aliasing tech will be obsolete, but until then we are stuck with blurriness.


----------



## the54thvoid (Feb 14, 2019)

kastriot said:


> Well when one day we will have <100 micron pixel even on 32 inch monitors which is btw 8K rez then AA, DLSS and all other aliasing tech will be obsolete, but until then we are stuck with blurriness.



Yeah, how long before we get gfx cards that can run 8k at 60+FPS?


----------



## efikkan (Feb 14, 2019)

All kinds of post-processing AA is ultimately flawed, no matter what new gimmick they attach to it. You can't generate information that doesn't exist.



rtwjunkie said:


> Because every frame of a benchmark can be optimized for.  Out in the wild, you are more likely to see those pictures actually reversed as far as image quality, based on what I have read so far.


You mean in the driver? No that's not possible at all.

When Nvidia and AMD "optimizes" a game or benchmark they have a limited set of tools to work with. The most basic one is general rendering options, things like texture filtering, mipmap thresholds etc., the type options you see in the driver control panel, plus a few extra that are available through their native API. The second type of "optimizations" are tweaked shaders, but by optimized in this case it's generally not a feature complete optimization, but rather simplifying it without too obvious degradations in image quality. AAA titles and popular benchmarks are "victims" of this, and both vendors have been caught causing serious artifacts in 3DMark in the past. 3DMark is basically "broken" as a _real_ benchmark at this point.

Then finally there are workarounds in the driver core per application. This is theoretically possible, but exceedingly rare (except for in that mess called Mesa). This is usually used to work around serious problems (bugs or bottlenecks) rather than actual optimizations, since any such workaround adds bloat to the driver.


----------



## Vya Domus (Feb 14, 2019)

jabbadap said:


> It is AA method by definition.



No it isn't, actually. Aliasing happens when you sample different signals in such a way that they appear identical once put back together, that is the textbook definition. Here the signals represent graphical elements before they are rasterized (sampled) *not the whole image* as Nvidia would like to imply. The way you avoid aliasing is to apply a filter to those particular elements before they are sampled , or in this case before the image is rendered, MSAA for example works like that (well, not exactly, but the point is that this is done on a per component basis). That is also the textbook definition of how you would go about using an anti-aliasing method.

Whatever DLSS does is applied to the whole mage after it is rendered, it is an upscaling solution nothing more. That's the equivalent of interpolating a signal, not filtering it. Post process AA like FXAA isn't a true method of AA either by definition, because there you try and alter the signal after it was sampled as well not while it is constructed.



jabbadap said:


> you would not need tensor cores to do just that.



You don't, you use those cores only to make it faster.


----------



## M2B (Feb 14, 2019)

The entire money and time (even if it isn't much) that nvidia is spending on DLSS is a waste.
They should entirely focus on the Ray-Tracing implementation instead, Metro is a nice showcase for them, after I saw the Ray Traced global illumination in metro exodus it really made me think that Ray Tracing has the potential to improve things in games, Cyberpunk 2077 is a good opportunity for nvidia, a game that most enthusiast PC gamers actually care about, they have a lot of time left to properly implement Ray Tracing in it and maybe make the game a true showcase for their next Gen GPUs.


----------



## Aquinus (Feb 14, 2019)

Raevenlord said:


> artefacts


Typo?

Isn't the purpose of DLSS to use all of that extra crap nVidia added to the RTX series cards? It's more of a "Hey look, all that extra stuff you got actually isn't a waste!" This is like AMD saying, "Actually, it was. We already have _open_ solutions that are perfectly adequate and widely used."


----------



## Basard (Feb 14, 2019)

M2B said:


> The entire money and time (even if it isn't much) that nvidia is spending on DLSS is a waste.
> They should entirely focus on the Ray-Tracing implementation instead, Metro is a nice showcase for them, after I saw the Ray Traced global illumination in metro exodus it really made me think that Ray Tracing has the potential to improve things in games, Cyberpunk 2077 is a good opportunity for nvidia, a game that most enthusiast PC gamers actually care about, they have a lot of time left to properly implement Ray Tracing in it and maybe make the game a true showcase for their next Gen GPUs.



Yeah.... they shoulda ditched tensor cores and just focused on more RTX cores or something.


----------



## moproblems99 (Feb 14, 2019)

M2B said:


> Cyberpunk 2077 is a good opportunity for nvidia



Please don't encourage them.


----------



## Metroid (Feb 14, 2019)

taa is very blurred, i do prefer dlss on those images above, night and day compared. What the hell amd is smoking saying taa is better than dlss.


----------



## Ubersonic (Feb 14, 2019)

MrAMD said:


> DLSS actually looks sharper to me vs TAA


In fairness that's like saying that horse poo tastes better than dog poo, it's correct (as it's herbivore vs omnivore) but not much to shout about as I'm sure most people would prefer to eat neither


----------



## Aerpoweron (Feb 14, 2019)

> The entire money and time (even if it isn't much) that nvidia is spending on DLSS is a waste.
> They should entirely focus on the Ray-Tracing implementation instead, Metro is a nice showcase for them, after I saw the Ray Traced global illumination in metro exodus it really made me think that Ray Tracing has the potential to improve things in games, Cyberpunk 2077 is a good opportunity for nvidia, a game that most enthusiast PC gamers actually care about, they have a lot of time left to properly implement Ray Tracing in it and maybe make the game a true showcase for their next Gen GPUs.



The current problem with raytracing is, that you can just use it as a cherry on top. The game still has to work without it. There is still no game which has complete lighting only through raytracing. I think you need a completely different game engine to do that. And then it would not work on normal GPUs any longer at sufficient performance. Even Nvidia has to think about that since they just released there GTX 1660ti.

And i think the pictures are labeled wrong as well. But we need more information about what resolution they were taken, what is the resolution DLSS scales up from (and fills out some details). And what the exact TAA and SMAA settings are.


----------



## noel_fs (Feb 14, 2019)

i think the overall image output of DLSS has too much noise


----------



## Imsochobo (Feb 14, 2019)

Metroid said:


> taa is very blurred, i do prefer dlss on those images above, night and day compared. What the hell amd is smoking saying taa is better than dlss.



Look at dlss elsewhere, then it's opposite


----------



## Batou1986 (Feb 15, 2019)

> AMD Doesn't Believe in NVIDIA's DLSS


AMD isn't the only one


----------



## trog100 (Feb 15, 2019)

running the 3Dmark comparison both runs look fine to me the only difference i see is one runs at 35 fps and the other at 50 fps that is quite a difference..

looking at the comparisons images in this thread the ones labeled DLSS look far better.. okay the DLSS knockers  think they are labeled wrong..  maybe they are who knows.. he he

trog


----------



## Xex360 (Feb 15, 2019)

This gimmick is just useless, for one simple reason you can't use it freely on every game, as an example now I only play one game BF1 and it doesn't support DLSS so even with a 2080ti I'd be stuck using normal AA solutions. 
I think nVidia should have used their silicon budget on much more Cuda cores, and find a solution like a xbox 360 promised to offer basically free MSAA X2 or more.


----------



## FordGT90Concept (Feb 15, 2019)

I expected AMD to take the open source path and that sounds like it is the case (if they pursue it at all).  Judging by this thread, DLSS is on the path to being yet another HairWorks: a few games will use it because NVIDIA paid them to.  After that, no one supports it.


----------



## renz496 (Feb 15, 2019)

proprietary solution? nvidia made those all the time. for example TXAA and MFAA that only works on their GPU. AMD never make a single noise about those before. why suddenly they care when nvidia try to push DLSS? the issue probably is less about being proprietary or open. the way i see it some people like it and some are not. more or less the same how it was with FXAA before where some people don't like the blurring side effect and some other rather have FXAA than not having AA at all. AMD most likely care because DLSS can uplift the performance quite significantly regardless of the quality image. if DLSS can give nvidia performance advantage (when AMD can only use TAA) they probably want the public to know that the comparison is not really a fair one because of image quality. 

personally i think AMD probably also want to offer something similar to DLSS (hence the talk about to use Direct ML before) but stuff like DLSS is not just simply "inject" the AA into the game. DLSS need the image to be trained using ML first. for nvidia this training cost is something they wiling to shoulder themselves instead of passing them to game developer. will AMD wiling to do the same? some people said this latest effort from nvidia is just wasting money but that's simply how they roll. they try to push something and when it does not work for them they just move on.



Xex360 said:


> This gimmick is just useless, for one simple reason you can't use it freely on every game, as an example now I only play one game BF1 and it doesn't support DLSS so even with a 2080ti I'd be stuck using normal AA solutions.
> I think nVidia should have used their silicon budget on much more Cuda cores, and find a solution like a xbox 360 promised to offer basically free MSAA X2 or more.



they can add more CUDA cores but doing so they will face another problem. the glimpse of that problem already here with RTX2080Ti.


----------



## megamanxtreme (Feb 15, 2019)

renz496 said:


> they can add more CUDA cores but doing so they will face another problem. the glimpse of that problem already here with RTX2080Ti.


I'm not up to date on that, what's going on?


----------



## GlacierXD (Feb 15, 2019)

I don't believe it either.


----------



## SystemMechanic (Feb 15, 2019)

rtwjunkie said:


> Because every frame of a benchmark can be optimized for.  Out in the wild, you are more likely to see those pictures actually reversed as far as image quality, based on what I have read so far.



Then tat would be Nvidia Falsely advertising and manipulating the customer. If you watched all their DLSS presentation, youll know. Also the Point in Nvidia comparing Port Royal's Blurry TAA with Crisp DLSS is to show that DLSS is better and it doesnt cause Blurry ness. So Nvidia screwing around with us in this case.



jabbadap said:


> It is AA method by definition. And no it's not just upscaling, you would not need tensor cores to do just that.



Exactly, people are forgetting here that Tensor cores actually exist, physically on the Chip. We have plenty of Upscaling methods that are way better than current DLSS if this were the case.


----------



## rtwjunkie (Feb 15, 2019)

SystemMechanic said:


> So Nvidia screwing around with us in this case.


Of Course they are!  They are stretching the truth to sell a product.  AMD has done that before.  Just about any marketing campaign for anything.  If you accept that then life doesn't spark as much outrage.  Everyone is trying to get us to spend our discretionary income on their product.


----------



## SystemMechanic (Feb 15, 2019)

rtwjunkie said:


> Of Course they Aare!  They are stretching the truth to sell a product.  AMD has done that before.  Just about any marketing campaign for anything.  If you accept that then life doesn't spark as much outrage.  Everyone is trying to get us to spend our discretionary income on their product.



I hope they get sued for this then. In theory AI should be able to do what DLSS is, but it its still a theory then Nvidia advertised it as if they already have it and all we need to do is use RTX cards. Atleast Raytracing is super good at this point. I was actually more keep for DLSS so I could play at higher frames at 4k. But behold, you cant even turn on DLSS without DXR.

The Lies about DLSS

1. It improves performance (now we know this is actually a vague statement because it has a lot of limitations )
2. maintains or sometimes ever increases Image quality (This is also a lie because it doesnt and it actually make it look worse than current scalar methods because of the obnoxious blur effect)

Jensen also when on sayin that in some cases DLSS can give you an image that looks much much better than the resolution it is upscaling to . I think this was DLSSX2 ? So he basically said in one of the presentations that a 1440p image upscaled to 4k will look super good than the 4k image itself..


----------



## Prima.Vera (Feb 15, 2019)

Jeezus! Those TAA screens don't just look like CRAP, they look utter garbage blur fest!! Even FXAA or MLAA looks much sharper than that junk! Definetely looks like fabricated screens for marketing purpose imo.
Where are the SMAA or TSSAA8x comparisson shots?? I thing AMD should invest heavily in those 2 since they are almost performace gap free.


----------



## Metroid (Feb 15, 2019)

Prima.Vera said:


> Jeezus! Those TAA screens don't just look like CRAP, they look utter garbage blur fest!! Even FXAA or MLAA looks much sharper than that junk! Definetely looks like fabricated screens for marketing purpose imo.
> Where are the SMAA or TSSAA8x comparisson shots?? I thing AMD should invest heavily in those 2 since they are almost performace gap free.



Agreed, dont understand why amd is betting on that taa crap.


----------



## cucker tarlson (Feb 15, 2019)

taa is crap,full of motion artifacts and image smearing.


----------



## Naito (Feb 15, 2019)

Not voting as none of the points really apply to me; I'm keen to see how it works firsthand, but I'm not jumping on the Turing bandwagon anytime soon...


----------



## Markosz (Feb 15, 2019)

Metroid said:


> taa is very blurred, i do prefer dlss on those images above, night and day compared. What the hell amd is smoking saying taa is better than dlss.



Yes on those images, the difference is very obvious. But check out real game samples, BF5 and Metro examples here on TPU. 
It's as if the labels would be mixed up here, the DLSS is heavily blurred in reality.


----------



## SystemMechanic (Feb 15, 2019)

Metro SS's From Reddit: - user claims he was using the latest patch.



http://imgur.com/MKdsJDW




http://imgur.com/ubnK8a5


Dont even have to label these because its very obvious, one of the images is very blurry as if like there is some sort of Depth of Field effect being applied to the scene.......


----------



## Recus (Feb 15, 2019)

Looks like AMD is scared. They were caught off guard.


----------



## SetsunaFZero (Feb 15, 2019)

IMO DLSS was only made to make RTX Games playable, just look at the new Metro 1440 and 4k benchmarks. DLSS will be present in future GPU generations but not mandatory.


----------



## cucker tarlson (Feb 15, 2019)

this early version of dlss works best for 4k owners,that's about it. It needs high res ground image,at least 1440p.The returns are there when it comes to fps you gain but I think this case scenario is about the only one I'd use dlss for. It's far from ideal but the results are there,needs time. what was an experiment in the forst rtx series will be bettered in the second one,both in hardware and in software.so yes,amd were completely caught here.they've completely stopped innovating themselves.RVII has zero new features compared to Vega.


----------



## londiste (Feb 15, 2019)

AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions
In other news - water is wet.
I would be worried about their mental health if they would say Nvidia's proprietary solution is cool or something.



> In fact, AMD decides to go head-on against NVIDIA's own technical presentations, comparing DLSS' image quality and performance benefits against a native-resolution, TAA-enhanced image - they say that SMAA and TAA can work equally as well without "the image artefacts caused by the upscaling and harsh sharpening of DLSS."


Nvidia left itself wide open for this one. Nice jab, though in line with the usual marketing bullshit on the topic. Native-resolution, TAA-enhanced image will always be better than DLSS image, because DLSS is not native resolution. At the same time, DLSS will be in range of 40% faster.

Taking the example of "4K" image and comparisons, test so far tend to show that DLSS image upscaled from 1440p is roughly on par with 1800p + TAA in both image quality and performance. Native 4K + TAA will be considerably slower. 1440p + TAA will be faster but uglier.



> Of course, AMD may only be speaking from the point of view of a competitor that has no competing solution. however, company representatives said that they could, in theory develop something along the lines of DLSS via a GPGPU framework - a task for which AMD's architectures are usually extremely well-suited.


Correct. But they neglect to mention this approach has a clear downside of using the same compute resources that are otherwise directly used for rendering. It is not clear if and how much of CUDA cores DLSS uses in addition to Tensor cores but the impression so far is that DLSS is leaner on compute than it would be without Tensor cores.


----------



## Vayra86 (Feb 15, 2019)

Pointless statement from AMD, its clear as day their focus is elsewhere and this is a VERY cheap answer to somehow create the impression they're still in the game. They aren't doing jack shit with AA at this time, or with DLSS, and this is a fancy way of saying it.

That said, they aren't wrong... DLSS is dead in the water.



SystemMechanic said:


> Metro SS's From Reddit: - user claims he was using the latest patch.
> 
> 
> 
> ...



LOL. Where is the blur? Its too dark to see...


----------



## Vya Domus (Feb 15, 2019)

Recus said:


> Looks like AMD is scared. They were caught off guard.



Indeed, they are scared about an upscaler found in 2 games and in one of those cases it objectively looks like utter garbage. I bet they are shitting themselves.


----------



## Nxodus (Feb 15, 2019)

I wish AMD has some fancy new tech that would make Nvidia say stuff like "we don't believe in AMD's..."

Thing is, AMD has zero innovation (besides "more Vram guys, let's add some more Vram") What a terrible competition AMD is. Come on Intel, show AMD how competing with Nvidia is done.


----------



## Vya Domus (Feb 15, 2019)

It always cracks me up how people have faith in a company that has failed miserably to enter the dedicated GPU businesses twice to deliver competing products.


----------



## Nxodus (Feb 15, 2019)

Vya Domus said:


> It always cracks me up how people have faith in a company that has failed miserably to enter the dedicated GPU businesses twice to deliver competing products.



It always cracks me up that you can't properly quote people you're responding to, as if, you're having a monologue to yourself.


----------



## moproblems99 (Feb 15, 2019)

Vya Domus said:


> It always cracks me up how people have faith in a company that has failed miserably to enter the dedicated GPU businesses twice to deliver competing products.



Not to mention hire the person responsible for the GPUs they complain about so much...


----------



## Kamgusta (Feb 15, 2019)

It's not that DLSS is sharper. It's TAA that is mudgy.


----------



## ratirt (Feb 15, 2019)

Nxodus said:


> I wish AMD has some fancy new tech that would make Nvidia say stuff like "we don't believe in AMD's..."
> 
> Thing is, AMD has zero innovation (besides "more Vram guys, let's add some more Vram") What a terrible competition AMD is. Come on Intel, show AMD how competing with Nvidia is done.


at first I'd say that blurry picture is nothing innovative if you speak about DLSS. Besides too much sharpness or too much blurriness is not good either way. DLSS brings nothing new except the FPS is higher? (correct me if I'm wrong) If that's the case then it's because the image quality is down and that's how I see it from the TPU review of the DLSS.
To summarize. Nvidia didn't invent anything new with DLSS as you perceive it. Blurriness for me have been with us a long time. Calling something DLSS and telling people this is a new technology isn't ok. Maybe it brings something new to the table but, I hope, this is yet to be seen.  NV's focus, as we all know, is on money so this DLSS is more of a marketing than actual innovation for me (innovation with poor image quality). Especially when they come up with some new tech ditching all other 1-2 years old innovation bringing something new to the table to charge more for it.
I stick with AMD and open techniques for improving image quality and implement new stuff. This is just another way for NV (that's just my opinion) to give "something new that only they have" (but in fact it's been in the market already developed) to charge more and convince customers and game developers that's they way to go and of course charge more.
Just like it is with G-Sync and just like it will be with G-sync compatible ( which in fact is free sync). that's just lame.


----------



## cucker tarlson (Feb 15, 2019)

ratirt said:


> I stick with AMD and open techniques for improving image quality and implement new stuff.


huh ? what did I miss ?


----------



## Nxodus (Feb 15, 2019)

ratirt said:


> at first I'd say that blurry picture is nothing innovative if you speak about DLSS. Besides too much sharpness or too much blurriness is not good either way. DLSS brings nothing new except the FPS is higher? (correct me if I'm wrong) If that's the case then it's because the image quality is down and that's how I see it from the TPU review of the DLSS.
> To summarize. Nvidia didn't invent anything new with DLSS as you perceive it. Blurriness for me have been with us a long time. Calling something DLSS and telling people this is a new technology isn't ok. Maybe it brings something new to the table but, I hope, this is yet to be seen.  NV, as we all know, focused on money so this DLSS is more of a marketing than actual innovation for me. Especially when they come up with some new tech ditching all other 1-2 years old innovation bringing something new to the table to charge more for it.
> I stick with AMD and open techniques for improving image quality and implement new stuff. This is just another way for NV (that's just my opinion) to give "something new that only they have" (but in fact it's been in the market already developed) to charge more and convince customers and game developers that's they way to go and of course charge more.
> Just like it is with G-Sync and just like it will be with G-sync compatible ( which in fact is free sync). that's just lame.



Who said innovation must always be good? There's bad innovation too. Focus here is on the fact that it's something new.
AMD is forced to market itself with open tech, it's actually a smart business move. Who knows how open AMD would be if it were in the nr.1 position. AMD is just picking up the crumbs that Nvidia left under the table. Which is sad, I really want AMD to be equal to Nvidia


----------



## ratirt (Feb 15, 2019)

cucker tarlson said:


> huh ?





cucker tarlson said:


> huh ? what did I miss ?


What got you tipped over there?
I'm referring to the SMAA and TAA since that what this thread is about.



Nxodus said:


> Who said innovation must always be good? There's bad innovation too. Focus here is on the fact that it's something new.
> AMD is forced to market itself with open tech, it's actually a smart business move. Who knows how open AMD would be if it were in the nr.1 position. AMD is just picking up the crumbs that Nvidia left under the table. Which is sad, I really want AMD to be equal to Nvidia


Really? So innovation in your dictionary is something new but not necessarily made for improving something but make it worse?
What crumbs? NV  feed you with your "innovations" which give nothing but drops down on an image quality. We look for improvements not deterioration and still calling it innovation.


----------



## cucker tarlson (Feb 15, 2019)

ratirt said:


> What got you tipped over there?
> I'm referring to the SMAA and TAA since that what this thread is about.


what new stuff is amd implementing ?



ratirt said:


> What got you tipped over there?
> NV  feed you with your "innovations" which give* nothing but drops down on an image quality. *We look for improvements not deterioration and still calling it innovation.


really ? you can't even think of one ?


----------



## ratirt (Feb 15, 2019)

cucker tarlson said:


> what new stuff is amd implementing ?


Ask AMD not me.


----------



## cucker tarlson (Feb 15, 2019)

ratirt said:


> Ask AMD not me.


but you said you stick with amd for implementing new image quality improvements. therefore I'm asking you cause I can't think of any in the recent few years.


----------



## Nxodus (Feb 15, 2019)

ratirt said:


> What got you tipped over there?
> I'm referring to the SMAA and TAA since that what this thread is about.
> 
> 
> ...



Really. 
Also, RT is the next step towards realism, I really like what I'm seeing. Innovation in the VGA sector has been pretty bland in the last couple of years, and behold the only meaningful innovation (RT) is delivered by Nvidia. Where is AMD? I want AMD to innovate and make Nvidia crawl back in shame. But it's not happening.


----------



## londiste (Feb 15, 2019)

AMD does create and provide some stuff, today mostly under their GPUOpen initiative:
https://gpuopen.com/games-cgi/#effects


----------



## Nxodus (Feb 15, 2019)

londiste said:


> AMD does create and provide some stuff, today mostly under their GPUOpen initiative:
> https://gpuopen.com/games-cgi/#effects



All of them are AMD radeon exclusive. That's very open


----------



## Vya Domus (Feb 15, 2019)

Nxodus said:


> It always cracks me up that you can't properly quote people you're responding to, as if, you're having a monologue to yourself.



Feeling neglected ? I quote whoever I want whenever I want.


----------



## londiste (Feb 15, 2019)

Nxodus said:


> All of them are AMD radeon exclusive. That's very open


They are open. There are GitHub links to source.
Of course, they are optimized for GCN, but that is expected.


----------



## Litzner (Feb 15, 2019)

The biggest elephant for me in the DLSS room is input latency. I haven't seen any direct tests done for this (and I would love to) but I am assuming that this type of post-processing has to add some amount of input latency? Just reading in how it works I am guessing that this input latency hit may be substantial. I don't spend so much time and money on getting my setup to have as little input latency as possible just to add a bunch with post-processing.


----------



## londiste (Feb 15, 2019)

Litzner said:


> The biggest elephant for me in the DLSS room is input latency. I haven't seen any direct tests done for this (and I would love to) but I am assuming that this type of post-processing has to add some amount of input latency? Just reading in how it works I am guessing that this input latency hit may be substantial. I don't spend so much time and money on getting my setup to have as little input latency as possible just to add a bunch with post-processing.


Why do you think it increases input latency more than any other part of rendering a frame? Basically, the effect to latency should be only in terms of slightly longer frame render time.
I am not saying it does not increase input latency but is there a reason to think it does?


----------



## cucker tarlson (Feb 15, 2019)

londiste said:


> AMD does create and provide some stuff, today mostly under their GPUOpen initiative:
> https://gpuopen.com/games-cgi/#effects


what games are they implemented in ?


----------



## londiste (Feb 15, 2019)

Off the top of my head? TressFX was in Tomb Raider 2013 

Edit: with a bit of searching, TressFX is also in Rise of the Tomb Raider (version modified by Square Enix), Deus Ex: Mankind Divided (TressFX 3.0).
These are not branded and marketed like Nvidia's GameWorks. Being open, pretty much anyone is free to modify and include them in their software. Bigger engines probably have most of this stuff included.


----------



## Litzner (Feb 15, 2019)

londiste said:


> Why do you think it increases input latency more than any other part of rendering a frame? Basically, the effect to latency should be only in terms of slightly longer frame render time.
> I am not saying it does not increase input latency but is there a reason to think it does?



"DLSS leverages a deep neural network to extract multidimensional features of the rendered scene and intelligently combine details from multiple frames to construct a high-quality final image."

If they are comparing multiple frames and combining them to construct the image you see, then you will always behind as many frames as they compared.


----------



## R0H1T (Feb 15, 2019)

Nxodus said:


> It always cracks me up that you can't properly quote people you're responding to, as if, you're having a monologue to yourself.


There's an entire industry making billions off monologue, more than Nvidia or possibly even Intel. Who says monologues are bad 


cucker tarlson said:


> I can't think of any in the recent few years.


TressFX, if that counts?


----------



## cucker tarlson (Feb 15, 2019)

londiste said:


> Off the top of my head? TressFX was in Tomb Raider 2013
> 
> Edit: with a bit of searching, TressFX is also in Rise of the Tomb Raider (version modified by Square Enix), Deus Ex: Mankind Divided (TressFX 3.0).
> These are not branded and marketed like Nvidia's GameWorks. Being open, pretty much anyone is free to modify and include them in their software. Bigger engines probably have most of this stuff included.


yeah I knew these two. that's a pretty sad list tho. I saw amd hair in rotr and deus ex,looked good though a bit too shiny and fake to me.



R0H1T said:


> There's an entire industry making billions off monologue, more than Nvidia or possibly even Intel. Who says monologues are bad
> TressFX, if that counts?



that's it ?


----------



## londiste (Feb 15, 2019)

Litzner said:


> "DLSS leverages a deep neural network to extract multidimensional features of the rendered scene and intelligently combine details from multiple frames to construct a high-quality final image."
> If they are comparing multiple frames and combining them to construct the image you see, then you will always behind as many frames as they compared.


Depends. It is unclear if there is any machine learning-ish taking place during rendering. Most of that work is supposed to happen in Nvidia's render farm and data from that delivered either via game or driver updates. I have seen a few suspicions - but no details or analysis - that something might take place locally but that does not necessarily mean delayed frames but more likely comparisons with previous (few) frames.


----------



## moproblems99 (Feb 15, 2019)

cucker tarlson said:


> bit too shiny and fake to me



Sounds familiar...like I saw it in Screen shots of a Battlefield game or something....what was that feature called...


----------



## Litzner (Feb 15, 2019)

londiste said:


> Depends. It is unclear if there is any machine learning-ish taking place during rendering. Most of that work is supposed to happen in Nvidia's render farm and data from that delivered either via game or driver updates. I have seen a few suspicions - but no details or analysis - that something might take place locally but that does not necessarily mean delayed frames but more likely comparisons with previous (few) frames.



Which is why I am very curious to see some input latency tests done for DLSS


----------



## Super XP (Feb 15, 2019)

hutt132 said:


> I think you have the labeling wrong on the comparison screenshots.


DLSS doesn't look great. Agreed.


----------



## cucker tarlson (Feb 15, 2019)

moproblems99 said:


> Sounds familiar...like I saw it in Screen shots of a Battlefield game or something....what was that feature called...


lol any reflections in any pc game are designed to look that way.I absolutely adore how amd fanbase still compalins how those rtx reflections too shiny,as if they didn't know it's still a computer game rednering of a reflection,we are not at the stage of photo realism yet sweetie 
said that before,grapes are getting even more sour with amd leaving rt features out for nvidia to claim for themselves.


----------



## moproblems99 (Feb 15, 2019)

cucker tarlson said:


> lol any reflections in any pc game are designed to look that way.I absolutely adore how amd fanbase still compalins how those look reflections too shiny,as if they didn't know it's still a computer game,we are deceades from photorealism.
> said that before,grapes are getting even more sour with amd leaving rt features out for nvidia to claim for themselves.



Wait, didn't you just say the same thing about TressFX?  Pot, meet kettle.

Edit:  To add to that, if AMD had RTRT features, that crap would be off.  It looks terrible.  I might give it a go on Metro (if I had it) but as long as it looks like BFV, that shit is off.


----------



## cucker tarlson (Feb 15, 2019)

moproblems99 said:


> Wait, didn't you just say the same thing about TressFX?  Pot, meet kettle.


what did I say about tress fx ?
the fact it's shiny ? yes,it is,which stands out when its applied to hair.
still beats the default hair and I like how it has no performance impact. I liked hw in witcher 3 more, but that came with a big performance penalty. not everyone has enough gpu resources to spare fort hat.


----------



## ratirt (Feb 15, 2019)

cucker tarlson said:


> but you said you stick with amd for implementing new image quality improvements. therefore I'm asking you cause I can't think of any in the recent few years.


read again.


----------



## moproblems99 (Feb 15, 2019)

cucker tarlson said:


> what did I say about tress fx ?
> the fact it's shiny ? yes,it is,which stands out when its applied to hair.



What did I say about RTX?  The exact same thing.  Reflective surfaces are way too shiny and look fake.  How do our comments differ?  Enlighten me.


----------



## cucker tarlson (Feb 15, 2019)

moproblems99 said:


> What did I say about RTX?  The exact same thing.  Reflective surfaces are way to shiny and look fake.  How do our comments differ?  Enlighten me.


cause you were compalining how *reflections are shiny* ?











ratirt said:


> read again.


okay.you go



ratirt said:


> I stick with AMD and open techniques for improving image quality and implement new stuff.



I ask


cucker tarlson said:


> what new stuff is amd implementing ?



you go



ratirt said:


> Ask AMD not me.


----------



## ratirt (Feb 15, 2019)

cucker tarlson said:


> cause you were compalining how *reflections are shiny* ?
> 
> 
> 
> ...



Listen I'm not giving you English lessons.


----------



## moproblems99 (Feb 15, 2019)

cucker tarlson said:


> cause you were compalining how *reflections are shiny* ?



Sorry but can you show me where I said the reflections were too shiny?  Let me quote myself below.

Edit:  I'll cut you a break because I didn't explicitly say reflective surfaces.  However, I figured someone with your intellect would have put it together.



moproblems99 said:


> Sounds familiar...like I saw it in Screen shots of a Battlefield game or something....what was that feature called...


----------



## jabbadap (Feb 15, 2019)

ratirt said:


> at first I'd say that blurry picture is nothing innovative if you speak about DLSS. Besides too much sharpness or too much blurriness is not good either way. DLSS brings nothing new except the FPS is higher? (correct me if I'm wrong) If that's the case then it's because the image quality is down and that's how I see it from the TPU review of the DLSS.
> To summarize. Nvidia didn't invent anything new with DLSS as you perceive it. Blurriness for me have been with us a long time. Calling something DLSS and telling people this is a new technology isn't ok. Maybe it brings something new to the table but, I hope, this is yet to be seen.  NV's focus, as we all know, is on money so this DLSS is more of a marketing than actual innovation for me (innovation with poor image quality). Especially when they come up with some new tech ditching all other 1-2 years old innovation bringing something new to the table to charge more for it.
> I stick with AMD and open techniques for improving image quality and implement new stuff. This is just another way for NV (that's just my opinion) to give "something new that only they have" (but in fact it's been in the market already developed) to charge more and convince customers and game developers that's they way to go and of course charge more.
> Just like it is with G-Sync and just like it will be with G-sync compatible ( which in fact is free sync). that's just lame.



Well it's new way of doing AA, there haven't been AA method using machine learning before DLSS. So saying it's nothing new is a false statement. And like there is in article AMD is exploring similar WinML upscaling filters too. And the filter works as well as it could be teached. Now dlss looks like crap, maybe it could get better over time with more ML runs on super computer to get better training model for inference. But if it can't achieve that, it's just one more bad AA method out there, which will eventually die a way.


----------



## cucker tarlson (Feb 15, 2019)

moproblems99 said:


> Sorry but can you show me where I said the reflections were too shiny?  Let me quote myself below.
> 
> Edit:  I'll cut you a break because I didn't explicitly say reflective surfaces.  However, I figured someone with your intellect would have put it together.


I said hair looks "too fake and shiny"
you said "like the feature I saw in BF5"

BF5 has raytraced reflections only. you're literally complaining how nvidia reflections are shiny.
don't worry tho,I'm enjoying this,it's hilarious


----------



## moproblems99 (Feb 15, 2019)

cucker tarlson said:


> BF5 has raytraced reflections only.



Right you are.  But it is the reflective surfaces that look too fake and shiny.  The reflections probably look great.  I just can't get over the surfaces looking terrible.


----------



## cucker tarlson (Feb 15, 2019)

moproblems99 said:


> Right you are.  But it is the reflective surfaces that look too fake and shiny.  The reflections probably look great.  I just can't get over the surfaces looking terrible.


probably has  to happen at this point,or it would be too subltle for anyone's eye to really catch.ssr has been doing exact same thing for years but with poor reproduction of the reflected object,never seen you complain.
puddles,cars,windows - everything in a pc game that uses ssr is overexposed to show off the effect.


----------



## moproblems99 (Feb 15, 2019)

cucker tarlson said:


> probably has  to happen at this point,or it would be too subltle for anyone's eye to really catch.



Oh, so now we have to make the reflective surfaces overly shiny, fake, and not realistic looking so that people MIGHT actually be able to see this revolutionary tech for realism improvements.

Gotcha.

Nice Ninja.


----------



## ArbitraryAffection (Feb 15, 2019)

Fluffmeister said:


> It would just be nice if AMD could conclusively beat a 2 year old EOL card with 7nm tech, but the spin from their director of marketing is.... expected.


Would be nice if NVIDIA could compete with a 2 year old not-quiet-EOL card with their 16 and 12nm tech 

_*£150~ RX 570 8GB mustard rice ~*_

I think TAA looks good. It's in Fallout 4 and Warframe. I'd take it over a dumb computer program guessing what a super-sampled image should look like. Especially when you consider how much die space is taken up by those Tensor cores. Honestly, those tensors are for AI training in servers and pro cards and NVIDIA is trying to sell them to gamers so they are OK paying more for less performance because the dies are bigger. ~


----------



## cucker tarlson (Feb 15, 2019)

moproblems99 said:


> Oh, so now we have to make the reflective surfaces overly shiny, fake, and not realistic looking so that people MIGHT actually be able to see this revolutionary tech for realism improvements.
> 
> Gotcha.
> 
> Nice Ninja.


they've been like that before rtx to show off ssr,please just don't tell me you only now noticed.


----------



## moproblems99 (Feb 15, 2019)

ArbitraryAffection said:


> I'd take it over a dumb computer program guessing what a super-sampled image should look like. Especially when you consider how much die space is taken up by those Tensor cores. Honestly, those tensors are for AI training in servers and pro cards and NVIDIA is trying to sell them to gamers so they are OK paying more for less performance because the dies are bigger.



It certainly isn't dumb, just needs to be refined.  If they can find out a decent algorithm then DLSS could be very useful.  Much like RTX in BFV according to cucker, you likely won't notice it.



cucker tarlson said:


> they've been like that before rtx to show off ssr,please just don't tell me you only now noticed.



Nice try.  You perfectly described the use case of RTX right now.  We will see what happens next gen.  If they can up the jigga rays and devs can learn the nuances, it may pan out.  That is a lot of ifs.  I think DLSS stands a better chance to succeed if they can refine it some more.


----------



## ratirt (Feb 15, 2019)

jabbadap said:


> Well it's new way of doing AA, there haven't been AA method using machine learning before DLSS. So saying it's nothing new is a false statement. And like there is in article AMD is exploring similar WinML upscaling filters too. And the filter works as well as it could be teached. Now dlss looks like crap, maybe it could get better over time with more ML runs on super computer to get better training model for inference. But if it can't achieve that, it's just one more bad AA method out there, which will eventually die a way.


Maybe I understand innovation a bit different here. 

For me it's a way of achieving at least same resault or better in a cost efficient way (there's more but this is a good example). RTX price is huge and this innovation ain't nothing like what we  already have with open techniques. It is worse. 
I disagree with your statement. This is what I think about the DLSS. It's my opinion. If you say it's worth something, that's OK with me. 
I also disagree that having dozen of techniques to achieve one goal is a great way. You can't focus on so called "innovations" and start from scratch every time, when you have all that's needed already there and open to everybody. Think about game developers and gamers. Do you realize what would have happend if we had these "innovations" once a year? They won't implement every single one. 
So for me at this point, DLSS is a downgrade.


----------



## jabbadap (Feb 15, 2019)

Vya Domus said:


> No it isn't, actually. Aliasing happens when you sample different signals in such a way that they appear identical once put back together, that is the textbook definition. Here the signals represent graphical elements before they are rasterized (sampled) *not the whole image* as Nvidia would like to imply. The way you avoid aliasing is to apply a filter to those particular elements before they are sampled , or in this case before the image is rendered, MSAA for example works like that (well, not exactly, but the point is that this is done on a per component basis). That is also the textbook definition of how you would go about using an anti-aliasing method.
> 
> Whatever DLSS does is applied to the whole mage after it is rendered, it is an upscaling solution nothing more. That's the equivalent of interpolating a signal, not filtering it. Post process AA like FXAA isn't a true method of AA either by definition, because there you try and alter the signal after it was sampled as well not while it is constructed.
> 
> You don't, you use those cores only to make it faster.



Acronym DLSS comes from Deep Learning Super-Sampling, so by definition some sort of Super-Sampling anti-aliasing method albeit quite crappy one. But the end result is it removes aliasing which is quite what anti-aliasing is.



ratirt said:


> Maybe I understand innovation a bit different here.
> 
> For me it's a way of achieving at least same resault or better in a cost efficient way (there's more but this is a good example). RTX price is huge and this innovation ain't nothing like what we  already have with open techniques. It is worse.
> I disagree with your statement. This is what I think about the DLSS. It's my opinion. If you say it's worth something, that's OK with me.
> ...



Well yeah if it would work like it is marketed, it could be quite game changer. Better quality than 4k with TAA at much higher framerates. I would be all over it, but in current state we don't have even close to that. So pretty big meh until proved otherwise.


----------



## ratirt (Feb 15, 2019)

jabbadap said:


> Acronym DLSS comes from Deep Learning Super-Sampling, so by definition some sort of Super-Sampling anti-aliasing method albeit quite crappy one. But the end result is it removes aliasing which is quite what anti-aliasing is.
> 
> 
> 
> Well yeah if it would work like it is marketed, it could be quite game changer. Better quality than 4k with TAA at much higher framerates. I would be all over it, but in current state we don't have even close to that. So pretty big meh until proved otherwise.



Just to add something to your premis. This DLSS innovation is, lets say in a "test mode". So you don't release it to the consumer market since it's not ready and I think we can all agree on that. If it worked like it is marketed? The question is will it ever work like it's marketed. I have serious doubts about that. What I've stated earlier and please let me repeat. The FPS is higher cause the image quality is down. I think it's that simple.


----------



## ArbitraryAffection (Feb 15, 2019)

moproblems99 said:


> It certainly isn't dumb, just needs to be refined.  If they can find out a decent algorithm then DLSS could be very useful.  Much like RTX in BFV according to cucker, you likely won't notice it.



I disagree honestly, AI technology as it is right now is dumb. It's little more than basic logic engines. there is no real intelligence, and there wont be for many years. it's in its infancy but of course this is a good step and its gotta start somewhere.


----------



## moproblems99 (Feb 15, 2019)

ratirt said:


> The FPS is higher cause the image quality is down. I think it's that simple.



Let's be honest, it's pretty hard to get better performance with an increase in image quality.  I have not played a title with DLSS so I am not sure if I would notice the quality or not.  Really depends on the pace of the game.  Metro might be slow enough of a pace that it would be noticeable.  Something like BFV multiplayer, maybe not so much.



ArbitraryAffection said:


> I disagree honestly, AI technology as it is right now is dumb. It's little more than basic logic engines. there is no real intelligence, and there wont be for many years. it's in its infancy but of course this is a good step and its gotta start somewhere.



I mean look at the average human, logic is mostly gone.  We are, in general, purely emotion driven now and you are right in that it will take a long time for AI to pick up emotions.  However, AI is well suited for things of this nature and can be extremely good at it.  It all comes down to the algorithm used.  If they can improve it, it could be nice.  But again, with consoles having AMD, how much use will it really get?  I guess that will come down to how many investments with game studios NV is willing to make.


----------



## ArbitraryAffection (Feb 15, 2019)

moproblems99 said:


> I mean look at the average human, logic is mostly gone.  We are, in general, purely emotion driven


I know this _all too well_ lol.

Interesting topic: When does the AI program become so intelligent that it _doesn't want_ to upscale your video games?


----------



## moproblems99 (Feb 15, 2019)

ArbitraryAffection said:


> I know this _all too well_ lol.
> 
> Interesting topic: When does the AI program become so intelligent that it _doesn't want_ to upscale your video games?



I would assume when it realizes that you could turn settings down to achieve the same goal without having to do all this extra processing?


----------



## ArbitraryAffection (Feb 15, 2019)

moproblems99 said:


> I would assume when it realizes that you could turn settings down to achieve the same goal without having to do all this extra processing?


Ohhhhhh~





Edit: sorry, lol. Couldn't resist.


----------



## ratirt (Feb 15, 2019)

ArbitraryAffection said:


> I disagree honestly, AI technology as it is right now is dumb. It's little more than basic logic engines. there is no real intelligence, and there wont be for many years. it's in its infancy but of course this is a good step and its gotta start somewhere.


I wouldn't say totally.
Some time ago I've attended a Warsaw Security Summit conference (Hope Cucker will be thirlled cause it was held in Poland)
Innovation with AI and Deep learning for cameras and since I was doing Video Surveillance Systems I attended. Using AI and/or deep learning techiques the camera was able to recognize human emotions. It could tell if somebody is sad or stressed etc. Basically it would tell you if a dude is up to something since his behavior, face would say it. Like a somebody who's about to commit a crime. That's innovation which has been showed and analized later with in depth information and code. What we have with DLSS here? People hear Deep Learning Multi-sampling and it must be great since "deep learning" is mentioned there. What a bull crap on a barn floor that marketing shenaningans.




moproblems99 said:


> Let's be honest, it's pretty hard to get better performance with an increase in image quality.  I have not played a title with DLSS so I am not sure if I would notice the quality or not.  Really depends on the pace of the game.  Metro might be slow enough of a pace that it would be noticeable.  Something like BFV multiplayer, maybe not so much.


You can take a look in TPU review. You have a comparison. Take a look. it's not only the algorithm but also purpose of it and if it's meant for this particular job (should Deep Learning be used for this ?) and of course if it's worth it.
I don't know your game preferences but imagin motion blur effect. Did you like it in games when you wanted to excel at a particular game? I didn't. I always wanted nice smooth and crispy image. With DLSS It seems like we are going backwards under the Deep Learning flag with a promise and motto "The way it's meant to be played" from now on . It's just sad.


----------



## moproblems99 (Feb 15, 2019)

ratirt said:


> You can take a look in TPU review. You have a comparison. Take a look.



I have a comparison of a still image that can be scrutinized till the end of the world.  What I don't have is live action comparison.  I don't know if I would notice it.  There are times when I am so caught up playing the game that I don't notice my giant, obvious radar telling me there is someone about to kill me right before I die.

TL;DR;

Still frames are much easier to compare and scrutinize than live action.  Until I see it in person while I am playing, I will withhold final judgement.  That said, I likely will never see it as I likely won't be buying a 2000 series card.


----------



## Casecutter (Feb 15, 2019)

Two years from now it will be like G-Sync, another proprietary technology that went no where.


----------



## ratirt (Feb 15, 2019)

Casecutter said:


> Two years from now it will be like G-Sync, another proprietary technology that went no where.


And with what waste of time, technology and money. (the last one not in particular since a lot of people will still go with it  )


----------



## XXL_AI (Feb 15, 2019)

amd is a spoiled little kid with some skills to show, but they are jealous of every achievement of other companies, they mostly suck at leading the innovation but heck, they can do some decent hardware at a low cost. but they contribute global warming more than any other computer part company.


----------



## John Naylor (Feb 16, 2019)

Well regarding the thread title ... have to ask, what can we expect any competitor in any market for any product would say in response in response to the announcement of a new feature, with advantages (real or imagined), that the competitor doesn't have access to ?


----------



## ratirt (Feb 16, 2019)

This is kinda interesting for those who wanted to know something about Ray Tracing from AMD


----------



## ComedicHistorian (Feb 17, 2019)

I honestly don't know enough to form an opinion as to which of the competing technologies is superior butttttt if those images are labeled correctly and they were produced and promoted by AMD as part of their argument in favor of TAA over DLSS.......well then that's just dumb (and I'm an AMD stock-holder lol fml)


----------



## ratirt (Feb 17, 2019)

This is interesting. Wonder if you guys have seen it. It's about ray tracing and rasterization.


----------



## efikkan (Feb 17, 2019)

ratirt said:


> This is interesting. Wonder if you guys have seen it. It's about ray tracing and rasterization.


It's definitely an interesting subject, and it might be fine as an introduction, but it is glaringly obvious to me that the author's understanding of rendering is only skin deep, and he gets the deeper technical wrong.

22:58


> But hybrid rendering is a stopgap; Nvidia needs to take the *hybrid approach due to a legacy of thousands of rasterized games*. We see why that is; with Turing already being poorly received due to not being fast enough at rasterization, *can you imagine what would have happened had they doubled or even quadrupled their RTX gigarays while actually lowering rasterization performance?*
> This was the only way Nvidia could do it.
> AMD on the other hand, they're the type of company that would just throw it all out and start from scratch. And I believe we will see them *go down a true raytracing or pathtracing route with the game consoles one day*


Anyone who knows how GPUs and raytracing work understands that the new RT cores are only doing a part of the rendering. GPUs are in fact a collection on various specific accelerated hardware (geometry processing, tessellation, TMUs, ROPs, video encoding/decoding, etc.) and clusters of ALU/FPUs for generic math. Even in a fully raytraced scene, 98% of this hardware will still be used. The RT cores are just another type specialized accelerators for one specific task. And it's not like everything can/will be raytraced; like UI elements in a game, a page in your web browser or your Windows desktop, it's rasterized because it's efficient, and that will not change.


----------



## Camm (Feb 17, 2019)

FF, Metro and BFV DLSS looks like muddy garbage at this point, which is further compounded by that there isn't enough performance to push over 90fps with DLSS on.

I'm waiting to see what the next few updates do, depending on game I might turn DLSS on, but at this point, RT is not worth the smearing that DLSS is.


----------



## mtcn77 (Feb 17, 2019)

It is okay for Nvidia to keep experimenting with gauges when true to form, but higher engagement and lower results aren't in their forte. It has to be a winner solution to keep the milk going. For the longest time green spammers had a hunch for texture fidelity, this is an optimisation towards that. They keep budgeting higher transistor counts, I suppose games are forever going to look like textureless bland blobs.


----------



## Fluffmeister (Feb 17, 2019)

Casecutter said:


> Two years from now it will be like G-Sync, another proprietary technology that went no where.



Heh, and yet G-Sync spawned FreeSync™.


----------



## mtcn77 (Feb 17, 2019)

Fluffmeister said:


> Heh, and yet G-Sync spawned FreeSync™.


I'm sure Nvidia wouldn't leave it at that, unless it was the other way around...


----------



## Fluffmeister (Feb 17, 2019)

mtcn77 said:


> I'm sure Nvidia wouldn't leave it at that, unless it was the other way around...



Well it wasn't the other way round, maybe the question should be; Would we have FreeSync if Nvidia hadn't...


----------



## mtcn77 (Feb 17, 2019)

Fluffmeister said:


> Well it wasn't the other way round, maybe the question should be; Would we have FreeSync if Nvidia hadn't...


I'm pretty sure we would still have gpus... I'm sure you get the pun.


----------



## Super XP (Feb 17, 2019)

Let me go 2 years into the Future. One Second Please.....  Wow Navi surprised everybody with its amazing performance, power efficiency, cost and AMD's version of this DLSS that actually makes the PQ look fantastic all while increasing FPS.


----------



## efikkan (Feb 17, 2019)

AMD catching up to Nvidia in one jump? We'll just have to see about that. Navi will compete with the successor of Turing most of its life, so it would have to offer over twice the efficiency of Vega, which will be no small feat.

BTW; many thought Vega was going to be a Pascal killer too…


----------



## ratirt (Feb 18, 2019)

efikkan said:


> It's definitely an interesting subject, and it might be fine as an introduction, but it is glaringly obvious to me that the author's understanding of rendering is only skin deep, and he gets the deeper technical wrong.
> 
> 22:58
> 
> Anyone who knows how GPUs and raytracing work understands that the new RT cores are only doing a part of the rendering. GPUs are in fact a collection on various specific accelerated hardware (geometry processing, tessellation, TMUs, ROPs, video encoding/decoding, etc.) and clusters of ALU/FPUs for generic math. Even in a fully raytraced scene, 98% of this hardware will still be used. The RT cores are just another type specialized accelerators for one specific task. And it's not like everything can/will be raytraced; like UI elements in a game, a page in your web browser or your Windows desktop, it's rasterized because it's efficient, and that will not change.


So what do you think is wrong with what that dude said? Cause I'm sure it is right. Cant see your point here. We are talking about games not web browsers. The UI isn't ray traced but that's stating the obvious. The ray tracing is for illumination and shadows (reflections as well like fire in BFV on the gun or in water) to get more realism in the graphics. It is connected to the light source in the game so depending on the scene it may not ray trace everything but a huge chunk of the image will be ray traced. 
It shows how ray tracing eats the resources and we still have a limitation with the hardware currently available in the market. I think this video shows what is needed and gives an example of other ways to achieve the ray traced scene in games nowadays or in the future. We will have to see where and how it will be done and how will it work.


----------



## efikkan (Feb 18, 2019)

ratirt said:


> So what do you think is wrong with what that dude said? Cause I'm sure it is right. Cant see your point here.


His mistake is thinking the hardware resources used in rasterized rendering is not used during raytracing, when everything except a tiny part is. Adding raytracing capabilities doesn't lower rasterization capabilities. Please read the parts i bold again and you'll see.


----------



## ratirt (Feb 18, 2019)

efikkan said:


> His mistake is thinking the hardware resources used in rasterized rendering is not used during raytracing, when everything except a tiny part is. Adding raytracing capabilities doesn't lower rasterization capabilities. Please read the parts i bold again and you'll see.


If the rt cores are doing the ray tracing and that's how it has been shown then there's less cores doing the rasterization. So in fact he got it right. That's what I understood from his video and I think that was the main idea of it. Adding to my premise, after rasterization is complete then the ray tracing is being processed. that also creates a lag (more time needed to complete) since these can't work at the same time. You need the objects already there before you can ray trace them. We can see that in BFV when you switch on ray tracing. It barely caps at 60FPS.


----------



## londiste (Feb 18, 2019)

ratirt said:


> If the rt cores are doing the ray tracing and that's how it has been shown then there's less cores doing the rasterization. So in fact he got it right. That's what I understood from his video and I think that was the main idea of it. Adding to my premise, after rasterization is complete then the ray tracing is being processed. that also creates a lag (more time needed to complete) since these can't work at the same time. You need the objects already there before you can ray trace them. We can see that in BFV when you switch on ray tracing. It barely caps at 60FPS.


What do you mean there is less cores doing the rasterization - that without RT cores they could fit some more shaders onto the chip or something else?
RT is being processed concurrently with rasterization work. There are some prerequisites - generally G-Buffer - but largely it happens at the same time.


----------



## ratirt (Feb 18, 2019)

londiste said:


> What do you mean there is less cores doing the rasterization - that without RT cores they could fit some more shaders onto the chip or something else?
> RT is being processed concurrently with rasterization work. There are some prerequisites - generally G-Buffer - but largely it happens at the same time.


Are you sure about that it happens at the same time? you need the objects to be there due to make the ray tracing process. Maybe it works alongside but there must be a gap between the rasterization and ray tracing. How can you ray trace an object and add reflections, shadows etc. when the object isn't there? 
I think yes. without RT cores there would be more for rasterization (unless RT cores do the rasterization as well? I don't think so) I'm not sure now since you've gotten surprised by my statement. Although it makes sense. RT cores take space. So without RT cores the 2080 TI would have been faster in rasterization.


----------



## londiste (Feb 18, 2019)

The guy in the video gets some things wrong.
- DXR is not limited to reflections, shadows and AO. Nvidia simply provides a more-or-less complete solutions for these three. You are able do anything else RT yuo want with DXR. While not DX12 - and not DXR - the ray-traced Q2 on Vulkan clearly shows that full-on raytracing can be done somewhat easily.
- While RTX/DXR are named raytracing, they do also accelerate pathtracing.
- OctaneRender is awesome but does not perform miracles. Turing does speed up the non-realtime path-/raytracing software where it is implemented by at least 2, often 3-5 times compared to Pascal. And this is early implementations.



ratirt said:


> Are you sure about that it happens at the same time? you need the objects to be there due to make the ray tracing process. Maybe it works alongside but there must be a gap between the rasterization and ray tracing. How can you ray trace an object and add reflections, shadows etc. when the object isn't there?


Nvidia as well as various developers have said RT can start occurs after generating G-Buffer. Optimizing that start point was one of the bigger boosts in BFV patch.


----------



## efikkan (Feb 18, 2019)

ratirt said:


> If the rt cores are doing the ray tracing and that's how it has been shown then there's less cores doing the rasterization.


The RT cores are accelerating the calculation and intersection of rays, but the RT cores themselves are not rendering a complete image.
Rendering is a pipelined process; you have the various stages of geometry processing (vertex shaders, geometry shaders and tessellation shaders), then fragment("pixel shaders" in Direct3D terms) processing and finally post-processing. In rasterized rendering, the rasterization itself it technically only the transition between geometry and fragment shading; it converts vertices from 3D space to 2D space, performs depth sorting and culling, before the fragment shader starts putting in textures etc.
In a fully raytraced rendering, all the geometry will still be the same, but the rasterization step between geometry and fragments are gone, and the fragment processing will have to be rewritten to interface with the RT cores. All the existing hardware is still needed, except for the tiny part which does the rasterization. So all the "cores"/shader processors, TMUs etc. are still used during raytracing.

So in conclusion, he is 100% wrong about claiming a "true" raytraced GPU wouldn't be able to do rasterization. He thinks the RT cores does the entire rendering, which of course is completely wrong. The next generations of GPUs will continue to increase the number of "shader processors"/cores, as they are still very much needed for both rasterization and raytracing, and it's not a legacy thing like he claims.


----------



## ratirt (Feb 18, 2019)

efikkan said:


> The RT cores are accelerating the calculation and intersection of rays, but the RT cores themselves are not rendering a complete image.
> Rendering is a pipelined process; you have the various stages of geometry processing (vertex shaders, geometry shaders and tessellation shaders), then fragment("pixel shaders" in Direct3D terms) processing and finally post-processing. In rasterized rendering, the rasterization itself it technically only the transition between geometry and fragment shading; it converts vertices from 3D space to 2D space, performs depth sorting and culling, before the fragment shader starts putting in textures etc.
> In a fully raytraced rendering, all the geometry will still be the same, but the rasterization step between geometry and fragments are gone, and the fragment processing will have to be rewritten to interface with the RT cores. All the existing hardware is still needed, except for the tiny part which does the rasterization. So all the "cores"/shader processors, TMUs etc. are still used during raytracing.
> 
> So in conclusion, he is 100% wrong about claiming a "true" raytraced GPU wouldn't be able to do rasterization. He thinks the RT cores does the entire rendering, which of course is completely wrong. The next generations of GPUs will continue to increase the number of "shader processors"/cores, as they are still very much needed for both rasterization and raytracing, and it's not a legacy thing like he claims.



I understand how rasterization process works and since it works in a pipeline there's one thing being done after another. If you add ray tracing to the pipeline it takes longer so this means the speed drops down. That's my point. 
"true" raytraced GPU? You mean like full of RT core only?


----------



## efikkan (Feb 18, 2019)

ratirt said:


> I understand how rasterization process works and since it works in a pipeline there's one thing being done after another. If you add ray tracing to the pipeline it takes longer so this means the speed drops down. That's my point.
> 
> "true" raytraced GPU? You mean like full of RT core only?


Once again, I refer you back to the quote from the video. He was talking as if the RT cores are the only used during raytracing, and the rest is used during rasterized rendering, which is not true at all.

Think of it more like this; the RT cores are accelerating one specific task, kind of like AVX extensions to x86. Code using AVX will not be AVX only, and while AVX will be doing a lot of the "heavy lifting", the majority of the code will still be "normal" x86 code.


----------



## londiste (Feb 18, 2019)

ratirt said:


> I understand how rasterization process works and since it works in a pipeline there's one thing being done after another. If you add ray tracing to the pipeline it takes longer so this means the speed drops down. That's my point.


RT Cores work concurrently.
Wasn't this slide already posted in one of the threads?


----------



## medi01 (Feb 18, 2019)

Why do TP news often sound as if written by NV's CEO?



MrAMD said:


> DLSS actually looks sharper


Algo that sharpers looks sharper.
Sounds logical.


----------



## ratirt (Feb 18, 2019)

Here's some more DLSS testing. 









It doesn't look great. I really can't see the improvement like you guys say.


----------



## Casecutter (Feb 18, 2019)

Fluffmeister said:


> Heh, and yet G-Sync spawned FreeSync™.


I always understood you to be more "in tune" with technologies, as to not purport such a false statement.

Variable refresh rate monitor - VESA Adaptive-Sync standard, which is part of DisplayPort 1.2a; was always a working open standard.  It was just Nvidia wanted to use/push there proprietary (added hardware) implementation to jump out in front before VESA (open coalition of monitor and GPU representatives) ultimately finalized and agreed upon standard that requires (no huge additional hardware/cost), but just enabling features already developed in the DisplayPort 1.2a. 

AMD and the industry had been working to commence the "open standard", it was just Nvidia saw that not progressing to their liking, and threw their weight to licensing such "add-in" proprietary work-around to monitor manufacture earlier than VESA coalition was considering the roll-out.  Now that those early monitor manufacture are seeing sales of those G-Sync monitors not as lucrative (as the once had been to the early adopter community).  Nvidia see's themselves on the losing end, decided to slip back in the fold, and unfortunately there's no repercussions to the lack of support they delivered upon the VESA coalition.

Much like the Donald on the "birther" crap,  it was "ok" to do it and spout the lies, until one day he finally could no-long pass muster and wasn't expedient to his campaign, and declared he would no talk about it again!  I think that's how Jen Hsun Huang hopes G-Sync will just pass...


----------



## londiste (Feb 18, 2019)

When G-Sync was shown, VESA Adaptive Sync in DP 1.2a did not exist. It did exist on eDP which AMD quickly found when they wanted to have a VRR solution in response. Freesync was first demoed on laptop screens. VESA Adaptive Sync took a while to take off, couple years actually until there was a good bunch of monitors available. Industry had not considered VRR important until then.

G-Sync was announced in October 2013. G-Sync monitors started to sell early 2014.
Freesync was announced in January 2014. VESA Adaptive Sync was added to DP 1.2a in mid-2014. Freesync monitors started to sell early 2015.


----------



## Xzibit (Feb 18, 2019)

ratirt said:


> Here's some more DLSS testing.
> 
> 
> 
> ...



OUCH!!!



			
				HardwareUnboxed said:
			
		

> DLSS is complete garbage


----------



## InVasMani (Feb 19, 2019)

the54thvoid said:


> Yeah, how long before we get gfx cards that can run 8k at 60+FPS?


 RTX on or RTX off lol



Xzibit said:


> OUCH!!!


 Sad part is that it's still easily noticeable enough in the comparisons despite the 1080p YouTube quality.


----------



## ratirt (Feb 19, 2019)

Xzibit said:


> OUCH!!!


I know it is but it would seem that some people think it is really great. I just can't understand that. We have moved from smooth and sharp image to blurry which people call it's an improvement and looks better.


InVasMani said:


> RTX on or RTX off lol
> 
> Sad part is that it's still easily noticeable enough in the comparisons despite the 1080p YouTube quality.


I looked over few videos in on YouTube. (Hardware unboxed) and they have shown the RTX on and off. It does improve the lighting reflections etc. Especially in Metro. It looks nice and more realistic. Though it eats a lot of resources. This RTX can be seen greatly in outside scenery but also when a lot of objects is moving inside and we have different light sources. In some other no change except FPS drop. Anyway it's a nice feature but when we will be playing with RTX on and 60+ FPS on 4k? I'd say long way to go.


----------



## satrianiboys (Feb 19, 2019)

RTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success


----------



## mtcn77 (Feb 19, 2019)

Funny enough, I assumed Nvidia would improve the driver AF levels now that more texture read requests could be issued in newer hardware iterations.


----------



## INSTG8R (Feb 19, 2019)

satrianiboys said:


> RTX & DLSS for sure will meet it's glory later on
> Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor
> 
> Every Nvidia innovations is guaranteed to be a success


 Oh my sides!! Thanks for the laugh. All of those things have faded away....


----------



## Fluffmeister (Feb 19, 2019)

Heh yeah, all those TressFX games certainly taught them a lesson.


----------



## INSTG8R (Feb 19, 2019)

Fluffmeister said:


> Heh yeah, all those TressFX games certainly taught them a lesson.


Exactly all just flash in the pan “shiny things” both sides are equal with their attempts...My New one I’m “enjoying” is Freesync 2 HDR in FC5, FC New Dawn and RE2. TBH it’s not that impressive or noticeable


----------



## Fluffmeister (Feb 19, 2019)

INSTG8R said:


> Exactly all just flash in the pan “shiny things” both sides are equal with their attempts...My New one I’m “enjoying” is Freesync 2 HDR in FC5, FC New Dawn and RE2. TBH it’s not that impressive or noticeable



Yeah kinda confirms open doesn't mean good or popular either, TressFX has gone nowhere. Regardless Nvidia users get to enjoy both plus get a choice of G-Sync and FreeSync displays today. Swings and roundabouts eh.


----------



## INSTG8R (Feb 19, 2019)

Fluffmeister said:


> Yeah kinda confirms open doesn't mean good or popular either, TressFX has gone nowhere. Regardless Nvidia users get to enjoy both plus get a choice of G-Sync and FreeSync displays today. Swings and roundabouts eh.


Well that just proves G-Sync was unnecessary and the open standard works for everyone.


----------



## Fluffmeister (Feb 19, 2019)

INSTG8R said:


> Well that just proves G-Sync was unnecessary and the open standard works for everyone.



Not really, looking at their G-Sync compatible list, the FreeSync working ranges are more often than not awful, and the standards varied from monitor to monitor.... in short their was no standard. G-Syc displays are expensive to at least they have to achive minimum requirements. FreeSync took the machine gun approach.


----------



## INSTG8R (Feb 19, 2019)

Fluffmeister said:


> Not really, looking at their G-Sync compatible list, the FreeSync working ranges are more often than not awful, and the standards varied from monitor to monitor.... in short their was no standard. G-Syc displays are expensive to at least they have to achive minimum requirements. FreeSync took the machine gun approach.


Yeah I get that but those monitors are those “cheap and easy” 75hz Freesync range stuff that’s literally “free” Freesync I don’t really put any stock into those.


----------



## mtcn77 (Feb 19, 2019)

Just what are you expecting out of a 60Hz LCD? 9FPS VRR? The upselling sickens me.


----------



## ratirt (Feb 20, 2019)

mtcn77 said:


> Just what are you expecting out of a 60Hz LCD? 9FPS VRR? The upselling sickens me.


I go 60Hz 4k Freesync screen and couldn't be happier  It's just great. Buying something like that with G-sync would cost a lot more. 
Now NV is joining the club with Freesync (this compatible stuff is so damn funny ) since they have seen they are going nowhere with the price.


----------



## satrianiboys (Feb 20, 2019)

oops, looks like i hurt someone for telling the truth..


----------



## INSTG8R (Feb 20, 2019)

satrianiboys said:


> oops, looks like i hurt someone for telling the truth..


Nothing truthful about your “success” fantasies. It was funny though.


----------



## ratirt (Feb 20, 2019)

satrianiboys said:


> RTX & DLSS for sure will meet it's glory later on
> Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor
> 
> Every Nvidia innovations is guaranteed to be a success



What a hoot  I literally fell from a chair  Even though it's in the game doesn't mean it's being used


----------



## satrianiboys (Feb 20, 2019)

Sigh..
Who is the donkey now for taking literally my words..

Guess i should become donkey too by putting /s next time
Yes, only donkey (read: plank) who always need an /s to understand the context of some words


----------



## INSTG8R (Feb 20, 2019)

satrianiboys said:


> Sigh..
> Who is the donkey now for taking literally my words..
> 
> Guess i should become donkey too by putting /s next time
> Yes, only donkey (read: plank) who always need an /s to understand the context of some words


Your backpedaling is just as pathetic as your original statement...


----------



## 95Viper (Feb 20, 2019)

Stay on topic.
Don't be insulting other members.
Don't be bickering back and forth with each other with off topic banter.

Thanks.


----------



## medi01 (Feb 21, 2019)

satrianiboys said:


> RTX & DLSS for sure will meet it's glory later on
> Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor
> 
> Every Nvidia innovations is guaranteed to be a success


Thanks for chuckles.


----------



## RichF (Feb 22, 2019)

moproblems99 said:


> I mean look at the average human, logic is mostly gone.


That's the classic Star Trek TOS fallacy. Emotions are an expression of logic.

Girlfriend sells your gaming PC to buy shoes = anger.

As for the implication that humanity has become more emotional and less logical, consider how many people are on emotional depressants (anti-depressants).  If anything, humanity has become less emotional due to the desire to medicate away the higher highs and lower lows. There is also the War on Drugs. Back when people were free to use whatever drugs they wanted to the resulting emotions were likely to be more intense and less rational.

Various cultures have tried a lot of things to reduce emotion. Some Buddhists aren't allowed to eat things like onions and garlic because it is believed that their flavors are too intense. The Kellogg cereal company got its start by peddling products (cereals like Corn Flakes) that were supposed to taste so bland they would prevent sexual arousal.



Nxodus said:


> Focus here is on the fact that it's something new.


Newness does not equal innovation.

innovation = striking improvement
evolution = incremental improvement
iteration = possibly no improvement

So, if company X releases GPU A, then GPU B, and GPU B has marginal differences, it is an iterative product. What constitutes evolution and iteration is subjective. However, innovation clearly implies a very significant improvement.

If you think DLSS is an improvement over competing technologies then you can count it as innovation. If not, then it's not innovation. It's merely change. The notion that change is always a good thing is the fallacy of liberalism. The opposing fallacy is the belief that tradition is superior to change (the fallacy of conservatism).

Coming up with clever new ways to trick people into spending their money (marketing innovation) counts as innovation if you're a stockholder but it's not in the interest of the typical product buyer.


----------



## moproblems99 (Feb 22, 2019)

RichF said:


> If anything, humanity has become less emotional



Maybe so but humanity hasn't become any more logical.


----------



## mtcn77 (Feb 22, 2019)

RichF said:


> If you think DLSS is an improvement over competing technologies then you can count it as innovation. If not, then it's not innovation. It's merely change. The notion that change is always a good thing is the fallacy of liberalism. The opposing fallacy is the belief that tradition is superior to change (the fallacy of conservatism).
> 
> Coming up with clever new ways to trick people into spending their money (marketing innovation) counts as innovation if you're a stockholder but it's not in the interest of the typical product buyer.


You are forgetting this is a businesswise decision to shove more transistors to take-in contemporaneous solutions.


----------



## RichF (Feb 24, 2019)

mtcn77 said:


> You are forgetting this is a businesswise decision to shove more transistors to take-in contemporaneous solutions.


What? I have no idea what you're trying to say.


----------



## mtcn77 (Feb 24, 2019)

RichF said:


> What? I have no idea what you're trying to say.


----------



## Super XP (Mar 5, 2019)

Disable DLSS for the best possible Picture Quality. 
That's it.



satrianiboys said:


> RTX & DLSS for sure will meet it's glory later on
> Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor
> 
> Every Nvidia innovations is guaranteed to be a success


I see the sarcasm in your post. As for hairworks, it doesn't look as good as some might think. Perhaps is one's opinion whether they like it or not. 
As for DLSS, that's complete garbage.



> Raevenlord said:
> Of course, *MAD* may only be speaking from the point of view of a competitor that has no competing solution.



AMD has no reason to compete with something that's already proven to be a complete failure.


----------



## dont whant to set it"' (Mar 6, 2019)

odly why nVidia has not came up for a name by theyr own to be given to their next "generation whatevers" by themselves, yet they trashed  hystory
ps: too drunk to grammar check.
Le: I have came up with at least a name, just ask Entropy. 
2nd Le: grammar on first "Lateredit"


----------



## dont whant to set it"' (Mar 8, 2019)

does it matter that Nvidia used the "tensor" word, without fathomisingsly it at all. they are so broke on naming ideeeas like since a douzine years ago(we , us as Nvidia are so desperate we go trash history for cash because we cannot  come up with names for our own concepts because whatwe call as being our own conncepts are not actualy our onw we just trashed history for a profit allthewhile delivering pardon my French : shit . )
ps: some of my 2 cents being randomly drunk


----------



## mtcn77 (Mar 8, 2019)

dont whant to set it"' said:


> does it matter that Nvidia used the "tensor" word, without fathomisingsly it at all. they are so broke on naming ideeeas like since a douzine years ago(we , us as Nvidia are so desperate we go trash history for cash because we cannot  come up with names for our own concepts because whatwe call as being our own conncepts are not actualy our onw we just trashed history for a profit allthewhile delivering pardon my French : shit . )
> ps: some of my 2 cents being randomly drunk


I like random trolls intermittently.


----------



## mandelore (Apr 12, 2019)

I have changed my mind on DLSS, having played Metro Exodus with a 2080 TI since updates, I can say that the blurr introduced is very minimal now, especially when upscaling the resolution. It actually provides a nice trade off for high resolution performance when paired with Ray Tracing. First examples looked terrible, now when switching DLSS on and off there is such a small degree of quality loss that can be compensated by rendering at higher resolutions if your rig can manage it.

My updated Two Cents


----------



## FordGT90Concept (Apr 12, 2019)

But remember that DLSS is only a thing to use those tensors cores that would otherwise be idle.  If those tensor core transistors were instead put to work on rendering, you could get the higher resolution/higher framerate without the blurriness.  In a graphics product, the tradeoff is stupid.  Put tensor cores on compute products.  Hell, make discreet tensor cards for systems that need it and divorce them entirely from the existing product stack.


----------



## Vayra86 (Apr 13, 2019)

FordGT90Concept said:


> But remember that DLSS is only a thing to use those tensors cores that would otherwise be idle.  If those tensor core transistors were instead put to work on rendering, you could get the higher resolution/higher framerate without the blurriness.  In a graphics product, the tradeoff is stupid.  Put tensor cores on compute products.  Hell, make discreet tensor cards for systems that need it and divorce them entirely from the existing product stack.



The reason Nvidia sections off a part of the die is so they have a new growth capacity. It is very much like the quotes from Jensen above here, Nvidia needs a new market now that GPUs can pretty much rip up the common resolutions. They need a new incentive for people to upgrade (4K is not it, mind you, it is still niche and only has a horizon for 1-2 generations). This underlines that in Nvidia's mind its certainly going to be a long term strategy item along with a USP as it is today. I can see the business case, and competition is far away in catching up to it.

That also underlines the primary motive for Nvidia. The motive is not 'the tech/hardware is capable now'. They just repurposed technology to 'make it work', and that is the starting point from which they'll probably iterate.

Nevertheless, I do agree, given the RT implementations we've seen at this point and the die sizes required to get there. On the other hand, if you see the way Nvidia built up Turing, its hard to imagine getting the desired latencies etc. for all that data transport to a discrete RT/tensor only card.


----------



## Ubersonic (Apr 13, 2019)

mandelore said:


> First examples looked terrible, now when switching DLSS on and off there is such a small degree of quality loss that can be compensated by rendering at higher resolutions if your rig can manage it.


I agree what you say is factually correct, with DSR on DLSS does look as good as MSAA, however DSR lowers FPS more than MSAA does, so DLSS requiring the use of DSR in order to make it visually passable completely defeats the entire purpose of DLSS.  What's the point of using a feature that cuts out the performance overhead of AA if you have to add in an even bigger performance overhead to stop it looking like ****.


----------



## mtcn77 (Apr 13, 2019)

Ubersonic said:


> I agree what you say is factually correct, with DSR on DLSS does look as good as MSAA, however DSR lowers FPS more than MSAA does, so DLSS requiring the use of DSR in order to make it visually passable completely defeats the entire purpose of DLSS.  What's the point of using a feature that cuts out the performance overhead of AA if you have to add in an even bigger performance overhead to stop it looking like ****.


DLSS uses FP16 which is quarter precision per say, however, precisely equal to the highest color channel bit depth. The gain is due on that since you don't need any higher color precision. Even INT8 is sufficient for desktop. You can save a lot in the render backends which is Nvidia's biggest weakness compared to AMD.




Your float is overflow safe so long as division uses a higher bitstep than the addition when taking the average of pixels.




Figure 1. Tensor cores *signficantly* accelerate FP16 and INT8 matrix calculations
*Signficantly* Really, Nvidia?


----------



## mandelore (Apr 13, 2019)

Ubersonic said:


> I agree what you say is factually correct, with DSR on DLSS does look as good as MSAA, however DSR lowers FPS more than MSAA does, so DLSS requiring the use of DSR in order to make it visually passable completely defeats the entire purpose of DLSS.  What's the point of using a feature that cuts out the performance overhead of AA if you have to add in an even bigger performance overhead to stop it looking like ****.



That was my original argument, but actually using it, I have found it to work out pretty good. Not perfect, but the realism afforded by ray tracing requires the performance boost from DLSS to make it work. It does work. I am really , like REALLY finicky about smooth gameplay. Can barely tollerate sub 80fps due to my eyes as I always seem to see frame jitter etc. The fps boost and lighting quality work well enough to feel the game is smooth and detailed. 
Obviously this is just my experience on my hardware for just one game implimentation, but it has notebly improved my enjoyment of this game.


----------

