# Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2



## btarunr (Dec 13, 2019)

Hardware-accelerated ray tracing and variable-rate shading will be the design focal points for AMD's next-generation RDNA2 graphics architecture. Microsoft's reveal of its Xbox Series X console attributed both features to AMD's "next generation RDNA" architecture (which logically happens to be RDNA2). The Xbox Series X uses a semi-custom SoC that features CPU cores based on the "Zen 2" microarchitecture and a GPU based on RDNA2. It's highly likely that the SoC could be fabricated on TSMC's 7 nm EUV node, as the RDNA2 graphics architecture is optimized for that. This would mean an optical shrink of "Zen 2" to 7 nm EUV. Besides the SoC that powers Xbox Series X, AMD is expected to leverage 7 nm EUV for its RDNA2 discrete GPUs and CPU chiplets based on its "Zen 3" microarchitecture in 2020. 

Variable-rate shading (VRS) is an API-level feature that lets GPUs conserve resources by shading certain areas of a scene at a lower rate than the other, without perceptible difference to the viewer. Microsoft developed two tiers of VRS for its DirectX 12 API, tier-1 is currently supported by NVIDIA "Turing" and Intel Gen11 architectures, while tier-2 is supported by "Turing." The current RDNA architecture doesn't support either tiers. Hardware-accelerated ray-tracing is the cornerstone of NVIDIA's "Turing" RTX 20-series graphics cards, and AMD is catching up to it. Microsoft already standardized it on the software-side with the DXR (DirectX Raytracing) API. A combination of VRS and dynamic render-resolution will be crucial for next-gen consoles to achieve playability at 4K, and to even boast of being 8K-capable.





*View at TechPowerUp Main Site*


----------



## ratirt (Dec 13, 2019)

Wonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.


----------



## londiste (Dec 13, 2019)

ratirt said:


> Wonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.


Parts of the screen with some pretty fine granularity are shaded at a lower resolution.
Wolfenstein II was the first game to implement it with minor but measurable performance boost.
UL has VRS feature test out as part of 3DMark: https://www.techpowerup.com/261825/...feature-test-for-variable-rate-shading-tier-2

There are some documents and videos that have pretty good explanation of how this works:








						Intel Developer Zone
					

Find software and development products, explore tools and technologies, connect with other developers and more. Sign up to manage your products.




					software.intel.com
				







__





						VRWorks - Variable Rate Shading (VRS)
					

Variable Rate Shading is a Turing feature that increases rendering performance and quality by varying the shading rate for different regions of the frame. VRS Wrapper makes it easier for developers to integrate gaze tracking capabilities of their HMDs for foveated rendering. This solution...




					developer.nvidia.com


----------



## eidairaman1 (Dec 13, 2019)

ratirt said:


> Wonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.



Radeon Rays


----------



## ratirt (Dec 13, 2019)

So does this mean the rx 5800 (or basically the upcoming NAVI cards in 2020) will have this?


----------



## DeathtoGnomes (Dec 13, 2019)

eidairaman1 said:


> Radeon Rays


Magik is happening again!


----------



## eidairaman1 (Dec 13, 2019)

ratirt said:


> So does this mean the rx 5800 (or basically the upcoming NAVI cards in 2020) will have this?



Plausible,


----------



## xkm1948 (Dec 13, 2019)

But, but Real time ray tracing are gimmicks!

—-certain fanbois


----------



## Vya Domus (Dec 13, 2019)

xkm1948 said:


> But, but Real time ray tracing are gimmicks!
> 
> —-certain fanbois



If more than one manufacturer does something it means it's not a gimmick!

-braindead logic


----------



## ratirt (Dec 13, 2019)

I don't need RR that much as long as I can get the 60 FPS in 4K I'm ok. I won't get mad if I'll have to wait longer for the RR+VDS feature. What I'm concerned about is, if this RR is happening whit the 5800 model Navi and up, that can mean delays in the release due to some new feature implementation and whatever, and I really don't want that to happen. I need a new Graphics card.


----------



## londiste (Dec 13, 2019)

RX 5800 is not really confirmed, is it? AMD should be in a bit of a trouble trying to fit into 250W with big Navi especially considering RX 5700XT is a 225W card. 
RDNA2 is more likely to be a next generation thing, RX 6000-series or whatever its eventual name will be.


----------



## spectatorx (Dec 13, 2019)

ratirt said:


> So does this mean the rx 5800 (or basically the upcoming NAVI cards in 2020) will have this?


From news i'm reading about navi i'm confused and leaning more and more towards conclusion there will be no 5800/5900 at all. There is absolutely no info about these cards, seems like navi2/rdna2 is what will be next and that to me is sad as i expected high-end gpu from amd to show up in the end of this year or at beginning of 2020. This only means i will seat on my temporary upgrade 580 for much longer than i expected.

Variable rate shading and raytracing are things i do not care at all. First one reduces image quality second one done properly, not faked, as it is with for example RTX library (yes, rtx is simplified and faked in many aspects form of "raytracing" and still killing performance too much), requires drastic changes to graphics rendering overall and still tons of performance which we will not reach in many decades.


----------



## Recus (Dec 13, 2019)

xkm1948 said:


> But, but Real time ray tracing are gimmicks!
> 
> —-certain fanbois



... and 5 years too early...
...won't be a mainstream thing until it's offered on "all ranges [of GPUs] from low-end to high-end..
...RT cores wasting die space.


----------



## efikkan (Dec 13, 2019)

londiste said:


> RX 5800 is not really confirmed, is it? AMD should be in a bit of a trouble trying to fit into 250W with big Navi especially considering RX 5700XT is a 225W card.
> RDNA2 is more likely to be a next generation thing, RX 6000-series or whatever its eventual name will be.


I would be surprised if they have a larger Navi ready, and if they did, the TDP would probably be in the 300-350W range.
Also don't forget that RX 5700 was renamed "last minute", even some official photos displayed "RX 690".


----------



## BakerMan1971 (Dec 13, 2019)

Well I welcome Raytracing from the Red Team, its great that we all get faster and faster cards, but if the visuals don't change and developers just continue to bloat existing engines in the vain attempt to offer us 'new' exeriences, we need those new technolgies to take the next step.

Hope it makes sense I am all tired and dizzy


----------



## cucker tarlson (Dec 13, 2019)

ratirt said:


> Wonder how this VRS works in practice. Hope it is nothing like NV DLSS blurriness cause that would be a disaster.


well it's not like nvidia already does it in wolfenstein games
and why would that be blurry ? it's not an image reconstruction tehcnique.


----------



## Maelwyse (Dec 13, 2019)

I continue to hope that we get a frame-rate improvement, rather than feature expansion.*
I'm running on a (admittedly top of the line) card from a generation ago (1080TI) it's 2 years old. I would upgrade if I were to get a reliable, significant, increase in framerates. A new feature that drops framerates? No.

Now, if you were to tell me I could have both, I'd consider it, but I'm not going to buy a card which is a 10-25% framerate improvement, for about 2x the price, of a 2 year old card.

*I'm almost convinced that VRS is the key piece I am hungry for. I've taken a hard look at a couple different samples, and if it works like I've seen, I'd certainly accept the image quality 'drop' for part of the screen, for the framerate improvements they have been touting.

And if that came with RR, meh. I wouldn't complain, but it wouldn't be the reason I buy a new card. Besides. HOW many games support hardware RayTracing so far? 7? 8? of which I'd actually think about paying for 2 or so of?


----------



## Steevo (Dec 13, 2019)

xkm1948 said:


> But, but Real time ray tracing are gimmicks!
> 
> —-certain fanbois




There is more than one way to skin a cat.


Hardware support with compressed vector tables to reduce the computational overhead of real time is one. Allow a CPU core or two to work out basic angle dependant setup info then hand that off the same way we got angle independent anisotropic filtering.

How many games support Ray tracing again? Physx hardware accelerated fluff still in the news? Overburden of Tesselation? Hair works? 

Nvidia deserves the flack for what they do, just like AMD deserves so much shit it would take a bulldozer to move it, except it overheated with it's "real men" cores.


----------



## Chomiq (Dec 13, 2019)

Wider adaption of RT can only help the consumer while VRS can't help deliver better performance to the entire ecosystem, be it PC or consoles.


----------



## TheGuruStud (Dec 13, 2019)

Microsoft is still out of touch with reality.

"Hey, want to one up ourselves with the dumbest naming since Xbox One and Xbox X?"
"Sure, let me hear it, Brain Dead Idiot Employee # 2!"
"Xbox Series X!"
"You did it. You crazy son of a bitch, you did it."

You know how they came up with the hardware specs? They just copied Sony leaks and rumors.


----------



## SIGSEGV (Dec 13, 2019)

xkm1948 said:


> But, but Real time ray tracing are gimmicks!
> 
> —-certain fanbois



It heavy taxed the GPU (performance penalty).  
those features even don't change the way you games.
GIMMICKS!


----------



## windwhirl (Dec 13, 2019)

xkm1948 said:


> But, but Real time ray tracing are gimmicks!
> 
> —-certain fanbois



Eh, the thing is that Nvidia wanted to sell RTRT as if it was the Holy Grail of graphics. Because it wasn't all that much of an improvement and severely reduces FPS, it had mixed reception.



Recus said:


> ... and 5 years too early...
> ...won't be a mainstream thing until it's offered on "all ranges [of GPUs] from low-end to high-end..
> ...RT cores wasting die space.



Yeah, it's too early. But at some point it had to arrive to consumer gaming space, either by Nvidia's hand or AMD's, or even Intel's. If anything, Nvidia now has more consumer feedback which will help enhance their RTRT implementation.


----------



## NC37 (Dec 14, 2019)

No Radeon card has much of any value till they get this done.


----------



## Manoa (Dec 14, 2019)

what good is "8k capable" and "ray traced" when you are blurring it with VRS ?! what a tards


----------



## InVasMani (Dec 14, 2019)

Manoa said:


> what good is "8k capable" and "ray traced" when you are blurring it with VRS ?! what a tards


 VRS is great that's one of the more exciting new GPU features in reality. A simple and easy tangible performance boost by diluting parts of the scene you care less about in the grand scheme why would anyone view that as a bad trade off is beyond me. Utilizing resources where they can be put to best usage plain and horsepower means very little when you have no traction which is exactly why drag cars do burnouts before they race to warm those f*ckers up a little to grip the road when they goose it aka 3, 2, 1 punch it.



windwhirl said:


> Eh, the thing is that Nvidia wanted to sell RTRT as if it was the Holy Grail of graphics. Because it wasn't all that much of an improvement and severely reduces FPS, it had mixed reception.
> 
> 
> 
> Yeah, it's too early. But at some point it had to arrive to consumer gaming space, either by Nvidia's hand or AMD's, or even Intel's. If anything, Nvidia now has more consumer feedback which will help enhance their RTRT implementation.


 To be perfectly fair it kind of is the holy grail of higher realistic graphics however it's just widely considered premature by probably half a generation. They jumped the gun, but the software won't be available w/o the hardware the same time so it'll help ease things into that direction, but no RTRT today isn't the holy grail of graphics it's just the primer coating before the real paint is applied and them gets a few coats of clear coat over the top of it. RTRT hardware is just the first stage of many additional coats I mean rasterization still is getting new paint jobs.


----------



## efikkan (Dec 14, 2019)

InVasMani said:


> VRS is great that's one of the more exciting new GPU features in reality. A simple and easy tangible performance boost by diluting parts of the scene you care less about in the grand scheme why would anyone view that as a bad trade off is beyond me. Utilizing resources where they can be put to best usage plain and horsepower means very little when you have no traction which is exactly why drag cars do burnouts before they race to warm those f*ckers up a little to grip the road when they goose it aka 3, 2, 1 punch it.


VRS is a technology that I've wanted for 10 years, but not as a way to reduce details in parts of the scene, only to improve select parts. I think this technology has great potential, but like with many other advanced techniques, it needs to be utilized _right_, otherwise the end result is bad.

Let's say you have a scene with a nice landscape in the lower half of the screen, and a sky (just a skydome or skybox) in the upper half. You might think that rendering the upper half in much fewer samples might be a good way to optimize away wasteful samples. But the truth is that low detail areas like skies are very simple to render in the first place, so you will probably end up with a very blurry area and marginal performance savings.

To make matters worse, this will probably only increase the frame rate variance (if not applied very carefully). If you have a first-person game walking a landscape, looking straight up or down will result in very high frame rates while looking straight forward into an open landscape will give low performance. Even if you don't do any particular fancy LoD algorithms, the GPU is already pretty good at culling off-screen geometry, and I know from experience that trying to optimize away any "unnecessary" detail can actually increase this frame rate variance even more.


----------



## Zach_01 (Dec 14, 2019)

This is very interesting to me... the software based RayTracing. Pretty good results IMO








						CRYENGINE | Crytek Releases Neon Noir, A Real-Time Ray Tracing Demonstration For CRYENGINE
					

The new feature used to create a futuristic drone scene is currently in development and will come to CRYENGINE this year.




					www.cryengine.com


----------



## ratirt (Dec 14, 2019)

cucker tarlson said:


> well it's not like nvidia already does it in wolfenstein games
> and why would that be blurry ? it's not an image reconstruction tehcnique.


On the VRS video linked earlier it does get blurry when the res is really down. So your image reconstruction technique mention here is simply pointless.
I'm more worried about the implementation of this so that it wont end up like NV's DLSS. Nvidia does it? You mean NVidia's using it.


----------



## medi01 (Dec 14, 2019)

NC37 said:


> No Radeon card has much of any value till they get this done.


In times when less than 1% of games support it, and even those who support make 2080Ti sweat, yeah, I mean, a must have/no feature-no buy feature.
Obviously.

Because once we get into full throttle RT "non-RT cards are not supported" some time in 2025, today's card will absolutely be adequate to run it.
Apparently.



xkm1948 said:


> But, but Real time ray tracing are gimmicks!
> 
> —-certain fanbois



Low ray count ray tracing, producing hell of a noisy images that get heavily "denoised" to produce a handful of effects in some otherwise traditionally rasterized scenes.. is not gimmick?

Because, let me guess, it has "RT" and "real time" in it? 

Clearly, only fanbois would disagree with it!

Exciting times!


----------



## cucker tarlson (Dec 14, 2019)

medi01 said:


> In times when less than 1% of games support it, and even those who support make 2080Ti sweat, yeah, I mean, a must have/no feature-no buy feature.
> Obviously.
> 
> Because once we get into full throttle RT "non-RT cards are not supported" some time in 2025, today's card will absolutely be adequate to run it.
> ...


well better go back to buying cards that can't do none of it.
isn't it good to have a choice...
about that 1%.....look how many triple A games out now or announced for 2020 have rtx support.
blurry and noisey ? depnds
this is rtx+dlss in Control


----------



## Totally (Dec 14, 2019)

xkm1948 said:


> But, but Real time ray tracing are gimmicks!
> 
> —-certain fanbois







Do tell, what has been you user experience/utilisation of RTRT over the course of the ownership of your RTX card? Do you believe that you will fully take advantage it's RTRT capabilities before replacing it with a next gen nvidia/amd card that can actually handle such?

P.S. If it's a gimmick at present there is nothing wrong with doing so.


----------



## medi01 (Dec 14, 2019)

cucker tarlson said:


> well better go back to buying cards that can't do none of it.


Hell yeah, how dare I!?!?!?!
Oh wait:












cucker tarlson said:


> blurry and noisey ?


I guess it is too much to expect from users on techy forum to have basic understanding of the underlying technology...


----------



## Maelwyse (Dec 14, 2019)

Totally said:


> View attachment 139366
> 
> Do tell, what has been you user experience/utilisation of RTRT over the course of the ownership of your RTX card? Do you believe that you will fully take advantage it's RTRT capabilities before replacing it with a next gen nvidia/amd card that can actually handle such?
> 
> P.S. If it's a gimmick at present there is nothing wrong with doing so.



So, err. dumb question. RT was originally developed by a software company, not Nvidia. Nvidia took it and developed a hardware method to (attempt to) make it have an "acceptable impact" and release it to the market earlier than said software company's solution. I feel that was a brilliant marketing move, but a poor end-user solution. RTRT would be great it if had little-to-no-real-world impact on framerates, and widespread adoption to the game development world. If there is a software solution, and hardware that can "add this in", without dropping framerates below "targets" such as 60 FPS for 60hz gaming, or 144 FPS for 144HZ gaming, or 240 for 240hz gaming, why would we care that the card has RT cores or not? Now, are we going to get there in the current or coming generation, almost certainly not. But since development is currently relying on those RT cores for programming models, we're actually delaying development adoption to get it 'now' IF we do get there?

VRS, on the other hand, is a potential framerate improvement, leaving us with "better' state, provided it doesn't significantly impact image quality. Why would anyone NOT want this, IF it works as advertised?


----------



## londiste (Dec 14, 2019)

Zach_01 said:


> This is very interesting to me... the software based RayTracing. Pretty good results IMO
> 
> 
> 
> ...


There is no good reason to be very keen on software-based raytracing (running on generic shaders, as in this case).

Until Crytek implements DXR there is no direct comparison. So far Neon Noir running on Vega 56/64 is about on par with it running on GTX1080. Anything DXR cards can do (mainly considerably larger amount of rays) is on top of that. If you want a comparison, check the differences between GTX and RTX cards - RTX2060 should be generally on par with GTX1080, so it is direct enough comparison - in Battlefield V DXR. It employs DXR for the same effect as Neon Noir employs its RT shaders).



ratirt said:


> On the VRS video linked earlier it does get blurry when the res is really down. So your image reconstruction technique mention here is simply pointless.
> I'm more worried about the implementation of this so that it wont end up like NV's DLSS. Nvidia does it? You mean NVidia's using it.


VRS is in DX12 and both Nvidia and Intel have this capability deployed. I believe Nvidia also has OpenGL and Vulkan extensions available for it, not sure about Intel.
VRS is not an image reconstruction technique. It does reduce image quality in parts of the image but option of using VRS is purely and entirely up to developer. When used well - in parts of screen that do not benefit from more details and quality lowered to acceptable degree - it provides a small but measurable performance boost for minimal image quality penalty.


----------



## cucker tarlson (Dec 14, 2019)

medi01 said:


> Hell yeah, how dare I!?!?!?!
> Oh wait:
> 
> 
> ...


I told you,you can pay the same and be happy to get less if you wish,you have a choice.


----------



## Totally (Dec 14, 2019)

Maelwyse said:


> So, err. dumb question. RT was originally developed by a software company, not Nvidia. Nvidia took it and developed a hardware method to (attempt to) make it have an "acceptable impact" and release it to the market earlier than said software company's solution. I feel that was a brilliant marketing move, but a poor end-user solution. RTRT would be great it if had little-to-no-real-world impact on framerates, and widespread adoption to the game development world. If there is a software solution, and hardware that can "add this in", without dropping framerates below "targets" such as 60 FPS for 60hz gaming, or 144 FPS for 144HZ gaming, or 240 for 240hz gaming, why would we care that the card has RT cores or not? Now, are we going to get there in the current or coming generation, almost certainly not. But since development is currently relying on those RT cores for programming models, we're actually delaying development adoption to get it 'now' IF we do get there?
> 
> VRS, on the other hand, is a potential framerate improvement, leaving us with "better' state, provided it doesn't significantly impact image quality. Why would anyone NOT want this, IF it works as advertised?



This goes back to the comment I quoted, I'm probably one of those certain people claimed to say "RT is a gimmick", that is out of context. How Nvidia is implementing it is a gimmick, no matter how you cut it with the current batch of cards it's not practical nor useful therefore a gimmick. RT itself was never thought or said to be superfluous by myself personally. That out of way will probably get there eventually but feel like had nvidia not snatched it up it would we would have seen acceptable RT sooner because the collaborative element has been removed, working on RT from that point either develop your own from scratch or go through Nvidia and use their hardware.


----------



## havox (Dec 15, 2019)

efikkan said:


> I would be surprised if they have a larger Navi ready, and if they did, the TDP would probably be in the 300-350W range.
> Also don't forget that RX 5700 was renamed "last minute", even some official photos displayed "RX 690".


I'd take a 350W card that can pull 4k 60FPS in Ubisofts poorly optomized yearly AAAs. From either company.
Back in the day I owned Asus Mars II which was Bitchin'fast!3D2000 personified. It was 365W TDP tripple slot 3x 8pin monstrosity, and yes, it had over 20000 BungholioMarks.


Spoiler: Wow!











I had no problem with heat, and it lasted over 3 years. Good times.


----------



## Fluffmeister (Dec 15, 2019)

I do love these pissing contests, Fermi famously got no love from the ATi/AMD crowd for being hot and power hungry, but it was clearly the faster more forward looking tech. And in grand scheme of things, AMD cards over recent years have made Fermi look kind to the environment!

Turing packs all this tech already, and can only improve once Nvidia go down to 7nm too, RTRT may realistically remain years off, but things like VRS coming to next gen consoles can certainly offer some nice benefits as details and res go up. AMD are playing catch up, but there is no need to get too butt hurt people.


----------



## InVasMani (Dec 15, 2019)

efikkan said:


> VRS is a technology that I've wanted for 10 years, but not as a way to reduce details in parts of the scene, only to improve select parts. I think this technology has great potential, but like with many other advanced techniques, it needs to be utilized _right_, otherwise the end result is bad.
> 
> Let's say you have a scene with a nice landscape in the lower half of the screen, and a sky (just a skydome or skybox) in the upper half. You might think that rendering the upper half in much fewer samples might be a good way to optimize away wasteful samples. But the truth is that low detail areas like skies are very simple to render in the first place, so you will probably end up with a very blurry area and marginal performance savings.
> 
> To make matters worse, this will probably only increase the frame rate variance (if not applied very carefully). If you have a first-person game walking a landscape, looking straight up or down will result in very high frame rates while looking straight forward into an open landscape will give low performance. Even if you don't do any particular fancy LoD algorithms, the GPU is already pretty good at culling off-screen geometry, and I know from experience that trying to optimize away any "unnecessary" detail can actually increase this frame rate variance even more.


 I think you used a poor example because it's unlikely that scenario would be applied or only sparingly. As far as frame time variance is concerned the hardware remains the same in either case it's simply prioritizing render tasks a bit different within the GPU. If anything it could be used to improve frame time variance in the worse case scenario's by utilizing VRS selectively switching a few things to a lower quality when frame rates dip below certain FPS trigger thresholds til they normalize. Sure I'm sure it could get used poorly, but it could get used very well at the same time and having it as a option doesn't hurt.

Here's a example GPU recognizes frame rate is below 60FPS or say below 30FPS which is even worse and input lag gets really crappy really quickly below that point. AA is gets set for 75% of screen resolution for w/e high quality setting you determine and the other 25% gets set lower when the trigger point kicks in until the frame rate normalizes. Frame rate variance improved a bit of image quality reduction temporarily, but in the grand scheme a good trade off perhaps given the scenario described. That could be applied to more than AA like shading, lighting, and geometry as well as other stuff. Boils down to how it gets used and applied, but VRS has the premise of improving quality and performance both in variable ways. It just depends how it gets injected into the render pipe line.



ratirt said:


> On the VRS video linked earlier it does get blurry when the res is really down. So your image reconstruction technique mention here is simply pointless.
> I'm more worried about the implementation of this so that it wont end up like NV's DLSS. Nvidia does it? You mean NVidia's using it.


 Plenty of video streams do variable rate adjustments similar to that based on DL speed because of traffic congestion. I honestly wouldn't mind a bit of selective DLSS smeary temporarily if my FPS dipped below a frame rate threshold I determined it beats chopping frame rates and sloppy input lag.



cucker tarlson said:


> well better go back to buying cards that can't do none of it.
> isn't it good to have a choice...
> about that 1%.....look how many triple A games out now or announced for 2020 have rtx support.
> blurry and noisey ? depnds
> ...


 That scene looks like it's be post processed with Charmin Ultrasoft soap opera effect. I cannot in good faith say I'm fond of the look. My vision isn't even 20/20 I'm blind as a bat w/o glasses, but that would drive me nuts personally so I'd hate to think what people with good vision think of it that dull mess. I practically looks like a mClassic doing upscaling from like a 720p console game in terms of texture detail it's horrible quite frankly the quality simply isn't there bottom line you can't make a blu-ray quality video out of a DVD rom which also true of all the sacrifices to RTRT to do with ray tracing and lower the amount of light passes and denoise to poorly run RTRT at a unsteady frame rate.



Totally said:


> View attachment 139366
> 
> Do tell, what has been you user experience/utilisation of RTRT over the course of the ownership of your RTX card? Do you believe that you will fully take advantage it's RTRT capabilities before replacing it with a next gen nvidia/amd card that can actually handle such?
> 
> P.S. If it's a gimmick at present there is nothing wrong with doing so.


 Let's not kid ourselves here next generation nvidia/amd card won't handle won't be tricking anyone's eyes into thinking RTRT Crysis is real life either.



medi01 said:


> Hell yeah, how dare I!?!?!?!
> Oh wait:
> 
> 
> ...


 Speak of the devil or close enough and actually that that demo was one of the better examples of RTRT type effects aside from that staged star wars demo Nvidia did that didn't materialize into actual playable games like that go figure who would've guessed it. Crytek did a pretty decent job though defiantly not perfect simply because single GPU hardware won't get us to GoPro Hero realism at this point in time we've got a ways to go still before we reach that point.



Totally said:


> This goes back to the comment I quoted, I'm probably one of those certain people claimed to say "RT is a gimmick", that is out of context. How Nvidia is implementing it is a gimmick, no matter how you cut it with the current batch of cards it's not practical nor useful therefore a gimmick. RT itself was never thought or said to be superfluous by myself personally. That out of way will probably get there eventually but feel like had nvidia not snatched it up it would we would have seen acceptable RT sooner because the collaborative element has been removed, working on RT from that point either develop your own from scratch or go through Nvidia and use their hardware.


 You make a bit of a good point RTRT could be viewed as a bit of a preemptive **** block attempt by Nvidia to ray tracing with developers that will ultimately slow the progression of ray tracing. No one wants another HairWorks or Physx scenario down the road for ray tracing, but could be right where things are headed towards. Luckily AMD is in the next gen console's so we might avoid that scenario so good chess move follow up by Lisa Su. I'm sure RTRT will improve in the coming hears and heat up further, but at this stage it's safe to call it a bit of a gimmick given how it both looks and performs neither are optimal and need tons more polish before people consider them high quality and high desirability. I don't think too many people bought RTX cards for RTRT alone, but rather for both performance/efficiency and RTX features that include RTRT among other tech like mesh shading and DLSS.



Fluffmeister said:


> Turing packs all this tech already, and can only improve once Nvidia go down to 7nm too, RTRT may realistically remain years off, but things like VRS coming to next gen consoles can certainly offer some nice benefits as details and res go up.


 Pretty much agreed Nvidia moving to 7nm will certainly only bring about further advancements though so too will AMD moving to 7nm EUV and increasing it's GPU divisions R&D budget over time as it continues to pay down debt from that ATI merger from years past. Intel's CPU stumbling will only benefit AMD especially given it's higher focus on the CPU side at present. AMD is defiantly in a good position to shift gears and focus or switch from 2WD to 4WD at any point in time between CPU/GPU so that's a good thing really it's worse days appear behind them. AMD has it's work cut out for them ahead especially on the GPU side of things, but I think they'll inch their way forward and regain market share back from Nvidia over the coming years. I do believe a stronger R&D budget and less debt will make a big difference in their overall competitiveness plus Intel's stumbles should help and those security stumbles could hurt Intel a lot that won't just be forget about given the scale of them that keeps getting deeper.


----------



## cucker tarlson (Dec 15, 2019)

InVasMani said:


> That scene looks like it's be post processed with Charmin Ultrasoft soap opera effect. I cannot in good faith say I'm fond of the look. My vision isn't even 20/20 *I'm blind as a bat w/o glasses*, but that would drive me nuts personally so I'd hate to think what people with good vision think of it that dull mess. I practically looks like a mClassic doing upscaling from like a 720p console game in terms of *texture detail it's horrible* quite frankly the quality simply isn't there bottom line you can't make a blu-ray quality video out of a DVD rom which also true of all the sacrifices to RTRT to do with ray tracing and lower the amount of light passes and denoise to poorly run RTRT at a unsteady frame rate.



well,thanks for the elaborate description.
I don't know why it looks like that to you ,maybe you do need glasses after all.




InVasMani said:


> You make a bit of a good point RTRT could be viewed as a bit of a* preemptive **** block attempt by Nvidia* to ray tracing with developers that will ultimately slow the progression of ray tracing.
> Luckily AMD is in the next gen console's so we might avoid that scenario so *good chess move follow up by Lisa Su*.


----------



## efikkan (Dec 15, 2019)

havox said:


> I'd take a 350W card that can pull 4k 60FPS in Ubisofts poorly optomized yearly AAAs. From either company.


Would you take a 350W card when you can get a 250W card with the same performance?
350W is pushing it in terms of cooling it without terrible noise levels.



InVasMani said:


> I think you used a poor example because it's unlikely that scenario would be applied or only sparingly. As far as frame time variance is concerned the hardware remains the same in either case it's simply prioritizing render tasks a bit different within the GPU.


I think you missed the point. The hardware is of course the same, the variance is in the workload.



InVasMani said:


> If anything it could be used to improve frame time variance in the worse case scenario's by utilizing VRS selectively switching a few things to a lower quality when frame rates dip below certain FPS trigger thresholds til they normalize. Sure I'm sure it could get used poorly, but it could get used very well at the same time and having it as a option doesn't hurt.


It is certainly possible to build an algorithm that uses performance metrics from previous frames and dynamically adjusts LoD on the fly, I have even looked into implementing something like that once. The issue is that you have to rely on the performance metrics of the last ~10 frames, so any adjustment will happen after the performance has changed, and will for this reason not reduce stutter. The best approach is to reduce the variance preemptively.

I stand by my claim that it can be used poorly, resulting in blurry scenes and in worst case flickering or artifacts.


----------



## medi01 (Dec 15, 2019)

cucker tarlson said:


> I told you,you can pay the same and be happy to get less if you wish,you have a choice.


People who insist RTX "is a must" (I won't repeat "less than 1% of games, yadayada") talk about having a choice.
Ironic.

Let me elaborate, not only is RTX a gimmick at this point (pathetic number of rays that are capable to produce noisy shadow/reflection like effects, with heavy de-noising) it is absolutely not clear how this area will develop. Whatever it will be, as with FreeSync, it won't be NV alone deciding it, heck, but AMD alone could, as, wait for it:
1) AMD commands 35% of the GPU market (and is poised to grab more), but also
2) 100% of the console market (Switch is in no way capable of RT-ing anyhow) which is expected to roll out next gen consoles with GPU at 2070-2080-ish levels

Last, but not least, the screenshot you have shared makes me smile. Looks like generic adventure to me.
Yeah, devs were able to gimmick reflections/shadows way before RT-ing, it's just about effort to implement it (must be much easier with RT).



efikkan said:


> Would you take a 350W card when you can get a 250W card with the same performance?


It will depend on the price.


----------



## cucker tarlson (Dec 15, 2019)

medi01 said:


> People who insist RTX "is a must" (I won't repeat "less than 1% of games, yadayada") talk about having a choice.
> Ironic.
> 
> Let me elaborate, not only is RTX a gimmick at this point (pathetic number of rays that are capable to produce noisy shadow/reflection like effects, with heavy de-noising) it is absolutely not clear how this area will develop. Whatever it will be, as with FreeSync, it won't be NV alone deciding it, heck, but AMD alone could, as, wait for it:
> ...


funny how you talk like that all the time while amd reveals their goal for 2020 is to match nvidia's 2018


----------



## medi01 (Dec 15, 2019)

cucker tarlson said:


> funny how you talk like that all the time while* amd reveals their goal for 2020 is to match nvidia's 201*8



Surely AMD would express admiration far all the unappreciated green greediness greatness in a less subtle way?

Would you mind linking it, I have an itch to read it verbatim.


----------



## cucker tarlson (Dec 15, 2019)

medi01 said:


> Surely AMD would express admiration far all the unappreciated green greediness greatness in a less subtle way?
> 
> Would you mind linking it, I have an itch to read it verbatim.



its literally the title of the thread  

you're taking longer and longer to catch on.


----------



## InVasMani (Dec 16, 2019)

efikkan said:


> I think you missed the point. The hardware is of course the same, the variance is in the workload.
> 
> 
> It is certainly possible to build an algorithm that uses performance metrics from previous frames and dynamically adjusts LoD on the fly, I have even looked into implementing something like that once. The issue is that you have to rely on the performance metrics of the last ~10 frames, so any adjustment will happen after the performance has changed, and will for this reason not reduce stutter. The best approach is to reduce the variance preemptively.
> ...


 So long as VRS has gears it can shift thru that hopefully subdivides with standard 24FPS animation frame rates it should be a great option with little downside. I could be used poorly, but so can RTRT and other things so that's nothing new.


----------



## Platinum certified Husky (Dec 16, 2019)

Fixing their driver is probably more important idk


----------



## ratirt (Dec 16, 2019)

InVasMani said:


> Plenty of video streams do variable rate adjustments similar to that based on DL speed because of traffic congestion. I honestly wouldn't mind a bit of selective DLSS smeary temporarily if my FPS dipped below a frame rate threshold I determined it beats chopping frame rates and sloppy input lag.


I'd rather skip RT and go with higher FPS than use DLSS (I see the difference in-games with image quality with this thing on) to speed things up because RT is eating all the performance.


----------



## cyneater (Dec 16, 2019)

Maelwyse said:


> So, err. dumb question. RT was originally developed by a software company, not Nvidia. Nvidia took it and developed a hardware method to (attempt to) make it have an "acceptable impact" and release it to the market earlier than said software company's solution. I feel that was a brilliant marketing move, but a poor end-user solution. RTRT would be great it if had little-to-no-real-world impact on framerates, and widespread adoption to the game development world. If there is a software solution, and hardware that can "add this in", without dropping framerates below "targets" such as 60 FPS for 60hz gaming, or 144 FPS for 144HZ gaming, or 240 for 240hz gaming, why would we care that the card has RT cores or not? Now, are we going to get there in the current or coming generation, almost certainly not. But since development is currently relying on those RT cores for programming models, we're actually delaying development adoption to get it 'now' IF we do get there?
> 
> VRS, on the other hand, is a potential framerate improvement, leaving us with "better' state, provided it doesn't significantly impact image quality. Why would anyone NOT want this, IF it works as advertised?



really CBF digging around in the past

but https://www.awn.com/news/sgi-demos-integration-art-vps-ray-tracing-hardware

You will find SGI had it ages ago for cad not really for gaming.
Pretty sure sun had real time ray tracing as well..

First gen hardware / software tends to suck, until it get momentum. 

An open standard would be nice.

Since consoles will be using and and consoles drive PC gaming unless some killer app has Ray tracing is just a gimmick at the moment.

Anyone else remember stand alone physics cards? Wank factory 99% there where some titles that you could run that where cool but other than that a waste of money.


----------



## Turmania (Dec 16, 2019)

Unlike many irrational fan Boyz here, I have been very critical of AMD in both cpu and gpu products they released and the undeserved hype they got with 7nm process.  But I have to admit and admire the rx5700. Best card for the money and interestingly power draw as well. You can fine tune it to consume around 140w and have a slightly better performance than its stock.and it beats both 2060 super and 2060 naturally.probably their best product of the year and after that comes ryzen 5 3600.


----------



## medi01 (Dec 16, 2019)

cucker tarlson said:


> its literally the title of the thread



The title of this thread deducts some of the goals that are part of the XSX console.
VRS is a rational thing to do, RT is more of a buzzwordy hype, like VR was, it might take off next year or two, or much later, we'll see.

Of goals AMD has next year, RT is certainly not high priority.


----------



## cucker tarlson (Dec 16, 2019)

medi01 said:


> Of goals AMD has next year, RT is certainly not high priority.


well certainly you're the one to know that.
I mean yeah,just throw in some RT with what they got spare.


----------



## medi01 (Dec 16, 2019)

cucker tarlson said:


> well certainly you're the one to know that.



Rich, coming from soneone who assessed AMD is getting out of its way to bring hardware accelerated RT gimmicks to the table.

They have 7nm EUV process to embrace, 350-ish mm2 chips to roll out in GPU business, at the very least. That is way more important than RT gimmick, obviously, given that even 2080 owners turn it off.


----------



## cucker tarlson (Dec 16, 2019)

medi01 said:


> Rich, coming from soneone who assessed AMD is getting out of its way to bring hardware accelerated RT gimmicks to the table.


of course they are,you think they were planning on rt for consoles ?
what 350mm2 chips are they rolling out ? ones that would be almost instantly obliterated by consoles on feature set ? imo that's the reason why we're not seeing anything at $600 from amd, like 5800xt/5900xt.Comparisons with 2080 super just couldn't be avoided,while people can still buy 5700xt over 2060 super cause it's just an entry level rtx card.
nvidia is a smart devil.


----------



## medi01 (Dec 16, 2019)

cucker tarlson said:


> of course they are,you think they were planning on rt for consoles ?


Audio RT bit was there since 2013.
For UI, I don't think they had.

They didn't plan for VR either. Sony made a big deal out of it, but I don't see it taking off.



cucker tarlson said:


> what 350mm2 chips are they rolling out ? ones that would be almost instantly obliterated by consoles on feature set ?


Whatever makes it in into consoles, will be included in RDNA2 GPUs as well. 

And thanks for "obliterated", remind me, how 2070 "obliterates" 5700Xt in sales, lol.



cucker tarlson said:


> why we're not seeing anything at $600 from amd, like 5800xt/5900xt.Comparisons with 2080 super just couldn't be avoided


Huge chip, big problems. 
5700 5500 and guaranteed 5600 are 7nm DUV.
With 7nm EUV around the corner,, it makes no sense to invest into big chip based on already outdated process, even for AMD who is so eagerly embracing new process nodes.



cucker tarlson said:


> people can still buy 5700xt over 2060 super


People buy 5700XT over 2070 super, chuckle, not sure what you are on about. RT as a feature is some sort of a scam at this point, current gen cards suck even with the handful of the current gen games that support it. There is next to no use case for it, besides tech enthusiasms of some weird sort.


----------



## cucker tarlson (Dec 16, 2019)

medi01 said:


> People buy 5700XT over 2070 super, chuckle, not sure what you are on about.


well yes,for saving money,not for performance.



medi01 said:


> And thanks for "obliterated", remind me, how 2070 "obliterates" 5700Xt in sales, lol.








> people can still buy 5700xt over 2060 super cause it's just an entry level rtx card



I don't suppose nvidia would be selling turing agaist 5800xt/5900xt when ampere comes out in h1 2020 (probably).whatever $400-600 ampere cards will pack in terms of RT performance will probably be close to 2080Ti and higher.
2070 super can do 1080p/60 RT pretty easily,1440p RT when limited to shadows or more options at lower fidelity,which still blows current SSR out of the water.


----------



## medi01 (Dec 16, 2019)

cucker tarlson said:


> whatever $400-600 ampere cards will pack in terms of RT performance will probably be close to 2080Ti and higher.


That would be a funny game to play for AMD, I guess.
Given, it's below 10% of the die size on Turings, it's still not quite negligible when your competitor has an option from not giving a f*ck at all, to coming up with hybrid solution that requires much smaller die size.



cucker tarlson said:


> 2070 super can do 1080p/60 RT pretty easily


It can do select RT gimmicks of certain kind in certain games at certain resolution... oh wait...


----------



## cucker tarlson (Dec 16, 2019)

medi01 said:


> That would be a funny game to play for AMD, I guess.
> Given, it's below 10% of the die size on Turings, it's still not quite negligible when your competitor has an option from not giving a f*ck at all, to coming up with hybrid solution that requires much smaller die size.
> 
> 
> It can do select RT gimmicks of certain kind in certain games at certain resolution... oh wait...


like I said before,there's nothing stopping you from getting a non-RT card at the same price that a RT card sells for,given that 5700xt is 5% faster than 2060S out of  the box and has less OC headroom it's an equally good option if you want to wait for RT to catch on,dunno about resale value tho.


----------



## medi01 (Dec 16, 2019)

cucker tarlson said:


> it's an equally good option



It is indeed, just go with AIB:


----------



## cucker tarlson (Dec 16, 2019)

medi01 said:


> It is indeed, just go with AIB:


I'm glad we agree on something


----------



## ratirt (Dec 17, 2019)

cucker tarlson said:


> like I said before,there's nothing stopping you from getting a non-RT card at the same price that a RT card sells for,given that 5700xt is 5% faster than 2060S out of  the box and has less OC headroom it's an equally good option if you want to wait for RT to catch on,dunno about resale value tho.


I do know about resale value. It will still be very good. Do you know why? Because as it has been said before, RT is not making to the market by storm. If by some miracle it does and RT will flourish and every game will have it and starts using it and there would be awesome implementations we've never seen before and realism etc. Do you think your 2060 Super will be able to keep up with these games? It struggles now with basically any game which is using RT effects. (not to mention the RT effects are crippled due to the taxing power required) AMD can cut price can NV do that too? With it's fancy, expensive RT cores? I really doubt it.


----------



## cucker tarlson (Dec 17, 2019)

ratirt said:


> I do know about resale value. It will still be very good. Do you know why? Because as it has been said before, RT is not making to the market by storm. If by some miracle it does and RT will flourish and every game will have it and starts using it and there would be awesome implementations we've never seen before and realism etc. Do you think your 2060 Super will be able to keep up with these games? It struggles now with basically any game which is using RT effects. (not to mention the RT effects are crippled due to the taxing power required) AMD can cut price can NV do that too? With it's fancy, expensive RT cores? I really doubt it.


hard to predict anything really.


----------



## ratirt (Dec 17, 2019)

cucker tarlson said:


> hard to predict anything really.


Is it really? For you this is hard to predict even though all odds are against it and no matter how this will start. RT flourish or goes down in the dumpster, the RTX cards will not make it with the RT cores and their price in both scenarios.

and yet this prediction of yours.


cucker tarlson said:


> I don't suppose nvidia would be selling turing agaist 5800xt/5900xt when ampere comes out in h1 2020 (probably).whatever $400-600 ampere cards will pack in terms of RT performance will probably be close to 2080Ti and higher.
> 2070 super can do 1080p/60 RT pretty easily,1440p RT when limited to shadows or more options at lower fidelity,which still blows current SSR out of the water.


You didn't even blink an eye when predicting this. did you?
You mix reality with future unknowns. Like 2070 super can do 1080p/RT easily (maybe now but not likely), and Amper will come out (there will be no Amper btw the name was scrapped) with 2080 Ti perf or higher. If the RT moves forward (I think you want this and this drives you), then 2070 Super wont be enough. Basically it will end up the same as the 2060 Super just a bit less disappointing.


----------



## cucker tarlson (Dec 17, 2019)

ratirt said:


> Is it really? For you this is hard to predict even though all odds are against it and no matter how this will start. RT flourish or goes down in the dumpster, the RTX cards will not make it with the RT cores and their price in both scenarios.
> 
> and yet this prediction of yours.
> 
> ...


I mean it's hard to predict 5700xt resale value 2-3 years down the down the road,Jesus,quit being a prick.

PS5 is gonna have RT hardware,yet your wise ass prediction is



> RT flourish or goes down in the dumpster




consoles will have RT while $450 5700xt from amd will not.I'm not debating,I'm stating the obvious.I just threw this resale value bit and you're friggin debating me on this. are you personally hurt or something ?


----------



## ratirt (Dec 17, 2019)

cucker tarlson said:


> I mean it's hard to predict 5700xt resale value 2-3 years down the down the road,Jesus,quit being a prick.


Why I'm being a prick? Because I have different opinion about this and have some arguments to support it pointing out you are mistaken and that you don't keep facts straight? I can tell you now that the resale value for the 5700 XT will be better than 2060S or 2070S for sure despite if RT will flood the games or it will be forgotten completely. This is why the 5700 AMD series will be really good in resale after 2 years in comparison to 2060S and 2070S.


----------



## cucker tarlson (Dec 17, 2019)

ratirt said:


> Why I'm being a prick?


cause you're picking on something you shouldn't.



ratirt said:


> I can tell you now that the resale value for the 5700 XT will be better than 2060S* or 2070S for sure*


  and Merry Christmas to you


----------



## ratirt (Dec 17, 2019)

cucker tarlson said:


> cause you're picking on something you shouldn't.
> 
> 
> and Merry Christmas to you


Pricking about something you shouldn't? You have a serious attitude issues bro 
Merry Christmas anyway  Hope you will get whatever you need this year cause the next one might be a disappointment for you.


----------



## cucker tarlson (Dec 17, 2019)

ratirt said:


> the next one might be a disappointment for you.


why ?


----------



## efikkan (Dec 17, 2019)

InVasMani said:


> So long as VRS has gears it can shift thru that hopefully subdivides with standard 24FPS animation frame rates it should be a great option with little downside. I could be used poorly, but so can RTRT and other things so that's nothing new.


That's not even how games work. Game simulation runs at what we call a _tick rate_, a fixed rate set to e.g. 30 Hz, 60 Hz, 100 Hz etc. Even when the frame rate fluctuates, the tick rate stays constant (otherwise the game would get out of sync). Frame rendering is usually not synced up with the tick rate.

The big problem with using advanced techniques like VRS is that it needs to be very well integrated into the game engine. Unfortunately, most top games today are using a third-party game engine, so the development team don't even touch the low-level engine code.



UltraThicc said:


> Fixing their driver is probably more important idk


It always is, and AMD definitely have some potential there.



ratirt said:


> I'd rather skip RT and go with higher FPS than use DLSS (I see the difference in-games with image quality with this thing on) to speed things up because RT is eating all the performance.


Any post-processing AA will ultimately not improve picture quality no matter how fancy it is. DLSS _is_ a gimmick, while RT is not.

RT can already be useful if used skillfully for diffuse lighting rather than much more expensive specular lighting.


----------



## ratirt (Dec 17, 2019)

cucker tarlson said:


> why ?


Why not?
It is like you said.


cucker tarlson said:


> hard to predict anything really.


You said we don't know what the future holds. Better expect less than you get.


efikkan said:


> Any post-processing AA will ultimately not improve picture quality no matter how fancy it is. DLSS _is_ a gimmick, while RT is not.
> 
> RT can already be useful if used skillfully for diffuse lighting rather than much more expensive specular lighting.


Never said it will improve image quality now did I? I just want the image quality not to be worsened and that is a sincere concern from my side. 
You got that one right. It can be useful but it is not about just the skillful use of it I'm afraid.  RT is inversely proportional to performance. Better quality or better RT requires more horse power. So skillfulness, yes it would help to make the implementation (no glitches, no mistakes with the lighting implemented etc.) right but the horsepower to give the proper performance has to be there anyway. The last one is not here and that is why it is still a gimmick just to make a fuss around it, use it as a marketing tool scam. You have the right to think different. Just remember, the feature, Ray Tracing ( for games) has a huge business matter at this point. I think we all know what it means business right?


----------



## cucker tarlson (Dec 17, 2019)

ratirt said:


> You said we don't know what the future holds. Better expect less than you get.


yes,that's why we're hyping up a cpu physics implementation we've never seen outside a few second long gameplay at the end of the video running at what looks like sub 30 fps on a cpu they did not disclose     


sorry,I had a brain freeze,wrong thread   

but I don't think we should expect little of amd,intel and nvidia.I think they'll all have really good launches next year.


----------



## ratirt (Dec 17, 2019)

cucker tarlson said:


> yes,that's why we're hyping up a cpu physics implementation we've never seen outside a few second long gameplay at the end of the video running at what looks like sub 30 fps on a cpu they did not disclose
> 
> 
> sorry,I had a brain freeze,wrong thread
> ...


I sure hope so. Expect as much as you can but predict for the worst.


----------



## cucker tarlson (Dec 17, 2019)

ratirt said:


> I sure hope so. Expect as much as you can but predict for the worst.


ampere on 7nm euv
rdna 2 with rt support on 7nm+
intel with HT on i5s (this is the one I'm waiting for since I wanna go back to buying $200 gaming oriented cpus instead of +$350 workstation ones)
ryzen 4000 later on 

this is gonna be a good year


----------



## ratirt (Dec 17, 2019)

cucker tarlson said:


> ampere on 7nm euv
> rdna 2 with rt support on 7nm+
> intel with HT on i5s (this is the one I'm waiting for since I wanna go back to buying $200 gaming oriented cpus instead of +$350 workstation ones)
> ryzen 4000 later on
> ...


These are the some of the releases I'm waiting for. 
I wait for 4000 Ryzen. I wanted to go 3000 series but decided to wait for the new one.


cucker tarlson said:


> rdna 2 with rt support on 7nm+


You make it sound like it needs some special architecture stuff (like NV RT cores to actually make ray tracing possible which is a ruse BTW) to make RT work on a graphics card. As a matter of fact RT is a technique that requires driver supporting and enough graphics card's processing power to make the real time ray tracing work fast enough (especially for gaming purposes).


----------



## cucker tarlson (Dec 17, 2019)

ratirt said:


> You make it sound like it needs some special architecture stuff (like NV RT cores to actually make ray tracing possible which is a ruse BTW) to make RT work on a graphics card. As a matter of fact RT is a technique that requires driver supporting and enough graphics card's processing power to make the real time ray tracing work fast enough (especially for gaming purposes).


there's require and require.
look what a 11tflops gpu with no dedicated rt hardware (1080Ti) does agaist a 6.5tflops rtx 2060 with rt asic









						Test wydajności Quake II - Path tracing na API Vulkan wstrząsa... | PurePC.pl
					

Test wydajności Quake II - Path tracing na API Vulkan wstrząsa... (strona 7) Test wydajności Quake II z patch tracing na API Vulkan. Czy rewolucyjna technika oświetlenia zmieni starego Quake 2 nie do poznania? I jakiego sprzętu do tego potrzeba?




					www.purepc.pl
				




you'd need 2.5x 1080Ti's power to match a rtx 2060


imo RT will stay an optional technology for at least a decade.but I can't see any gpu manfufacturer not supporting it starting from mid range cards.

the alternative to buying a $400 card and enjoying ray traced shadows/reflections in a few games is paying the same for the same performance and not having it.really.


----------



## ratirt (Dec 17, 2019)

cucker tarlson said:


> there's require and require.
> look what a 11tflops gpu with no dedicated rt hardware (1080Ti) does agaist a 6.5tflops rtx 2060 with rt asic
> 
> 
> ...


You do realize that Q2 RTX doesn't even use the RT cores for Ray tracing that NV is so fond of? What you are doing is quoting what NV has planned all along. Nice marketing for RTX cards stating that you need RT cores. Since NV is asking so much for RTX it would have been stupid if quake 2 worked well on 1080 Ti now would it? The price needs to be justified and you fall for it.
Do you know why I know this ?
Because https://www.cryengine.com/news/view...-time-ray-tracing-demonstration-for-cryengine
doesn't need RT cores to get this one done and works with sufficient performance and it is ray tracing just as in Quake 2. (Actually it looks even better than in quake) They have added RTX to make it believable that the RT core are actually necessary and also a great marketing for NVidia cards with Ray Tracing. Cripple the driver for 1080 TI so that it doesn't work properly and here you have a great evidence.
Did you expect that NV will release RTX cards with RT cores without giving any rational reason and justification for the price even if NV has to forge that reason which is Quake2 RTX?


----------



## kings (Dec 17, 2019)

ratirt said:


> AMD can cut price can NV do that too? With it's fancy, expensive RT cores? I really doubt it.



Nvidia's margins are much higher than AMD's. They have already cut prices somewhat at the launch of the Super cards, by offering better chips for the same value.

Nvidia doesn't cut prices even more, because it doesn't have to, really. They currently have about 73% market share in dedicated GPUs. It is AMD that needs to gain market share and so has to subject itself to earn less.


----------



## ratirt (Dec 17, 2019)

kings said:


> Nvidia's margins are much higher than AMD's. They have already cut prices somewhat at the launch of the Super cards, by offering better chips for the same value.
> 
> Nvidia doesn't cut prices even more, because it doesn't have to, really. They currently have about 73% market share in dedicated GPUs. It is AMD that needs to gain market share and so has to subject itself to earn less.


Sure but I don't see AMD actually being scared about NV's price cuts. Actually the 5700 series is selling pretty well. 
Huuh. Cut price to offer better chips for the same value? I though it was the natural way of new graphics cards evolution and  releases but I guess it has been brought down to the goodness of NVidia now. 
You are missing the rest of the conversation to say NV doesn't have to cut prices. It will have to cut prices and you will see soon why. AMD my dear foreign friend doesn't need to do anything now and it is evidently clearly  today. It is NV that is running around town like a boogeyman trying to scare people off with not having RT cores.


----------



## londiste (Dec 17, 2019)

ratirt said:


> You do realize that Q2 RTX doesn't even use the RT cores for Ray tracing that NV is so fond of?


Yes, it does. RT cores are why Turing is that much faster in Q2 RTX over GTX cards. The problem with Q2 RTX is different - it is not a good representation for hybrid RTRT solutions because it is not one. Q2 RTX is completely pathtraced.



ratirt said:


> Do you know why I know this ?
> Because https://www.cryengine.com/news/view...-time-ray-tracing-demonstration-for-cryengine
> doesn't need RT cores to get this one done and works with sufficient performance and it is ray tracing just as in Quake 2.


First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.


----------



## cucker tarlson (Dec 17, 2019)

ratirt said:


> *You do realize that Q2 RTX doesn't even use the RT cores for Ray tracing that NV is so fond of?* What you are doing is quoting what NV has planned all along. Nice marketing for RTX cards stating that you need RT cores. Since NV is asking so much for RTX it would have been stupid if quake 2 worked well on 1080 Ti now would it? The price needs to be justified and you fall for it.


what?of course it does.
Jesus the red base fans their theories 



ratirt said:


> Because https://www.cryengine.com/news/view...-time-ray-tracing-demonstration-for-cryengine
> doesn't need RT cores to get this one done and works with sufficient performance and it is ray tracing just as in Quake 2. (Actually it looks even better than in quake) They have added RTX to make it believable that the RT core are actually necessary and also a great marketing for NVidia cards with Ray Tracing. *Cripple the driver for 1080 TI* so that it doesn't work properly and here you have a great evidence.
> Did you expect that NV will release RTX cards with RT cores without giving any rational reason and justification for the price even if NV has to forge that reason which is Quake2 RTX?


NeonNoir only has reflections at 1 ray per 4 pixels and it already suffers immensely.This is worse than RTX low doing 1 ray per 2 pixels in worst case scenario,and it's a synthetic benchmark not a game.
lol,you're in a* big* bubble sir.



ratirt said:


> Huuh. Cut price to offer better chips for the same value? I though it was the natural way of new graphics cards evolution and  releases but I guess it has been brought down to the goodness of NVidia now.


lel,just like 5500xt


----------



## ratirt (Dec 18, 2019)

londiste said:


> Yes, it does. RT cores are why Turing is that much faster in Q2 RTX over GTX cards. The problem with Q2 RTX is different - it is not a good representation for hybrid RTRT solutions because it is not one. Q2 RTX is completely pathtraced.
> 
> First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
> Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.


There is no distinguished difference between Ray Tracing and Path Tracing. Path tracing is supposed to be a faster form of Ray Tracing and that is basically it.


cucker tarlson said:


> what?of course it does.
> Jesus the red base fans their theories


No it doesn't  That is the funny part  You think that RT core speed up ray tracing and that is not the case.
Besides I'm not a red based fan so quit that. Who's being a prick now? 



cucker tarlson said:


> NeonNoir only has reflections at 1 ray per 4 pixels and it already suffers immensely.This is worse than RTX low doing 1 ray per 2 pixels in worst case scenario,and it's a synthetic benchmark not a game.
> lol,you're in a* big* bubble sir.


We will see who is in a big bubble (Whatever that means) in time. New engine will be available in full extent soon and there will definitely be games using it. this will be a good indication of what is actually needed. Those rays per pixel can be increased you know. It is a demo showcase to show what it can do like a CPU sample. It is not the released product so be patient. You just don't see it yet and if I'm supposed to be a red based fan with theories than you are a blind green fan without any theories or reasoning for that matter.


----------



## cucker tarlson (Dec 18, 2019)

ratirt said:


> No it doesn't  That is the funny part  You think that RT core speed up ray tracing and that is not the case.


absolutely.they built 750mm2 dies just to cripple 1080Ti in the end.


----------



## ratirt (Dec 18, 2019)

cucker tarlson said:


> absolutely.they built 750mm2 dies just to cripple 1080Ti in the end.


They have built it because it is a graphics company and "leather jacket" must have something to brag about and this time around it was RT cores. Let's see what he will come up with next year.
The difference in performance in ray tracing scenarios and non ray tracing environment between 2080 and 1080 is more less the same in both cases. So how is the RT cores supposed to speed things up for ray tracing?
This means that the 2080S is simply faster graphics.

I will follow up this stuff a bit more to evaluate it and see if this is true for sure. I suggest you do the same thing.


----------



## londiste (Dec 18, 2019)

ratirt said:


> You think that RT core speed up ray tracing and that is not the case.


Why would you claim this? Do you have any reference or proof?


ratirt said:


> The difference in performance in ray tracing scenarios and non ray tracing environment between 2080 and 1080 is more the same. So how is the RT cores supposed to speed things up for ray tracing?
> This means that the 2080S is simply faster graphics.


RTX2080 is about on par with GTX1080Ti, if a little bit above it. Super variant is a few more percent ahead. There are improvements other than RT cores that allow Turing cards to get a performance lead over Pascal if used properly (and Neon Noir seems to be a good example of that). When RT Cores are used in games that have DXR effects or the Vulkan counterparts - Quake 2 RTX, BF V, Metro Exodus, SoTR - RTX cards blow GTX cards out of the water.


----------



## cucker tarlson (Dec 18, 2019)

londiste said:


> Why would you claim this? Do you have any *reference or proof*?


good one,haha.


I don't think he gets rasterized vs ray traced.























2080Ti with 13.5 tflops and rt+tensor cores 40 fps
Titan V with 15 tflops and no rt cores 28 fps



londiste said:


> Yes, it does. RT cores are why Turing is that much faster in Q2 RTX over GTX cards. The problem with Q2 RTX is different - it is not a good representation for hybrid RTRT solutions because it is not one. Q2 RTX is completely pathtraced.
> 
> First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
> Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.


when using simpler forms of RT,like shadows only,1080Ti is closer to 2060,still loses by 40%









						Call of Duty: Modern Warfare 2019 - Test wydajności ray-tracingu | PurePC.pl
					

Call of Duty: Modern Warfare 2019 - Test wydajności ray-tracingu (strona 6) Test wydajności kart graficznych w grze Call of Duty: Modern Warfare 2019 z włączoną opcją śledzenia promieni czyli ray tracingiem. Jakie są wymagania tej technologii?




					www.purepc.pl
				




interestingly,the perfromance penalty is over 100% on 1080Ti,80% on 1660Ti (tensor) but only 19% on RTX 2060.
1080Ti tends to produce more noisy image too.


----------



## efikkan (Dec 18, 2019)

ratirt said:


> Sure but I don't see AMD actually being scared about NV's price cuts. Actually the 5700 series is selling pretty well.


I guess that depends on your definition of _well_.
As can be seen in the Steam Hardware Survey, it has done little to impact AMD's market share and is still outsold by Nvidia's comparable products;

AMD Radeon RX 5700 XT 0.22% (+0.07%)

NVIDIA GeForce RTX 2060 1.95% (+0.41%)
NVIDIA GeForce RTX 2070 1.60% (+0.19%)
NVIDIA GeForce RTX 2070 SUPER 0.42% (+0.17%)
NVIDIA GeForce RTX 2060 SUPER 0.25% (+0.10%)

As you can see, in this segment Nvidia is outselling them ~10:1.



ratirt said:


> They have built it because it is a graphics company and "leather jacket" must have something to brag about and this time around it was RT cores. Let's see what he will come up with next year.


You're not even trying to be serious. Grow up or go play somewhere else!

Anyone with a basic understanding of 3D graphics knows ray tracing to be necessary to get good lighting.


----------



## ratirt (Dec 18, 2019)

efikkan said:


> I guess that depends on your definition of _well_.
> As can be seen in the Steam Hardware Survey, it has done little to impact AMD's market share and is still outsold by Nvidia's comparable products;
> 
> AMD Radeon RX 5700 XT 0.22% (+0.07%)
> ...


Market share is different from sales since we are not talking in general but one segment? Which market you are talking about here? I remember you claimed that MindFactory.de is not relevant and yet steam is? Anyway NV is making a lot of noise around RT is it not? I don't see that from AMD side and yet as you said AMD is the one should be trying harder. 

I am serious the same way I see you being serious.


----------



## londiste (Dec 18, 2019)

@efikkan a better comparison is probably Super cards as both RTX2060 and RTX2070 have been on the market for about a year more than Navi cards while RTX2060 Super/RTX2070 Super were released right before RX5700/RX5700XT. RX5700 does not seem to be listed separately in the Steam HW survey, meaning it is either rolled into 5700XT or more likely is <0.15%. There is still a twofold difference but Navi is doing quite well.


----------



## cucker tarlson (Dec 18, 2019)

lol,show us the "data" you and your collague managed to obtain while you were doing your "research" wink wink


----------



## efikkan (Dec 18, 2019)

ratirt said:


> Market share is different from sales since we are not talking in general but one segment? Which market you are talking about here? I remember you claimed that MindFactory.de is not relevant and yet steam is?


I'm talking of market share in the gaming market, which is a subset of the entire PC market.
The fact is that AMD's market share among gamers have stayed stagnant at 15%, which also includes APUs from AMD. For the past three years AMD have not been present in the high-end, stayed at ~10% or less of the mid-range, while many have been touting Polaris, Vega and now Navi as "great successes". In general sales AMD have about 20-25% discrete GPUs, but most people keep forgetting that a lot of this is from OEM sales of low-end GPUs that are not used for gaming. Steam is the most dominant platform among PC gamers, and is very much representative of the PC gaming market, anyone who understands representative samples would understand this. There is nothing that is more representative than the Steam statistics at this point.



ratirt said:


> Anyway NV is making a lot of noise around RT is it not? I don't see that from AMD side and yet as you said AMD is the one should be trying harder.


Over the past five years AMD have been making way more noise over "their stuff" than anyone else, including Mantle, the myth of "better" Direct3D 12 performance, FreeSync being "free", etc.

While RT may not be super useful yet, it will be at some point. All hardware support have to start somewhere, and hardware support have to come before software support.



londiste said:


> @efikkan a better comparison is probably Super cards as both RTX2060 and RTX2070 have been on the market for about a year more than Navi cards while RTX2060 Super/RTX2070 Super were released right before RX5700/RX5700XT.


Just in the past month there has been added nearly twice as many RTX 2060s as there are RX 5700 XTs in total. If you add up the percentage-points of gain for these Nvidia cards it is 0.87% compared to RX 5700 XTs 0.07% gain.


----------



## kings (Dec 18, 2019)

ratirt said:


> They have built it because it is a graphics company and "leather jacket" must have something to brag about and this time around it was RT cores. Let's see what he will come up with next year.



Yeah, Nvidia thought, "let's release cards with bigger and more expensive dies, with RT cores that do nothing, just to brag about it".

You must think that the people who work at Nvidia are all stupid and make business decisions that involve millions and millions of dollars, just for the bragging rights.


----------



## Vayra86 (Dec 18, 2019)

londiste said:


> First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
> Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.



Eh what? I run that bench at 60-80 FPS on my 1080. Did you try it yet? Add your score  








						Crytek's Neon Noir Raytracing Benchmark Results
					

https://www.cryengine.com/marketplace/product/neon-noir#  Enjoy - Pretty rough ride. FPS is pretty great but stability...  Settings for fair results in this topic are: (and if enough people chime in, I'll make&keep scores for it here) Other setups are welcome but will not be listed. Subject to...




					www.techpowerup.com
				




The problem Neon Noir has is accuracy, but RT isn't all that accurate yet either, it just resolves the lack of detail differently. Ill take the software box of tricks in Neon Noir over BFV's RT implementation any day of the week.

Really the debate is still ongoing on what is the best solution. Some hardware for it, sure. Large sections of a die? Not so sure, this will probably get integrated in a way and Turing is just an early PoC.


----------



## INSTG8R (Dec 18, 2019)

Vayra86 said:


> Eh what? I run that bench at 60-80 FPS on my 1080. Did you try it yet?
> 
> The problem Neon Noir has is accuracy, but RT isn't all that accurate yet either, it just resolves the lack of detail differently. Ill take the software box of tricks in Neon Noir over BFV's RT implementation any day of the week.
> 
> Really the debate is still ongoing on what is the best solution. Some hardware for it, sure. Large sections of a die? Not so sure, this will probably get integrated in a way and Turing is just an early PoC.


Exactly have we seen an example of DXR yet? the agnostic solution where the field is ”level”


----------



## Vayra86 (Dec 18, 2019)

kings said:


> Yeah, Nvidia thought, "let's release cards with bigger and more expensive dies, with RT cores that do nothing, just to brag about it".
> 
> You must think that the people who work at Nvidia are all stupid and make business decisions that involve millions and millions of dollars, just for the bragging rights.



Its very clear what Nvidia is looking at: 4K adoption rate is not really going places and those who do have it, tend to lower their res anyway. But Nvidia also has trouble making cards much faster than 1080ti, I mean the 2080s are baby steps and the 2080ti is way too large to be economical hence its price. At the same time, there is good growth in demand for high refresh rates, but even that is very feasible on the current crop of cards for most games, especially competitive ones.

Essentially, Nvidia was looking for a new buyer's incentive/upgrade incentive and found it in RT. Marketing then made us believe the world is ready for it. That is how these things go 

So really, lacking the content, Nvidia surely released Turing cards with the idea to brag about it. It is what Jensen has been doing since day one. It just works, right? We were going to buy more to save more because dev work was going to become so easy, if you winked at the GPU it'd do the work for you. Or something vague like that. And then there is reality: a handful of titles with so-so implementations at a massive FPS hit 

This also explains why AMD cares a lot less, and just now starts to push it to console. Their target market doesn't really care, and represents the midrange. AMD has no urge to push this forward other than telling the world they still play along.


----------



## londiste (Dec 18, 2019)

Vayra86 said:


> Eh what? I run that bench at 60-80 FPS on my 1080. Did you try it yet? Add your score
> 
> The problem Neon Noir has is accuracy, but RT isn't all that accurate yet either, it just resolves the lack of detail differently. Ill take the software box of tricks in Neon Noir over BFV's RT implementation any day of the week.
> 
> Really the debate is still ongoing on what is the best solution. Some hardware for it, sure. Large sections of a die? Not so sure, this will probably get integrated in a way and Turing is just an early PoC.


Sorry, my bad. 1080p@30FPS was the initial claim from CryTek. This seems to be the 99% low result for Vega56 and it actually runs faster. GTX1080 is in the same ballpark. Comparison to Battlefield is off, you are right about that.

Neon Noir has cool optimizations that benefit performance. Things like only doing RT for short range and falling back to Voxels when it is beneficial. 
By the way, CryTek should (and plans to) use assistance from DXR or Vulkan's RT extensions in their engine.
In comparison to Neon Noir, what exactly makes you dislike BFV's RT implementation?

Best solution is relative. RT cores is not an RT solution. It is a hardware assist to casting rays. The exact algorithm and optimizations are up to developer.



INSTG8R said:


> Exactly have we seen an example of DXR yet? the agnostic solution where the field is ”level”


On Nvidia side of things, any DXR game will give an idea what RT performance differences are between Pascal, Turing and Turing with RT cores.
If we want to compare AMD vs Nvidia, we cannot. AMD cards/drivers have no DXR implementation.


----------



## Vayra86 (Dec 18, 2019)

londiste said:


> Sorry, my bad. 1080p@30FPS was the initial claim from CryTek. This seems to be the 99% low result for Vega56 and it actually runs faster. GTX1080 is in the same ballpark. Comparison to Battlefield is off, you are right about that.
> 
> Neon Noir has cool optimizations that benefit performance. Things like only doing RT for short range and falling back to Voxels when it is beneficial.
> By the way, CryTek should (and plans to) use assistance from DXR or Vulkan's RT extensions in their engine.
> ...



Not so much dislike, I just fancy the hybrid solution better because it will help adoption better. BFV is proof of concept, CryEngine makes it marketable for mainstream audience.

Also I don't believe games need the high accuracy at all. Especially in motion, the cost of that detail just isn't worth it. On top of that, games are an artistic product, even those that say they want to 'look real'. Its still a scene and it still has its limitations, and therefore still needs tweaking because just RT lighting makes lots of stuff unplayable.


----------



## londiste (Dec 18, 2019)

Vayra86 said:


> Not so much dislike, I just fancy the hybrid solution better because it will help adoption better. BFV is proof of concept, CryEngine makes it marketable for mainstream audience.


All of these are hybrid solutions. It is just a question of LOD, falloff distances and what it falls back to. Neon Noir is not a good representation of a game. It is a fixed techdemo meaning it is no doubt very well optimized.

RT effects, including DXR support, are there or coming to large engines. Unreal has those, Unity has those (not sure if still in preview or production build), CryEngine has RT but no DXR support yet. Others will not be far behind.


----------



## efikkan (Dec 18, 2019)

Vayra86 said:


> Its very clear what Nvidia is looking at: 4K adoption rate is not really going places and those who do have it, tend to lower their res anyway.…
> Essentially, Nvidia was looking for a new buyer's incentive/upgrade incentive and found it in RT…
> So really, lacking the content, Nvidia surely released Turing cards with the idea to brag about it.


I'm seriously concerned if you believe your own words, because all of that is a truckload worth of ox manure.

Ray tracing has been requested by graphics developers for over a decade. Every new GPU generation has given us more performance and memory, easily allowing developers to throw in larger meshes, finer grained animations and higher detailed textures, which is easy since most assets are modeled in higher detail anyway. But lighting and shadows have been a continuous problem. Simple stencil shadows and pre-rendered shadow maps is not cutting it any more as the other details of the games keeps increasing. Pretty much every lighting effect you see in games are just cheap clever tricks to simulate the real thing, and quite often only "work well" under conditions and may result in unwanted side-effects. Programming all these effects is also quite challenging, and may have to be adapted to all the various scenes of a game.

Simply put; developers want RT more than Nvidia. But we are only in the infant stages of RT this far, it's still too slow to be used to the extent developers want. So for now, it has to be used in a limited fashion.


----------



## Vayra86 (Dec 18, 2019)

efikkan said:


> I'm seriously concerned if you believe your own words, because all of that is a truckload worth of ox manure.
> 
> Ray tracing has been requested by graphics developers for over a decade. Every new GPU generation has given us more performance and memory, easily allowing developers to throw in larger meshes, finer grained animations and higher detailed textures, which is easy since most assets are modeled in higher detail anyway. But lighting and shadows have been a continuous problem. Simple stencil shadows and pre-rendered shadow maps is not cutting it any more as the other details of the games keeps increasing. Pretty much every lighting effect you see in games are just cheap clever tricks to simulate the real thing, and quite often only "work well" under conditions and may result in unwanted side-effects. Programming all these effects is also quite challenging, and may have to be adapted to all the various scenes of a game.
> 
> Simply put; developers want RT more than Nvidia. But we are only in the infant stages of RT this far, it's still too slow to be used to the extent developers want. So for now, it has to be used in a limited fashion.



Source, please. And not an Nvidia branded or affiliated one if you wouldn't mind.

Even if just for sanity check purposes... because when I hear 'developers want for a decade' all I really hear is 'we've been working on this for 10 years, and finally, here it is' (Huang himself @ SIGGRAPH). I've seen too much spin in my life to take this at face value. There is always an agenda and _its always about money._


----------



## INSTG8R (Dec 18, 2019)

londiste said:


> On Nvidia side of things, any DXR game will give an idea what RT performance differences are between Pascal, Turing and Turing with RT cores.
> If we want to compare AMD vs Nvidia, we cannot. AMD cards/drivers have no DXR implementation.


Yet, but DXR is where “the rubber meets the road” where both sides are using the same API and where we see how devs will focus the new tech. Vulkan RT also would apply We know AMD has their answer ready with the new Xbox announcement , just waiting on their dGPU answer.


----------



## londiste (Dec 18, 2019)

@efikkan, @Vayra86, I think neither of you are wrong. 4K adoption is not going all too well and with the generally expected rate of 30% more performance per generation there are a couple generations to go until 4K 60FPS is easy. GPU vendors do need a new incentive of some sort to push the envelope.

Raytracing has been kind of coming for a while. Theory is there, research is there but performance simply has not been for anything real-time. Now Nvidia pushed the issue to the brink of being usable. Someone has to push new things for these to be implemented and widespread enough especially when it comes to hardware features.

@Vayra86 just look at how lighting and shadowing methods have evolved. Shadow maps, dynamic stencil shadows, soft/hard shadows and the ever more complex methods for these. Similarly and closely related - GI methods. Latest wave of GI methods were SVOGI (that CryEngine uses for fallback in Neon Noir and their RT solution is evolved from) and Nvidia's VGXI are both Voxel-based and with a very noticeable performance hit. In principle both get more and more closer to raytracing. Also keep in mind that rasterization will apply several different methods on top of each other, complicating things.

If you think Nvidia is doing this out of the blue - they are definitely not. A decade of experience in OptiX gives them a good idea about where the performance issues are. Of course, same applies to AMD and Intel.


----------



## cucker tarlson (Dec 18, 2019)

Pushing 4k cards would be way more profitable for nvidia than RTX,dunno what you're taking about Vayra


----------



## Vayra86 (Dec 18, 2019)

londiste said:


> @efikkan, @Vayra86, I think neither of you are wrong. 4K adoption is not going all too well and with the generally expected rate of 30% more performance per generation there are a couple generations to go until 4K 60FPS is easy. GPU vendors do need a new incentive of some sort to push the envelope.
> 
> Raytracing has been kind of coming for a while. Theory is there, research is there but performance simply has not been for anything real-time. Now Nvidia pushed the issue to the brink of being usable. Someone has to push new things for these to be implemented and widespread enough especially when it comes to hardware features.
> 
> @Vayra86 just look at how lighting and shadowing methods have evolved. Shadow maps, dynamic stencil shadows, soft/hard shadows and the ever more complex methods for these. Similarly and closely related - GI methods. Latest wave of GI methods were SVOGI (that CryEngine uses for fallback in Neon Noir and their RT solution is evolved from) and Nvidia's VGXI are both Voxel-based and with a very noticeable performance hit. In principle both get more and more closer to raytracing. Also keep in mind that rasterization will apply several different methods on top of each other, complicating things.



Thank you once again for your nuance and informative posts


----------



## INSTG8R (Dec 18, 2019)

cucker tarlson said:


> Pushing 4k cards would be way more profitable for nvidia than RTX,dunno what you're taking about Vayra


Yet ASUS just released a 27” 1080p 265hz IPS. I don‘t disagree 4K is more “reasonable” goal to shoot for but adoption is still low, but eventually we’ll be at the point where 4K and RT are gonna have to be a thing for the “bleeding edge crowd” I mean I’m part of the HDR minority another tech that’s still hit and miss.


----------



## cucker tarlson (Dec 18, 2019)

INSTG8R said:


> Yet ASUS just released a 27” 1080p 265hz IPS. I don‘t disagree 4K is more “reasonable” goal to shoot for but adoption is still low, but eventually we’ll be at the point where 4K and RT are gonna have to be a thing for the “bleeding edge crowd” I mean I’m part of the HDR minority another tech that’s still hit and miss.


4k is waaaaaay more popluar.youve got 50 4k options for one 240hz


----------



## INSTG8R (Dec 18, 2019)

cucker tarlson said:


> 4k is waaaaaay more popluar.youve got 50 4k options for one 240hz


I’m not saying there”s not options for 4K but the entry barrier is still high when you factor the GPU to get the 4K 60hz experience. 1080 is still the “norm” when you consider the market overall when the 1060 is “median“ of the GPU user base.


----------



## efikkan (Dec 18, 2019)

cucker tarlson said:


> Pushing 4k cards would be way more profitable for nvidia than RTX


Imagine if Nvidia used all* the die space of the RT cores for just more SMs; it would obviously be more profitable for Nvidia, and people would of course get more excited, boosting demand probably higher than supply. Instead Nvidia opted for a more "balanced" approach (in their opinion), with adding some more performance in general while also adding RT acceleration to bring the advancement of technology forward. In the short term not a financial "smart move", but it certainly is in the long term.

*) They would have to scale it a little back to avoid too high power consumption.


----------



## londiste (Dec 18, 2019)

efikkan said:


> Imagine if Nvidia used all* the die space of the RT cores for just more SMs; it would obviously be more profitable for Nvidia, and people would of course get more excited, boosting demand probably higher than supply. Instead Nvidia opted for a more "balanced" approach (in their opinion), with adding some more performance in general while also adding RT acceleration to bring the advancement of technology forward. In the short term not a financial "smart move", but it certainly is in the long term.


RT Cores and Tensor cores are ~10% of the die space, RT cores are smaller part of that, in the range of 3%. This would not be enough for noticeable general performance boost. There is a case to be made about Tensor cores but a large part of that seems to be double-use for FP16 and possibly the concurrent FP+INT. Power concerns are on top of that.

AMD's RDNA2 seems to go and follow the same general idea as Nvidia did - small specialized ASIC added to existing pipeline, in RDNA2 reportedly next to or in the TMUs.


----------



## cucker tarlson (Dec 18, 2019)

I hope they improve RT/tensor count.Either go big or go home nvidia.
Control was beautiful but still played like a demo with DLSS at 55-62 fps.I want steady 70 fps at native with a $600 prie tag max.


----------



## INSTG8R (Dec 18, 2019)

cucker tarlson said:


> I hope they improve RT/tensor count.Either go big or go home nvidia.
> Control was beautiful but still played like a demo with DLSS at 55-62 fps.I want steady 70 fps at native with a $600 prie tag max.


Of course it will get better and more viable as newer generations of cards move forward, like all new tech SM3.0 
, Physx, the evolution of DX, Tessellation, all things that needed a generation to become useable and another one to become features we now take for granted and are just standard settings we don’t even think about anymore effecting our performance.


----------



## cucker tarlson (Dec 18, 2019)

INSTG8R said:


> Of course it will get better and more viable as newer generations of cards move forward, like all new tech SM3.0
> , Physx, the evolution of DX, Tessellation, all things that needed a generation to become useable and another one to become features we now take for granted and are just standard settings we don’t even think about anymore effecting our performance.


yup.
remember how expensive gtx 8800 was ? 
it was a breakthrough card nevertheless.
now we've got rtx 2060 running circles around 1080Ti in ray tracing.


----------



## efikkan (Dec 19, 2019)

cucker tarlson said:


> yup.
> remember how expensive gtx 8800 was ?
> it was a breakthrough card nevertheless.
> now we've got rtx 2060 running circles around 1080Ti in ray tracing.


Ray tracing will improve a lot in a couple of generations.
GTX 1080 Ti received a lot of harsh criticism when it launched, then Turing came along and all of a sudden GTX 1080 Ti was excellent and Turing bad.
Also, AMD criticized ray tracing when Turing launched, and now they may be launching something similar next year (if rumors are to be believed).
As always, the perspective changes to fit the ever-changing narrative.


----------



## ratirt (Dec 19, 2019)

efikkan said:


> Ray tracing will improve a lot in a couple of generations.
> GTX 1080 Ti received a lot of harsh criticism when it launched, then Turing came along and all of a sudden GTX 1080 Ti was excellent and Turing bad.
> Also, AMD criticized ray tracing when Turing launched, and now they may be launching something similar next year (if rumors are to be believed).
> As always, the perspective changes to fit the ever-changing narrative.


I don't recall AMD as the company criticizing Ray Tracing. Maybe they have criticized Turing graphics for the RT performance although I can't recall that either.


----------



## londiste (Dec 19, 2019)

efikkan said:


> Also, AMD criticized ray tracing when Turing launched, and now they may be launching something similar next year (if rumors are to be believed).


AMD didn't really criticize RT. They were pretty quiet about it and the promiment (or maybe only) real statement they made is that they will add RT capability in hardware when it makes sense across the entire range.

From rumors and leaks, RDNA2 places RT ASIC into TMU. This should lead to an interesting situation where AMD has more RT capabilities assuming their unit has similar capabilities to Nvidia's RT Cores (which, based on AMD's patent is very likely). RX 5700XT has 160 TMUs while RTX 2080 Super has 48 SMs/RT Cores (Turing has one RT Core per SM).


----------



## efikkan (Dec 19, 2019)

londiste said:


> From rumors and leaks, RDNA2 places RT ASIC into TMU. This should lead to an interesting situation where AMD has more RT capabilities assuming their unit has similar capabilities to Nvidia's RT Cores (which, based on AMD's patent is very likely). RX 5700XT has 160 TMUs while RTX 2080 Super has 48 SMs/RT Cores (Turing has one RT Core per SM).


That will be very hard to assess accurately without anything but speculation. AMD's solution here will be more "improvised", it may end up successful, but it could easily go the other way as well. So I don't dare to claim that it will be more capable than Turing at this point.

Also keep in mind that "RDNA2" will mostly compete with the next generation from Nvidia.


----------



## delshay (Dec 31, 2019)

In the mean time another game has gone beta RTX "AMID EVIL". Follow the instructions on the steam how to access & download. You need to have the base game to access RTX features.


----------



## INSTG8R (Dec 31, 2019)

delshay said:


> In the mean time another game has gone beta RTX "AMID EVIL". Follow the instructions on the steam how to access & download. You need to have the base game to access RTX features.


Elaborate?


----------



## londiste (Dec 31, 2019)

Save 66% on AMID EVIL on Steam
					

A retro FPS for the ages! Once branded a HERETIC. Now YOU have been chosen as our champion! Reclaim our sacred weapons. Take back our ancient lands. If you can stand... AMID EVIL.




					store.steampowered.com
				











						Steam :: AMID EVIL :: RTX Beta Available Now!
					

Champions! The rumors are true...  The AMID EVIL RTX beta is here!




					steamcommunity.com


----------



## INSTG8R (Dec 31, 2019)

londiste said:


> Save 66% on AMID EVIL on Steam
> 
> 
> A retro FPS for the ages! Once branded a HERETIC. Now YOU have been chosen as our champion! Reclaim our sacred weapons. Take back our ancient lands. If you can stand... AMID EVIL.
> ...


Thank you. I read it like a typo “AMD EVIL”


----------



## delshay (Dec 31, 2019)

INSTG8R said:


> Thank you. I read it like a typo “AMD EVIL”



When I first saw this game, this is how I saw it too. So your not alone. This game is awesome & I have completed it.


----------

