Friday, December 13th 2019

Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2

Hardware-accelerated ray tracing and variable-rate shading will be the design focal points for AMD's next-generation RDNA2 graphics architecture. Microsoft's reveal of its Xbox Series X console attributed both features to AMD's "next generation RDNA" architecture (which logically happens to be RDNA2). The Xbox Series X uses a semi-custom SoC that features CPU cores based on the "Zen 2" microarchitecture and a GPU based on RDNA2. It's highly likely that the SoC could be fabricated on TSMC's 7 nm EUV node, as the RDNA2 graphics architecture is optimized for that. This would mean an optical shrink of "Zen 2" to 7 nm EUV. Besides the SoC that powers Xbox Series X, AMD is expected to leverage 7 nm EUV for its RDNA2 discrete GPUs and CPU chiplets based on its "Zen 3" microarchitecture in 2020.

Variable-rate shading (VRS) is an API-level feature that lets GPUs conserve resources by shading certain areas of a scene at a lower rate than the other, without perceptible difference to the viewer. Microsoft developed two tiers of VRS for its DirectX 12 API, tier-1 is currently supported by NVIDIA "Turing" and Intel Gen11 architectures, while tier-2 is supported by "Turing." The current RDNA architecture doesn't support either tiers. Hardware-accelerated ray-tracing is the cornerstone of NVIDIA's "Turing" RTX 20-series graphics cards, and AMD is catching up to it. Microsoft already standardized it on the software-side with the DXR (DirectX Raytracing) API. A combination of VRS and dynamic render-resolution will be crucial for next-gen consoles to achieve playability at 4K, and to even boast of being 8K-capable.
Add your own comment

119 Comments on Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2

#76
kings
ratirtAMD can cut price can NV do that too? With it's fancy, expensive RT cores? I really doubt it.
Nvidia's margins are much higher than AMD's. They have already cut prices somewhat at the launch of the Super cards, by offering better chips for the same value.

Nvidia doesn't cut prices even more, because it doesn't have to, really. They currently have about 73% market share in dedicated GPUs. It is AMD that needs to gain market share and so has to subject itself to earn less.
Posted on Reply
#77
ratirt
kingsNvidia's margins are much higher than AMD's. They have already cut prices somewhat at the launch of the Super cards, by offering better chips for the same value.

Nvidia doesn't cut prices even more, because it doesn't have to, really. They currently have about 73% market share in dedicated GPUs. It is AMD that needs to gain market share and so has to subject itself to earn less.
Sure but I don't see AMD actually being scared about NV's price cuts. Actually the 5700 series is selling pretty well.
Huuh. Cut price to offer better chips for the same value? I though it was the natural way of new graphics cards evolution and releases but I guess it has been brought down to the goodness of NVidia now.
You are missing the rest of the conversation to say NV doesn't have to cut prices. It will have to cut prices and you will see soon why. AMD my dear foreign friend doesn't need to do anything now and it is evidently clearly today. It is NV that is running around town like a boogeyman trying to scare people off with not having RT cores.
Posted on Reply
#78
londiste
ratirtYou do realize that Q2 RTX doesn't even use the RT cores for Ray tracing that NV is so fond of?
Yes, it does. RT cores are why Turing is that much faster in Q2 RTX over GTX cards. The problem with Q2 RTX is different - it is not a good representation for hybrid RTRT solutions because it is not one. Q2 RTX is completely pathtraced.
ratirtDo you know why I know this ?
Because www.cryengine.com/news/view/crytek-releases-neon-noir-a-real-time-ray-tracing-demonstration-for-cryengine
doesn't need RT cores to get this one done and works with sufficient performance and it is ray tracing just as in Quake 2.
First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.
Posted on Reply
#79
cucker tarlson
ratirtYou do realize that Q2 RTX doesn't even use the RT cores for Ray tracing that NV is so fond of? What you are doing is quoting what NV has planned all along. Nice marketing for RTX cards stating that you need RT cores. Since NV is asking so much for RTX it would have been stupid if quake 2 worked well on 1080 Ti now would it? The price needs to be justified and you fall for it.
what?of course it does.
Jesus the red base fans their theories :rolleyes:
ratirtBecause www.cryengine.com/news/view/crytek-releases-neon-noir-a-real-time-ray-tracing-demonstration-for-cryengine
doesn't need RT cores to get this one done and works with sufficient performance and it is ray tracing just as in Quake 2. (Actually it looks even better than in quake) They have added RTX to make it believable that the RT core are actually necessary and also a great marketing for NVidia cards with Ray Tracing. Cripple the driver for 1080 TI so that it doesn't work properly and here you have a great evidence.
Did you expect that NV will release RTX cards with RT cores without giving any rational reason and justification for the price even if NV has to forge that reason which is Quake2 RTX?
NeonNoir only has reflections at 1 ray per 4 pixels and it already suffers immensely.This is worse than RTX low doing 1 ray per 2 pixels in worst case scenario,and it's a synthetic benchmark not a game.
lol,you're in a big bubble sir.
ratirtHuuh. Cut price to offer better chips for the same value? I though it was the natural way of new graphics cards evolution and releases but I guess it has been brought down to the goodness of NVidia now.
lel,just like 5500xt
Posted on Reply
#80
ratirt
londisteYes, it does. RT cores are why Turing is that much faster in Q2 RTX over GTX cards. The problem with Q2 RTX is different - it is not a good representation for hybrid RTRT solutions because it is not one. Q2 RTX is completely pathtraced.

First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.
There is no distinguished difference between Ray Tracing and Path Tracing. Path tracing is supposed to be a faster form of Ray Tracing and that is basically it.
cucker tarlsonwhat?of course it does.
Jesus the red base fans their theories :rolleyes:
No it doesn't :) That is the funny part :D You think that RT core speed up ray tracing and that is not the case.
Besides I'm not a red based fan so quit that. Who's being a prick now? :p
cucker tarlsonNeonNoir only has reflections at 1 ray per 4 pixels and it already suffers immensely.This is worse than RTX low doing 1 ray per 2 pixels in worst case scenario,and it's a synthetic benchmark not a game.
lol,you're in a big bubble sir.
We will see who is in a big bubble (Whatever that means) in time. New engine will be available in full extent soon and there will definitely be games using it. this will be a good indication of what is actually needed. Those rays per pixel can be increased you know. It is a demo showcase to show what it can do like a CPU sample. It is not the released product so be patient. You just don't see it yet and if I'm supposed to be a red based fan with theories than you are a blind green fan without any theories or reasoning for that matter. :)
Posted on Reply
#81
cucker tarlson
ratirtNo it doesn't :) That is the funny part :D You think that RT core speed up ray tracing and that is not the case.
absolutely.they built 750mm2 dies just to cripple 1080Ti in the end.

Posted on Reply
#82
ratirt
cucker tarlsonabsolutely.they built 750mm2 dies just to cripple 1080Ti in the end.
They have built it because it is a graphics company and "leather jacket" must have something to brag about and this time around it was RT cores. Let's see what he will come up with next year.
The difference in performance in ray tracing scenarios and non ray tracing environment between 2080 and 1080 is more less the same in both cases. So how is the RT cores supposed to speed things up for ray tracing?
This means that the 2080S is simply faster graphics.

I will follow up this stuff a bit more to evaluate it and see if this is true for sure. I suggest you do the same thing.
Posted on Reply
#83
londiste
ratirtYou think that RT core speed up ray tracing and that is not the case.
Why would you claim this? Do you have any reference or proof?
ratirtThe difference in performance in ray tracing scenarios and non ray tracing environment between 2080 and 1080 is more the same. So how is the RT cores supposed to speed things up for ray tracing?
This means that the 2080S is simply faster graphics.
RTX2080 is about on par with GTX1080Ti, if a little bit above it. Super variant is a few more percent ahead. There are improvements other than RT cores that allow Turing cards to get a performance lead over Pascal if used properly (and Neon Noir seems to be a good example of that). When RT Cores are used in games that have DXR effects or the Vulkan counterparts - Quake 2 RTX, BF V, Metro Exodus, SoTR - RTX cards blow GTX cards out of the water.
Posted on Reply
#84
cucker tarlson
londisteWhy would you claim this? Do you have any reference or proof?
good one,haha.


I don't think he gets rasterized vs ray traced.









2080Ti with 13.5 tflops and rt+tensor cores 40 fps
Titan V with 15 tflops and no rt cores 28 fps
londisteYes, it does. RT cores are why Turing is that much faster in Q2 RTX over GTX cards. The problem with Q2 RTX is different - it is not a good representation for hybrid RTRT solutions because it is not one. Q2 RTX is completely pathtraced.

First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.
when using simpler forms of RT,like shadows only,1080Ti is closer to 2060,still loses by 40%

www.purepc.pl/karty_graficzne/call_of_duty_modern_warfare_2019_test_wydajnosci_raytracingu?page=0,5

interestingly,the perfromance penalty is over 100% on 1080Ti,80% on 1660Ti (tensor) but only 19% on RTX 2060.
1080Ti tends to produce more noisy image too.
Posted on Reply
#85
efikkan
ratirtSure but I don't see AMD actually being scared about NV's price cuts. Actually the 5700 series is selling pretty well.
I guess that depends on your definition of well.
As can be seen in the Steam Hardware Survey, it has done little to impact AMD's market share and is still outsold by Nvidia's comparable products;

AMD Radeon RX 5700 XT 0.22% (+0.07%)

NVIDIA GeForce RTX 2060 1.95% (+0.41%)
NVIDIA GeForce RTX 2070 1.60% (+0.19%)
NVIDIA GeForce RTX 2070 SUPER 0.42% (+0.17%)
NVIDIA GeForce RTX 2060 SUPER 0.25% (+0.10%)

As you can see, in this segment Nvidia is outselling them ~10:1.
ratirtThey have built it because it is a graphics company and "leather jacket" must have something to brag about and this time around it was RT cores. Let's see what he will come up with next year.
You're not even trying to be serious. Grow up or go play somewhere else!

Anyone with a basic understanding of 3D graphics knows ray tracing to be necessary to get good lighting.
Posted on Reply
#86
ratirt
efikkanI guess that depends on your definition of well.
As can be seen in the Steam Hardware Survey, it has done little to impact AMD's market share and is still outsold by Nvidia's comparable products;

AMD Radeon RX 5700 XT 0.22% (+0.07%)

NVIDIA GeForce RTX 2060 1.95% (+0.41%)
NVIDIA GeForce RTX 2070 1.60% (+0.19%)
NVIDIA GeForce RTX 2070 SUPER 0.42% (+0.17%)
NVIDIA GeForce RTX 2060 SUPER 0.25% (+0.10%)

As you can see, in this segment Nvidia is outselling them ~10:1.


You're not even trying to be serious. Grow up or go play somewhere else!

Anyone with a basic understanding of 3D graphics knows ray tracing to be necessary to get good lighting.
Market share is different from sales since we are not talking in general but one segment? Which market you are talking about here? I remember you claimed that MindFactory.de is not relevant and yet steam is? Anyway NV is making a lot of noise around RT is it not? I don't see that from AMD side and yet as you said AMD is the one should be trying harder.

I am serious the same way I see you being serious.
Posted on Reply
#87
londiste
@efikkan a better comparison is probably Super cards as both RTX2060 and RTX2070 have been on the market for about a year more than Navi cards while RTX2060 Super/RTX2070 Super were released right before RX5700/RX5700XT. RX5700 does not seem to be listed separately in the Steam HW survey, meaning it is either rolled into 5700XT or more likely is <0.15%. There is still a twofold difference but Navi is doing quite well.
Posted on Reply
#88
cucker tarlson
lol,show us the "data" you and your collague managed to obtain while you were doing your "research" wink wink ;);)
Posted on Reply
#89
efikkan
ratirtMarket share is different from sales since we are not talking in general but one segment? Which market you are talking about here? I remember you claimed that MindFactory.de is not relevant and yet steam is?
I'm talking of market share in the gaming market, which is a subset of the entire PC market.
The fact is that AMD's market share among gamers have stayed stagnant at 15%, which also includes APUs from AMD. For the past three years AMD have not been present in the high-end, stayed at ~10% or less of the mid-range, while many have been touting Polaris, Vega and now Navi as "great successes". In general sales AMD have about 20-25% discrete GPUs, but most people keep forgetting that a lot of this is from OEM sales of low-end GPUs that are not used for gaming. Steam is the most dominant platform among PC gamers, and is very much representative of the PC gaming market, anyone who understands representative samples would understand this. There is nothing that is more representative than the Steam statistics at this point.
ratirtAnyway NV is making a lot of noise around RT is it not? I don't see that from AMD side and yet as you said AMD is the one should be trying harder.
Over the past five years AMD have been making way more noise over "their stuff" than anyone else, including Mantle, the myth of "better" Direct3D 12 performance, FreeSync being "free", etc.

While RT may not be super useful yet, it will be at some point. All hardware support have to start somewhere, and hardware support have to come before software support.
londiste@efikkan a better comparison is probably Super cards as both RTX2060 and RTX2070 have been on the market for about a year more than Navi cards while RTX2060 Super/RTX2070 Super were released right before RX5700/RX5700XT.
Just in the past month there has been added nearly twice as many RTX 2060s as there are RX 5700 XTs in total. If you add up the percentage-points of gain for these Nvidia cards it is 0.87% compared to RX 5700 XTs 0.07% gain.
Posted on Reply
#90
kings
ratirtThey have built it because it is a graphics company and "leather jacket" must have something to brag about and this time around it was RT cores. Let's see what he will come up with next year.
Yeah, Nvidia thought, "let's release cards with bigger and more expensive dies, with RT cores that do nothing, just to brag about it".

You must think that the people who work at Nvidia are all stupid and make business decisions that involve millions and millions of dollars, just for the bragging rights.
Posted on Reply
#91
Vayra86
londisteFirst, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.
Eh what? I run that bench at 60-80 FPS on my 1080. Did you try it yet? Add your score :)
www.techpowerup.com/forums/threads/cryteks-neon-noir-raytracing-benchmark-results.261155/

The problem Neon Noir has is accuracy, but RT isn't all that accurate yet either, it just resolves the lack of detail differently. Ill take the software box of tricks in Neon Noir over BFV's RT implementation any day of the week.

Really the debate is still ongoing on what is the best solution. Some hardware for it, sure. Large sections of a die? Not so sure, this will probably get integrated in a way and Turing is just an early PoC.
Posted on Reply
#92
INSTG8R
Vanguard Beta Tester
Vayra86Eh what? I run that bench at 60-80 FPS on my 1080. Did you try it yet?

The problem Neon Noir has is accuracy, but RT isn't all that accurate yet either, it just resolves the lack of detail differently. Ill take the software box of tricks in Neon Noir over BFV's RT implementation any day of the week.

Really the debate is still ongoing on what is the best solution. Some hardware for it, sure. Large sections of a die? Not so sure, this will probably get integrated in a way and Turing is just an early PoC.
Exactly have we seen an example of DXR yet? the agnostic solution where the field is ”level”
Posted on Reply
#93
Vayra86
kingsYeah, Nvidia thought, "let's release cards with bigger and more expensive dies, with RT cores that do nothing, just to brag about it".

You must think that the people who work at Nvidia are all stupid and make business decisions that involve millions and millions of dollars, just for the bragging rights.
Its very clear what Nvidia is looking at: 4K adoption rate is not really going places and those who do have it, tend to lower their res anyway. But Nvidia also has trouble making cards much faster than 1080ti, I mean the 2080s are baby steps and the 2080ti is way too large to be economical hence its price. At the same time, there is good growth in demand for high refresh rates, but even that is very feasible on the current crop of cards for most games, especially competitive ones.

Essentially, Nvidia was looking for a new buyer's incentive/upgrade incentive and found it in RT. Marketing then made us believe the world is ready for it. That is how these things go :)

So really, lacking the content, Nvidia surely released Turing cards with the idea to brag about it. It is what Jensen has been doing since day one. It just works, right? We were going to buy more to save more because dev work was going to become so easy, if you winked at the GPU it'd do the work for you. Or something vague like that. And then there is reality: a handful of titles with so-so implementations at a massive FPS hit ;)

This also explains why AMD cares a lot less, and just now starts to push it to console. Their target market doesn't really care, and represents the midrange. AMD has no urge to push this forward other than telling the world they still play along.
Posted on Reply
#94
londiste
Vayra86Eh what? I run that bench at 60-80 FPS on my 1080. Did you try it yet? Add your score :)

The problem Neon Noir has is accuracy, but RT isn't all that accurate yet either, it just resolves the lack of detail differently. Ill take the software box of tricks in Neon Noir over BFV's RT implementation any day of the week.

Really the debate is still ongoing on what is the best solution. Some hardware for it, sure. Large sections of a die? Not so sure, this will probably get integrated in a way and Turing is just an early PoC.
Sorry, my bad. 1080p@30FPS was the initial claim from CryTek. This seems to be the 99% low result for Vega56 and it actually runs faster. GTX1080 is in the same ballpark. Comparison to Battlefield is off, you are right about that.

Neon Noir has cool optimizations that benefit performance. Things like only doing RT for short range and falling back to Voxels when it is beneficial.
By the way, CryTek should (and plans to) use assistance from DXR or Vulkan's RT extensions in their engine.
In comparison to Neon Noir, what exactly makes you dislike BFV's RT implementation?

Best solution is relative. RT cores is not an RT solution. It is a hardware assist to casting rays. The exact algorithm and optimizations are up to developer.
INSTG8RExactly have we seen an example of DXR yet? the agnostic solution where the field is ”level”
On Nvidia side of things, any DXR game will give an idea what RT performance differences are between Pascal, Turing and Turing with RT cores.
If we want to compare AMD vs Nvidia, we cannot. AMD cards/drivers have no DXR implementation.
Posted on Reply
#95
Vayra86
londisteSorry, my bad. 1080p@30FPS was the initial claim from CryTek. This seems to be the 99% low result for Vega56 and it actually runs faster. GTX1080 is in the same ballpark. Comparison to Battlefield is off, you are right about that.

Neon Noir has cool optimizations that benefit performance. Things like only doing RT for short range and falling back to Voxels when it is beneficial.
By the way, CryTek should (and plans to) use assistance from DXR or Vulkan's RT extensions in their engine.
In comparison to Neon Noir, what exactly makes you dislike BFV's RT implementation?

Best solution is relative. RT cores is not an RT solution. It is a hardware assist to casting rays. The exact algorithm and optimizations are up to developer.
Not so much dislike, I just fancy the hybrid solution better because it will help adoption better. BFV is proof of concept, CryEngine makes it marketable for mainstream audience.

Also I don't believe games need the high accuracy at all. Especially in motion, the cost of that detail just isn't worth it. On top of that, games are an artistic product, even those that say they want to 'look real'. Its still a scene and it still has its limitations, and therefore still needs tweaking because just RT lighting makes lots of stuff unplayable.
Posted on Reply
#96
londiste
Vayra86Not so much dislike, I just fancy the hybrid solution better because it will help adoption better. BFV is proof of concept, CryEngine makes it marketable for mainstream audience.
All of these are hybrid solutions. It is just a question of LOD, falloff distances and what it falls back to. Neon Noir is not a good representation of a game. It is a fixed techdemo meaning it is no doubt very well optimized.

RT effects, including DXR support, are there or coming to large engines. Unreal has those, Unity has those (not sure if still in preview or production build), CryEngine has RT but no DXR support yet. Others will not be far behind.
Posted on Reply
#97
efikkan
Vayra86Its very clear what Nvidia is looking at: 4K adoption rate is not really going places and those who do have it, tend to lower their res anyway.…
Essentially, Nvidia was looking for a new buyer's incentive/upgrade incentive and found it in RT…
So really, lacking the content, Nvidia surely released Turing cards with the idea to brag about it.
I'm seriously concerned if you believe your own words, because all of that is a truckload worth of ox manure.

Ray tracing has been requested by graphics developers for over a decade. Every new GPU generation has given us more performance and memory, easily allowing developers to throw in larger meshes, finer grained animations and higher detailed textures, which is easy since most assets are modeled in higher detail anyway. But lighting and shadows have been a continuous problem. Simple stencil shadows and pre-rendered shadow maps is not cutting it any more as the other details of the games keeps increasing. Pretty much every lighting effect you see in games are just cheap clever tricks to simulate the real thing, and quite often only "work well" under conditions and may result in unwanted side-effects. Programming all these effects is also quite challenging, and may have to be adapted to all the various scenes of a game.

Simply put; developers want RT more than Nvidia. But we are only in the infant stages of RT this far, it's still too slow to be used to the extent developers want. So for now, it has to be used in a limited fashion.
Posted on Reply
#98
Vayra86
efikkanI'm seriously concerned if you believe your own words, because all of that is a truckload worth of ox manure.

Ray tracing has been requested by graphics developers for over a decade. Every new GPU generation has given us more performance and memory, easily allowing developers to throw in larger meshes, finer grained animations and higher detailed textures, which is easy since most assets are modeled in higher detail anyway. But lighting and shadows have been a continuous problem. Simple stencil shadows and pre-rendered shadow maps is not cutting it any more as the other details of the games keeps increasing. Pretty much every lighting effect you see in games are just cheap clever tricks to simulate the real thing, and quite often only "work well" under conditions and may result in unwanted side-effects. Programming all these effects is also quite challenging, and may have to be adapted to all the various scenes of a game.

Simply put; developers want RT more than Nvidia. But we are only in the infant stages of RT this far, it's still too slow to be used to the extent developers want. So for now, it has to be used in a limited fashion.
Source, please. And not an Nvidia branded or affiliated one if you wouldn't mind.

Even if just for sanity check purposes... because when I hear 'developers want for a decade' all I really hear is 'we've been working on this for 10 years, and finally, here it is' (Huang himself @ SIGGRAPH). I've seen too much spin in my life to take this at face value. There is always an agenda and its always about money.
Posted on Reply
#99
INSTG8R
Vanguard Beta Tester
londisteOn Nvidia side of things, any DXR game will give an idea what RT performance differences are between Pascal, Turing and Turing with RT cores.
If we want to compare AMD vs Nvidia, we cannot. AMD cards/drivers have no DXR implementation.
Yet, but DXR is where “the rubber meets the road” where both sides are using the same API and where we see how devs will focus the new tech. Vulkan RT also would apply We know AMD has their answer ready with the new Xbox announcement , just waiting on their dGPU answer.
Posted on Reply
#100
londiste
@efikkan, @Vayra86, I think neither of you are wrong. 4K adoption is not going all too well and with the generally expected rate of 30% more performance per generation there are a couple generations to go until 4K 60FPS is easy. GPU vendors do need a new incentive of some sort to push the envelope.

Raytracing has been kind of coming for a while. Theory is there, research is there but performance simply has not been for anything real-time. Now Nvidia pushed the issue to the brink of being usable. Someone has to push new things for these to be implemented and widespread enough especially when it comes to hardware features.

@Vayra86 just look at how lighting and shadowing methods have evolved. Shadow maps, dynamic stencil shadows, soft/hard shadows and the ever more complex methods for these. Similarly and closely related - GI methods. Latest wave of GI methods were SVOGI (that CryEngine uses for fallback in Neon Noir and their RT solution is evolved from) and Nvidia's VGXI are both Voxel-based and with a very noticeable performance hit. In principle both get more and more closer to raytracing. Also keep in mind that rasterization will apply several different methods on top of each other, complicating things.

If you think Nvidia is doing this out of the blue - they are definitely not. A decade of experience in OptiX gives them a good idea about where the performance issues are. Of course, same applies to AMD and Intel.
Posted on Reply
Add your own comment
Mar 29th, 2025 02:44 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts