• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Performance Slide of RTX 3090 Ampere Leaks, 100% RTX Performance Gain Over Turing

That entirely depends on what you're looking for. Rasterization can't do lighting or reflections like RTRT can. GI can be faked to a certain extent, but it's (at least up until now) extremely expensive. Screen space reflections are great, until you're looking at something reflective that ought to be reflecting off-screen objects or scenery, yet suddenly isn't. Did you notice how there is nothing reflective in the UE5 demo? That isn't an accident, and I would imagine one of the first add-ons (or just most used optional features) for UE5 being RTRT reflections, as that would serve to complete the otherwise excellent looking package. You don't necessarily notice these things in play, but you obviously might. Control is an excellent example of the settings feeling more realistic when RTRT is added. That obviously doesn't mean that there arent/won't be bad RTRT implementations that don't add anything of real value - there definitely will be! - but the technology nonetheless has the potential to surpass anything possible with rasterization in certain aspects. Full scene RTRT is unnecessary and likely of little real benefit, but selective use of the tech can provide good IQ improvements, even relative to the performance cost, when used in the right way in the right games. On the other hand, some games don't benefit from this whatsoever, and should really skip it. As is the case with all kinds of graphical features.

You can do good reflections ala Gears 5. We need better looking hair and trees and textures and so much more than lighting imo. I'd like a 32GB VRAM card instead of an RT card. There is so much wasted silicon with the latest cards.
 
You can do good reflections ala Gears 5. We need better looking hair and trees and textures and so much more than lighting imo. I'd like a 32GB VRAM card instead of an RT card. There is so much wasted silicon with the latest cards.
"Good reflections" doesn't contradict what I said. As I said in the very post you quoted: screen space reflections are great, until you're looking at something reflective that ought to be reflecting off-screen objects or scenery. And that is the type of thing that can have a dramatic effect on immersion when you notice it, as it makes the gameworld feel all that much more real and less artificial.

Besides, scaling rasterization much higher would put massive pressure on the memory subsystem, which would be problematic in and of itself. There's plenty of improvements to rasterization to be had in these GPUs already, asking for more is just asking for a memory bottleneck.
 
You can do good reflections ala Gears 5. We need better looking hair and trees and textures and so much more than lighting imo. I'd like a 32GB VRAM card instead of an RT card. There is so much wasted silicon with the latest cards.
And that's what RT will eventually enable: when you get reflections, ambient occlusion for free, you'll have more HP at your disposal to throw at geometry. But not just yet when games are still mostly about rasterizing with a little RT on top.

I'm not even sure why we're discussing this, it's not like RT is a gimmick Nvidia pulled out of their hat yesterday. RT has been the choice for professional rendering for decades. Because it produces superior results.
 
Free you mean allocating silicone space and resources to integrating it in place of other hardware. Look I installed a supercharger on the engine and am charging you more for it's "free performance" enjoy comrade.
 
And that's what RT will eventually enable: when you get reflections, ambient occlusion for free, you'll have more HP at your disposal to throw at geometry. But not just yet when games are still mostly about rasterizing with a little RT on top.

I'm not even sure why we're discussing this, it's not like RT is a gimmick Nvidia pulled out of their hat yesterday. RT has been the choice for professional rendering for decades. Because it produces superior results.
it produces inferior results, we are playing in real time, not offline... ugh... anyways the RT believers are not worth arguing with
 
it produces inferior results, we are playing in real time, not offline... ugh... anyways the RT believers are not worth arguing with
No, it doesn't. Read up on it, there's many free papers on the subject.
 
Based on TPU review of Control. RTX on will cut 2080Ti performance to half in 1080p


Without DLSS, 2080Ti gets about 100FPS avg at 1080p, 50fps avg at RTX 1080P.

So now RTX3090 gets about 100 fps on RTX 1080p (>2X from this chart)

That means RTX3090 needs at least 200FPS on regulation 1080p, that is without RTX. Of course this is assuming no RT efficiency improvement. Let's assume there is some major RT effieceny improvement so instead of 0.5X performance penalty we have 0.7X performance penalty. Then RTX3090 would be running 133FPS to 150FPS without RTX. So a 30% to 50% performance uplift in non-RTX games. Also we have 5248 versus 4352 CUDA core. So CUDA core increase by itself should give at least 20% performance in nonRTX

That is just some quick napkin math. I am more leaning towards 30%~35% performance increase. But there is a good chance that I am wrong
Sorry, you lost me at 1080p, damn that eternal resolution, it'll outlive NTSC.
 
Back
Top