Wednesday, October 17th 2018

Remedy Shows The Preliminary Cost of NVIDIA RTX Ray Tracing Effects in Performance

Real time ray tracing won't be cheap. NVIDIA GeForce RTX 20 Series graphics cards are quite expensive, but even with that resources the cost to take advantage of this rendering technique will be high. We didn't know for sure what this cost would be, but the developers at Remedy have shown some preliminary results on that front. This company is working on Control, one of the first games with RTX support, and although they have not provided framerate numbers, what we do know is that the activation of ray tracing imposes a clear impact.

It does at least in these preliminary tests with its Northlight Engine. In an experimental scene with a wet marble floor and a lot of detailed furniture they were able to evaluate the cost of enabling RTX. There is a 9.2 ms performance overhead per frame in total: 2.3 ms to compute shadows; 4.4 ms to compute reflexions; and 2.5 ms for the global denoising lighting. These are not good news for those who enjoy games at 1080p60.

Remedy may be able to reduce that impact in the final version of its engine and in the game, but those 9.2 ms will clearly influence the framerate we can achieve. Playing at 30 fps requires 33 ms and playing at 60 fps requires 17 ms per frame. If we enable NVIDIA's RTX effects that would translate to a framerate of about 40 fps during the game with a 1920x1080 resolution on a GeForce RTX 2080 Ti. The result is excellent visually: clearer shadows and reflections that are independent of the camera and angle show up and give a photorealistic finish to the game, but the cost is high. Too much, maybe?
Source: Golem
Add your own comment

85 Comments on Remedy Shows The Preliminary Cost of NVIDIA RTX Ray Tracing Effects in Performance

#1
Fleurious
I’de sacrifice image quality for refresh rate (up to 120-144hz range), not the other way around. I’de rather not sacrifice either though.
Posted on Reply
#2
pky
So, purely theoretically, to achieve 60FPS with RT on you would need to render a frame (without RT) for no more than 7.4ms, which would mean you would have to have ~135FPS with Ray Tracing off to achieve 60FPS with it enabled. Of course, that's just dry maths with the data provided, can't guarantee that the final product will act this way. But if it does - that's quite big of a performance hit.
Posted on Reply
#3
Ed_1
The RTX cores are run in parallel so AFAIK should not be just added to render time.
There should be an amount of workload from RTX that is "free".
It is not clear here and this is just beta, I am sure it will get optimized and probably options given for user level settings in the end.
Posted on Reply
#4
TheLostSwede
News Editor
As I posted elsewhere, RTX is a technology no gamer asked for, yet Nvidia spent millions (billions?) in making it a thing. It might very well be great in 2-3 generations, but right now it's something that offers little to no benefit, for a lot of extra cost. Thanks Nvidia.
Posted on Reply
#5
Sasqui
TheLostSwedeAs I posted elsewhere, RTX is a technology no gamer asked for, yet Nvidia spent millions (billions?) in making it a thing. It might very well be great in 2-3 generations, but right now it's something that offers little to no benefit, for a lot of extra cost. Thanks Nvidia.
If you enjoy static images, they're great! At least they're heading in the right direction, maybe with optimization or some breakthrough graphics engine, it'll become a reality to play RT games.
Posted on Reply
#6
atomicus
It seems quite obvious to me now that buying an RTX card is more akin to putting money in to some sort of Kickstarter campaign for the future of ray tracing... only you don't get anything for it regards ray tracing other than crappy frame rates and some tech demos on the card you've bought. So yeah, if you're OK with that and are happy to support Nvidia in their future endeavours then by all means purchase one of these GPUs, but please don't do so expecting fully fledged ray traced gaming experiences at playable frame rates... ESPECIALLY if you're at 1440p or above!

At 4K in AAA titles, the 2080Ti delivers great performance, so again, if you want maxed out 4K gaming, go nuts... but that's the only area where the 2080Ti will ever excel, and even there it's not going to hold out forever, especially given how close it is to 60FPS in some games already. A year from now, it's going to be struggling unless you start turning down settings.
Posted on Reply
#7
R0H1T
pkySo, purely theoretically, to achieve 60FPS with RT on you would need to render a frame (without RT) for no more than 7.4ms, which would mean you would have to have ~135FPS with Ray Tracing off to achieve 60FPS with it enabled. Of course, that's just dry maths with the data provided, can't guarantee that the final product will act this way. But if it does - that's quite big of a performance hit.
It's hybrid ray tracing anyway & it's not like Nvidia reinvented the wheel, the computational costs for RT are still way OTT :twitch:
Posted on Reply
#8
dozenfury
The selling point with the Pascal gen was finally being able to game at ~60fps (maybe often closer to 45-50fps, but close enough) and 4k. That's a pretty major step-function in capability for users over the past gens where 4k was a nice novelty but not really usable for gaming yet prior to that. RT for Turing is a nice minor feature, but nowhere near the customer impact as Pascal for 4k was. So this is a huge point that will affect Turing sales.

More importantly, even though they've come down some video card prices generally are still artificially high from the mining gold rush craze. And on top of that there is absolutely massive glut of 1070/1080 (and similar AMD) cards out there in store inventory and in customers hands from the mining rush. You never know how things will turn out, but every sign has to be pointing to a this gen returning back to more normal pre-mining sales levels - if not kind of being a flop. It's actually kind of a "perfect storm" in a negative way for them.
Posted on Reply
#9
HTC
Question: is there a middle ground like setting for Ray tracing? Something along the lines of different AA levels but for RT.

So far, we know it only as "on and off": a middle option would greatly boost performance while still showing serious visual improvement over "more traditional" AA, no?

As it stands, with what we know so far, i have serious doubts nVidia will manage 60 FPS @ 1080p with full RT on, with the exception of maybe the 2080ti: forget about higher resolutions, with 2000 series, IMO.

We shall see ...
Posted on Reply
#10
W1zzard
HTCQuestion: is there a middle ground like setting for Ray tracing? Something along the lines of different AA levels but for RT.

So far, we know it only as "on and off": a middle option would greatly boost performance while still showing serious visual improvement over "more traditional" AA, no?

As it stands, with what we know so far, i have serious doubts nVidia will manage 60 FPS @ 1080p with full RT on, with the exception of maybe the 2080ti: forget about higher resolutions, with 2000 series, IMO.

We shall see ...
I asked NVIDIA about that and they say that there will be "RTX off, low, med high" for most games, not just "RTX on/off"
Posted on Reply
#11
HTC
W1zzardI asked NVIDIA about that and they say that there will be "RTX off, low, med high" for most games, not just "RTX on/off"
This changes things immensely.
Posted on Reply
#12
birdie
TheLostSwedeAs I posted elsewhere, RTX is a technology no gamer asked for, yet Nvidia spent millions (billions?) in making it a thing. It might very well be great in 2-3 generations, but right now it's something that offers little to no benefit, for a lot of extra cost. Thanks Nvidia.
You don't need to repeat the idiocy. NVIDIA had to start with RTX and hadn't they done that now, that would have postponed for several years longer. AMD don't even have plans for RT at all - how much would we have to wait for them to realize that RT could be implemeneted right here, right now?

It surely looks like >95% of people commenting on RTX are either young and/or stupid but programmable shaders used to be a new "unneeded" "slow" "superficial" "do really gamers need it?" feature as well when it was introduced over ten years ago. Strangely there are next to zero games nowadays which don't use programmable shaders.

I for one commend NVIDIA for their massive effort of bringing photorealistic lighting, shading and reflections to the masses.

Meanwhile I'm not going to buy any RTXs just yet because they are way over my budget - the cheapest one costs more than my monthly salary but I will most likely buy the RTX 3060 in 2019/2020.

One other thing most illiterate idiots fail to realize is that RTX makes games' development a lot easier, quite cheaper and significantly faster and the end result is just jaw-dropping.

P.S. Sorry for being a little bit harsh.
Posted on Reply
#13
TheLostSwede
News Editor
birdieYou don't need to repeat the idiocy. NVIDIA had to start with RTX and hadn't they done that now, that would have postponed for several years longer. AMD don't even have plans for RT at all - how much would we have to wait for them to realize that RT could be implemeneted right here, right now?

It surely looks like >95% of people commenting on RTX are either young and/or stupid but programmable shaders used to be a new "unneeded" "slow" "superficial" "do really gamers need it?" feature as well when it was introduced over ten years ago. Strangely there are next to zero games nowadays which don't use programmable shaders.

I for one commend NVIDIA for their massive effort of bringing photorealistic lighting, shading and reflections to the masses.

Meanwhile I'm not going to buy any RTXs just yet because they are way over my budget - the cheapest one costs more than my monthly salary but I will most likely buy the RTX 3060 in 2019/2020.
So you're saying that the RTX series of graphics cards offers a technology that all gamers have been eagerly been awaiting? And it's something all gamers are willing to pay an extra $100-300 for for depending on the level of card they're getting?

You clearly know me really well, young and stupid... Right...

I have no problem with technological advances, it just seems like this is a product that brings something that's barely usable to the table for a very, very high extra cost. Hence my comment that it'll most likely be good in 2-3 generations, once they've managed to improve the efficiency. However, launching it in a "beta" state and using its customers as beta testers is getting tiring, especially when they expect everyone to pay a large premium for a feature that brings little to no tangible benefit. Yes, a handful of games will look pretty, but it sounds like they'll play like a pig on anything apart from the 2080Ti, which most of us either can't afford, or are unwilling to invest in.

But please, Mr I Know Better Than You, go ahead, spend your hard earned cash on a beta card, as it's your money and you may spend it on whatever you like. Me? I would've preferred something a bit less expensive. Then again, it seems like you're not willing to do so, yet you're talking smack when others are criticising a half baked product. If it's so great, why aren't you getting one?
Posted on Reply
#14
R0H1T
birdieYou don't need to repeat the idiocy. NVIDIA had to start with RTX and hadn't they done that now, that would have postponed for several years longer. AMD don't even have plans for RT at all - how much would we have to wait for them to realize that RT could be implemeneted right here, right now?

It surely looks like >95% of people commenting on RTX are either young and/or stupid but programmable shaders used to be a new "unneeded" "slow" "superficial" "do really gamers need it?" feature as well when it was introduced over ten years ago. Strangely there are next to zero games nowadays which don't use programmable shaders.

I for one commend NVIDIA for their massive effort of bringing photorealistic lighting, shading and reflections to the masses.

Meanwhile I'm not going to buy any RTXs just yet because they are way over my budget - the cheapest one costs more than my monthly salary but I will most likely buy the RTX 3060 in 2019/2020.

One other thing most illiterate idiots fail to realize is that RTX makes games' development a lot easier, quite cheaper and significantly faster and the end result is just jaw-dropping.

P.S. Sorry for being a little bit harsh.
I think you might be surprised to know how far ahead AMD is in their plans to get parity, or even leap frog Nvidia :cool:
Posted on Reply
#15
Crustybeaver
R0H1TI think you might be surprised to know how far ahead AMD is in their plans to get parity, or even leap frog Nvidia :cool:
AMD ahead :roll:
Posted on Reply
#16
dgianstefani
TPU Proofreader
Hurr Durr... Cost too high... whine whine...

Don't forget in your doomsaying that RTX isn't the only difference between 2xxx and 1xxx.

2080ti first single GPU that can push 4k120fps.
DLSS major advancement in AA.
Typical 2x1x cards being faster than 1x2x cards, i.e 2070 is better than 1080, 2080 is better than 1080ti (except for the rare occasion when game uses more than 8gb VRAM).
Major advancement in SLI, breathing new life into multiGPU with NVLink.
RTX only for those that aren't poor, i.e those that buy 2070+, at which point you have the kind of cash available to demand the best. 2060 and below will just be faster versions of last gen cards.

Nvidia now 2x faster and 2x more power efficient than AMD, their only competitor. You're surprised they charge a hefty premium?

Cry more, but you live in a capitalist society where the objective of business is to make money. If you don't want to buy the product don't buy it, but quit whinging.
Posted on Reply
#17
swirl09
W1zzardI asked NVIDIA about that and they say that there will be "RTX off, low, med high" for most games, not just "RTX on/off"
This is nice to know, because honestly there is *zero* temptation on my part to drop to 1080p in hopes of getting 60fps to play games with RT on.
Posted on Reply
#18
coolernoob
so now we know... even rtx 2080 ti - will not add any value as RayTracing gpu in gaming - not now nor in future, and forget even about lower trier gpus (like 2080 2070)! want raytracing in future? me too - lets wait for RTX 3080ti, but for now lets - value RTX 2xxx as the most poor value new-released gpu ever. end this "....well maybe if you will need it in the upcoming future Raytracing titles"
Posted on Reply
#19
deu
TheLostSwedeAs I posted elsewhere, RTX is a technology no gamer asked for, yet Nvidia spent millions (billions?) in making it a thing. It might very well be great in 2-3 generations, but right now it's something that offers little to no benefit, for a lot of extra cost. Thanks Nvidia.
RTX-cards is developer-tech released to the HEDT (for now). If they wanted to add performance they would have spent the die space on traditional cuda cores and sizing up existing tech. To be honest the reason this happens now is because of the lack of competition; NVIDIA is ahead and have headroom for rolling out and defining dominant design and standards, laying the groundwork for future dominans. If AMD REALLY wanted they could release a Navi-GPU beating 2080Ti in raw performance on 7nm. Whether or not this GPU would have space and tech for RT too is doubtful, but it is properbly too late for AMD to react to this "weak" performanceboost. So in my opinion NVIDIA played their cards somewhat right (eventhough people hate them) in other words: right now RTX cards only competion is made by NVIDIA themselves, therefore the pricebump, for a firstmover tech. RTX will be the, content-creators poormans QUADRO and the newb-with too much money- card. Meanwhile 1080Ti is NOT to be found on the 'used-market' (at least not in Denmark), due to the sudden contextualized relative performance/value bump. :)
Posted on Reply
#20
R0H1T
deuRTX-cards is developer-tech released to the HEDT (for now). If they wanted to add performance they would have spent the die space on traditional cuda cores and sizing up existing tech. To be honest the reason this happens now is because of the lack of competition; NVIDIA is ahead and have headroom for rolling out and defining dominant design and standards, laying the groundwork for future dominans. If AMD REALLY wanted they could release a Navi-GPU beating 2080Ti in raw performance on 7nm. Whether or not this GPU would have space and tech for RT too is doubtful, but it is properbly too late for AMD to react to this "weak" performanceboost. So in my opinion NVIDIA played their cards somewhat right (eventhough people hate them) in other words: right now RTX cards only competion is made by NVIDIA themselves, therefore the pricebump, for a firstmover tech. RTX will be the, content-creators poormans QUADRO and the newb-with too much money- card. Meanwhile 1080Ti is NOT to be found on the 'used-market' (at least not in Denmark), due to the sudden contextualized relative performance/value bump. :)
Whole I agree with the part about Nvidia being ahead (in gaming) but not in the context of Turing, especially wrt RT. The only reason Nvidia is charging so much for Turing is RT & that's a half baked solution anyway, you can choose to ignore the pricing & buy it or wait for better VFM options to come later on.
Posted on Reply
#21
TheLostSwede
News Editor
deuRTX-cards is developer-tech released to the HEDT (for now). If they wanted to add performance they would have spent the die space on traditional cuda cores and sizing up existing tech. To be honest the reason this happens now is because of the lack of competition; NVIDIA is ahead and have headroom for rolling out and defining dominant design and standards, laying the groundwork for future dominans. If AMD REALLY wanted they could release a Navi-GPU beating 2080Ti in raw performance on 7nm. Whether or not this GPU would have space and tech for RT too is doubtful, but it is properbly too late for AMD to react to this "weak" performanceboost. So in my opinion NVIDIA played their cards somewhat right (eventhough people hate them) in other words: right now RTX cards only competion is made by NVIDIA themselves, therefore the pricebump, for a firstmover tech. RTX will be the, content-creators poormans QUADRO and the newb-with too much money- card. Meanwhile 1080Ti is NOT to be found on the 'used-market' (at least not in Denmark), due to the sudden contextualized relative performance/value bump. :)
Very much so and I'm not saying it's wrong that they're working on new technology. My issue is that they've pushed this out before it's really ready, by the looks of things. Only the top end card appears to have sufficient processing power to do the technology justice and only barely at that. Your reasoning makes a lot of sense though, as with no competition, they can do what they want and the consumers just have to like the situation. I'm not even concerned about the so so performance increase, the issue is that we're going from what was possibly the biggest single jump in performance we've seen in a very long time for GPU's that were at least initially offered at a very attractive price point based on the performance improvement that was offered, so big, hot and very expensive GPU's that feels like a step backwards compared to the previous generation. Unfortunately the crypto craze ruined pricing altogether, which also makes this higher price point from Nvidia even more disappointing.

I wonder if AMD really could compete though, it seems like something went wrong with their architecture this time around, maybe because they relied too much on the abilities of their Chinese engineering team? I have a feeling that they have to go back to the drawing board and come up with a vastly improved GPU design to be able to properly compete at the high-end again.

It really sucks to be a consumer right now if you're looking at buying a high-end graphics card.
Posted on Reply
#22
rtwjunkie
PC Gaming Enthusiast
I think it's great technology, which will one day be usable at the same frame rates we expect today with non-RT (60 to 60++). But it's first generation and not ready for that yet. All those who say non-purchasers of RTX will be left behind are wrong. It will be several generations of cards before this is a huge thing. By then, those who have not upgraded with this first gen will have likely upgraded once already, so your argument is moot.

Those that don't adopt the RTX 20xx series because of either cost or immature technology are perfectly ok in not doing so. Likewise those that want to, by all means do so. RT will see it's day in affordable mainstream because it is great, just not at this time.
Posted on Reply
#23
Xzibit
W1zzardI asked NVIDIA about that and they say that there will be "RTX off, low, med high" for most games, not just "RTX on/off"
Did they go into detail of what that would be ?

The Low, Med, High settings might just be exclusions of features.

Example: GI only being enabled at High or Shadows on Med & High only. Another way they can go is what the BF5 Devs alluded to, Lower LOD for RT effects (Makes sense but defeats the purpose of RT since your back to "faking it"). Could end up with combinations of all these things.
Posted on Reply
#24
stimpy88
I have said this all along... You're a sucker if you buy anything less than the 2080Ti, and you're still a sucker if you spend $1200 on a video card, especially if it can only RTX game at 1080p 30FPS!

nGreedia has you by the balls, and you like it!
Posted on Reply
#25
Vya Domus
HTCThis changes things immensely.
Not really. The big issue with RTX isn't just that it is computationally expensive and the RT cores aren't enough but rather it's the fact that this incurs a massive stall in the graphics pipeline. No matter what you do, as long as those RT cores are in use the performance is going to be abominable. And I don't know why anyone would opt for the "lowest RTX quality level" , even when in full force this is still a grossly approximated effect, toning it down further would prove so detrimental you might as well turn it off altogether.
Posted on Reply
Add your own comment
Nov 18th, 2024 21:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts