Thursday, April 13th 2023

NVIDIA Reveals Some RT and DLSS Statistics

Following the launch of the new GeForce RTX 40 series graphics card, the GeForce RTX 4070, NVIDIA has revealed some numbers regarding the usage of ray tracing (RT) and Deep Learning Super Sampling (DLSS). Bear in mind that these numbers only come from those users that are willing to share their data with GeForce Experience, so they do not show the complete picture, but, on the other hand, they show a rise in adoption rate. Of course, the number of games supporting RT and DLSS has risen over the last few years.

According to NVIDIA, 83 percent of users running on RTX 40 series graphics cards enabled RT, and 79 percent enabled DLSS. On the RTX 30 series, 56 percent of users enable ray tracing and 71 percent enabled DLSS. According to NVIDIA, the numbers were much lower for users running on RTX 20 series back in 2018, where 37 percent users enable RT and 26 percent of them enable DLSS.
Source: NVIDIA
Add your own comment

28 Comments on NVIDIA Reveals Some RT and DLSS Statistics

#1
Verpal
DLSS 1 is.... well I still have PTSD for that thing, no wonder 20 series people didn't turn it on back in 2018.

DLSS 2 have been great so far, recently version 2.51 and later made more resolution DLSS usable, nowadays even 1080P can use DLSS to a reasonable degree.

Thanks to competition from FSR and XESS NVIDIA seems to be working hard on improving DLSS, would be great if they can make it open source, at least it will finally settle the question on how and whether DLSS utilize tensor cores.
Posted on Reply
#2
nguyen
VerpalDLSS 1 is.... well I still have PTSD for that thing, no wonder 20 series people didn't turn it on back in 2018.

DLSS 2 have been great so far, recently version 2.51 and later made more resolution DLSS usable, nowadays even 1080P can use DLSS to a reasonable degree.

Thanks to competition from FSR and XESS NVIDIA seems to be working hard on improving DLSS, would be great if they can make it open source, at least it will finally settle the question on how and whether DLSS utilize tensor cores.
Streamline is open sourced :D, but AMD sponsored titles are rather against implementing Streamline, for some reasons
Posted on Reply
#3
Vya Domus
More like users don't have a choice but to use DLSS if they want playable framerates with RT on. Most new games are not playable with RT at all on 2000 series, at least not without major compromises to resolution and DLSS 1 sucked badly, so this isn't exactly an achievement.
Posted on Reply
#4
Sandbo
Vya DomusMore like users don't have a choice but to use DLSS if they want playable framerates with RT on. Most new games are not playable with RT at all on 2000 series, at least not without major compromises to resolution and DLSS 1 sucked badly, so this isn't exactly an achievement.
Came to say the same, they just basically tune the games to a state that DLSS/FSR is part of the essential components for a game to be playable at advertised quality. Not sure if I am happy to see that, especially when they lock DLSS 3 with their 4000 series, it will mean 2000/3000 series owners can expect crap quality in the upcoming titles.
Posted on Reply
#5
Sithaer
VerpalDLSS 1 is.... well I still have PTSD for that thing, no wonder 20 series people didn't turn it on back in 2018.

DLSS 2 have been great so far, recently version 2.51 and later made more resolution DLSS usable, nowadays even 1080P can use DLSS to a reasonable degree.

Thanks to competition from FSR and XESS NVIDIA seems to be working hard on improving DLSS, would be great if they can make it open source, at least it will finally settle the question on how and whether DLSS utilize tensor cores.
I never had the experience with DLSS 1 since my first RT card is a 3060 Ti and DLSS was a selling point to me when I was deciding between cards cause I try to keep my cards for 2-3 years at least so having +1 upscaler option sounded better to me than only FSR.
So far I like DLSS 2 a lot on Quality setting at least and use it in almost every game that has it even w/o RT turned on, to my eyes it often looks better than native TAA on a 2560x1080 monitor/res.

It also fixes some weird flickering issues I've noticed in some games and also I like the more detailed distance objects like wires and whatnot. 'this I notice a lot easier than the upscaling resolution difference'
Vya DomusMore like users don't have a choice but to use DLSS if they want playable framerates with RT on.
Thats also pretty much true in most cases. 'Guardians of the galaxy I could run maxed out + RT on high with no DLSS but thats an older game by now'
Tbh I like RT as a tech and it does look good imo in some games like Cyberpunk/Control but its not a deal breaker for me if I can't use it and mainly bought into it cause of curiosity. 'those old retro games with RT rework are kinda cool tho'
Posted on Reply
#6
dgianstefani
TPU Proofreader
DLAA is the champion here for IQ. Literal only downside is that many games don't support it yet. It can be patched into single player games but it's risky to do so in multiplayer games due to anti-cheat.
Posted on Reply
#7
Hyderz
Wait what where do these numbers come from?
Posted on Reply
#8
Testsubject01
HyderzWait what where do these numbers come from?
NVIDIA Driver software phoning home would be my guess.
Posted on Reply
#9
brutlern
And this is how you make misleading statistics. They are not lying, it's probably 100% true, but it is misleading. You can easily make any statistic showing absolutely whatever you want as long as you put it in the right context. A more to the point statistic would be how many people with 4090s use RT, vs how many people with 2060s use RT. If you have the horsepower, of course you'll turn on RT, and if you don't have the horsepower, you will not turn it on. Another proper statistic would be how many users in total across any graphics card (old, new, AMD, Intel, integrated gpu's etc) actually turn on RT, or how many games of all the games in existence actually have RT. Context is everything. For example, 99% of cars use wheels. Well, no s**t.
Posted on Reply
#10
rv8000
HyderzWait what where do these numbers come from?
Geforce Experience data collection.

Correct me if I’m wrong but don’t optimized settings for some cards based on Nvidias testing auto enable DLSS based on the recommended settings? These numbers are BS on top of BS if a user chooses to let Experience decide graphics settings.
Posted on Reply
#11
sepheronx
My major gripe with DLSS and FSR is the ghosting from the TAA. It's horrendous and on my 1440p lg monitor, it's even worst. Shimmering and ghosting. Smearing of edges. Most noticeable in Sons of the Forrest and Red Dead 2 where skinning an animal or moving character around a body of water gives that horrendous effect.

I feel the tech is necessary for lower end cards to give it enough oomph to play at higher detail and higher resolution. But fact that you need it for $2000 cards just to use some effects and make it at a semi playable state, is troubling. Although I'm more inclined to blame game developers on their piss poor optimizing skills than the hardware makers for that.
Posted on Reply
#12
Upgrayedd
I feel like everyone here just wants to bitch at nvidia and isn't seeing the larger picture.

They're using the user metrics to do marketing for them. "More people turn on RT the newer/faster their card is. If you want RT but have been waiting for consensus results on when it will actually be viable for the mainstream then it's finally here." Is what this says.

As far as all the RTX technologies. Yall realize AMD isn't out there exactly innovating these things. They're like the bollywood of graphics. "OH that's a good idea we should come about 10 months late with something similar."
Posted on Reply
#13
rv8000
UpgrayeddI feel like everyone here just wants to bitch at nvidia and isn't seeing the larger picture.

They're using the user metrics to do marketing for them. "More people turn on RT the newer/faster their card is. If you want RT but have been waiting for consensus results on when it will actually be viable for the mainstream then it's finally here." Is what this says.

As far as all the RTX technologies. Yall realize AMD isn't out there exactly innovating these things. They're like the bollywood of graphics. "OH that's a good idea we should come about 10 months late with something similar."
This couldn’t be further from the truth. We have no idea what percent of users actually install Experience, experience deploys recommended settings based on hardware and internal testing which can mean many users aren’t actively choosing to enabled these features afaik.

Current RT implementations are typically, shadows, ao, and reflections, far from a true ray traced experience; no full ray/path traced game can run on existing hardware, or really exist.

DLSS, FSR, XESS are far from solutions. They have certainly gotten better and can serve as a crutch for certain hardware, but ultimately you’re degrading the visual quality of your game in the end. Whether it’s ghosting, increased latency, artifacting, or improperly rendering UI/inserted frames, it will never be as good as a non upscaled frame.

Also, why then do you feel the need to defend Nvidia? How is that different from wanting to complain about something.
Posted on Reply
#14
dyonoctis
rv8000Geforce Experience data collection.

Correct me if I’m wrong but don’t optimized settings for some cards based on Nvidias testing auto enable DLSS based on the recommended settings? These numbers are BS on top of BS if a user chooses to let Experience decide graphics settings.
Yes, but the user can preview what the optimized settings are and choose to apply them. GFE doesn't auto optimize all the games upon installation. If the user doesn't change it, it means that he's not bothered by it. GFE doesn't point a gun at the head of user if they want to see how the game look with DLLS vs native
Posted on Reply
#15
rv8000
dyonoctisYes, but the user can preview what the optimized settings are and choose to apply them. GFE doesn't auto optimize all the games upon installation. If the user doesn't change it, it means that he's not bothered by it. GFE doesn't point a gun at the head of user if they want to see how the game look with DLLS vs native
Again, not the point. If the user just installs experience and gives it free rein it pollutes the data. I’m not saying nvidia is holding a gun to the end users head, without knowing what the average user decides, DLSS could be enabled and the majority of less tech savvy people would never know.
Posted on Reply
#16
FierceRed
VerpalDLSS 1 is.... well I still have PTSD for that thing, no wonder 20 series people didn't turn it on back in 2018.

DLSS 2 have been great so far, recently version 2.51 and later made more resolution DLSS usable, nowadays even 1080P can use DLSS to a reasonable degree.

Thanks to competition from FSR and XESS NVIDIA seems to be working hard on improving DLSS, would be great if they can make it open source, at least it will finally settle the question on how and whether DLSS utilize tensor cores.
I've always been a little confused about DLSS 2 and whether it's the "correct" version.

I get that I can go into a game and select the setting that turns it on, but is there a way...
  • To know what version the game is running?
  • To know what the latest version is?
  • Is upgrading to the latest merely replacing a DLL?
  • Why isn't this just continually deployed with driver packages, like the PhysX driver is?
Posted on Reply
#17
apoklyps3
they should not be allowed to include DLSS as benchmark performance in their promotional materials
nvidia should improve real performance and lower prices.
not interested in fake frames that create on screen glitches.
Posted on Reply
#18
dyonoctis
rv8000Again, not the point. If the user just installs experience and gives it free rein it pollutes the data. I’m not saying nvidia is holding a gun to the end users head, without knowing what the average user decides, DLSS could be enabled and the majority of less tech savvy people would never know.
Well, that's an issue common with a lot of data collected by telemetry, the default setting of any product will be the most common occurrence vs something that's been manually tweaked...(unless something in the default is really hampering the experience).
I don't have an issue with the fact that you are part of the people who would rather have upscaling disappear, but if for you turning dlss off is the honest way of doing things, for a corporation that send the message: "we don't actually believe that DLSS provides an increased framerate with a good image quality " at that point they might as well just give up on the tech. (And they also know that people who are not tech savvy would well...not even know what dlss is, that it exists, and therefore not even try it out. That's always an issue when you implement a new feature. Some software got those pop up telling you try the new features, that people find annoying, but so many people don't bother to read the updates note, and just stick to what they already know)
It's an eternal debate that I'm seeing everywhere: one side think that pure rasterization without upscaling should be the only way until we can somehow use DXR without any performance hit on the first try, at the same level as an offline renderer. And the other side is interested to play around with those new tech.
Nvidia and pure gamers just have a massive conflict of interest since Turing, :D someone at Nvidia strongly believe that making a GPU only good at raster games would not be a good business decision, and that person will not give up
Posted on Reply
#19
Upgrayedd
rv8000This couldn’t be further from the truth. We have no idea what percent of users actually install Experience, experience deploys recommended settings based on hardware and internal testing which can mean many users aren’t actively choosing to enabled these features afaik.

Current RT implementations are typically, shadows, ao, and reflections, far from a true ray traced experience; no full ray/path traced game can run on existing hardware, or really exist.

DLSS, FSR, XESS are far from solutions. They have certainly gotten better and can serve as a crutch for certain hardware, but ultimately you’re degrading the visual quality of your game in the end. Whether it’s ghosting, increased latency, artifacting, or improperly rendering UI/inserted frames, it will never be as good as a non upscaled frame.

Also, why then do you feel the need to defend Nvidia? How is that different from wanting to complain about something.
Well it's GFE only. So we know that. Even it's not 100% of the crowd it still counts. I would confidently bet that the majority of RTX owners don't go through the hassle of removing GFE. TPU members are not the majority.

CP2077 is path traced now. Cards can do PT. Just not at 4K like everyone wants to compare with. I checked out CP2077 with PT on my 3090, it's about 50-60fps at 1080p. Not ideal but also not unplayable.

Not really defending them just looking at what they're saying without hate goggles on.
Posted on Reply
#20
Mistral
"83% of 40 series gamers turn RT on"...

OK, and where are the stats on how many turn it off after?
And the other 17%? Imagine paying for a 40-series card and not using RT...
Posted on Reply
#21
rv8000
UpgrayeddWell it's GFE only. So we know that. Even it's not 100% of the crowd it still counts. I would confidently bet that the majority of RTX owners don't go through the hassle of removing GFE. TPU members are not the majority.

CP2077 is path traced now. Cards can do PT. Just not at 4K like everyone wants to compare with. I checked out CP2077 with PT on my 3090, it's about 50-60fps at 1080p. Not ideal but also not unplayable.

Not really defending them just looking at what they're saying without hate goggles on.
It doesn’t matter if it’s GFE only. There is no distinction between someone who’s actively turning DLSS on, vs enabling RT which auto turns on DLSS in most games, or letting GFE handle all of their graphical settings. The distinction is important, it is otherwise horrendously disingenuous to state 70+ % of users actively choose to use it and laud it like it’s everyone gamer under the sun; it doesn’t help the fact that in the entire collection of games, very VERY few even support DLSS. Regardless of how interesting the tech is, it’s important that people understand marketing stunts like this from any company.

Your 3090 gets 50 to 60 fps? A 4090 gets sub 30 FPS. Enable DLSS 3 and you’re borderline getting playable fps, which then brings you back to the huge caveat. Why are you paying $1500+ on a GPU for better and more realistic image quality, and then actively making the image quality worse by enabling upscalers. The universal truth being no current hardware is capable of running a path traced game, let alone one that hasn’t been designed from the ground up in an exclusive path traced engine. The idea that it’s anything but is an oxymoron.

Ray tracing is cool, software technologies are cool and interesting, but see it for what it actually is.
Posted on Reply
#22
Vayra86
UpgrayeddWell it's GFE only. So we know that. Even it's not 100% of the crowd it still counts. I would confidently bet that the majority of RTX owners don't go through the hassle of removing GFE. TPU members are not the majority.

CP2077 is path traced now. Cards can do PT. Just not at 4K like everyone wants to compare with. I checked out CP2077 with PT on my 3090, it's about 50-60fps at 1080p. Not ideal but also not unplayable.

Not really defending them just looking at what they're saying without hate goggles on.
It is kinda true the tech is moving forward, and it better, because at the current state its still hard to see it beat raster out of the equation.

But the way it does, doesn't sit well with me. I much prefer the AMD approach where raster is basically what drives the GPU, and there's no supposed special sauce core doing very limited work. I also much prefer the chiplet approach moving into GPUs now. After all, does it really matter if the execution of RT is 20-30% slower if you can offer that much more actual GPU for the money? That's where the real performance win is going to be at if we want generational improvements to actually be and remain more than 'shrink me and add a bigger featureset'. Because that is what Ada is, what Ampere was, and what Turing started. Nvidia needed a shrink at every turn to actually create something new.

For me RT is mostly just a fancy way to absolutely destroy performance so companies get to sell smaller dies at higher cost - and we have the live examples of that as we speak. As long as they're playing that game, I'm not buying a card 'for RT'. Tech will only truly become commonplace if it is in fact supported in common ways.
Posted on Reply
#23
HOkay
Mistral"83% of 40 series gamers turn RT on"...

OK, and where are the stats on how many turn it off after?
And the other 17%? Imagine paying for a 40-series card and not using RT...
Exactly this. Obviously I turn it on in every new game I get to have a look, & I then turn it off to actually play the game because I prefer the higher fps over the RT effects any day. I'm actually amazed 17% of people with 40 series cards haven't even tried RT!
Posted on Reply
#24
Upgrayedd
rv8000It doesn’t matter if it’s GFE only. There is no distinction between someone who’s actively turning DLSS on, vs enabling RT which auto turns on DLSS in most games, or letting GFE handle all of their graphical settings. The distinction is important, it is otherwise horrendously disingenuous to state 70+ % of users actively choose to use it and laud it like it’s everyone gamer under the sun; it doesn’t help the fact that in the entire collection of games, very VERY few even support DLSS. Regardless of how interesting the tech is, it’s important that people understand marketing stunts like this from any company.

Your 3090 gets 50 to 60 fps? A 4090 gets sub 30 FPS. Enable DLSS 3 and you’re borderline getting playable fps, which then brings you back to the huge caveat. Why are you paying $1500+ on a GPU for better and more realistic image quality, and then actively making the image quality worse by enabling upscalers. The universal truth being no current hardware is capable of running a path traced game, let alone one that hasn’t been designed from the ground up in an exclusive path traced engine. The idea that it’s anything but is an oxymoron.

Ray tracing is cool, software technologies are cool and interesting, but see it for what it actually is.
Your talking in 4K again
Posted on Reply
#25
MarsM4N
Mistral"83% of 40 series gamers turn RT on"...

OK, and where are the stats on how many turn it off after?
And the other 17%? Imagine paying for a 40-series card and not using RT...
Ohh NO, someone over here is exposing our deceptive PR strategies! Get him! :D Recessions are the perfect time for scam artists out to make a quick buck of naive and desperate people. And you have tons of people happy to fall for it. Instead of starting to learn to think for themselfs you have folks out there just smelling a bashing of their loyal brand. Which is fine because it just exposes the intellect of those people you're dealing with.

The Nvidia statistic (or let's say the part of the statistic they used in their favour) just shows that xx% activated RT/DLSS in a given timeframe once. It doesn't tell you how long, the percentage they used it in their whole gaming time. They also don't tell you if they have designed their software in a deceptive way to steer customers into clicking on it. They also don't tell you if they have payed off game dev's into designing their setups/games in a deceptive way to peddle folks into activating given features. Deceptive PR and statistics manipulation is a science, since decades, invented by Edward Bernays.

Posted on Reply
Add your own comment
May 15th, 2024 20:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts