• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Reveals Some RT and DLSS Statistics

GFreeman

News Editor
Staff member
Joined
Mar 6, 2023
Messages
1,530 (2.43/day)
Following the launch of the new GeForce RTX 40 series graphics card, the GeForce RTX 4070, NVIDIA has revealed some numbers regarding the usage of ray tracing (RT) and Deep Learning Super Sampling (DLSS). Bear in mind that these numbers only come from those users that are willing to share their data with GeForce Experience, so they do not show the complete picture, but, on the other hand, they show a rise in adoption rate. Of course, the number of games supporting RT and DLSS has risen over the last few years.

According to NVIDIA, 83 percent of users running on RTX 40 series graphics cards enabled RT, and 79 percent enabled DLSS. On the RTX 30 series, 56 percent of users enable ray tracing and 71 percent enabled DLSS. According to NVIDIA, the numbers were much lower for users running on RTX 20 series back in 2018, where 37 percent users enable RT and 26 percent of them enable DLSS.



View at TechPowerUp Main Site | Source
 
Joined
May 7, 2020
Messages
261 (0.16/day)
DLSS 1 is.... well I still have PTSD for that thing, no wonder 20 series people didn't turn it on back in 2018.

DLSS 2 have been great so far, recently version 2.51 and later made more resolution DLSS usable, nowadays even 1080P can use DLSS to a reasonable degree.

Thanks to competition from FSR and XESS NVIDIA seems to be working hard on improving DLSS, would be great if they can make it open source, at least it will finally settle the question on how and whether DLSS utilize tensor cores.
 
Joined
Nov 11, 2016
Messages
3,411 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
DLSS 1 is.... well I still have PTSD for that thing, no wonder 20 series people didn't turn it on back in 2018.

DLSS 2 have been great so far, recently version 2.51 and later made more resolution DLSS usable, nowadays even 1080P can use DLSS to a reasonable degree.

Thanks to competition from FSR and XESS NVIDIA seems to be working hard on improving DLSS, would be great if they can make it open source, at least it will finally settle the question on how and whether DLSS utilize tensor cores.

Streamline is open sourced :D, but AMD sponsored titles are rather against implementing Streamline, for some reasons
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
More like users don't have a choice but to use DLSS if they want playable framerates with RT on. Most new games are not playable with RT at all on 2000 series, at least not without major compromises to resolution and DLSS 1 sucked badly, so this isn't exactly an achievement.
 
Joined
Apr 6, 2015
Messages
250 (0.07/day)
Location
Japan
System Name ChronicleScienceWorkStation
Processor AMD Threadripper 1950X
Motherboard Asrock X399 Taichi
Cooling Noctua U14S-TR4
Memory G.Skill DDR4 3200 C14 16GB*4
Video Card(s) AMD Radeon VII
Storage Samsung 970 Pro*1, Kingston A2000 1TB*2 RAID 0, HGST 8TB*5 RAID 6
Case Lian Li PC-A75X
Power Supply Corsair AX1600i
Software Proxmox 6.2
More like users don't have a choice but to use DLSS if they want playable framerates with RT on. Most new games are not playable with RT at all on 2000 series, at least not without major compromises to resolution and DLSS 1 sucked badly, so this isn't exactly an achievement.
Came to say the same, they just basically tune the games to a state that DLSS/FSR is part of the essential components for a game to be playable at advertised quality. Not sure if I am happy to see that, especially when they lock DLSS 3 with their 4000 series, it will mean 2000/3000 series owners can expect crap quality in the upcoming titles.
 
Joined
Jun 27, 2019
Messages
2,109 (1.07/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
DLSS 1 is.... well I still have PTSD for that thing, no wonder 20 series people didn't turn it on back in 2018.

DLSS 2 have been great so far, recently version 2.51 and later made more resolution DLSS usable, nowadays even 1080P can use DLSS to a reasonable degree.

Thanks to competition from FSR and XESS NVIDIA seems to be working hard on improving DLSS, would be great if they can make it open source, at least it will finally settle the question on how and whether DLSS utilize tensor cores.

I never had the experience with DLSS 1 since my first RT card is a 3060 Ti and DLSS was a selling point to me when I was deciding between cards cause I try to keep my cards for 2-3 years at least so having +1 upscaler option sounded better to me than only FSR.
So far I like DLSS 2 a lot on Quality setting at least and use it in almost every game that has it even w/o RT turned on, to my eyes it often looks better than native TAA on a 2560x1080 monitor/res.

It also fixes some weird flickering issues I've noticed in some games and also I like the more detailed distance objects like wires and whatnot. 'this I notice a lot easier than the upscaling resolution difference'

More like users don't have a choice but to use DLSS if they want playable framerates with RT on.

Thats also pretty much true in most cases. 'Guardians of the galaxy I could run maxed out + RT on high with no DLSS but thats an older game by now'
Tbh I like RT as a tech and it does look good imo in some games like Cyberpunk/Control but its not a deal breaker for me if I can't use it and mainly bought into it cause of curiosity. 'those old retro games with RT rework are kinda cool tho'
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,029 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
DLAA is the champion here for IQ. Literal only downside is that many games don't support it yet. It can be patched into single player games but it's risky to do so in multiplayer games due to anti-cheat.
 
Joined
Aug 12, 2019
Messages
2,179 (1.13/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
Wait what where do these numbers come from?
 
Joined
Jun 18, 2018
Messages
158 (0.07/day)
Processor i5 3570K - 4x @ 5GHz (1.32V) - de-lid
Motherboard ASRock Z77 Extrem 4
Cooling Prolimatech Genesis (3x Thermalright TY-141 PWM) - direct die
Memory 2x 4GB Corsair VENGEANCE DDR3 1600 MHz CL 9
Video Card(s) MSI GTX 980 Gaming 4G (Alpenföhn Peter + 2x Noctua NF-A12) @1547 Mhz Core / 2000 MHz Mem
Storage 500GB Crucial MX500 / 4 TB Seagate - BarraCuda
Case IT-9001 Smasher -modified with a 140mm top exhaust
Audio Device(s) AKG K240 Studio
Power Supply be quiet! E9 Straight Power 600W
Joined
Mar 17, 2011
Messages
117 (0.02/day)
Location
Satu Mare, Romania
System Name It's a box, it games, what more do you want?
Processor Ryzen 7 5800X3D
Motherboard TUF GAMING B550-PLUS
Cooling EVGA CLC 240 Liquid
Memory 2 x 16 Gb DDR4 G.Skill Trident Z Neo @ 3600Mhz CL14
Video Card(s) XFX AMD Radeon RX 7900 XT MERC310
Storage WD BLACK™ SN770 Gen.4 1TB NVME M.2, Crucial MX300 275 GB M.2 2280, WD Red 4TB SATA-III 256M
Display(s) Dell Alienware AW3423DWF
Case Phanteks Eclipse P400A Digital RGB Black
Power Supply SuperFlower Legion GX Pro 850W
Mouse Mionix Naos QG
Keyboard Logitech G513 Carbon RGB Romer-G Tactile
Software Windows 11 x64
And this is how you make misleading statistics. They are not lying, it's probably 100% true, but it is misleading. You can easily make any statistic showing absolutely whatever you want as long as you put it in the right context. A more to the point statistic would be how many people with 4090s use RT, vs how many people with 2060s use RT. If you have the horsepower, of course you'll turn on RT, and if you don't have the horsepower, you will not turn it on. Another proper statistic would be how many users in total across any graphics card (old, new, AMD, Intel, integrated gpu's etc) actually turn on RT, or how many games of all the games in existence actually have RT. Context is everything. For example, 99% of cars use wheels. Well, no s**t.
 
Joined
Apr 14, 2018
Messages
655 (0.27/day)
Wait what where do these numbers come from?

Geforce Experience data collection.

Correct me if I’m wrong but don’t optimized settings for some cards based on Nvidias testing auto enable DLSS based on the recommended settings? These numbers are BS on top of BS if a user chooses to let Experience decide graphics settings.
 
Joined
Sep 28, 2005
Messages
3,323 (0.47/day)
Location
Canada
System Name PCGR
Processor 12400f
Motherboard Asus ROG STRIX B660-I
Cooling Stock Intel Cooler
Memory 2x16GB DDR5 5600 Corsair
Video Card(s) Dell RTX 3080
Storage 1x 512GB Mmoment PCIe 3 NVME 1x 2TB Corsair S70
Display(s) LG 32" 1440p
Case Phanteks Evolve itx
Audio Device(s) Onboard
Power Supply 750W Cooler Master sfx
Software Windows 11
My major gripe with DLSS and FSR is the ghosting from the TAA. It's horrendous and on my 1440p lg monitor, it's even worst. Shimmering and ghosting. Smearing of edges. Most noticeable in Sons of the Forrest and Red Dead 2 where skinning an animal or moving character around a body of water gives that horrendous effect.

I feel the tech is necessary for lower end cards to give it enough oomph to play at higher detail and higher resolution. But fact that you need it for $2000 cards just to use some effects and make it at a semi playable state, is troubling. Although I'm more inclined to blame game developers on their piss poor optimizing skills than the hardware makers for that.
 
Joined
Mar 14, 2014
Messages
1,390 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
I feel like everyone here just wants to bitch at nvidia and isn't seeing the larger picture.

They're using the user metrics to do marketing for them. "More people turn on RT the newer/faster their card is. If you want RT but have been waiting for consensus results on when it will actually be viable for the mainstream then it's finally here." Is what this says.

As far as all the RTX technologies. Yall realize AMD isn't out there exactly innovating these things. They're like the bollywood of graphics. "OH that's a good idea we should come about 10 months late with something similar."
 
Joined
Apr 14, 2018
Messages
655 (0.27/day)
I feel like everyone here just wants to bitch at nvidia and isn't seeing the larger picture.

They're using the user metrics to do marketing for them. "More people turn on RT the newer/faster their card is. If you want RT but have been waiting for consensus results on when it will actually be viable for the mainstream then it's finally here." Is what this says.

As far as all the RTX technologies. Yall realize AMD isn't out there exactly innovating these things. They're like the bollywood of graphics. "OH that's a good idea we should come about 10 months late with something similar."

This couldn’t be further from the truth. We have no idea what percent of users actually install Experience, experience deploys recommended settings based on hardware and internal testing which can mean many users aren’t actively choosing to enabled these features afaik.

Current RT implementations are typically, shadows, ao, and reflections, far from a true ray traced experience; no full ray/path traced game can run on existing hardware, or really exist.

DLSS, FSR, XESS are far from solutions. They have certainly gotten better and can serve as a crutch for certain hardware, but ultimately you’re degrading the visual quality of your game in the end. Whether it’s ghosting, increased latency, artifacting, or improperly rendering UI/inserted frames, it will never be as good as a non upscaled frame.

Also, why then do you feel the need to defend Nvidia? How is that different from wanting to complain about something.
 
Joined
Oct 28, 2012
Messages
1,190 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
Geforce Experience data collection.

Correct me if I’m wrong but don’t optimized settings for some cards based on Nvidias testing auto enable DLSS based on the recommended settings? These numbers are BS on top of BS if a user chooses to let Experience decide graphics settings.
Yes, but the user can preview what the optimized settings are and choose to apply them. GFE doesn't auto optimize all the games upon installation. If the user doesn't change it, it means that he's not bothered by it. GFE doesn't point a gun at the head of user if they want to see how the game look with DLLS vs native
 
Joined
Apr 14, 2018
Messages
655 (0.27/day)
Yes, but the user can preview what the optimized settings are and choose to apply them. GFE doesn't auto optimize all the games upon installation. If the user doesn't change it, it means that he's not bothered by it. GFE doesn't point a gun at the head of user if they want to see how the game look with DLLS vs native

Again, not the point. If the user just installs experience and gives it free rein it pollutes the data. I’m not saying nvidia is holding a gun to the end users head, without knowing what the average user decides, DLSS could be enabled and the majority of less tech savvy people would never know.
 
Joined
Mar 4, 2011
Messages
320 (0.06/day)
Location
Canada
System Name Something Esoteric 2
Processor Ryzen 7 7800X3D
Motherboard ASUS Prime B650-Plus
Cooling Corsair H150i Elite Capellix 360MM AIO
Memory 64GB Corsair Vengeance 6000Mhz DDR5
Video Card(s) MSI Ventus RTX 3090 OC 24GB
Storage WD Black SN850X 2TB NVMe
Display(s) 2 x Dell S2721DGF IPS
Case Corsair 4000D Airflow Tempered Glass
Audio Device(s) Samsung Buds2 Pro, SteelSeries Siberia 800
Power Supply EVGA 1200W P3 80+ Platinum
Mouse Logitech G903
Keyboard Microsoft Sidewinder X6
Software Windows 11 Pro
DLSS 1 is.... well I still have PTSD for that thing, no wonder 20 series people didn't turn it on back in 2018.

DLSS 2 have been great so far, recently version 2.51 and later made more resolution DLSS usable, nowadays even 1080P can use DLSS to a reasonable degree.

Thanks to competition from FSR and XESS NVIDIA seems to be working hard on improving DLSS, would be great if they can make it open source, at least it will finally settle the question on how and whether DLSS utilize tensor cores.

I've always been a little confused about DLSS 2 and whether it's the "correct" version.

I get that I can go into a game and select the setting that turns it on, but is there a way...
  • To know what version the game is running?
  • To know what the latest version is?
  • Is upgrading to the latest merely replacing a DLL?
  • Why isn't this just continually deployed with driver packages, like the PhysX driver is?
 
Joined
Dec 28, 2013
Messages
151 (0.04/day)
they should not be allowed to include DLSS as benchmark performance in their promotional materials
nvidia should improve real performance and lower prices.
not interested in fake frames that create on screen glitches.
 
Joined
Oct 28, 2012
Messages
1,190 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
Again, not the point. If the user just installs experience and gives it free rein it pollutes the data. I’m not saying nvidia is holding a gun to the end users head, without knowing what the average user decides, DLSS could be enabled and the majority of less tech savvy people would never know.
Well, that's an issue common with a lot of data collected by telemetry, the default setting of any product will be the most common occurrence vs something that's been manually tweaked...(unless something in the default is really hampering the experience).
I don't have an issue with the fact that you are part of the people who would rather have upscaling disappear, but if for you turning dlss off is the honest way of doing things, for a corporation that send the message: "we don't actually believe that DLSS provides an increased framerate with a good image quality " at that point they might as well just give up on the tech. (And they also know that people who are not tech savvy would well...not even know what dlss is, that it exists, and therefore not even try it out. That's always an issue when you implement a new feature. Some software got those pop up telling you try the new features, that people find annoying, but so many people don't bother to read the updates note, and just stick to what they already know)
It's an eternal debate that I'm seeing everywhere: one side think that pure rasterization without upscaling should be the only way until we can somehow use DXR without any performance hit on the first try, at the same level as an offline renderer. And the other side is interested to play around with those new tech.
Nvidia and pure gamers just have a massive conflict of interest since Turing, :D someone at Nvidia strongly believe that making a GPU only good at raster games would not be a good business decision, and that person will not give up
 
Last edited:
Joined
Mar 14, 2014
Messages
1,390 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
This couldn’t be further from the truth. We have no idea what percent of users actually install Experience, experience deploys recommended settings based on hardware and internal testing which can mean many users aren’t actively choosing to enabled these features afaik.

Current RT implementations are typically, shadows, ao, and reflections, far from a true ray traced experience; no full ray/path traced game can run on existing hardware, or really exist.

DLSS, FSR, XESS are far from solutions. They have certainly gotten better and can serve as a crutch for certain hardware, but ultimately you’re degrading the visual quality of your game in the end. Whether it’s ghosting, increased latency, artifacting, or improperly rendering UI/inserted frames, it will never be as good as a non upscaled frame.

Also, why then do you feel the need to defend Nvidia? How is that different from wanting to complain about something.
Well it's GFE only. So we know that. Even it's not 100% of the crowd it still counts. I would confidently bet that the majority of RTX owners don't go through the hassle of removing GFE. TPU members are not the majority.

CP2077 is path traced now. Cards can do PT. Just not at 4K like everyone wants to compare with. I checked out CP2077 with PT on my 3090, it's about 50-60fps at 1080p. Not ideal but also not unplayable.

Not really defending them just looking at what they're saying without hate goggles on.
 
Joined
Feb 23, 2008
Messages
1,064 (0.17/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
"83% of 40 series gamers turn RT on"...

OK, and where are the stats on how many turn it off after?
And the other 17%? Imagine paying for a 40-series card and not using RT...
 
Joined
Apr 14, 2018
Messages
655 (0.27/day)
Well it's GFE only. So we know that. Even it's not 100% of the crowd it still counts. I would confidently bet that the majority of RTX owners don't go through the hassle of removing GFE. TPU members are not the majority.

CP2077 is path traced now. Cards can do PT. Just not at 4K like everyone wants to compare with. I checked out CP2077 with PT on my 3090, it's about 50-60fps at 1080p. Not ideal but also not unplayable.

Not really defending them just looking at what they're saying without hate goggles on.

It doesn’t matter if it’s GFE only. There is no distinction between someone who’s actively turning DLSS on, vs enabling RT which auto turns on DLSS in most games, or letting GFE handle all of their graphical settings. The distinction is important, it is otherwise horrendously disingenuous to state 70+ % of users actively choose to use it and laud it like it’s everyone gamer under the sun; it doesn’t help the fact that in the entire collection of games, very VERY few even support DLSS. Regardless of how interesting the tech is, it’s important that people understand marketing stunts like this from any company.

Your 3090 gets 50 to 60 fps? A 4090 gets sub 30 FPS. Enable DLSS 3 and you’re borderline getting playable fps, which then brings you back to the huge caveat. Why are you paying $1500+ on a GPU for better and more realistic image quality, and then actively making the image quality worse by enabling upscalers. The universal truth being no current hardware is capable of running a path traced game, let alone one that hasn’t been designed from the ground up in an exclusive path traced engine. The idea that it’s anything but is an oxymoron.

Ray tracing is cool, software technologies are cool and interesting, but see it for what it actually is.
 
Joined
Sep 17, 2014
Messages
22,438 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Well it's GFE only. So we know that. Even it's not 100% of the crowd it still counts. I would confidently bet that the majority of RTX owners don't go through the hassle of removing GFE. TPU members are not the majority.

CP2077 is path traced now. Cards can do PT. Just not at 4K like everyone wants to compare with. I checked out CP2077 with PT on my 3090, it's about 50-60fps at 1080p. Not ideal but also not unplayable.

Not really defending them just looking at what they're saying without hate goggles on.
It is kinda true the tech is moving forward, and it better, because at the current state its still hard to see it beat raster out of the equation.

But the way it does, doesn't sit well with me. I much prefer the AMD approach where raster is basically what drives the GPU, and there's no supposed special sauce core doing very limited work. I also much prefer the chiplet approach moving into GPUs now. After all, does it really matter if the execution of RT is 20-30% slower if you can offer that much more actual GPU for the money? That's where the real performance win is going to be at if we want generational improvements to actually be and remain more than 'shrink me and add a bigger featureset'. Because that is what Ada is, what Ampere was, and what Turing started. Nvidia needed a shrink at every turn to actually create something new.

For me RT is mostly just a fancy way to absolutely destroy performance so companies get to sell smaller dies at higher cost - and we have the live examples of that as we speak. As long as they're playing that game, I'm not buying a card 'for RT'. Tech will only truly become commonplace if it is in fact supported in common ways.
 
Last edited:
Joined
Apr 9, 2013
Messages
289 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
"83% of 40 series gamers turn RT on"...

OK, and where are the stats on how many turn it off after?
And the other 17%? Imagine paying for a 40-series card and not using RT...
Exactly this. Obviously I turn it on in every new game I get to have a look, & I then turn it off to actually play the game because I prefer the higher fps over the RT effects any day. I'm actually amazed 17% of people with 40 series cards haven't even tried RT!
 
Joined
Mar 14, 2014
Messages
1,390 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
It doesn’t matter if it’s GFE only. There is no distinction between someone who’s actively turning DLSS on, vs enabling RT which auto turns on DLSS in most games, or letting GFE handle all of their graphical settings. The distinction is important, it is otherwise horrendously disingenuous to state 70+ % of users actively choose to use it and laud it like it’s everyone gamer under the sun; it doesn’t help the fact that in the entire collection of games, very VERY few even support DLSS. Regardless of how interesting the tech is, it’s important that people understand marketing stunts like this from any company.

Your 3090 gets 50 to 60 fps? A 4090 gets sub 30 FPS. Enable DLSS 3 and you’re borderline getting playable fps, which then brings you back to the huge caveat. Why are you paying $1500+ on a GPU for better and more realistic image quality, and then actively making the image quality worse by enabling upscalers. The universal truth being no current hardware is capable of running a path traced game, let alone one that hasn’t been designed from the ground up in an exclusive path traced engine. The idea that it’s anything but is an oxymoron.

Ray tracing is cool, software technologies are cool and interesting, but see it for what it actually is.
Your talking in 4K again
 
Top