• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Extends DirectX Raytracing (DXR) Support to Many GeForce GTX GPUs

Joined
Feb 15, 2019
Messages
1,664 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
You must be young. Every single gfx technology that came before this took years till developers learned how to use it properly, RTX/DXR is no different. Also, no game supports all effects in DX12 or DX11 or DX10, so I don't see how supporting everything DXR can do is relevant.
And again, this is not a discussion to have in a thread about RTX being sort of supported on Pascal cards.

It is relevant.
This "RTX being sort of supported on Pascal cards" move has one purpose only : Try to convince ppl to "upgrade / side-grade" to RTX cards.
Then, what is the point to do this "upgrade / side-grade" when there are not even a handful of Ray-Tracing games out there ?

RTX based on DXR which is just one part of the DX12 API, RTX itself is not comparable to fully featured DX10 / DX11 / DX12 .
Compare it to Nvidia's last hardware-accelerated marketing gimmick a.k.a PhysX, tho.
It was the same thing again, hardware PhysX got 4 games back in 2008 and 7 in 2009, then hardware PhysX simply fade out and now open sourced.
Now we got 3 Ray Traced games in 7 months, sounds familiar ?
 
Last edited:
Joined
Feb 1, 2019
Messages
3,658 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
I was expecting a much bigger gap to be honest, usually specialised hardware with optimised routines has much more than a 2-3x performance improvement. This RT was optimised for RTX and it only has a 200-300% advantage depending on game.

All this data has told me, is that if games get made with compute versions of RT optimised for it, then potentially performance can be quite close. I dont understand what nvidia are doing here, it seems out of desperation they enabled support for the large pascal userbase to try and entice developers. RT cards may be dead consumer tech within 1-2 generations.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I still think RT cores are basically lean Kepler (or similar architecture) cores that are reserved so they don't interfere with the graphics pipeline. If this is the case, GCN using async compute should be able to do very well at DXR without modification.

NVIDIA to date hasn't technically described what the RT cores are.

I guess we'll find out when AMD debuts DXR support.
 
Joined
Sep 27, 2014
Messages
550 (0.15/day)
It was the same thing again, hardware PhysX got 4 games back in 2008 and 7 in 2009, then hardware PhysX simply fade out and now open sourced.
PhysX games were plenty and the PhysX added to the game's realism. The fact that you didn't get to observe it doesn't change the facts.
Fallout 4 (FleX added in 2017), Witcher 3 family (2016), COD Ghosts (2013) really benefited from that - at least I personally liked the effects. I could't stop using grenades :)
Was it doomed by being closed source? Maybe. But that doesn't mean it didn't work. Unreal Engine 4 still uses it.

I would like an identical approach for RTX: Let me add another card and dedicate it to RTX. That will make me maybe take the bait to upgrade sooner to a single-card solution.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,810 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
I was expecting a much bigger gap to be honest, usually specialised hardware with optimised routines has much more than a 2-3x performance improvement. This RT was optimised for RTX and it only has a 200-300% advantage depending on game.
All this data has told me, is that if games get made with compute versions of RT optimised for it, then potentially performance can be quite close.
The current hybrid solution means only a small part of frame rendering uses DXR. Even then, only specific operations are done on RT cores, data setup and management still happens on shaders.
Compare results from BF5 that uses little, Metro/SoTR that use little bit more and benchmarks like Port Royal or techdemos that use a lot of RT. The more RT is used the bigger the performance gap gets.
The other part is that Nvidia chose to put front and center results with DXR Low/Medium and modest resolutions. These paint Pascal in a better light than DXR High/Ultra results.

For a visual representation on what I am trying to say, look at the Metro Exodus frame graphs from Nvidia's original announcement, the middle part represents the part that RT Cores deal with:
https://www.techpowerup.com/253759/...10-and-16-series-gpus-in-april-drivers-update
https://www.techpowerup.com/img/Qr86CtLnbFWCRcfc.jpg

NVIDIA to date hasn't technically described what the RT cores are.
They have not described the units very precisely. However, it is not quite correct to say we do not know what the RT Cores do. They run a couple operations for raytracing implemented in hardware. Anandtech's article has a pretty good overview:
https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/5
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
They have not described the units very precisely. However, it is not quite correct to say we do not know what the RT Cores do. They run a couple operations for raytracing implemented in hardware. Anandtech's article has a pretty good overview:
https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/5
Something like a Kepler core could be doing everything the "RT core" does.

Remember, RTX has an *extremely* limited capability to ray trace: it complements existing rendering techniques in games rather than replacing it.
 
Joined
Aug 20, 2007
Messages
21,529 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
If AMD come up with something let's say "FreeRay" which runs on and optimized for typical graphics cards instead of RTX cards, would that benefit RTX card sales ?

Yes it would, because it's literally impossible to do this in a performant way on standard non tensor gpus.

As RTX are the only gpus with tensor cores right now, it would run like shit, driving upgrades to RTX.

That is his point in a nutshell.

Nvidia does not own DXR.
If AMD could come up with their own solution to optimize DXR without dedicated "cores", RTX cards would become utterly pointless.

They would need to add tensor cores first.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Tensor cores are quite useless for DXR. NVIDIA is using tensor cores to up-sample resolution to compensate for framerate loss due to DXR. If the RT cores weren't rubbish and/or the GPU could properly async (like GCN can) so they can raytrace without impacting framerate, DLSS would be useless. A proper raytracing ASIC could be the solution...assuming DXR is a problem worth solving which I don't believe it is. There would have to be a monumental jump in compute capabilities (as in, a ton of cheap performance to waste) to warrant pursuing DXR as a useful technology in games.
 
Joined
Feb 15, 2019
Messages
1,664 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
Yes it would, because it's literally impossible to do this in a performant way on standard non tensor gpus.

As RTX are the only gpus with tensor cores right now, it would run like shit, driving upgrades to RTX.
That is his point in a nutshell.
They would need to add tensor cores first.

The Tensor cores in RTX cards don't do DXR .

Nvidia described Tensor cores as "specialized execution units designed specifically for performing the tensor / matrix operations that are the core compute function used in Deep Learning " .

They have nothing to do with Ray Tracing.
 
Joined
Feb 18, 2019
Messages
92 (0.04/day)
Processor Ryzen 7 1700
Motherboard MSI x470 gaming plus
Cooling coolermaster AIO
Memory 16gb Gskill tridentz
Video Card(s) ASUS Strix Geforce RTX 2060
Storage Crucial NVME M.2 1 TB pcie SSD and 1TB Western Digital HDD.
Display(s) LG 34" superwide 2560x1080
Case Phanteks Evolv X
Audio Device(s) USB Audio Interface with Stereo Studio Monitors.
Power Supply EVGA 600watt 80 plus
Mouse gskill "ripjaw" LOL
Keyboard Piece of shit Lenovo Freebie - best keyboard ever.
Software Win 10.
If raytracing wasnt demonstrated running on a radeon card (and rather well) by a third party, none of this would be happening.
Full "Damage Control" mode.
 
Joined
Aug 20, 2007
Messages
21,529 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
They have nothing to do with Ray Tracing.

They do the "denoising" that enables raytracing to be possible on present hardware (we can't possibly push enough raw rays).

Thus, they have everything to do with it.
 
Joined
Feb 15, 2019
Messages
1,664 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
They do the "denoising" that enables raytracing to be possible on present hardware (we can't possibly push enough raw rays).

Thus, they have everything to do with it.

lol you are mixing things up.
RTRT doesn't require denoiser to work.
Denoiser is an after-effect added to the final image.
 
Joined
Sep 27, 2014
Messages
550 (0.15/day)
lol you are mixing things up.
RTRT doesn't require denoiser to work.
Denoiser is an after-effect added to the final image.
Sure is not necessary... if you are able to ray trace all the rays. But the current generation of RTX are not.
They trace only a couple of rays per pixel and mix some textures in, hence they have to apply de-noise to that to make it look good.
I guess you didn't read the excellent article linked above? Here you go a quote:
Essentially, this style of ‘hybrid rendering’ is a lot less raytracing than one might imagine from the marketing material. Perhaps a blunt way to generalize might be: real time raytracing in Turing typically means only certain objects are being rendered with certain raytraced graphical effects, using a minimal amount of rays per pixel and/or only raytracing secondary rays, and using a lot of denoising filtering; anything more would affect performance too much.
 
Last edited:
Joined
Feb 15, 2019
Messages
1,664 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
Sure is not necessary... if you are able to ray trace all the rays. But the current generation of RTX are not.
They trace only a couple of rays per pixel and mix some textures in, hence they have to apply de-noise to that to make it look good.

Yes, that is exactly the point.

Nvidia offers an AI-based de-noiser powered by tensor cores.
It will de-noise any image given, no matter it is an in-game image, or a photo.

If it is just the de-noiser which matters, then it is the de-noiser, NOT the Tensor cores.
If AMD could come up with an efficient de-noise method without any dedicated hardware, Tensor cores also become utterly pointless.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Something like a Kepler core could be doing everything the "RT core" does.

Remember, RTX has an *extremely* limited capability to ray trace: it complements existing rendering techniques in games rather than replacing it.
Well, but what about "fully RT Quake 3"?
Of course, models that are rendered there are simple, but still.

This boils down to "it does in RT cores what DXR API is about" namely, intersection matching.
Uh, who would have thought.
 
Joined
Apr 12, 2013
Messages
7,563 (1.77/day)
Yes, that is exactly the point.

Nvidia offers an AI-based de-noiser powered by tensor cores.
It will de-noise any image given, no matter it is an in-game image, or a photo.

If it is just the de-noiser which matters, then it is the de-noiser, NOT the Tensor cores.
If AMD could come up with an efficient de-noise method without any dedicated hardware, Tensor cores also become utterly pointless.
So what happens with Pascal, I'm guessing lack of tensor cores isn't the (biggest) reason why RT tanks the performance on that thing :wtf:
 
Joined
Feb 15, 2019
Messages
1,664 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
So what happens with Pascal, I'm guessing lack of tensor cores isn't the (biggest) reason why RT tanks the performance on that thing :wtf:
Only the leather jacket himself knows.
Without any comparison data from the red team, we have no idea if the pascal cards received optimization for RTRT , or no optimization at all.

After all, Nvidia naturally wants to sell more Turing cards, optimize old pascal cards for the selling feature of Turing is the exact opposite of that.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Well, but what about "fully RT Quake 3"?
Of course, models that are rendered there are simple, but still.
Judging by AAA games, publishers aren't willing to sacrifice so much for raytracing. On that note, if raytracing were more accessible, indie developers would probably use it because a lot of them go for a minimalist graphics style anyway.
 
Last edited:

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
14,010 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Judging by AAA games, publishers aren't willing to sacrifice so much for raytracing. On that note, if raytracing were more accessible, indie developers would probably use it because a lot of them go for a minimalist graphics style anyway.
Funny you should mention that. The PC game Abducted is adding in raytracing for those that have the hardware in one of its soon to be released early access patches. The game has been EA for three years and is almost ready. The dev just announced a couple weeks ago that RT would be added before final release. This game, btw, is not using minimalist graphics.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Then I have to assume that NVIDIA probably gave the devs RTX cards on the condition that they add RTX support.

It's way too early for indie devs to be adding raytracing as a cost-saving measure.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
14,010 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Then I have to assume that NVIDIA probably gave the devs RTX cards on the condition that they add RTX support.

It's way too early for indie devs to be adding raytracing as a cost-saving measure.
I think you’re right. I don’t think it is a cost saving measure. I think they are just trying to offer it as a nice perk. They are very responsive and I think they really just want to make the best product they can.
 
Joined
Feb 3, 2017
Messages
3,810 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Something like a Kepler core could be doing everything the "RT core" does.
Remember, RTX has an *extremely* limited capability to ray trace: it complements existing rendering techniques in games rather than replacing it.
It could but would be rather inefficient at it. Turing SM is faster than Pascal SM in practically every aspect. Pascal is faster than Maxwell which is faster than Kepler. If Nvidia would think RT is best done on good old shaders, they would simply add more shader units and would not bother with RT Cores.
Tensor cores are quite useless for DXR. NVIDIA is using tensor cores to up-sample resolution to compensate for framerate loss due to DXR.
I am not too sure this is exactly the case here. True, they have no real purpose for RT calculations themselves. However, Nvidia has claimed (or at least did so initially) that their denoising algorithm runs on Tensor cores. This is definitely not the only denoising algorithm and maybe/likely not the best.
Judging by AAA games, publishers aren't willing to sacrifice so much for raytracing. On that note, if raytracing were more accessible, indie developers would probably use it because a lot of them go for a minimalist graphics style anyway.
This is exactly what Nvidia has been going after. They say there are potential cost savings from reduced time in workflow of creating a game. Less workarounds, less artist/designer work. Some developers have supported that claim so they might actually have a point. Making raytracing more accessible is exaclty what DXR support for GTX cards is about.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
It could but would be rather inefficient at it. Turing SM is faster than Pascal SM in practically every aspect. Pascal is faster than Maxwell which is faster than Kepler. If Nvidia would think RT is best done on good old shaders, they would simply add more shader units and would not bother with RT Cores.
All I know is that the organization of RT cores in Turing doesn't make sense.

This is exactly what Nvidia has been going after. They say there are potential cost savings from reduced time in workflow of creating a game. Less workarounds, less artist/designer work. Some developers have supported that claim so they might actually have a point. Making raytracing more accessible is exaclty what DXR support for GTX cards is about.
That is only true if the game was coded from the ground up to exclusively use raytracing. If there was any time put into traditional lighting/rendering techniques then raytracing is added cost.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Top