Thursday, April 11th 2019

NVIDIA Extends DirectX Raytracing (DXR) Support to Many GeForce GTX GPUs

NVIDIA today announced that it is extending DXR (DirectX Raytracing) support to several GeForce GTX graphics models beyond its GeForce RTX series. These include the GTX 1660 Ti, GTX 1660, GTX 1080 Ti, GTX 1080, GTX 1070 Ti, GTX 1070, and GTX 1060 6 GB. The GTX 1060 3 GB and lower "Pascal" models don't support DXR, nor do older generations of NVIDIA GPUs. NVIDIA has implemented real-time raytracing on GPUs without specialized components such as RT cores or tensor cores, by essentially implementing the rendering path through shaders, in this case, CUDA cores. DXR support will be added through a new GeForce graphics driver later today.

The GPU's CUDA cores now have to calculate BVR, intersection, reflection, and refraction. The GTX 16-series chips have an edge over "Pascal" despite lacking RT cores, as the "Turing" CUDA cores support concurrent INT and FP execution, allowing more work to be done per clock. NVIDIA in a detailed presentation listed out the kinds of real-time ray-tracing effects available by the DXR API, namely reflections, shadows, advanced reflections and shadows, ambient occlusion, global illumination (unbaked), and combinations of these. The company put out detailed performance numbers for a selection of GTX 10-series and GTX 16-series GPUs, and compared them to RTX 20-series SKUs that have specialized hardware for DXR.
Update: Article updated with additional test data from NVIDIA.

According to NVIDIA's numbers, GPUs without RTX are significantly slower than the RTX 20-series. No surprises here. But at 1440p, the resolution NVIDIA chose for these tests, you would need at least a GTX 1080 or GTX 1080 Ti for playable frame-rates (above 30 fps). This is especially true in case of Battlefield V, in which only the GTX 1080 Ti manages 30 fps. The gap between the GTX 1080 Ti and GTX 1080 is vast, with the latter serving up only 25 fps. The GTX 1070 and GTX 1060 6 GB spit out really fast Powerpoint presentations, at under 20 fps.
It's important to note here, that NVIDIA tested at the highest DXR settings for Battlefield V, and lowering the DXR Reflections quality could improve frame-rates, although we remain skeptical about the slower SKUs such as GTX 1070 and GTX 1060 6 GB. The story repeats with Shadow of the Tomb Raider, which uses DXR shadows, albeit the frame-rates are marginally higher than Battlefield V. You still need a GTX 1080 Ti for 34 fps.
Atomic Heart uses Advanced Reflections (reflections of reflections, and non-planar reflective surfaces). Unfortunately, no GeForce GTX card manages performance over 15.4 fps. The story repeats with 3DMark Port Royal, which uses both Advanced Reflections and DXR Shadows. Single-digit frame-rates for all GTX cards. The performance is better with Justice tech-demo, although far-from playable, as only the GTX 1080 and GTX 1080 Ti manage over 20 fps. Advanced Reflections and AO, in case of the Star Wars RTX tech-demo, is another torture for these GPUs - single-digit frame-rates all over. Global Illumination with Metro Exodus is another slog for these chips.
Overall, NVIDIA has managed to script the perfect advertisement for the RTX 20-series. Real-time ray-tracing on compute shaders is horrendously slow, and it pays to have specialized hardware such as RT cores for them, while tensor cores accelerate DLSS to improve performance even further.
It remains to be seen if AMD takes a swing at DXR on GCN stream processors any time soon. The company has already had a technical effort underway for years under Radeon Rays, and is reportedly working on DXR.

Update:
NVIDIA posted its test data for 4K and 1080p in addition to 1440p, and medium-thru-low settings of DXR. Their entire test data is posted below.

Add your own comment

111 Comments on NVIDIA Extends DirectX Raytracing (DXR) Support to Many GeForce GTX GPUs

#26
HellZaQ
TheLostSwedeYeah, at 10fps...
10-15 FPS on GTX 1070Ti, res 1080p all high.
londisteDid you try it?
Honest question.

Edit:
The new set of drivers has DXR implementation, that is DirectX 12.
I really do not know if Nvidia actually made the Vulkan stuff Q2VKRT uses to work on non-RTX hardware. Probably not as Vulkan's extensions should be low-level enough.
About 15FPS on my GPU, but looks very good :D
Posted on Reply
#27
Crackong
jabbadapYou might want to reconsider that and look back to how successful Ageia PPU were at the beginning.
But there were more games that support PhysX at the beginning of PhysX launch.
Not just 3 , right ?
Posted on Reply
#28
biffzinker
My RTX 2060 ran the Atomic Heart demo better than I expected with or without DLSS. The RTRT is impressive but I'm not sure it's worth the performance impact incurred.
Posted on Reply
#29
Fluffmeister
It would be nice if AMD added DXR support to their relevant GPUs too, but maybe they don't want to sell more Turing RTX cards either. ;)
Posted on Reply
#30
Crackong
FluffmeisterIt would be nice if AMD added DXR support to their relevant GPUs too, but maybe they don't want to sell more Turing RTX cards either. ;)
I highly doubt that.
Did FreeSync help selling more G-Sync monitors?
Posted on Reply
#31
Fluffmeister
CrackongI highly doubt that.
Did FreeSync help selling more G-Sync monitors?
Yes, a shit ton of shit Freesync monitors achieved nothing, beyond creating a market for Nvidia come in a gobble up.

Well played Nvidia, well played.
Posted on Reply
#32
HossHuge
Did anyone else go and check the price of a GTX 1060 6 GB ?
Posted on Reply
#33
Crackong
FluffmeisterYes, a shit ton of shit Freesync monitors achieved nothing, beyond creating a market for Nvidia come in a gobble up.

Well played Nvidia, well played.
But G-Sync monitors did not benefit from that, right?
Same case.
If AMD come up with something let's say "FreeRay" which runs on and optimized for typical graphics cards instead of RTX cards, would that benefit RTX card sales ?

Nope

.
Posted on Reply
#35
GreiverBlade
it's not like DXR is something ... since it impair the fps on the 20XX series .... it's kinda unwanted on 10XX ... even if it's an option ... what's the point of an option that make the card run like sh!t (also .... since the current gen run like sh!t too with it ... double framerate ... but still sh!t ) ohhhh wait... to make the user believe that their card is obsolete and they need a new gen ... well that might work with some...

more stone to the castle of : "i wait next gen GPU's"
Posted on Reply
#36
TheLostSwede
News Editor
CrackongBut there were more games that support PhysX at the beginning of PhysX launch.
Not just 3 , right ?
No and yes. I think many people forget that PhysX isn't Nvidia technology, it was initially developed by a Swiss university spin-off and then bought by Ageia that made the first PPU cards. As such, there were a handful of games that supported it long before Nvidia got involved and bout Ageia. Nvidia bought Ageia four years in, which makes a huge difference in terms of software support, whereas this is the first widely available ray tracing hardware available for consumers. There have been several earlier attempts, but none have been a commercial success. Caustic Graphics were the closest, but for whatever reason, Imagination Technologies shut them down a year or so after acquiring them. en.wikipedia.org/wiki/Ray-tracing_hardware#Implementations
Posted on Reply
#37
bug
CrackongI highly doubt that.
Did FreeSync help selling more G-Sync monitors?
Different situation. Freesync is an alternative to G-Sync, whereas another implementation of DXR only increases exposure of the API.
Posted on Reply
#38
SoNic67
I have installed the drivers and I am sad that I didn't see an option for RTX similar with the one for PhysX - to dedicate another video card just for that function (RTX).
In the past I was able to use a weaker GTX960 in games that have PhysX (Fallout 4, Metro 2033) with help of another GT740 dedicated to PhysX. During games, I could monitor the usage on both cards with GPU-Z and I saw it working nicelly.

A similar approach for RTX would allow people like me, that already own a GTX1080, to get into using RTX by just adding another 1060 or 1660, dedicated to that.

I guess nvidia's intent was not to make the technology truly usable, but just to frustrate us and hopefully in this way make us buy the new RTX cards.
That's not gonna work and actually I think will be generating a backlash against RTX adoption by the game developers.
Posted on Reply
#39
bug
@SoNic67 Just think of the amount of data that would need to be transferred between the cards if one would do the rendering and the other one the lighting. And keep in mind weaker cards don't support SLI anymore.
Posted on Reply
#40
SoNic67
@bug Based on the amount of CUDA calculations involved, I think that the bottleneck will be in the GPU cores trying to process RTX, not in the PCIe bus.

PCI Express 3.0's 8 GT/s bit rate effectively delivers 985 MB/s per lane. With two x16 slots... I think is enough.
Or heck, if there are too high latencies, they could finaly use the NVLink for something useful for a change, because SLI never scaled right... I'll get a 1070 if needed. They are so proud of that link...

PS: Or just cut to the cheese and, similar with how they released a Turing card without the RTX cores, release a card that has only those cores, like a co-processor, for us that want to upgrade without paying $1000.
Posted on Reply
#41
Ruru
S.T.A.R.S.
I can't see any difference between ray tracing or not.
Posted on Reply
#42
BorgOvermind
So another check-box for my 1070 that does absolutely nothing. 'Great.'
Posted on Reply
#43
bug
Chloe PriceI can't see any difference between ray tracing or not.
If you were trying to mix off topic with irrelevant, you have nailed it.
Posted on Reply
#44
Fluffmeister
Chloe PriceI can't see any difference between ray tracing or not.
I believe ray tracing turns a round object into a hexagon.
Posted on Reply
#45
Crackong
bugDifferent situation. Freesync is an alternative to G-Sync, whereas another implementation of DXR only increases exposure of the API.
Nvidia does not own DXR.
If AMD could come up with their own solution to optimize DXR without dedicated "cores", RTX cards would become utterly pointless.
Posted on Reply
#46
Slizzo
FordGT90ConceptMore like developer interest has been...nonexistent...and they're hoping expanding support will spur more developer interest. I think developers seeing these numbers will laugh and walk away.
rtwjunkieIt’s not an about face. It’s a clever move by Nvidia to show that the top Pascal card can only do RTX barely, at a playable FPS.

It’s meant to convince the hesitant people to get RTX cards after they see what their Pascal’s cannot or can only barely do.
It only sort of meant to do that.

What it REALLY is for, is to allow developers that don't have RTX cards to get in and start developing for RTX and allow them to see their results before releasing the features to the masses in their games. Going from a very small install base of developers to a considerably large install base, means that you have more people developing for DXR and Vulkan Ray Tracing than you had before.
Posted on Reply
#47
bug
CrackongNvidia does not own DXR.
If AMD could come up with their own solution to optimize DXR without dedicated "cores", RTX cards would become utterly pointless.
Maybe so, but how is that relevant to this context?
Posted on Reply
#48
Crackong
bugMaybe so, but how is that relevant to this context?
If AMD not doing something, we will keep getting these "Demo..Demo...Demo..Demo...Where are the actual GAMES???" situation.
It has been 7 months, only 3 games supporting this "Just works" technology and none of them fully supports all DXR effects in a single game.
Posted on Reply
#49
bug
CrackongIf AMD not doing something, we will keep getting these "Demo..Demo...Demo..Demo...Where are the actual GAMES???" situation.
It has been 7 months, only 3 games supporting this "Just works" technology and none of them fully supports all DXR effects in a single game.
You must be young. Every single gfx technology that came before this took years till developers learned how to use it properly, RTX/DXR is no different. Also, no game supports all effects in DX12 or DX11 or DX10, so I don't see how supporting everything DXR can do is relevant.
And again, this is not a discussion to have in a thread about RTX being sort of supported on Pascal cards.
Posted on Reply
#50
Crackong
bugYou must be young. Every single gfx technology that came before this took years till developers learned how to use it properly, RTX/DXR is no different. Also, no game supports all effects in DX12 or DX11 or DX10, so I don't see how supporting everything DXR can do is relevant.
And again, this is not a discussion to have in a thread about RTX being sort of supported on Pascal cards.
It is relevant.
This "RTX being sort of supported on Pascal cards" move has one purpose only : Try to convince ppl to "upgrade / side-grade" to RTX cards.
Then, what is the point to do this "upgrade / side-grade" when there are not even a handful of Ray-Tracing games out there ?

RTX based on DXR which is just one part of the DX12 API, RTX itself is not comparable to fully featured DX10 / DX11 / DX12 .
Compare it to Nvidia's last hardware-accelerated marketing gimmick a.k.a PhysX, tho.
It was the same thing again, hardware PhysX got 4 games back in 2008 and 7 in 2009, then hardware PhysX simply fade out and now open sourced.
Now we got 3 Ray Traced games in 7 months, sounds familiar ?
Posted on Reply
Add your own comment
Dec 18th, 2024 03:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts