• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Update

Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
I'm not saying RTRT is a gimmick. It definitely adds to the quality of the graphics but it is a mixture of ray tracing and rasterization.
The whole point of GPUs is to utilize rasterization.
If our chips were fast enough to do full RT in games (like we do RTRT in renders CGI movies), you wouldn't need a GPU.
CPUs are more suited to ray tracing in general. It's just that they're usually optimized for other tasks (low latency etc).

I doubt.

The time you've wasted on writing that post could have been spent on reading few paragraphs of this:
https://en.wikipedia.org/wiki/Ray_tracing_(graphics)
 
Joined
Feb 3, 2017
Messages
3,747 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
I see.
So nVidia spent years upon years "doing something with RT" to come up with glorious (stock market crashing :))))) RTX stuff that absolutely had to use RTX cards, because, for those stupid who don't get it: it has that R in it! It stands for "raytracing". Unlike GTX, where G stands for something else (gravity perhaps)
Did I say "years"? Right, so years of development to bring you that RTX thing.
Nvidia and AMD and Intel have spent years "doing something with RT". Nvidia has OptiX (wiki, Nvidia), AMD has Radeon Rays as the latest iteration of what they are doing. Intel has for example OSPRay. This is just examples, the work, research and software in this area is extensive.

The current real-time push that Nvidia (somewhat confusingly) branded as RTX is not a thing in itself, it is an evolution on all the previous work. RT cores and the operations they are able to perform are not chosen randomly, they have extensive experience in the field. While you keep lambasting Nvidia for RTX, APIs for RTRT seem to take a pretty open route. DXR is in DX12, open for anyone to implement. Vulkan has Nvidia RTX specific extensions but there is a discussion whether something more generic is needed.

Nobody has claimed RT or RTRT cannot be done on anything else. Any GPU or CPU can, it is a simple question of performance. So far, including DXR and Optix tests done on Pascal, Volta and Turing suggest RT Cores do provide a considerable boost to RT performance.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
The current real-time push that Nvidia (somewhat confusingly) branded as RTX is not a thing in itself, it is an evolution on all the previous work.
A pinnacle of the effort.
The best thing in RT ever produced.
Brought to us, by the #leatherman.

Thank you so much for your insight!

Unfortunately it is somewhat misplaced. The context of the argument was RTX for GTX cards being something nVidia likely prepared for GDC as opposed to something that nVidia likely had to pull out from the back pocket, to address thunder from the Crytek demo.

RT cores and the operations they are able to perform are not chosen randomly, they have extensive experience in the field.
Development at NVDA is not done randomly, got it.
Not least, because they are lead by, you know, Lisa's Uncle.

Vulkan has Nvidia RTX specific extensions but there is a discussion whether something more generic is needed.
Because it is not obvious that "more generic" (if that's what we call open standards these days) Ray tracing related standard is needed, or for some even more sophisticated reason?
 
Joined
Feb 3, 2017
Messages
3,747 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Because it is not obvious that "more generic" (if that's what we call open standards these days) Ray tracing related standard is needed, or for some even more sophisticated reason?
RT Cores perform specific operations that are exposed via current Nvidia extensions. If I remember the discussion in the threads correctly the argument was that Vulkan does not need to provide a standard API calls for RT because RT is compute in its core and when IHVs expose the necessary GPU compute capabilities via their own extensions, developers can leverage these extensions to implement RT. In essence, Vulkan is low-level API and perhaps anything towards more generic RT is too high-level for it to address by design. It is not a wrong argument and in a way a question of principle.
 
Joined
Sep 17, 2014
Messages
22,431 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
RT Cores perform specific operations that are exposed via current Nvidia extensions. If I remember the discussion in the threads correctly the argument was that Vulkan does not need to provide a standard API calls for RT because RT is compute in its core and when IHVs expose the necessary GPU compute capabilities via their own extensions, developers can leverage these extensions to implement RT. In essence, Vulkan is low-level API and perhaps anything towards more generic RT is too high-level for it to address by design. It is not a wrong argument and in a way a question of principle.

Stop feeding him, might as well be talking to a wall. The best you'll get is your posts misconstrued and ripped up in quotes and a -1 spree on everything you do. You've been warned ;)

The bait is so obvious, the only reason he's not on my ignore list is for entertainment purposes.
 
Joined
Jun 22, 2014
Messages
446 (0.12/day)
System Name Desktop / "Console"
Processor Ryzen 5950X / Ryzen 5800X
Motherboard Asus X570 Hero / Asus X570-i
Cooling EK AIO Elite 280 / Cryorig C1
Memory 32GB Gskill Trident DDR4-3600 CL16 / 16GB Crucial Ballistix DDR4-3600 CL16
Video Card(s) RTX 4090 FE / RTX 2080ti FE
Storage 1TB Samsung 980 Pro, 1TB Sabrent Rocket 4 Plus NVME / 1TB Sabrent Rocket 4 NVME, 1TB Intel 660P
Display(s) Alienware AW3423DW / LG 65CX Oled
Case Lian Li O11 Mini / Sliger CL530 Conswole
Audio Device(s) Sony AVR, SVS speakers & subs / Marantz AVR, SVS speakers & subs
Power Supply ROG Loki 1000 / Silverstone SX800
VR HMD Quest 3
Anyway, this is a great way to tease the technology and boost adoption rates. Overall what's coming out now is looking a whole lot more like actual industry effort, broad adoption and multiple ways to attack the performance problem. The one-sided RTX approach carried only by a near-monopolist wasn't healthy. This however, looks promising.

I'm not sure i follow you on that last line, just because the wording almost makes it sound like a G-sync or Physx walled garden approach which it isn't. DXR is an open standard, developed with both AMD and nVidia, available to anyone through DX12. RTX is just nVidias implementation of an open source API on their hardware. Only AMD is to blame not showing up and offering support for a standard that they helped create. If/when AMD releases something, they will also have an equivalent name that is proprietary to their hardware (i.e. Radeon Rays or something similar). And let's be real for a second, if it wan't for nVidia releasing RTX, there would be no Crytek RT demos, RT support in Unreal Engine, or anyone else working on solutions. So to give credit where it is due, RTX most definitely -single handedly- got the ball rolling on a tech the myself (and seemingly now yourself) can both agree is gaining industry wide attention and looks promising... when less than a year ago, the 'ball' didn't even exist.

However, I completely agree with the rest of that assessment.
 
Joined
Sep 17, 2014
Messages
22,431 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I'm not sure i follow you on that last line, just because the wording almost makes it sound like a G-sync or Physx walled garden approach which it isn't. DXR is an open standard, developed with both AMD and nVidia, available to anyone through DX12. RTX is just nVidias implementation of an open source API on their hardware. Only AMD is to blame not showing up and offering support for a standard that they helped create. If/when AMD releases something, they will also have an equivalent name that is proprietary to their hardware (i.e. Radeon Rays or something similar). And let's be real for a second, if it wan't for nVidia releasing RTX, there would be no Crytek RT demos, RT support in Unreal Engine, or anyone else working on solutions. So to give credit where it is due, RTX most definitely -single handedly- got the ball rolling on a tech the myself (and seemingly now yourself) can both agree is gaining industry wide attention and looks promising... when less than a year ago, the 'ball' didn't even exist.

However, I completely agree with the rest of that assessment.

RTX is just like Gsync except in how they sell it (and even there, the similarities exist). The walled garden is artificial and Nvidia's RTX approach is too expensive to last in the marketplace, it will be eclipsed by cheaper, more easily marketed alternatives. Gsync is also a separate bit of hardware that is 'required' to get it right, according to Nvidia, while the rest of the industry works towards solutions that can actually turn mainstream; Freesync as part of VESA standards.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
RTX is just like Gsync except in how they sell it (and even there, the similarities exist). The walled garden is artificial and Nvidia's RTX approach is too expensive to last in the marketplace, it will be eclipsed by cheaper, more easily marketed alternatives. Gsync is also a separate bit of hardware that is 'required' to get it right, according to Nvidia, while the rest of the industry works towards solutions that can actually turn mainstream; Freesync as part of VESA standards.
At least they try. That's the whole point of being successful in business. You try 5 things, one sticks.

And I don't agree. RTX is here to stay. We don't know for how long: 2, 3, 5 years? For few years it'll cement Nvidia supremacy. That's all they want - they already won the race for market share and profits.
By 2025 GPUs will be twice as fast, mainstream PCs will have PCIe 5.0 and 10-16 cores. At that point CPUs could take some of the RTRT workload (and they're much better at it).

RTX makes another 2 big wins for Nvidia probable. Imagine RTRT sticks - people will learn to like it. Imagine AMD doesn't make a competing ASIC.
Datacenter and console makers will think twice before they choose AMD as the GPU supplier for next gen streaming/console gaming.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.46/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Doesn't all that apply to DX12 and Vulkan as well?
No, because DX12 and Vulkan can result in 10%+ FPS uplift in all future games they make with the engine. That means lowering requirements to run the game which translates to more money now and into the future. Well...that wasn't the case until Microsoft announced D3D12 support for Windows 7...now it translates to more money. :roll:

Ehm, just by transitioning to D3D12 and abandoning D3D11, they can spend far less time optimizing for that 10%+ down the road. Time is money.

What does DXR offer that saves money? Nothing? Because it isn't fully real-time raytraced which would save them money. Because so little hardware can even do it, not doing D3D11/D3D12/Vulkan is out of the question in DXR games.

Quite the opposite. DXR promises to shave 1000's of development hours by not having to paint lightmaps.
In a decade or two, sure. Not until then. Gamers want 1440p, 4K, higher framerate, HDR, and finally raytracing. Until an ASIC is developed and integrated into GPUs to do raytracing at 4K in realtime with negligible framerate cost, it will not become mainstream and not for a decade after those products start launching.

Crytek has not implemented DXR - they have a custom algorithm for ray-traced reflections.
Using RadeonRays/OpenCL? Makes sense seeing how AMD doesn't technically support DXR yet.
 
Last edited:
Joined
Sep 26, 2012
Messages
871 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
In a decade or two, sure. Not until then. Gamers want 1440p, 4K, higher framerate, HDR, and finally raytracing. Until an ASIC is developed and integrated into GPUs to do raytracing at 4K in realtime with negligible framerate cost, it will not become mainstream and not for a decade after those products start launching.

I'm not sold its decades, neither do I think a dedicated ASIC is needed. What is needed however is to be able to divide FP32 and INT32 units down to the smallest instruction possible, pack instructions into the one unit, and execute at the lowest cost. This will enable units on die not being wasted most of the clockcycle by being able to do something else, or needing to maintain coherence back into the main render pipeline.

IMO, that period after next gen consoles come out and after devs stop supporting current gen consoles is when we will see a shift over.
 
Joined
Feb 18, 2017
Messages
688 (0.24/day)
Vega cards beating Titan in productivity - allow more performance from drivers
Crytek shows Ray Tracing running on Vega - NV allow Ray Tracing on Pascal cards.

So pathetic. :D
 
Joined
Mar 23, 2005
Messages
4,085 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
There is a reason why RTX GPUs are not selling well. Last I heard Nvidia had over 1.6 billion unsold inventory :nutkick: :D. That's what happens when you overprice GPUs that don't deserve such high prices.
 
Last edited:
Joined
Jun 22, 2014
Messages
446 (0.12/day)
System Name Desktop / "Console"
Processor Ryzen 5950X / Ryzen 5800X
Motherboard Asus X570 Hero / Asus X570-i
Cooling EK AIO Elite 280 / Cryorig C1
Memory 32GB Gskill Trident DDR4-3600 CL16 / 16GB Crucial Ballistix DDR4-3600 CL16
Video Card(s) RTX 4090 FE / RTX 2080ti FE
Storage 1TB Samsung 980 Pro, 1TB Sabrent Rocket 4 Plus NVME / 1TB Sabrent Rocket 4 NVME, 1TB Intel 660P
Display(s) Alienware AW3423DW / LG 65CX Oled
Case Lian Li O11 Mini / Sliger CL530 Conswole
Audio Device(s) Sony AVR, SVS speakers & subs / Marantz AVR, SVS speakers & subs
Power Supply ROG Loki 1000 / Silverstone SX800
VR HMD Quest 3
RTX is just like Gsync except in how they sell it (and even there, the similarities exist). The walled garden is artificial and Nvidia's RTX approach is too expensive to last in the marketplace, it will be eclipsed by cheaper, more easily marketed alternatives. Gsync is also a separate bit of hardware that is 'required' to get it right, according to Nvidia, while the rest of the industry works towards solutions that can actually turn mainstream; Freesync as part of VESA standards.

I think the thing that will be eclipsed by easier alternatives will be DXR, not necessarily RTX. My reasoning behind this is that nVidia chose to add additional hardware to accelerate ray tracing, but it is not exclusively bound to DXR. Crytek has already hinted at optimizations in their new ray tracing solution that will also utilize RT cores. So they won't be automatically relegated to the 'useless bin'. I would think that they will still more than likely offer a substantial boost over doing the entire computation on the GPU. Until this week, no one but Microsoft offered a method of easily incorporating ray tracing in to games, so nVidia basically had DXR only to work with. Apparently they thought the hardware accelerated route was correct/required for that particular method. Is it the most elegant and efficient solution to have that monolithic chip? Nope, obviously not. If other options to DXR existed, perhaps the architecture of Turing may be different, it may be a compute heavy card, or maybe not and it would be just the same. Everyone is pretty quick to bag RTX and blamed it for RT failure, but maybe it is in fact DXR that is the fat pig, hogging up all of the gpu power? Seems to be a likely possibility when a Vega56 is doing what it's doing. There are so many talented individuals in the world that for decades have come up with countless ways of doing things better than Microsoft. I have faith in these people, and I think there will be many changes and shifts in the early days to figure out the sweet spot and what is/isn't required to get acceptable results. But someone had to be the one to step out and take a chance in order to get the rest of these great individuals and the industry involved.

My opinion/prediction/hope is that Crytek showed this tech in order to try and be more relevant in the console market. They can now offer a way for the next console to be able to run ray tracing on AMD hardware (hint hint) and make it a huge selling point (the new 4k if you will). Now Unreal Engine has also added ray tracing. RT was never going to take off on the PC until consoles could also adopt it. Several people have declared it dead because a console wouldn't be able to ray trace for another 10 years, but apparently, all it needs is a Vega 56 equivalent gpu to get 4k/30 at what looks to be slightly less than current gen console quality geometry and textures. This should definitely be obtainable next gen release. And if this ends up being the case, then as a pc enthusiast I benefit.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,992 (2.35/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Joined
Mar 23, 2005
Messages
4,085 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...


Exaggerate much?
Nope U?
 
Joined
Mar 11, 2019
Messages
201 (0.10/day)
Location
over the HoRYZEN
System Name Not an Intel Piece of Shite
Processor Superior AMD Glorious Master Race 2700SEX
Motherboard Glorious low cost Awesome Motherboard 4
Cooling A piece of metal that cools the amazing Ryzen CPU
Memory SAMMY BEEE DAI BABEH
Video Card(s) Turding
Storage irelevant
Display(s) monitor
Case It's red because AMD = red and AMD = awesome
Power Supply 1000W,. but not needed as Glorious RYZEN CPU is extremely afficient unlike that recylced 14nm++ Junk
Mouse *gets cat*
Keyboard RUHGUBUH!
Software Not Linux
Benchmark Scores Higher than Intel shite
Baby Turing (1660 series) should do a lot better than Pascal of equal core count also due to the fact that the Tu116 has dedicated 2XFP16 pathways, alongside the dedicated INT32/FP32 ones. I think this MIGHT mean TU116 can use the 2XFP16 acceleration for BVH and take some load off the other CUDA cores. Pascal, OTOH, has to make do with a jack of all trade CUDA core. Wouldnt be surprised if GTX 1660 is faster than GTX 1080 in RTX. Pls correct me if im wrong.
 
Joined
Mar 23, 2005
Messages
4,085 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Baby Turing (1660 series) should do a lot better than Pascal of equal core count also due to the fact that the Tu116 has dedicated 2XFP16 pathways, alongside the dedicated INT32/FP32 ones. I think this MIGHT mean TU116 can use the 2XFP16 acceleration for BVH and take some load off the other CUDA cores. Pascal, OTOH, has to make do with a jack of all trade CUDA core. Wouldnt be surprised if GTX 1660 is faster than GTX 1080 in RTX. Pls correct me if im wrong.
Ray Tracing will eventually be readily available as AA for example. As more and more games adopt it, but right now and in the foreseeable future, its not worth the headache IMO
 
Joined
Mar 11, 2019
Messages
201 (0.10/day)
Location
over the HoRYZEN
System Name Not an Intel Piece of Shite
Processor Superior AMD Glorious Master Race 2700SEX
Motherboard Glorious low cost Awesome Motherboard 4
Cooling A piece of metal that cools the amazing Ryzen CPU
Memory SAMMY BEEE DAI BABEH
Video Card(s) Turding
Storage irelevant
Display(s) monitor
Case It's red because AMD = red and AMD = awesome
Power Supply 1000W,. but not needed as Glorious RYZEN CPU is extremely afficient unlike that recylced 14nm++ Junk
Mouse *gets cat*
Keyboard RUHGUBUH!
Software Not Linux
Benchmark Scores Higher than Intel shite
Ray Tracing will eventually be readily available as AA for example. As more and more games adopt it, but right now and in the foreseeable future, its not worth the headache IMO
mmhmm, i was just stating a bit of trivia lol
 
Joined
Mar 23, 2005
Messages
4,085 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.46/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
IMO, that period after next gen consoles come out and after devs stop supporting current gen consoles is when we will see a shift over.
Console gamers want 4K. Navi can deliver that but not with raytracing. Console raytracing ain't happening until the generation after next at the earliest... close to a decade out. Considering how fabrication improvements have been coming at snails pace and slowing more with each generation, I think it's likely consoles won't see raytracing adoption for two-three generations past Navi. It is too expensive with virtually no benefit because we're still talking hybrid raytracing here which represents more work, not less. Raytracing only is four+ generations past Navi at best. Which is also probably moot because then there will be a push to 8K over raytracing.

Raytracing is better suited for professional software than games and that's not going to change for a long time. If the change of substrates leads to a huge jump in performance (e.g. graphene and terahertz processors) at little cost, then we could see raytracing flood in to take advantage of all that untapped potential...and that's at least a decade out too.
 
Last edited:
Joined
Sep 15, 2011
Messages
6,716 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Sorry but the RTX in this game is a complete and total waste of resources since the quality difference is mediocre. Nothing to brag about.
The devs should be better focusing on the gameplay, story and atmosphere of the game instead of wasting time with this nonsense.
 
Joined
Oct 2, 2015
Messages
3,129 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
Sorry but the RTX in this game is a complete and total waste of resources since the quality difference is mediocre. Nothing to brag about.
The devs should be better focusing on the gameplay, story and atmosphere of the game instead of wasting time with this nonsense.
That's the motto of Nintendo right there.
 
Joined
Jan 17, 2006
Messages
932 (0.14/day)
Location
Ireland
System Name "Run of the mill" (except GPU)
Processor R9 3900X
Motherboard ASRock X470 Taich Ultimate
Cooling Cryorig (not recommended)
Memory 32GB (2 x 16GB) Team 3200 MT/s, CL14
Video Card(s) Radeon RX6900XT
Storage Samsung 970 Evo plus 1TB NVMe
Display(s) Samsung Q95T
Case Define R5
Audio Device(s) On board
Power Supply Seasonic Prime 1000W
Mouse Roccat Leadr
Keyboard K95 RGB
Software Windows 11 Pro x64, insider preview dev channel
Benchmark Scores #1 worldwide on 3D Mark 99, back in the (P133) days. :)
@notb I can't really imagine AMD not doing what's needed; ASIC integration or otherwise, unless they decide to get out of GPU, which is probably unlikely.

AMD also have the option of integrating RTRT via a CPU ASIC, perhaps a chiplet dedicated to that. Maybe part of the reason for the "more cores" push is for this kind of thing, making the CPU + GPU more balanced in graphics workloads. A 7nm (and lower) chiplet would be far more economical than having to add the same die size to a monolithic GPU die. With e.g IF chiplets could also be put on graphics cards.

IMO this is the way things will start to go, the only down-side is we may end up with even more SKUs. ;)

Thinking further out, I wonder will chiplets (on cards) possibly allow for user customisable layout, i.e. you buy a card and have maybe 8 or 16 sockets on it, into these you can install modules depending on your workload. Do you want rasterisation, RT, GPGPU, custom ASIC for some blockchain or AI etc. etc. put on the power you need for the task. It would also allow gradual upgrades and customisation on a per game level (to a point, it would be a hassle to swap chips every time you change game). i.e. when a new engine comes out, you may need to "rebalance" your "chip-set" <G>.
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
There is a reason why RTX GPUs are not selling well. Last I heard Nvidia had over 1.6 billion unsold inventory :nutkick::D. That's what happens when you overprice GPUs that don't deserve such high prices.
It would be nice to provide some data for sales. I don't have any, but from what I see on store websites and on forums like this one, it does seem to already have surpassed Vega in popularity among heavy gamers. So on one hand: Nvidia is a lot larger, so this was expected. On the other: RTX is a premium lineup. It's like if Radeon VII outsold Vega 56. Unlikely.

Inventory stems from crypto crash and that's a common knowledge you also possess. Stop manipulating. They've already reported $1.4B in October 2018.
Nvidia has warehouses full of small Maxwell and Pascal chips they struggle to move. It'll most likely be written off - hence the drop in stock lately.

And it seems you haven't looked into AMD's financial statement lately.
They have $0.8B inventory (12% of FY revenue). So it's not exactly better than Nvidia's situation (14% of FY revenue).
And AMD's inventory is likely mostly GPUs while revenue is CPU+GPU.
 
Joined
Jan 17, 2006
Messages
932 (0.14/day)
Location
Ireland
System Name "Run of the mill" (except GPU)
Processor R9 3900X
Motherboard ASRock X470 Taich Ultimate
Cooling Cryorig (not recommended)
Memory 32GB (2 x 16GB) Team 3200 MT/s, CL14
Video Card(s) Radeon RX6900XT
Storage Samsung 970 Evo plus 1TB NVMe
Display(s) Samsung Q95T
Case Define R5
Audio Device(s) On board
Power Supply Seasonic Prime 1000W
Mouse Roccat Leadr
Keyboard K95 RGB
Software Windows 11 Pro x64, insider preview dev channel
Benchmark Scores #1 worldwide on 3D Mark 99, back in the (P133) days. :)
Is AMD's inventory stated as GPU specifically? Just asking?
 
Top