• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Dying Light 2 Benchmark Test & Performance Analysis

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
An interesting way to appear to dismantle my point without really addressing it. So where do you believe I am coming from? I've extrapolated numbers from professional reviews, feel free to share others and I'll happily do analysis of them, where possible, to show the true cost and performance loss for enabling RT.
You're concerned to take every smallest win away AMD has with RT (probably traditional rendering too), so I don't consider your arguments, as we all know what this behavior is. I have better things to do. If I want arguments like this I go to YouTube/reddit (which I don't).

PS. It's astonishing that people expect a debate or nice talk, after being toxic / aggressive elsewhere 5 minutes earlier. Won't happen, learn manners. Just because you sit anonymous in front of a keyboard somewhere doesn't give you the right to be an ***.

/thread
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
You're concerned to take every smallest win away AMD has with RT (probably traditional rendering too), so I don't consider your arguments, as we all know what this behavior is.
Not at all, in fact not even close, I don't want to take anything away from AMD at all, certainly not any wins, of which there are numerous with RDNA2.

I am concerned with understanding the performance impacts and architectural differences (among many other things) as best I can, if only to aide myself and satisfy my curiosity, but also to help others make informed decisions. It is in this spirit that I quoted what you originally said as it is contrary to the evidence I have seen and presented.

If you don't want to have the discussion, consider my arguments/evidence, and either make possibly a great point and teach me something/broaden my perspective, or be willing to learn yourself, then I guess we both have better things to do? This can certainly be civil.
 
Joined
Jul 19, 2011
Messages
540 (0.11/day)
This game is demanding on the new consoles. It's the only game I know that limits them to 1080p 60. The series S only manages 30 fps, a first afaik, since 60 fps would likely force sub 720p resolution.
 
Joined
Jun 21, 2015
Messages
66 (0.02/day)
Location
KAER MUIRE
System Name Alucard
Processor M2 Pro 14"
Motherboard Apple thingy all together
Cooling no Need
Memory 32 Shared Memory
Video Card(s) 30 units
Storage 1 TB
Display(s) Acer 2k 170Hz, Benq 4k HDR
Mouse Logictech M3
Keyboard Logictech M3
Software MacOs / Ubuntu
poor me, not getting this game then, My AMD car will be struggling.
 
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
No HDR support in 2022. :banghead:

I played for two hours. DF optimized settings (with RT GI and AO). Game barely manages to keep 4K60 with DLSS Performance most of the time. I saw drops to mid-40s in a couple of scenes.

I think draw distance is the main culprit of these high requirements. There is almost no pop-up, no LOD changes. The first game had a crazy draw distance at launch, which was killing CPUs. They drastically lowered it in a patch and the game ran much better.
The CPU side is not a problem this time, but the GPU definitely is. They should add more settings to adjust, and draw distance should be one of them.
 
D

Deleted member 24505

Guest
I'm playing it on my 980ti at 60fps with peasant settings and am perfectly happy. :laugh:
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
No HDR support in 2022. :banghead:

I played for two hours. DF optimized settings (with RT GI and AO). Game barely manages to keep 4K60 with DLSS Performance most of the time. I saw drops to mid-40s in a couple of scenes.

I think draw distance is the main culprit of these high requirements. There is almost no pop-up, no LOD changes. The first game had a crazy draw distance at launch, which was killing CPUs. They drastically lowered it in a patch and the game ran much better.
The CPU side is not a problem this time, but the GPU definitely is. They should add more settings to adjust, and draw distance should be one of them.

Trying force enabling Resizeable Bar using NV Inspector, Dying Light 2 is not a whitelisted game in the lastest driver. I tried it and gain about 6% higher FPS
 
Joined
Jul 19, 2011
Messages
540 (0.11/day)
I'm playing it on my 980ti at 60fps with peasant settings and am perfectly happy. :laugh:
I guess the lack of dynamic resolution keeps the consoles down on resolution. The 3060 just misses 1440p 60 by a few frames. The 1660ti is not far off from 1080p 60. Strange they wouldn't make a few tweaks to the consoles in order to hit 1440p 60 on the SX/PS5 and 1080p 60 on the SS.
 
D

Deleted member 24505

Guest
I guess the lack of dynamic resolution keeps the consoles down on resolution. The 3060 just misses 1440p 60 by a few frames. The 1660ti is not far off from 1080p 60. Strange they wouldn't make a few tweaks to the consoles in order to hit 1440p 60 on the SX/PS5 and 1080p 60 on the SS.

I'm on 1080p, monitor is 165hz too :laugh:
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Joined
Sep 4, 2020
Messages
93 (0.06/day)
Good review once a gain but could you add cpu core count tests with games ie test it running on 4 cores then 6, 8, 12 etc to see what you really need for minimum gaming at decent frame rates!

Thanks
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Good review once a gain but could you add cpu core count tests with games ie test it running on 4 cores then 6, 8, 12 etc to see what you really need for minimum gaming at decent frame rates!

Thanks
Depends on your GPU, but normally a 4 (8 threads) or 6 core is always enough. This is a typical single player game, it will be more demanding on the GPU side.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,968 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
5800x is a little low end versus now a 12900k . I Hope to see the config updated soon.
Makes only a tiny difference. Maybe you can come here, bring the hardware and set it up? Very little time to mess around these days, gotta get work done first.
 
Joined
Sep 17, 2014
Messages
22,679 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
They didn't skip it. They needed time to develop it for RDNA 2. Nvidia developed this in 10 years. That's why Nvidias is so much better. Including a way bigger RT core, not comparable to AMDs, which is tiny.

Nvidia developed this in 10 years? Don't believe every marketing line you hear. The drive to push RT on consumer GPUs is purely financial and marketing-related. Every GPU eats rasterized workloads for breakfast right about now, even at very high resolutions. GPU mfgrs are hitting a wall in terms of potential sales since 2017. Realistically, RT was technically 'feasible' since Volta. That's not 10 years old, the cores are another iteration of what they built for enterprise. Repurposed.

This echoes in the implementations of RT as of today. You barely get a noticeable difference on top of just solid rasterized lighting methods. Its grossly expensive to run RT in real time on current GPUs and the benefits are barely tangible. So not only do you get knockoff/bottom barrel cut die GPUs, you get them with lackluster VRAM, very high TDPs, and with a technology that mostly just kills your FPS.

It also echoes in the amount of RT-capable content on launch. The whole industry got caught by surprise and they really still are. Look at the implementation here and you have just yet another confirmation. If they were really at it for a decade, they sure did a shit job getting people involved.

10 years my ass ;)
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Nvidia developed this in 10 years? Don't believe every marketing line you hear. The drive to push RT on consumer GPUs is purely financial and marketing-related. Every GPU eats rasterized workloads for breakfast right about now, even at very high resolutions. GPU mfgrs are hitting a wall in terms of potential sales since 2017. Realistically, RT was technically 'feasible' since Volta. That's not 10 years old, the cores are another iteration of what they built for enterprise. Repurposed.
I don't agree at all. First of all, if you tell me the "10 years" is wrong, source? Give me then a exact amount of time they needed to do it, otherwise, I really don't see a point. :) The RT core works well, and it could very well (including tensor cores) have taken a long of time to develop.

Secondly, RT is way way more than just financially motivated, this is highly cynical from you. Nvidia isn't a perfect company and does a lot of things for "money", but they love graphics and THIS is way more than financially motivated.

Thirdly, RT was not technically feasible since Volta, if you want to be exact, it was "feasible" way sooner. But you stretch the meaning of the word "feasible" here. We are talking about REAL TIME ray tracing, not whatever you are thinking about, so your whole "point" really isn't a point.

Fourth, source for the RT core, that came before 20 series? There was none, 20 series has the first gen RT core. That is also wrong.
This echoes in the implementations of RT as of today. You barely get a noticeable difference on top of just solid rasterized lighting methods
Let me stop you right there. You don't have a RT card and I don't think you ever saw it really in action. It looks great in a NUMBER of games, including, what I played myself: CP77, Control, MW5. You're a 100% wrong with this.
10 years my ass ;)
Sorry bro, you don't have a point. I really don't care about empty banter.
 
Joined
Sep 17, 2014
Messages
22,679 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I don't agree at all. First of all, if you tell me the "10 years" is wrong, source? Give me then a exact amount of time they needed to do it, otherwise, I really don't see a point. :) The RT core works well, and it could very well (including tensor cores) have taken a long of time to develop.

Secondly, RT is way way more than just financially motivated, this is highly cynical from you. Nvidia isn't a perfect company and does a lot of things for "money", but they love graphics and THIS is way more than financially motivated.

Thirdly, RT was not technically feasible since Volta, if you want to be exact, it was "feasible" way sooner. But you stretch the meaning of the word "feasible" here. We are talking about REAL TIME ray tracing, not whatever you are thinking about, so your whole "point" really isn't a point.

Fourth, source for the RT core, that came before 20 series? There was none, 20 series has the first gen RT core. That is also wrong.

Let me stop you right there. You don't have a RT card and I don't think you ever saw it really in action. It looks great in a NUMBER of games, including, what I played myself: CP77, Control, MW5. You're a 100% wrong with this.

Sorry bro, you don't have a point. I really don't care about empty banter.

Where is your source its 10 years in the making? Nvidia documents things quite well. Shouldn't be a problem eh? Because their GPUs or products show no evidence of a strong push for (RT)RT until Volta/Turing.

As for the 'Real time' part of ray-tracing, in fact... you'd be wrong. What carries RTX cards are the updated Quadros for content producers, which is where the real money is. *removed diff node - this was Turing (TSMC).

Otherwise, the best evidence we have is what we have on the market at moment X/Y. Which is what I posted about.

The only source I know for 10 years in the making is Jensen saying so on SIGGRAPH. He also said the industry was all in on this... meanwhile Nvidia is throwing money at devs left and right to get their pre-emptive nonsense in games.

I mean really, just look at what happens in the gaming market, and you can distill the evidence quite easily. AMD is confident enough to postpone this completely and a full console gen is doing half assed RT. That's not broad industry support in the slightest.
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Where is your source its 10 years in the making? Nvidia documents things quite well. Shouldn't be a problem eh?
I asked first? :)
Otherwise, the best evidence we have is what we have on the market at moment X/Y. Which is what I posted about.
The best evidence for what?
The only source I know for 10 years in the making is Jensen saying so on SIGGRAPH. He also said the industry was all in on this... meanwhile Nvidia is throwing money at devs left and right to get their pre-emptive nonsense in games.
Well they did that for a long time, it has up and downsides. The upsides is "sometimes" better graphics, or generally new features implemented, the downside is, that it *can* make things worse in general, if you don't have a Nvidia card. But to be honest, "TWIMTBP" is so old, nobody talks about it anymore. Nvidia optimizations are still there, but mostly about RT/DLSS now, everything else seems to be mostly gone. And I really don't mind their RT/DLSS optimizations, since everyone who seriously wants RT buys a Nvidia card anyway.
I mean really, just look at what happens in the gaming market, and you can distill the evidence quite easily.
I don't see things that cynical. I'm trying to be positive.

As for the 'Real time' part of ray-tracing, in fact... you'd be wrong. What carries RTX cards are the updated Quadros for content producers, which is where the real money is. *removed diff node - this was Turing (TSMC).
Your ninja edit, my ninja edit here: You're saying TURING, TURING IS 20 series, so then, you are confirming what I said. Turing is a GAMING arch first, and only after that workstation. AMPERE is a mixed arch and also more optimized for workload. It's not that efficient for gaming. Look at RDNA 2.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I don't agree at all. First of all, if you tell me the "10 years" is wrong, source? Give me then a exact amount of time they needed to do it, otherwise, I really don't see a point. :) The RT core works well, and it could very well (including tensor cores) have taken a long of time to develop.

Secondly, RT is way way more than just financially motivated, this is highly cynical from you. Nvidia isn't a perfect company and does a lot of things for "money", but they love graphics and THIS is way more than financially motivated.

Thirdly, RT was not technically feasible since Volta, if you want to be exact, it was "feasible" way sooner. But you stretch the meaning of the word "feasible" here. We are talking about REAL TIME ray tracing, not whatever you are thinking about, so your whole "point" really isn't a point.

Fourth, source for the RT core, that came before 20 series? There was none, 20 series has the first gen RT core. That is also wrong.

Let me stop you right there. You don't have a RT card and I don't think you ever saw it really in action. It looks great in a NUMBER of games, including, what I played myself: CP77, Control, MW5. You're a 100% wrong with this.

Sorry bro, you don't have a point. I really don't care about empty banter.
The example I keep using is Hollywood effects. Those have all been ray traced since the original Jurassic Park (if not earlier). Without having ot resort to as many tricks, ray tracing is simply capable of more realistic results. Of course, it can be argued that games don't always need to look realistic, but that's missing two important points. First, that games are not the only intended target for video cards. And second, even if the baseline looks more realistic, there's no reason you can't have artistic effects while employing ray tracing.

Also, leave Vayra alone. He's decided years ago RT is a gimmick, arguments won't change his mind. He's entitled to his opinion, you just have to acknowledge it's just his opinion and nothing more.
 
Joined
Sep 17, 2014
Messages
22,679 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I asked first? :)

The best evidence for what?

Well they did that for a long time, it has up and downsides. The upsides is "sometimes" better graphics, or generally new features implemented, the downside is, that it *can* make things worse in general, if you don't have a Nvidia card. But to be honest, "TWIMTBP" is so old, nobody talks about it anymore. Nvidia optimizations are still there, but mostly about RT/DLSS now, everything else seems to be mostly gone. And I really don't mind their RT/DLSS optimizations, since everyone who seriously wants RT buys a Nvidia card anyway.

I don't see things that cynical. I'm trying to be positive.


Your ninja edit, my ninja edit here: You're saying TURING, TURING IS 20 series, so then, you are confirming what I said. Turing is a GAMING arch first, and only after that workstation. AMPERE is a mixed arch and also more optimized for workload. It's not that efficient for gaming. Look at RDNA 2.
I've already told you what I base my statements on, didn't I?
How can I find evidence of something 'not being said'? Interesting paradox, that :)

We won't get a conclusive answer on this, and you know it. Its just a choice of what you want to believe.

I'm a realist. You're an optimist. NP.

As for my opinion... what is the major issue we have right now in GPU land? Now scroll back to early Turing days and see what I've said about this.
Cost for GPUs has exploded, the shortages happened, and RT is now available to a tiny subset of gamers. Node shrinks later, we're looking at 350W TDPs instead of 250W.

And we call it progress.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
The example I keep using is Hollywood effects. Those have all been ray traced since the original Jurassic Park (if not earlier). Without having ot resort to as many tricks, ray tracing is simply capable of more realistic results. Of course, it can be argued that games don't always need to look realistic, but that's missing two important points. First, that games are not the only intended target for video cards. And second, even if the baseline looks more realistic, there's no reason you can't have artistic effects while employing ray tracing.
I agree that RTRT can be better implemented today, everyone knows this, it's only in the 2nd gen now and AMD also slowed devs down with their "anti optimizations" optimized for their inferior "Ray Accelerators", that are too tiny to be really good.

However, that said, it is still great in some games and that shows that it CAN make a dent.
Also, leave Vayra alone. He's decided years ago RT is a gimmick, arguments won't change his mind. He's entitled to his opinion, you just have to acknowledge it's just his opinion and nothing more.
He should really move on from his outdated GPU and try it out for himself, my opinion. Otherwise, whatever.

How can I find evidence of something 'not being said'? Interesting paradox, that :)
Yeah well, then there is no point in saying I'm wrong, right? Just leave it be then.
I'm a realist. You're an optimist. NP.
Honestly, realists are mostly pessimists in disguise. I know it, I was like that too. It's too hard to be a realist without being kinda negative. But I don't care about that, you have some cynical views, and before I came back I also saw that in multiple other posts, I think you can do better. Just my opinion, you can disregard it if you want.

And we call it progress.
I don't, but I can't comment on it further now. :)
 
Joined
Sep 17, 2014
Messages
22,679 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
The example I keep using is Hollywood effects. Those have all been ray traced since the original Jurassic Park (if not earlier). Without having ot resort to as many tricks, ray tracing is simply capable of more realistic results. Of course, it can be argued that games don't always need to look realistic, but that's missing two important points. First, that games are not the only intended target for video cards. And second, even if the baseline looks more realistic, there's no reason you can't have artistic effects while employing ray tracing.

Also, leave Vayra alone. He's decided years ago RT is a gimmick, arguments won't change his mind. He's entitled to his opinion, you just have to acknowledge it's just his opinion and nothing more.

See... thát part of RT I don't contest at all. Yes, it can and does improve visual fidelity. At the same time, gaming is not film and we've seen RT implementations that forgot about that. Even Dying Light 2 shows this: RT makes stuff a lot darker, which while realistic, is not always preferable in games.

Note not always. I see how it works for some. Definitely open to arguments in that sense. Also have seen the tech live, contrary to what's been stated above by others.

I'm really still watching it evolve from the sidelines, because buying in today seems like a seriously bad buy.

However, that said, it is still great in some games and that shows that it CAN make a dent.

He should really move on from his outdated GPU and try it out for himself, my opinion. Otherwise, whatever.
Absolutely and absolutely.

But the market rn is simply denying it.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Even Dying Light 2
Is such a mess this game. Without RT it looks like a old game, with it, it looks properly RT'd. Weird. The difference is "night" and day, pun intended.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
As for my opinion... what is the major issue we have right now in GPU land? Now scroll back to early Turing days and see what I've said about this.
Cost for GPUs has exploded, the shortages happened, and RT is now available to a tiny subset of gamers. Node shrinks later, we're looking at 350W TDPs instead of 250W.

And we call it progress.
Honestly, the first 3D accelerators added a whole new card to your system. An they were progress.
Never judge new tech by its first iterations. Potential is a far better indicator ;)
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Never judge new tech by its first iterations.
Turing was way ahead of its time and aged admirably. It only loses ~5% more FPS than comparable new GPUs like 3080 if RT is activated. Tensor cores work well too.
 
Top