• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

HalfLife2 RTX Demo Is out!

Are you interested in HalfLife 2 with Ray Tracing?

  • Yes! Bring it on!

    Votes: 32 42.7%
  • Yes, worth a try.

    Votes: 21 28.0%
  • No, I like the original look more.

    Votes: 13 17.3%
  • Something else, comment below.

    Votes: 9 12.0%

  • Total voters
    75

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,389 (1.44/day)
Location
Charleston, SC
System Name Tower of Power / Delliverance
Processor i7 14700K / i9-14900K
Motherboard ASUS ROG Strix Z790-A Gaming WiFi II / Z690
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / Air
Memory CORSAIR - VENGEANCE RGB 32GB (2x16GB) DDR5 7200MHz / DDR5 2x 16gb
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / GeForce RTX 4080
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / NVM3 PC801 SK hynix 1TB
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, 2x LG HDR 32" 60hz 4k / 2x LG HDR 32" 60hz 4k
Case Phantek "400A" / Dell XPS 8960
Audio Device(s) Realtek ALC4080 / Sound Blaster X1
Power Supply Corsair RM Series RM750 / 750w
Mouse Razer Deathadder V3 Hyperspeed Wireless / Glorious Gaming Model O 2 Wireless
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (512gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 11 Pro x64
Benchmark Scores Yes
I have to say I enjoyed playing it last night. It ran great on my system with my 14700k+4070 Ti @ 1440p. I didn't touch any settings just loaded in. I'll play around with settings later. It's probably set to balanced by default. Was I blown away.. Naw it looks good but honestly I'm ready for HLX. lol I would rather the team convert it to Source 2.
 
Joined
Dec 9, 2024
Messages
286 (2.80/day)
Location
Missouri
System Name The
Processor Ryzen 7 5800X
Motherboard ASUS PRIME B550-PLUS AC-HES
Cooling Thermalright Peerless Assassin 120 SE
Memory Silicon Power 32GB (2 x 16GB) DDR4 3200
Video Card(s) RTX 2080S FE | 1060 3GB & 1050Ti 4GB In Storage
Display(s) Gigabyte G27Q (1440p / 170hz DP)
Case SAMA SV01
Power Supply Firehazard in the making
Mouse Corsair Nightsword
Keyboard Steelseries Apex Pro
I'll throw my computer at this thing to see. I don't doubt the struggle of turing cards though. I could barely get Portal RTX to run even after a hour of tuning with it, so I don't expect this to be any different. I can't really judge it anyway if I can't run it though.

I think the more disappointing thing, besides the 'oh wow, another RTX game that is designed to push graphics, pushes graphics, and runs badly on most hardware', is the fact that the team behind this game are so wonderfully talented, and this game is getting dragged through the MUD already. Hopefully they can see through the fog and see that its just people upset the same way they were with Portal RTX.
Still though, I think the team should of avoided working w/ NVIDIA outside of their software that they used to make the game (you should look it up btw, its actually pretty cool on a unrelated note) because the trailers for it are a bad look when I dont think it was their intention to try and sell graphic cards. I could be misinterpreting how that whole thing is handled anyway, though.

Also, heard rumors about there being gpu utilization leaks.

As a diehard HL fan, I'll also judge it based on how the lightning and RT implementation is done too if I can get it playable with good enough fidelity..
 
Last edited:
Joined
Jul 5, 2013
Messages
29,851 (6.98/day)
Sorry, you get that when immovable object meets impenetrable wall.
Immovable objects don't have conflicts with impenetrable walls. Your analogy is as flawed as your attitude which needs desperate improvement.
Tell me again who's frustrated here.
Don't need to, everyone can see it.
It was an objective question
No, it was you trying to stir the pot while at the same time flying in the face of reason when you know damn well what the answer is. Anyone who can not see the advantages to lighting and other light-ray based FX is someone oblivious to reality or just being belligerent for the sake of being annoying and unpleasant.

Moving on... and back on topic.

RandomGaminginHD just did another video with Radeon IGP 780M.
This was an unplayable slideshow. Even on 480P it was just sad.
 
Joined
Sep 17, 2014
Messages
23,542 (6.14/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Just for shits n giggles.

1742414381858.png
1742414422459.png
1742414399703.png

1742414492904.png
1742414516870.png
1742414563388.png
1742414546146.png
1742414580281.png


If you look at the gist.... performance. Performance. Performance. Overall looks are different, artistic style impacted, and 'I game to play games and not look at a blurry mess'.
 
Joined
Dec 9, 2024
Messages
286 (2.80/day)
Location
Missouri
System Name The
Processor Ryzen 7 5800X
Motherboard ASUS PRIME B550-PLUS AC-HES
Cooling Thermalright Peerless Assassin 120 SE
Memory Silicon Power 32GB (2 x 16GB) DDR4 3200
Video Card(s) RTX 2080S FE | 1060 3GB & 1050Ti 4GB In Storage
Display(s) Gigabyte G27Q (1440p / 170hz DP)
Case SAMA SV01
Power Supply Firehazard in the making
Mouse Corsair Nightsword
Keyboard Steelseries Apex Pro
Just for shits n giggles.

View attachment 390621View attachment 390623View attachment 390622
View attachment 390624View attachment 390625View attachment 390627View attachment 390626View attachment 390628

If you look at the gist.... performance. Performance. Performance. Overall looks are different, artistic style impacted, and 'I game to play games and not look at a blurry mess'.
So rather unsurprisingly, its the same talking points as people used for / against Portal RTX. Who woulda thunk?

Anyone who can not see the advantages to lighting and other light-ray based FX is someone oblivious to reality or just being belligerent for the sake of being annoying and unpleasant.
I don't see RT as purely a advantage. I think stuff like Lumen is awesome, and still looks good enough to where I'm happy with it. Plus, its pretty optimized.
Before I say, I should preface that I don't mean to imply anything by saying this, but I dont think RT has a place in every game. I think raster still can look very good and sometimes be a much better option overall for a games sake even if it looks worse off for it.

Obviously most games don't use RT as their only lightning, and if they do, its either as a tech demo of sorts like these RTX games have been, benchmarking, or stupidity (Lumen though, its fine.) The future of RT is in optimized (but wisely used) RT, like Lumen kind of is (though its only really the first half of that but ehh whatever, it fits enough for my opinion)

I don't see RT as 'the future', not the way its currently being used. I see stuff like Lumen, and their approach to RT, as the future. And who knows, I could still be wrong.

As for how it effects this game, and its lightning compared to the original? I'll see. I am setting myself up for disappointment, I think though.

I haven't seen a chance to throw my RT opinion out there so rant over.
 
Joined
Jul 30, 2019
Messages
3,536 (1.72/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
NEWS FLASH, 8GB of VRAM is still enough for gaming in 2025! Also, raytracing is here to stay.
My 4060LP with 8GB vram was doing just fine getting 45 to 55fps last night before the game crashed with the latest game ready driver. I don't really see the problem but I've kind of been stuck under the umbrella of 60Hz gaming for a long time. Everyone wants the best graphics they can get but at some point you just need to relax and enjoy the game for what it is using what adjustments you can do to make that happen.
 
Joined
Jul 5, 2013
Messages
29,851 (6.98/day)
I don't see RT as 'the future', not the way its currently being used.
You just highlighted the key point: There's more than one way to skin a cat. RayTracing is flexible and can be done in many varying ways. The general concept tracing light rays is the same, but the way it's done is very customizable for the specific needs of the use-case scenario.

However, until a better way to replicate lighting comes about, mimicking nature with raytracing is now the standard and isn't going anywhere.

My 4060LP with 8GB vram was doing just fine getting 45 to 55fps last night before the game crashed with the latest game ready driver.
I'm betting real money you could get above 60 consistently if you tweak some more.
 
Joined
Jul 30, 2019
Messages
3,536 (1.72/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
I'm betting real money you could get above 60 consistently if you tweak some more.
I might need some guidance with that as I'm not quite familiar with Nvidia settings.
 
Joined
Jul 5, 2013
Messages
29,851 (6.98/day)
I might need some guidance with that as I'm not quite familiar with Nvidia settings.
The settings I'm talking about are in game. You just need to tinker with dialing things down until you reach your target FPS while hitting the quality you like. Keep in mind, this is an unoptimized beta, so things in the engine are going to change and get better.
 
Joined
Jul 30, 2019
Messages
3,536 (1.72/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
The settings I'm talking about are in game. You just need to tinker with dialing things down until you reach your target FPS while hitting the quality you like. Keep in mind, this is an unoptimized beta, so things in the engine are going to change and get better.
Ah ok. I did go though those last night. I will double check but I recall being underwhelmed with the available options.
 
Joined
Dec 26, 2006
Messages
4,027 (0.60/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Joined
Jan 14, 2019
Messages
15,200 (6.73/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
This is why I don't get all the kvetching about RT being slow; rasterisation used to be slow too and guess what, the hardware did eventually catch up. Everyone seems to have forgotten how truly wretched it was trying to get big-name games to run at playable framerates on graphics hardware between 1995 and 2010; since then we've been truly spoiled by how powerful that hardware has become. RT is that decade-and-a-half cycle repeating, and just like back then people were frustrated at how apparently slow things were moving, and just like back then it is getting and will get better.

Yes, there is the Moore's Law wall to contend with this time around, but we also have other solutions like frame generation to help us overcome the limitations of physics. You may not like them, but they are solutions, they do work well enough, and they too are getting and will get better. And eventually the hardware will become powerful enough to not need them, and we will discard them.
The problem back then was that technology moved so fast, and games pushed boundaries with every release, so you had to buy a new graphics card, or even do a complete system upgrade every year. I got the PC I mentioned above in 1998. The GeForce 256 came out only a year later. The difference was night and day.

The problem now is that (some) game developers treat things like RT as a tick box exercise, something they have to use at all costs, instead of something that makes their game better. And this has been going on for 6 years at least, without too much advancement in hardware. A 2080 Ti is still a decent piece of hardware, even though it's 3 generations behind. Like you said, there's the Moore's Law wall, but we're closing our eyes, trying to pretend that it isn't there.

If RT really was the future, then I'd like to see Nvidia and AMD making advancements in that front. I mean, AMD has with RDNA 4, as it runs RT miles better than a 7900 XTX, but if you look at Nvidia's performance charts, you'll see that Blackwell is gimped by RT at an equal measure as Ada, Ampere and Turing was, so nothing has changed there architecturally.

Frame generation doesn't solve anything as long as it can't deal with a low frame rate input without introducing severe lag and graphical glitches. Lipstick on a pig.
 
Joined
Jul 5, 2013
Messages
29,851 (6.98/day)
Ah ok. I did go though those last night. I will double check but I recall being underwhelmed with the available options.
The only thing I can think of that applies to everyone, is AA. Turn off any form of AA, regardless of type and things should improve. How dramatic that will be?..

you'll see that Blackwell is gimped by RT at an equal measure as Ada, Ampere and Turing was, so nothing has changed there architecturally.
There's a reason for that, there's only a few ways to calculate light-ray trajectories and it's very mathematically intensive. There is no avoiding that fact and ATM, there are no short-cuts that work well like there are with rasterization. To do ray-tracing properly it has to be done the long/right way. We can't fake simulating physics accurately.

Make no mistake though, accurately simulating physics in games is the future and the way forward. RT is here to stay.
 
Last edited:
Joined
Jan 19, 2023
Messages
443 (0.56/day)
I would only add that in addition to RT I want physics in games to really take off and this seems to have going backwards instead of forwards. If we have RT lighting that allows dynamic time of day to look good and dynamic environment then make the damn environment really dynamic. Allow us to break stuff, move it, destroy it and do whatever. CPU power is there I believe so there should be no reason to not improve that aspect.

It's hard to sell RT when the baked lighting would work in a game. Make that game have advanced destruction, it will show off both RT and physics.
 
Joined
Jan 14, 2019
Messages
15,200 (6.73/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
There's a reason for that, there's only a few ways to calculate light-ray trajectories and it's very mathematically intensive. There is no avoiding that fact and ATM, there are no short-cuts that work well like there are with rasterization. To do ray-tracing properly it has to be done the long/right way. We can't fake simulating physics accurately.

Make no mistake though, accurately simulating physics in games is the future and the way forward. RT is here to stay.
I mean, the 9070 XT that sort of equals a 7900 XT in raster is faster than the XTX in RT. On that logic, the 5070 Ti that equals a 4070 Ti Super in raster should be much faster than that in RT, right? But it's not. That's why I'm saying that RT isn't moving anywhere. Not forward, not back, it just is... After 4 Nvidia generations now. It still eats your PC for breakfast. Nothing is changing. This is not how I envisioned RT being the future 6 years ago.

Back in the HL 1 days, new games obliterated your PC, but you bought the next graphics card a year later and you were good. This isn't the case now.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,613 (1.32/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30 tuned
Video Card(s) Palit Gamerock RTX 5080 oc
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Back in the HL 1 days, new games obliterated your PC, but you bought the next graphics card a year later and you were good. This isn't the case now.
Yeah it's RT but also just the actual stagnation / slowing of progress in actually making chips smaller and faster in every metric, those days are long gone. AMD definitely did great by disproportionately improving RT in RDNA4, Nvidia has changed the RT cores in Blackwell but there could be other reasons we don't see the benefits, there could be a bottleneck elsewhere, or the way current games are coded don't take advantage of it, still disappointing. AMD got a big leap perhaps because they were further behind and they still had bigger wins to conquer that were already conquered in Nvidia hardware (after all they are still faster at RT).
 
Joined
Jul 5, 2013
Messages
29,851 (6.98/day)
I mean, the 9070 XT that sort of equals a 7900 XT in raster is faster than the XTX in RT. On that logic, the 5070 Ti that equals a 4070 Ti Super in raster should be much faster than that in RT, right? But it's not.
That's not a fair comparison. AMD's implementation of RT is different from NVidia's.
Back in the HL 1 days, new games obliterated your PC, but you bought the next graphics card a year later and you were good. This isn't the case now.
That's because of the "Moore's Law" thing. As we edge closer to the atomic scale, advances in IC scaling diminish. We can't just drop the lith scale a notch and triple our compute power anymore. Those days are behind us.
After 4 Nvidia generations now. It still eats your PC for breakfast.
Yeah, that's how intense the math is.
This is not how I envisioned RT being the future 6 years ago.
100% agree with you on that one. Greater advances were expected. That's why optimizations in coding are so important right now.

My guess is that coders will find ways to do RT and related physics in games in a very optimized way. But it's going to take more time. Personally, I see someone coming up with a task oriented ASIC specifically designed for RT/Physics that will offload or perhaps supplement GPU's to good effect.
 
Joined
Sep 10, 2018
Messages
7,890 (3.31/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Yeah it's RT but also just the actual stagnation / slowing of progress in actually making chips smaller and faster in every metric, those days are long gone. AMD definitely did great by disproportionately improving RT in RDNA4, Nvidia has changed the RT cores in Blackwell but there could be other reasons we don't see the benefits, there could be a bottleneck elsewhere, or the way current games are coded don't take advantage of it, still disappointing. AMD got a big leap perhaps because they were further behind and they still had bigger wins to conquer that were already conquered in Nvidia hardware (after all they are still faster at RT).

My hope is that it's just a one off for Nvidia that it was rushed to meet AI demand and that the next couple generations will be better or that they will come out with something to mitigate the advances in process node not being as robust as in the past, My worry is that the AI bubble will need to pop first before they even care to invest in making better gaming gpus

Now that I am retired I am likely in that 700-1k range for a gpu upgrades and honestly Blackwell worries me and even though AMD made some good strides it's not hard when you are so behind at a specific gpu task kinda how Zen1 murdered Bulldozer but was only Haswell levels of IPC, amazing but at the time Intel was still ahead hopefully they can carry the same momentum and by the time I need my next upgrade they have a killer 700-900 usd GPU.

I guess only time will tell but even really old games like this still struggle on cutting edge hardware unless you really compromise the image quality I get that it is more of a tech demo and likely not as optimized as a ground up product would be but just imagine how something like Hellblade 2 would perform with even this level of RT lol.
 
Joined
Jan 14, 2019
Messages
15,200 (6.73/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Yeah it's RT but also just the actual stagnation / slowing of progress in actually making chips smaller and faster in every metric, those days are long gone. AMD definitely did great by disproportionately improving RT in RDNA4, Nvidia has changed the RT cores in Blackwell but there could be other reasons we don't see the benefits, there could be a bottleneck elsewhere, or the way current games are coded don't take advantage of it, still disappointing. AMD got a big leap perhaps because they were further behind and they still had bigger wins to conquer that were already conquered in Nvidia hardware (after all they are still faster at RT).
I know it sounds mad, but I don't think Nvidia has done much with the RT cores since Turing. If the 2080 lost let's say 48% performance in X game with RT on vs off, then the 3080 did, too, and the 4080 and the 5080. The only reason we see more FPS is because we see more FPS in general due to having more cores running at higher frequencies, and not because those cores have improved. But the relation between raster and RT performance has stayed the same. This is not the "RT is the future" promise that we've been fed for the last 6 years.

Sure, AMD made a huge leap with RDNA 4 on the RT front because they had more ground to cover, I give you that.

That's not a fair comparison. AMD's implementation of RT is different from NVidia's.
And? I don't care about the implementation, I care about what I see on screen.

That's because of the "Moore's Law" thing. As we edge closer to the atomic scale, advances in IC scaling diminish. We can't just drop the lith scale a notch and triple our compute power anymore. Those days are behind us.
Exactly.

My guess is that coders will find ways to do RT and related physics in games in a very optimized way. But it's going to take more time. Personally, I see someone coming up with a task oriented ASIC specifically designed for RT/Physics that will offload or perhaps supplement GPU's to good effect.
That wouldn't be bad. Just like dedicated AI chips wouldn't be bad for those who only need that, while leaving gaming GPUs alone.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,613 (1.32/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30 tuned
Video Card(s) Palit Gamerock RTX 5080 oc
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I know it sounds mad, but I don't think Nvidia has done much with the RT cores since Turing. If the 2080 lost let's say 48% performance in X game with RT on vs off, then the 3080 did, too, and the 4080 and the 5080. The only reason we see more FPS is because we see more FPS in general due to having more cores running at higher frequencies, and not because those cores have improved. But the relation between raster and RT performance has stayed the same. This is not the "RT is the future" promise that we've been fed for the last 6 years.
I suppose, I believe them when they say they did XYZ improvements to the RT core (double ray-triangle intersection speed?, and to lie about that must be a serious form of fraud) but that facet itself mustn't be what current games with RT are actually benefitting from or there is a bottleneck to the RT load itself elsewhere in the GPU, or another variable I haven't considered because I don't know what I don't know. Looking at the architecture breakdown of Blackwell it's fairly typical of Nvidia from the alst few generations, it's forward looking but it will take time for those features to actually get leveraged. Interesting to keep an eye on to see if the delta (or lack thereof) between it ad Ada and even further back continues to widen as years go on or if it stays exactly the same. I'd certainly be interested in some form of expert/engineer's detailed take on Ada->Blackwell and RDNA3->4 and the RT improvements and why they are or aren't realised as the best I/we can do is speculate.
Sure, AMD made a huge leap with RDNA 4 on the RT front because they had more ground to cover, I give you that.
However they did it, bloody well done to them.
 
Joined
Sep 10, 2018
Messages
7,890 (3.31/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I suppose, I believe them when they say they did XYZ improvements to the RT core (double ray-triangle intersection speed?, and to lie about that must be a serious form of fraud) but that facet itself mustn't be what current games with RT are actually benefitting from or there is a bottleneck to the RT load itself elsewhere in the GPU, or another variable I haven't considered because I don't know what I don't know. Looking at the architecture breakdown of Blackwell it's fairly typical of Nvidia from the alst few generations, it's forward looking but it will take time for those features to actually get leveraged. Interesting to keep an eye on to see if the delta (or lack thereof) between it ad Ada and even further back continues to widen as years go on or if it stays exactly the same. I'd certainly be interested in some form of expert/engineer's detailed take on Ada->Blackwell and RDNA3->4 and the RT improvements and why they are or aren't realised as the best I/we can do is speculate.

However they did it, bloody well done to them.

It's also best to probably isolated the RT cores and compare

2080ti to 3080ti only saw an increase in RT cores of like 18% but at least on my end I was seeing 50% uplift in RT heavy games.

3080 to 4080 only saw an 11% gain in RT cores but a much larger gain in RT perfomance.

While the overall performance of the gpu matters the most part of it has been Nvidia being more reserved about adding more RT cores for whatever reason.
 
Joined
Jan 14, 2019
Messages
15,200 (6.73/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
I suppose, I believe them when they say they did XYZ improvements to the RT core (double ray-triangle intersection speed?, and to lie about that must be a serious form of fraud) but that facet itself mustn't be what current games with RT are actually benefitting from or there is a bottleneck to the RT load itself elsewhere in the GPU, or another variable I haven't considered because I don't know what I don't know. Looking at the architecture breakdown of Blackwell it's fairly typical of Nvidia from the alst few generations, it's forward looking but it will take time for those features to actually get leveraged. Interesting to keep an eye on to see if the delta (or lack thereof) between it ad Ada and even further back continues to widen as years go on or if it stays exactly the same. I'd certainly be interested in some form of expert/engineer's detailed take on Ada->Blackwell and RDNA3->4 and the RT improvements and why they are or aren't realised as the best I/we can do is speculate.
I believe Nvidia when they say they've done XYZ to the RT cores, but where are the benefits? It's like AMD with the chiplet design in RDNA 3. It's highly advanced tech, but did it make the cards faster? Or cheaper? No. It didn't even go into the margins, as we all see they lost some massive cash on Radeon last year. A technology is only as good as the implementation of it. AMD has proved that with FX, then chiplets on RDNA 3, and now Nvidia is proving it with Blackwell, it seems.

I also don't think Blackwell is "forward thinking" in any way. Looking at the diagrams, it's still the same architecture as Turing, only with the INT/FP roles of the shader cores mixed up to be more versatile - which again, shows no gains in real life.

It's also best to probably isolated the RT cores and compare

2080ti to 3080ti only saw an increase in RT cores of like 18% but at least on my end I was seeing 50% uplift in RT heavy games.

3080 to 4080 only saw an 11% gain in RT cores but a much larger gain in RT perfomance.

While the overall performance of the gpu matters the most part of it has been Nvidia being more reserved about adding more RT cores for whatever reason.
What gain in RT performance? Do you mean gain in performance in general? RT performance alone hasn't improved much since Turing (see my explanation above).
 
Joined
Sep 10, 2018
Messages
7,890 (3.31/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I believe Nvidia when they say they've done XYZ to the RT cores, but where are the benefits? It's like AMD with the chiplet design in RDNA 3. It's highly advanced tech, but did it make the cards faster? Or cheaper? No. It didn't even go into the margins, as we all see they lost some massive cash on Radeon last year. A technology is only as good as the implementation of it. AMD has proved that with FX, then chiplets on RDNA 3, and now Nvidia is proving it with Blackwell, it seems.

I also don't think Blackwell is "forward thinking" in any way. Looking at the diagrams, it's still the same architecture as Turing, only with the INT/FP roles of the shader cores mixed up to be more versatile - which again, shows no gains in real life.

I honestly think it has to do with them not wanting to stick a bunch of rt cores on the gpu probably due to cost and wanting to keep margins up more than how much they are or are not improved.

My theory is they improve then so that they can keep perfomance similar without having to add a bunch more.

Totally unfounded and not based in reality but it's what I think lol.
 
Joined
Jan 14, 2019
Messages
15,200 (6.73/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
I honestly think it has to do with them not wanting to stick a bunch of rt cores on the gpu probably due to cost and wanting to keep margins up more than how much they are or are not improved.

My theory is thet improve then so that they can keep perfomance similar without having to add a bunch more.
It definitely comes down to costs vs benefits. Since AI is their cash cow, and their gaming GPUs deserve a "good enough I guess" award, why improve? Just like Intel during the quad core era.
 
Joined
Apr 30, 2020
Messages
1,094 (0.61/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
It's also best to probably isolated the RT cores and compare

2080ti to 3080ti only saw an increase in RT cores of like 18% but at least on my end I was seeing 50% uplift in RT heavy games.

3080 to 4080 only saw an 11% gain in RT cores but a much larger gain in RT perfomance.

While the overall performance of the gpu matters the most part of it has been Nvidia being more reserved about adding more RT cores for whatever reason.

It's not 18%. It's less than 10%. If you compared with equal rasterization & shader count & r.o.p's Nvidia has only gained at most 6% increase in RT efficiency every generation. The 3080 ti has much higher rasterization than a 2080 ti. Every one of the comparisons you made here increased rasterization, it's very hard to compare Nvidia's own generations to each other. When nvidia keep changing parts for rasterization while only claiming their RT is better.
 
Top