• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Redesign Ray Tracing Hardware on RDNA 4

Joined
Feb 24, 2023
Messages
3,027 (4.72/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
We already have realistic graphics
Where? I find it impossible to confuse real life with computer games, unless you're completely whacked. Not even close. Shadows, textures, reflections, they all behave not the same way.
Its a fallacy to think you can get everything photorealistic and still have a pleasant gaming experience
Not my fallacy. I love some games precisely for being far from representing reality. Fallout series, for example, have it way off but this is what makes the series even better. Same applies to Quake, Doom etc. These games are graphically the bee's knees.

RT is a great tool to master nonetheless.
 
Joined
Nov 4, 2005
Messages
11,983 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
I dunno if it has been said or not.

But do you guys remember Physx and Hairworks? How AMD Struggled or couldn't operate with it? I mean, there was dedicated cards for it, heck I had one for that space bugs game on that cold planet? I cant remember the name. But yeah, it was used heavily and AMD Couldn't work with it. Had to get another card.

Anyway, what I am getting at is that AMD is late to the game, as usual. RT is the new Physx and Hairworks. Even bigger actually. And a game changer to lighting. Hell, it is fantastic for horror games.

I am glad they are now being active in looking into it. But at this point, for midrange, I don't care who it is (AMD, Intel, Nvidia), so long as I can get a cheaper GPU that can implement RT, then I will go for it.

There was a profit driven idea behind PhysX, and why after the company that built it saying a GPU couldn't do what it did was bought it simply merged into a compute language that was proven to work fine on AMD cards.

PhysX was also mostly still precooked calculations with a few finishing touches on the fly, as Nvidias own programming for CUDA document showed.

Nvidia has done amazing things but they aren't the technology pioneer they are seen as.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,175 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
It makes it better, but only slightly
That's where we disagree I suppose. It varies game to game, some are not worth the hit, some are and I won't cherry pick a single example to hang my hat on. It's especially the case if the upscaling doesn't turn it into a shimmering garbly mess and can genuinely help. Like I said I've been enjoying it for years now, and AMD seem to be finally be admitting it and are on the bandwagon too, they know that if they want more Radeon customers they have quite a steep RT/Upscaling hill to climb.
 
Joined
Oct 6, 2021
Messages
1,605 (1.40/day)
they could still distribute how they spend their R&D money differently. really they should give up on Ray tracing and focus solely on matching DLSS, I plan to buy a 5090 or 6090 and sell my 7900 XT when I do that and its largely because DLSS is the future and there is no stopping it. unfortunately.
What a hell of a future you expect huh. Better to wait for the zombie apocalypse. In either scenario, I'll stand ready, finely trained and armed with a katana by my side.

My optimistic wish for the future; TAA must die, and consequently DLSS and its analogues. Games will cease to be plagued by glitches/Blur/bugs, and companies will refrain from dismissing QA and launching flawed games. Instead, they'll prioritize innovation and build games that are truly enjoyable.

Yes, this future, exists for me and you, see the doors of heaven opening. :p
 
Joined
Mar 29, 2014
Messages
483 (0.12/day)
That's your opinion. Industry moves towards things that result in what consumers want (better graphics and performance), along with techniques that allow for simpler and easier workloads in production. The fact you offer resistance doesn't help your preferred GPU vendor, who also seemed to think that these technologies weren't important, and look at where they are now. Adding proper RT hardware, going the way of AI (finally) for FSR, etc.


I'd agree, except AMD does not exist in a vacuum. NVIDIA is also releasing new cards which will be improved too, and they're starting ahead.
You are so wrong it's not even funny. This has everything to do with AI and nothing to do with gaming.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
The small things obviously add up. That the 3090 went from 22 to 49 fps there is a ~120% jump. That's more than substantial.
No. It went up from 22 to 46 by enabling DLSS Q. Then, it gained a whole other 3 FPS by ray reconstruction. That's an unnoticeable 6% increase.

As for the AMD card, why not? It's supposed to be focused on power efficiency. If the price is right, sounds like fun to me.
Fair enough. I'm actually on the same opinion. :)

If AMD manages to fix only the video playback power consumption, and add nothing else, I'll gladly swap my 7800 XT for the new model. I might even sell it early as I could use the cash for some upcoming vacations. Then, buying the new model will come with a lot less buyer's remorse, as I'll essentially be upgrading from a 1660 Ti or 6500 XT. :D
 
Joined
Sep 15, 2011
Messages
6,725 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Yeah they can finally check out how the game looks almost indistinguishable from the regular RT, a real treat.
I keep saying this.
RT is the most overrated in CP, there is only a slight difference, but not that big to worth the fps tank. Not at all.
 
Joined
Sep 17, 2014
Messages
22,449 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Where? I find it impossible to confuse real life with computer games, unless you're completely whacked. Not even close. Shadows, textures, reflections, they all behave not the same way.

Not my fallacy. I love some games precisely for being far from representing reality. Fallout series, for example, have it way off but this is what makes the series even better. Same applies to Quake, Doom etc. These games are graphically the bee's knees.
Well then, we're in agreement here.

This is exactly what I'm saying. Realism is abstract in gaming. If its not real, its not real, and gaming never is. Most games are what they are exactly because they're not real. So why would you need RT to improve realism over the dynamic lighting we already had? Is the desire for better graphics really something that hinges on having RT or not, or is that just what Nvidia wants to tell us?

Because frankly I haven't seen a single game where RT made or broke the visuals or the realism. Its either already there in the rasterized picture, or its not there at all. RT won't make the surreal, real. It just applies minor changes to the image's lighting and reflective qualities.

What I DID see a lot of in my gaming decades is that the real gaming eye openers were those with painstakingly hand-crafted scenery. The most immersive games are the games that just breathe they are works of art, that real effort went into them and things just click. RT doesn't do that. Rather, for being so 'ubiquitous' in gaming if its 'so easy to apply', it will make a lot of games look the same. A bit like how the autotune destroyed pop music. Everything sounds the same, vocals are computerized, heard one, you've heard them all.

That's the fallacy of this chase for realistic graphics and 'the holy grail of graphics'. Its reducing an art form to something based on universal metrics, but at that point it directly stopped being art and starts being a spreadsheet, a system with recognizable patterns much like that procedural 'random' world you walk through that's still boring AF.
 
Last edited:
Joined
Sep 4, 2022
Messages
309 (0.38/day)
I keep saying this.
RT is the most overrated in CP, there is only a slight difference, but not that big to worth the fps tank. Not at all.
The best feature in terms of return on investment of resources and performance hit imo is rt global illuminations. It makes the image pop especially with HDR display vs traditional rasterization ambient occlusions. It's a night and day difference subjectively speaking.
 
Joined
Jan 29, 2024
Messages
87 (0.29/day)
For me personally I think the biggest quality uplift for graphics is getting HDR right for every game. I can see the need for RT in certain games (and it will only be in certain games, not every game needs it) but HDR does add a certain something to all games. When I play PS5 games on my LG OLED TV with HDR it can look stunning.
 
Joined
Sep 17, 2014
Messages
22,449 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
For me personally I think the biggest quality uplift for graphics is getting HDR right for every game. I can see the need for RT in certain games (and it will only be in certain games, not every game needs it) but HDR does add a certain something to all games. When I play PS5 games on my LG OLED TV with HDR it can look stunning.
Yeah I gotta say watching Dune on the OLED with HDR on turned my perspective around on HDR. Its amazing when done right (and viewed on something with much better dynamic range than the average LCD). Unfortunately most panels are simply incapable of good HDR and it only serves as an excuse to either give you a paled image or heavy oversaturation and crushed highlights/blacks.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,031 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Yeah I gotta say watching Dune on the OLED with HDR on turned my perspective around on HDR. Its amazing when done right (and viewed on something with much better dynamic range than the average LCD). Unfortunately most panels are simply incapable of good HDR and it only serves as an excuse to either give you a paled image or heavy oversaturation and crushed highlights/blacks.
Dune, esp part II, made me remember what a well made film can be.

Hollywood really dropped the ball these last couple decades. Nice to see masters at work.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,175 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Of course, for those of us not living in 2018 there is great demand for good upscaling/DLAA and ray tracing performance, but it's easy to just write off that majority when you can simply call them fanboys and therefore excuse AMD not delivering.
Yeah people definitely didn't ask for more realistic/better lighting effects (aka, better graphics) and more performance, which is what both of the technologies actually do .... But they'll never admit it, because we didn't specifically ask for exactly what was delivered on a technical level, we're all just 100% riding jacket mans dong and they're waaaay smarter than us, duh.
 
Joined
Apr 29, 2014
Messages
4,290 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
I really want to see some actual people who legitimately use Ray tracing at least significant to the point that its a must have. I hear all the time how Ray Tracing is just so amazing yet all the people I know and speak to use it on either one of the lower settings or turn it off in most cases because it kills the performance. Now granted only one of my friends has a 4090 and can truly use it on the higher settings without killing his performance to sub 60 performance in games, most of my friends have 4070 ti cards and lower (Those that have Nvidia cards).

I am not complaining about Ray Tracing existing, but it got old along time ago all the talk about it being the only way AMD can be competitive. Its a Niche tech, and most of the games that use it use it in a small way. Not a ton of games are like Cyberpunk who truly show what it can do and I agree that when its used to its fullest it looks great! But there is a reason most games that use it use it have heartedly (At least in my opinion). Again the only card I think that truly shows off what Ray Tracing can do is the 4090 which is very expensive.

As for AMD redesigning it, good, I mean I am all for them making it more competitive as if the tech is staying they might as well invest in it. It will be interesting to see how it performs on next generation hardware this year!
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,031 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
I really want to see some actual people who legitimately use Ray tracing at least significant to the point that its a must have. I hear all the time how Ray Tracing is just so amazing yet all the people I know and speak to use it on either one of the lower settings or turn it off in most cases because it kills the performance. Now granted only one of my friends has a 4090 and can truly use it on the higher settings without killing his performance to sub 60 performance in games, most of my friends have 4070 ti cards and lower (Those that have Nvidia cards).

I am not complaining about Ray Tracing existing, but it got old along time ago all the talk about it being the only way AMD can be competitive. Its a Niche tech, and most of the games that use it use it in a small way. Not a ton of games are like Cyberpunk who truly show what it can do and I agree that when its used to its fullest it looks great! But there is a reason most games that use it use it have heartedly (At least in my opinion). Again the only card I think that truly shows off what Ray Tracing can do is the 4090 which is very expensive.

As for AMD redesigning it, good, I mean I am all for them making it more competitive as if the tech is staying they might as well invest in it. It will be interesting to see how it performs on next generation hardware this year!
When every console uses borderline unusably slow RT hardware (RDNA2 APU), the majority of games developers will build for the lowest common denominator (Xbox series S).

UE5.4 has lumen as the default, PS5 Pro coming soon which has slightly less useless RT hardware from a RNDA3.5 APU, so expect to see stronger and more detailed implementations.

Games devs are mandated to have every release run on the series S, so blame Microsoft for stupid segmentation and AMD for behind the times hardware.

Screenshot_20240506_165725.png
 
Joined
Nov 11, 2016
Messages
3,413 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I really want to see some actual people who legitimately use Ray tracing at least significant to the point that its a must have. I hear all the time how Ray Tracing is just so amazing yet all the people I know and speak to use it on either one of the lower settings or turn it off in most cases because it kills the performance. Now granted only one of my friends has a 4090 and can truly use it on the higher settings without killing his performance to sub 60 performance in games, most of my friends have 4070 ti cards and lower (Those that have Nvidia cards).

I am not complaining about Ray Tracing existing, but it got old along time ago all the talk about it being the only way AMD can be competitive. Its a Niche tech, and most of the games that use it use it in a small way. Not a ton of games are like Cyberpunk who truly show what it can do and I agree that when its used to its fullest it looks great! But there is a reason most games that use it use it have heartedly (At least in my opinion). Again the only card I think that truly shows off what Ray Tracing can do is the 4090 which is very expensive.

As for AMD redesigning it, good, I mean I am all for them making it more competitive as if the tech is staying they might as well invest in it. It will be interesting to see how it performs on next generation hardware this year!

If technologies were being held back by what the majority have or use, you wouldn't have such nice thing like a PC today

Another example, is OLED screen pointless because the majority of users still have LCD screen?
 
Last edited:
Joined
Sep 28, 2005
Messages
3,327 (0.48/day)
Location
Canada
System Name PCGR
Processor 12400f
Motherboard Asus ROG STRIX B660-I
Cooling Stock Intel Cooler
Memory 2x16GB DDR5 5600 Corsair
Video Card(s) Dell RTX 3080
Storage 1x 512GB Mmoment PCIe 3 NVME 1x 2TB Corsair S70
Display(s) LG 32" 1440p
Case Phanteks Evolve itx
Audio Device(s) Onboard
Power Supply 750W Cooler Master sfx
Software Windows 11
If Silent Hill 2 is good and not shit (oh God I'm hoping it's good. Otherwise that's it, it will be a dog's day afternoon for me), then RT will play an important roll in it. For atmosphere, lighting, shadows, the fog, etc. That is where RT will shine.

I agree in cyberpunk its kinda meh. Can be done, those effects, without it or at least without such a penalty in performance. But I've tried it on my 3080 and I was impressed. But when running around in an open environment you won't notice it. More closed environments or something like a forest or a jungle, you would notice it.
 
Joined
Jan 29, 2024
Messages
87 (0.29/day)
If technologies were being held back by what the majority have or use, you wouldn't have such nice thing like a PC today

Another example, is OLED screen pointless because the majority of users still have LCD screen?
Then that makes RT a bit pointless then as the vast MAJORITY of graphics cards users are 4060 or equivalent and under.
 
Joined
Nov 11, 2016
Messages
3,413 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Then that makes RT a bit pointless then as the vast MAJORITY of graphics cards users are 4060 or equivalent and under.

It seems you didn't get it.
New bleeding edge technologies aren't just going to be ubiquitous in a single day, but they will slowly replace the current tech.
For example: LCD replaced CRT, optical mouse replaced trackball mouse, DDR2 - DDR3 - DDR4 - DDR5 RAM, etc...

Kinda stupid to think new techs are pointless because the majority are not using those tech LOL
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Well then, we're in agreement here.

This is exactly what I'm saying. Realism is abstract in gaming. If its not real, its not real, and gaming never is. Most games are what they are exactly because they're not real. So why would you need RT to improve realism over the dynamic lighting we already had? Is the desire for better graphics really something that hinges on having RT or not, or is that just what Nvidia wants to tell us?

Because frankly I haven't seen a single game where RT made or broke the visuals or the realism. Its either already there in the rasterized picture, or its not there at all. RT won't make the surreal, real. It just applies minor changes to the image's lighting and reflective qualities.

What I DID see a lot of in my gaming decades is that the real gaming eye openers were those with painstakingly hand-crafted scenery. The most immersive games are the games that just breathe they are works of art, that real effort went into them and things just click. RT doesn't do that. Rather, for being so 'ubiquitous' in gaming if its 'so easy to apply', it will make a lot of games look the same. A bit like how the autotune destroyed pop music. Everything sounds the same, vocals are computerized, heard one, you've heard them all.

That's the fallacy of this chase for realistic graphics and 'the holy grail of graphics'. Its reducing an art form to something based on universal metrics, but at that point it directly stopped being art and starts being a spreadsheet, a system with recognizable patterns much like that procedural 'random' world you walk through that's still boring AF.
My only question to the general audience is: why are we celebrating the paint brush instead of the painter?

No graphical tool is worth a penny if the game is jack shite, and if the game is good, then graphics only add to the immersion, not make it, nor break it.

Or someone please tell me that Half-Life is unplayable garbage because it doesn't have RT (or even DirectX 9 for that matter).

Yeah people definitely didn't ask for more realistic/better lighting effects (aka, better graphics) and more performance, which is what both of the technologies actually do .... But they'll never admit it, because we didn't specifically ask for exactly what was delivered on a technical level, we're all just 100% riding jacket mans dong and they're waaaay smarter than us, duh.
I'm not sure if you meant this in a sarcastic way, but actually, I never asked for more realistic lighting. I've been asking for characters that don't look like plastic dolls in the rain, but I don't seem to be getting them.

I remember being amazed by coloured lights in SW: Dark Forces 2 (Jedi Knight) and Half-Life. Then, there was Doom 3 and Half-Life 2. Then Crysis. Now we have Cyberpunk and Alan Wake 2. Every single game improved on lights for decades without anyone asking for anything, so no, I don't want to pretend that the whole idea came out of leather jacket man's arse on a Sunday afternoon while sipping tea just to make us all feel jolly good about... more lights?

If technologies were being held back by what the majority have or use, you wouldn't have such nice thing like a PC today

Another example, is OLED screen pointless because the majority of users still have LCD screen?
That's a bad example. I specifically avoid OLED because of burn-in.

Or maybe not a bad example at all... the same way OLED doesn't replace LCD, ray tracing doesn't replace rasterization, either.
 
Joined
Jan 29, 2024
Messages
87 (0.29/day)
It seems you didn't get it.
New bleeding edge technologies aren't just going to be ubiquitous in a single day, but they will slowly replace the current tech.
For example: LCD replaced CRT, optical mouse replaced trackball mouse, DDR2 - DDR3 - DDR4 - DDR5 RAM, etc...

Kinda stupid to think new techs are pointless because the majority are not using those tech LOL
Oh I get it alright. I work with tech in games development that makes my home system look rather pathetic in comparison. I'm talking the average Joe Schmo with a 3060/4060. The likes of 4090 performance ain't reaching them for another two to three generations at least. That's the critical mass point. The problem is the cores that do the RT/upscaling heavy lifting are not cheap to make. If you think that graphics cards are staying at the price point they are now then it seems you don't get it. The talk is that the 5090 will be around £2000. If true that's another £400 price rise in a generation. Filtering down the scale that means a 5060 or equivalent getting a price point of £5/600 or getting more cut down to hit it's current price point. You sure are going to pay for that sweet RT/upscaling goodness. I more than get it, seems other people might not afford to.

Oh and another thing, all those techs you mentioned, the replacements eventually reached a point that the price was the same as the old tech. Graphics cards, ummm, no.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,449 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
When every console uses borderline unusably slow RT hardware (RDNA2 APU), the majority of games developers will build for the lowest common denominator (Xbox series S).

UE5.4 has lumen as the default, PS5 Pro coming soon which has slightly less useless RT hardware from a RNDA3.5 APU, so expect to see stronger and more detailed implementations.

Games devs are mandated to have every release run on the series S, so blame Microsoft for stupid segmentation and AMD for behind the times hardware.

View attachment 346448

Blame whoever, this has been the gist of consoles since forever. They're not bleeding edge tech because they need to be cost effective. Don't for a second think Nvidia is/was capable of pushing a more competitive deal here - if they were able or willing to, we would've had Nvidia based consoles. Especially if the RT promise was seen industry wide as a killer feature.

And yet here we are. The market proves how important RT really is. Blame has no place. Economy doesn't care about blame. The indisputable fact is that (third gen!) RT has proven to be even less than a buzz word outside of DIY PC than AI is in its current infancy. People don't give a shit about it, despite lots of marketing and pushed games/content. The fact we're still talking about Cyberpunk's extra special RT features right now speaks volumes. It reminds a lot of how people talk about the tiny handful of VR games because there simply isn't anything else. Yeah... have fun with your little box of toys... it ain't going places though.

The developments are there, but we can be much more dismissive of what's going to actually work for us and what's not. Others say that all technology starts small and then takes over, but that's not true. The majority of developments do in fact, fail. Many things are thrown at walls, only the best things stick. I put a lot more stock in engine developments like Nanite and the slow crawl of AMD's RT hardware improvements than proprietary pushes and per-game TLC to 'show off' what it can do. Sure, you can use a few of those poster childs to push the tech, but its really that, and nothing more. It won't ever become a thing in that way, shape or form. Cyberpunk's PT, the excessive TLC on that game's graphics... fun project, but not indicative in the slightest of a gaming market for the (near) future. Not the next gen, not the gen after it either. 10 years? Maybe. Maybe not.

RT is currently mostly selling an idea and has a few tech demos to show it off. There is no proof of its economical feasibility whatsoever yet. Given the abysmal raster performance on the new crop of games, you can also rest assured that Ye Upscale won't save RT. Game performance bar keeps moving up regardless.
 
Last edited:
Joined
Nov 11, 2016
Messages
3,413 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Oh I get it alright. I work with tech in games development that makes my home system look rather pathetic in comparison. I'm talking the average Joe Schmo with a 3060/4060. The likes of 4090 performance ain't reaching them for another two to three generations at least. That's the critical mass point. The problem is the cores that do the RT/upscaling heavy lifting are not cheap to make. If you think that graphics cards are staying at the price point they are now then it seems you don't get it. The talk is that the 5090 will be around £2000. If true that's another £400 price rise in a generation. Filtering down the scale that means a 5060 or equivalent getting a price point of £5/600 or getting more cut down to hit it's current price point. You sure are going to pay for that sweet RT/upscaling goodness. I more than get it, seems other people might not afford to.

RT is getting cheaper each generation, during RTX2000 only the 2080 Ti is somewhat capable of RT, now the 600usd 4070S is about good enough (PT is another beast).
Next gen we will definitely see some 450-500usd GPUs capable of handling RT GI, Reflections and AO at 1440p (or 4K with Upscaling + Frame Gen).

Pay for better visuals? gladly. After all I'm paying 70usd for a AAA game these day already.
 
Joined
Sep 17, 2014
Messages
22,449 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
That's a bad example. I specifically avoid OLED because of burn-in.

Or maybe not a bad example at all... the same way OLED doesn't replace LCD, ray tracing doesn't replace rasterization, either.
Its a perfectly fine example. Both LCD and OLED are inferior in their own ways, we've arrived at a point where there aren't definitive, decisively better technologies, there are advantages to each one for specific environments/use cases. Even if you look at LCD: first we had a battle between TN and IPS. Then VA got better. Now you can buy monitors in all three corners and they all have unique selling points.

The same thing applies for RT. It has a purpose, and it works, if used sparingly. If its used to brute force something that raster can do at half or less the performance hit, we're looking at something that's not viable in the long or even mid-term. There is no future where RT will replace everything before it. There's no point in doing so, until it costs less/equal performance as raster and while doing so, provides a better visual.
 
Joined
Jan 29, 2024
Messages
87 (0.29/day)
RT is getting cheaper each generation, during RTX2000 only the 2080 Ti is somewhat capable of RT, now the 600usd 4070S is about good enough (PT is another beast).
Next gen we will definitely see some 450-500usd GPUs capable of handling RT GI, Reflections and AO at 1440p (or 4K + Upscaling + Frame Gen)
You think RT is getting cheaper each generation. Tell me again about the price rises last gen to this gen. The 3080 to 4080 price rise was insane. But okay graphics cards are getting cheaper. Like I said, you want the RT/upscaling goodies to get better each gen then expect to pay more..
 
Top