Friday, May 3rd 2024
AMD to Redesign Ray Tracing Hardware on RDNA 4
AMD's next generation RDNA 4 graphics architecture is expected to feature a completely new ray tracing engine, Kepler L2, a reliable source with GPU leaks, claims. Currently, AMD uses a component called Ray Accelerator, which performs the most compute-intensive portion of the ray intersection and testing pipeline, while AMD's approach to ray tracing on a hardware level still relies greatly on the shader engines. The company had debuted the ray accelerator with RDNA 2, its first architecture to meet DirectX 12 Ultimate specs, and improved the component with RDNA 3, by optimizing certain aspects of its ray testing, to bring about a 50% improvement in ray intersection performance over RDNA 2.
The way Kepler L2 puts it, RDNA 4 will feature a fundamentally transformed ray tracing hardware solution from the ones on RDNA 2 and RDNA 3. This could probably delegate more of the ray tracing workflow onto fixed-function hardware, unburdening the shader engines further. AMD is expected to debut RDNA 4 with its next line of discrete Radeon RX GPUs in the second half of 2024. Given the chatter about a power-packed event by AMD at Computex, with the company expected to unveil "Zen 5" CPU microarchitecture on both server and client processors; we might expect some talk on RDNA 4, too.
Sources:
HotHardware, Kepler_L2 (Twitter)
The way Kepler L2 puts it, RDNA 4 will feature a fundamentally transformed ray tracing hardware solution from the ones on RDNA 2 and RDNA 3. This could probably delegate more of the ray tracing workflow onto fixed-function hardware, unburdening the shader engines further. AMD is expected to debut RDNA 4 with its next line of discrete Radeon RX GPUs in the second half of 2024. Given the chatter about a power-packed event by AMD at Computex, with the company expected to unveil "Zen 5" CPU microarchitecture on both server and client processors; we might expect some talk on RDNA 4, too.
227 Comments on AMD to Redesign Ray Tracing Hardware on RDNA 4
PhysX was also mostly still precooked calculations with a few finishing touches on the fly, as Nvidias own programming for CUDA document showed.
Nvidia has done amazing things but they aren't the technology pioneer they are seen as.
My optimistic wish for the future; TAA must die, and consequently DLSS and its analogues. Games will cease to be plagued by glitches/Blur/bugs, and companies will refrain from dismissing QA and launching flawed games. Instead, they'll prioritize innovation and build games that are truly enjoyable.
Yes, this future, exists for me and you, see the doors of heaven opening. :p
If AMD manages to fix only the video playback power consumption, and add nothing else, I'll gladly swap my 7800 XT for the new model. I might even sell it early as I could use the cash for some upcoming vacations. Then, buying the new model will come with a lot less buyer's remorse, as I'll essentially be upgrading from a 1660 Ti or 6500 XT. :D
RT is the most overrated in CP, there is only a slight difference, but not that big to worth the fps tank. Not at all.
This is exactly what I'm saying. Realism is abstract in gaming. If its not real, its not real, and gaming never is. Most games are what they are exactly because they're not real. So why would you need RT to improve realism over the dynamic lighting we already had? Is the desire for better graphics really something that hinges on having RT or not, or is that just what Nvidia wants to tell us?
Because frankly I haven't seen a single game where RT made or broke the visuals or the realism. Its either already there in the rasterized picture, or its not there at all. RT won't make the surreal, real. It just applies minor changes to the image's lighting and reflective qualities.
What I DID see a lot of in my gaming decades is that the real gaming eye openers were those with painstakingly hand-crafted scenery. The most immersive games are the games that just breathe they are works of art, that real effort went into them and things just click. RT doesn't do that. Rather, for being so 'ubiquitous' in gaming if its 'so easy to apply', it will make a lot of games look the same. A bit like how the autotune destroyed pop music. Everything sounds the same, vocals are computerized, heard one, you've heard them all.
That's the fallacy of this chase for realistic graphics and 'the holy grail of graphics'. Its reducing an art form to something based on universal metrics, but at that point it directly stopped being art and starts being a spreadsheet, a system with recognizable patterns much like that procedural 'random' world you walk through that's still boring AF.
Hollywood really dropped the ball these last couple decades. Nice to see masters at work.
I am not complaining about Ray Tracing existing, but it got old along time ago all the talk about it being the only way AMD can be competitive. Its a Niche tech, and most of the games that use it use it in a small way. Not a ton of games are like Cyberpunk who truly show what it can do and I agree that when its used to its fullest it looks great! But there is a reason most games that use it use it have heartedly (At least in my opinion). Again the only card I think that truly shows off what Ray Tracing can do is the 4090 which is very expensive.
As for AMD redesigning it, good, I mean I am all for them making it more competitive as if the tech is staying they might as well invest in it. It will be interesting to see how it performs on next generation hardware this year!
UE5.4 has lumen as the default, PS5 Pro coming soon which has slightly less useless RT hardware from a RNDA3.5 APU, so expect to see stronger and more detailed implementations.
Games devs are mandated to have every release run on the series S, so blame Microsoft for stupid segmentation and AMD for behind the times hardware.
Another example, is OLED screen pointless because the majority of users still have LCD screen?
I agree in cyberpunk its kinda meh. Can be done, those effects, without it or at least without such a penalty in performance. But I've tried it on my 3080 and I was impressed. But when running around in an open environment you won't notice it. More closed environments or something like a forest or a jungle, you would notice it.
New bleeding edge technologies aren't just going to be ubiquitous in a single day, but they will slowly replace the current tech.
For example: LCD replaced CRT, optical mouse replaced trackball mouse, DDR2 - DDR3 - DDR4 - DDR5 RAM, etc...
Kinda stupid to think new techs are pointless because the majority are not using those tech LOL
No graphical tool is worth a penny if the game is jack shite, and if the game is good, then graphics only add to the immersion, not make it, nor break it.
Or someone please tell me that Half-Life is unplayable garbage because it doesn't have RT (or even DirectX 9 for that matter). I'm not sure if you meant this in a sarcastic way, but actually, I never asked for more realistic lighting. I've been asking for characters that don't look like plastic dolls in the rain, but I don't seem to be getting them.
I remember being amazed by coloured lights in SW: Dark Forces 2 (Jedi Knight) and Half-Life. Then, there was Doom 3 and Half-Life 2. Then Crysis. Now we have Cyberpunk and Alan Wake 2. Every single game improved on lights for decades without anyone asking for anything, so no, I don't want to pretend that the whole idea came out of leather jacket man's arse on a Sunday afternoon while sipping tea just to make us all feel jolly good about... more lights? That's a bad example. I specifically avoid OLED because of burn-in.
Or maybe not a bad example at all... the same way OLED doesn't replace LCD, ray tracing doesn't replace rasterization, either.
Oh and another thing, all those techs you mentioned, the replacements eventually reached a point that the price was the same as the old tech. Graphics cards, ummm, no.
And yet here we are. The market proves how important RT really is. Blame has no place. Economy doesn't care about blame. The indisputable fact is that (third gen!) RT has proven to be even less than a buzz word outside of DIY PC than AI is in its current infancy. People don't give a shit about it, despite lots of marketing and pushed games/content. The fact we're still talking about Cyberpunk's extra special RT features right now speaks volumes. It reminds a lot of how people talk about the tiny handful of VR games because there simply isn't anything else. Yeah... have fun with your little box of toys... it ain't going places though.
The developments are there, but we can be much more dismissive of what's going to actually work for us and what's not. Others say that all technology starts small and then takes over, but that's not true. The majority of developments do in fact, fail. Many things are thrown at walls, only the best things stick. I put a lot more stock in engine developments like Nanite and the slow crawl of AMD's RT hardware improvements than proprietary pushes and per-game TLC to 'show off' what it can do. Sure, you can use a few of those poster childs to push the tech, but its really that, and nothing more. It won't ever become a thing in that way, shape or form. Cyberpunk's PT, the excessive TLC on that game's graphics... fun project, but not indicative in the slightest of a gaming market for the (near) future. Not the next gen, not the gen after it either. 10 years? Maybe. Maybe not.
RT is currently mostly selling an idea and has a few tech demos to show it off. There is no proof of its economical feasibility whatsoever yet. Given the abysmal raster performance on the new crop of games, you can also rest assured that Ye Upscale won't save RT. Game performance bar keeps moving up regardless.
Next gen we will definitely see some 450-500usd GPUs capable of handling RT GI, Reflections and AO at 1440p (or 4K with Upscaling + Frame Gen).
Pay for better visuals? gladly. After all I'm paying 70usd for a AAA game these day already.
The same thing applies for RT. It has a purpose, and it works, if used sparingly. If its used to brute force something that raster can do at half or less the performance hit, we're looking at something that's not viable in the long or even mid-term. There is no future where RT will replace everything before it. There's no point in doing so, until it costs less/equal performance as raster and while doing so, provides a better visual.
Again, both can be true, I know you don't like the recent push and have things higher on the list for you personally. Be that as it may, others do want RT performance to increase.