Monday, July 22nd 2024
Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked
We've known since May that AMD is giving its next generation RDNA 4 graphics architecture a significant upgrade with ray tracing performance, and had some clue since then, that the company is working on putting more of the ray tracing workflow through dedicated, fixed function hardware, unburdening the shader engine further. Kepler_L2, a reliable source with GPU leaks sheds light on some of the many new hardware features AMD is introducing with RDNA 4 to better accelerate ray tracing, which should give its GPUs a reduced performance cost of having ray tracing enabled. Kepler_L2 believes that these hardware features should also make it to the GPU of the upcoming Sony PlayStation 5 Pro.
To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
Sources:
Kepler_L2 (Twitter), VideoCardz
To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
247 Comments on Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked
As usual it's from bad to worse with you every time you post something.
Patience... :D
With just a BIOS update, you could get a performance uplift that matches the 9800X3D so don't make the upgrade to the ARL platform just yet (new LGA1851 MB required also).
For me, the level of RT performance is still not enough for the mid range cards to consider it an improvement and pay so much for it.
Because it does not bring that much difference to an image quality and a gameplay itself as it takes away in FPS.
I will look closely how it goes though.
Control,Cyberpunk with tweaked settings and some RT stuff on Ultra, Ghostwire Tokyo and some other smaller RT games.
Also the same with DLSS in general, I enable it even when I don't need the extra performance cause even if nothing else at least it fixes the flickering issues and gets rid of the crappy TAA which does look worse to me in most games.
I've played several RTX showcase titles on a 3090 and 4070S, and for the most part they look worse. Sure, in a pretty static screenshot they look arguably better, but RT in motion with current gen hardware is kind ugly. RT reflections are an improvement over screen-space reflections IMO, but shadows and lighting done with RT instead of shaders is truly awful - there simply aren't enough samples per frame to give any kind of reasonable effect so shadows crawl like they have lice, lighting that is supposed to be smooth and static is lumpy and wriggles around. Any time you pan your view the new part of the scene can take a good 20-30 frames to look even remotely correct as the denoiser and temporal blender get their shit together to make it even vaguely the right colour and brightness.
If you don't believe me, find your favourite RT title, and take some screenshots of areas that are rich in RT shadows or occlusion as you pan the camera around, and then try to match one of those in-motion screenshots when you're not moving. The difference is stark and the image quality of the actual, in-motion RT you are experiencing in real gameplay is absolute dogshit.
It's pointless to try any GI or indirect lighting on a mid range card unless you know what you're doing with other settings and you know what to expect.
The PC gamers should already know that. That's the reason they have PCs instead of stupid consoles.
The ability to recognize the bottleneck, the heavy part in a game etc.
RT is always better when you know what and when to enable it.
If you expect miracles, just get a PS5 and call it a day.
This reminds me of a news article couple of months back that I think I actually discussed this issue with you, @Chrispy_, when Diablo IV added RT support and especially touted awesome new RT shadows with included screenshots… and in every instance RT looked worse. Sure, it was “realistic”, but it was obvious that original crisp and high contrast shadows were deliberate in the part of the game artists to create a certain mood and make the scene readable at a glance from pseudo-iso perspective. RT shadows just looked muddy and undefined instead and made the whole image, funnily enough, look lower res overall.
I don’t have anything against RT, I just feel that most current implementation are analogous to tasteless over-saturated ENB packs and 8K textures for old games - it’s missing the point and mostly makes things look worse, but gamers with no taste lap it up because it’s ostensibly new and high tech, so it must be better.
Its the reason AMDs share price has still gained a good 400% in the face of 'losing battles' against much larger chip behemoths. Not a bad result.