Friday, May 3rd 2024
AMD to Redesign Ray Tracing Hardware on RDNA 4
AMD's next generation RDNA 4 graphics architecture is expected to feature a completely new ray tracing engine, Kepler L2, a reliable source with GPU leaks, claims. Currently, AMD uses a component called Ray Accelerator, which performs the most compute-intensive portion of the ray intersection and testing pipeline, while AMD's approach to ray tracing on a hardware level still relies greatly on the shader engines. The company had debuted the ray accelerator with RDNA 2, its first architecture to meet DirectX 12 Ultimate specs, and improved the component with RDNA 3, by optimizing certain aspects of its ray testing, to bring about a 50% improvement in ray intersection performance over RDNA 2.
The way Kepler L2 puts it, RDNA 4 will feature a fundamentally transformed ray tracing hardware solution from the ones on RDNA 2 and RDNA 3. This could probably delegate more of the ray tracing workflow onto fixed-function hardware, unburdening the shader engines further. AMD is expected to debut RDNA 4 with its next line of discrete Radeon RX GPUs in the second half of 2024. Given the chatter about a power-packed event by AMD at Computex, with the company expected to unveil "Zen 5" CPU microarchitecture on both server and client processors; we might expect some talk on RDNA 4, too.
Sources:
HotHardware, Kepler_L2 (Twitter)
The way Kepler L2 puts it, RDNA 4 will feature a fundamentally transformed ray tracing hardware solution from the ones on RDNA 2 and RDNA 3. This could probably delegate more of the ray tracing workflow onto fixed-function hardware, unburdening the shader engines further. AMD is expected to debut RDNA 4 with its next line of discrete Radeon RX GPUs in the second half of 2024. Given the chatter about a power-packed event by AMD at Computex, with the company expected to unveil "Zen 5" CPU microarchitecture on both server and client processors; we might expect some talk on RDNA 4, too.
227 Comments on AMD to Redesign Ray Tracing Hardware on RDNA 4
The AI comment was just a random thought - I'm by no means an expert in this area. But it could be possible that a game would not require models, textures, materials, or even and engine in a traditional sense, just an AI model which takes inputs and follows rules described by the developers. Just as Sora can create an image from text to generate a video, would it not be possible for a different type of game engine to take the input of the player, motion vectors, the scene described, the rule set, source of truth, etc, and generate a frame? I mean, a very primitive form exists with Nvidia's DLSS frame generation...
I think the main question would be stability (AI hullucinations) and consistency of the final presentation between players, and play throughs.
This hasn't changed, I have zero interest in the state of the tech as it is now. It doesn't add much if anything, but does detract from latency and performance. I don't think PT is going to happen anytime soon, if ever. Far too expensive.
AMD's cards are so utterly incompetent at this that it's baffling. It's not like nobody saw it coming, though. Nvidia's been developing their raytracing technology for more than 15 years, I find it hard to believe no one at AMD had the same foresight.
This image represents my point very well. If I have to spend $1500+ on a graphics card (which is about 3x my theoretical limit) just to be able to enjoy a technology, then that technology is not ready to be enjoyed.
I'm probably going to buy the AMD card next generation and keep my 4080. That way i'll have both handy, as I expect Ada to be well supported for some time to come. Most games don't really take advantage of its hardware features yet, and I'm having some difficulty picturing what Blackwell can do outside of efficiency improvements and "AI".
Why would you buy an AMD card next generation? It is rumoured to be around 7900 XT level, which isn't faster than your 4080.
Multiple disengenuous arguments, like usual... Sigh. Many consumers have absolutely been wanting [the result of] this tech for years, but typically consumers don't know the nuts and bolts of how it will actually work. For example, for 1-2 decades before RT, I've absolutely wanted more realistic lighting, up to and including photo realistic. Did I know the technical implementation I wanted? No, I wasn't that well read back then, but did I want the result? You bet. As if consumers not explicitly asking for RT in games before Nvidia innovated is an argument worth a damn anyway, yet it's presented as if to be a mic drop moment lol.
Introducing new RT concepts and optimising it in RDNA4 is cool and all that but it's overdue. Knowing AMD, one can only hope it won't be a complete disaster. Feels like they do enjoy trailing. Also some gents were pointing out that AMD have to introduce their own innos. TRUE THAT. "I switch to AMD because their GPUs have some cool technique that allows lossless grass render and much higher FPS whilst roaming over vegetation heavy game areas" would've been grass touching for NV shareholders.
Realistic graphics will probably come... but not this decade I'm afraid.
It's like some other aspects of life. You pay way more than you should on basic necessities just because some asshole companies said so, but don't worry, you can enjoy the latest Disney series in 4K. Yay!
But that's really off topic now. :ohwell:
So if you're one who prefers RT, then it's fine for you to want major improvements in RT, but I reckon most people would prefer performance over visuals, and mostly play games which don't have RT, not everyone buys the newest RT/PT games and not every new game comes with them anyway, so most people would evaluate the whole package over just RT performance.
Do I think RT/PT is the future for high fidelity lighting? Possibly, but we're nowhere near that future, we're still in the bleeding edge experimental phase. Just because AMD's RT may not match Blackwell's RT performance doesn't mean AMD is "Finished" or "not a competitor" in the gpu space, AMD focusing on providing a high value gpu with "decent enough" RT performance is very welcome in my opinion as most people are unwilling to upgrade due to high prices or low vram.
As for the AMD card, why not? It's supposed to be focused on power efficiency. If the price is right, sounds like fun to me.
Its a fallacy to think you can get everything photorealistic and still have a pleasant gaming experience. You will find yourself in hand crafted scenery nonetheless, simply because life ain't a movie reel. We've seen this in early RT implementations, like Metro Exodus' GI implementation, which in RT mode, can give scenes lighting that really isn't preferable or additive to a good gaming experience. It also hits home in many places. But where it doesn't, you see that nothing will change in the end. Devs will still have to craft scenes to their liking. Is it easier? I just think the toolbox has expanded, and generally that doesn't make it easier, the simple fact is, devs now have to optimize for a raster based lighting AND do an RT pass.
RT is a great tool to master nonetheless.