Monday, July 22nd 2024
Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked
We've known since May that AMD is giving its next generation RDNA 4 graphics architecture a significant upgrade with ray tracing performance, and had some clue since then, that the company is working on putting more of the ray tracing workflow through dedicated, fixed function hardware, unburdening the shader engine further. Kepler_L2, a reliable source with GPU leaks sheds light on some of the many new hardware features AMD is introducing with RDNA 4 to better accelerate ray tracing, which should give its GPUs a reduced performance cost of having ray tracing enabled. Kepler_L2 believes that these hardware features should also make it to the GPU of the upcoming Sony PlayStation 5 Pro.
To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
Sources:
Kepler_L2 (Twitter), VideoCardz
To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
247 Comments on Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked
Disclaimer: I'm not an AMD, Nvidia, or any fan, and I'm not bashing RT. My ultimate point is that RT currently offers little benefit for the performance penalty on any hardware. On a 7800 XT with a 1440 UW 144 Hz screen such as mine? Over 40 Watts for the card total.
AMD is following the market. Everyone one does that and it needs to be addressed despite if it is a great feature or lackluster. If not, the company doing it will market it and get more sales. That is exactly what is happening with NV vs AMD. Every industry flow that schematic and it is for a reason.
And like I said, I'm not an AMD fan.
So if you buy 7900xtx you are buying for raster performance even though it can utilize RT
4800 you buy for RT because 7900xtx is for raster.
4900, you just want the best despite costs. When did i say this? I said the difference is 90-100EUR between the two cards. 80EUR does not get you RT it gets you better RT so you buy for RT. You have FSR and also FG with AMD cards. Better or worse but it is there.
The cheaper xtx I can find ready to be shipped is at 965 in nbb. Still that's a 35 euro difference.... That's peanuts
In Norway the difference is 90EUR at least.
Personally, I fail to see any progress here:
Yes I'd buy 35 for dlss, rt and lower power draw. Who the hell wouldn't? How is what your showing on the graph progress? What am I missing?
I'd pay 0 to 50 euros more for an identical card in raster that has much better RT, yes. Again, even amd acknowledges this and they are trying to push their rt performance.
BTW. there are games showing, DLSS is not always better than native. Actually, I've never seen it better all the way. There is also a tradeoff here. It has its flaws. So it is a stretch form you part a bit.
The 4080super doesn't have more units or is more expensive than the 3090,yet it's faster? It obviously depends on the price point. If one card is 200 and the other 300 no, I wouldn't pay that for RT if everything else is the same. If one card is 1000 and the other 1090 yes, absolutely id pay. For a high end pc that will cost like 2.5k minimum (monitor included) 90 euros is nothing.
It makes a bit no sense to me.
The price increase is still up by 100EUR anyway
It only proves that RT is clearly a Markting stunt for the Media and "Benchmarks" ...
In the end, I dont think RT r&d is worth the investment, when in fact the most practical and beneficial features in GPU's are Perfo/Wat/Raster, Upscalling tech and Encoding/Decoding.
If AMD succeed to deliver these features at top level .... they can start to get some market share...
Yes RT is a nice thing ...but totally unecessary, ~35% performance hit just to see minor ambient lights reflections... no
They look good when not in motion, which is great for screenshots and marketing material, but the reality is a splotchy, crawling mess.
I've posted screenshots using CP2077, since that's the game that's had the most effort spent on RT to date, with a lot of help from Nvidia.
www.techpowerup.com/forums/threads/nvidia-builds-exotic-rtx-4070-from-larger-ad103-by-disabling-nearly-half-its-shaders.321976/post-5245009
This is their flagship RT example, and it's dogshit in motion - that's not subjective, it's objectively proven as captured to be a splotchy, ugly, incorrect mess that looks nothing like advertised and nothing like the artists' intent. On top of the awful lighting in motion, you're also losing all the motion clarity because DLSS kinda sucks in motion too, it only really sharpens up when you stop moving the camera. I think on a midrange GPU you just go with reflections. That's the most significant visual upgrade in many games for a relatively low RT performance cost.
GI or RT shadow occlusion is too expensive and just doesn't look much better, if you can even see the improvement at all.
Nitro+, Merc, TUF, Gaming OC and similar top cooling solutions were €1100 and above.
Funny thing that RTX 4080 that I have (Zotac Trinity) with 5y warranty was €989. Also some KFA and PNY entry models were same priced.
The whole 2023 until that moment, 7900XTX model were in best cases same priced as 4080. Many more often they were even more expensive. So I opted for nVidia because I didn't want to spend over 1000 on a GPU. Over a year now, been happy with this Zotac card. Temp/noise is OK.
20% cannot be the vast majority!