Monday, July 22nd 2024

Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

We've known since May that AMD is giving its next generation RDNA 4 graphics architecture a significant upgrade with ray tracing performance, and had some clue since then, that the company is working on putting more of the ray tracing workflow through dedicated, fixed function hardware, unburdening the shader engine further. Kepler_L2, a reliable source with GPU leaks sheds light on some of the many new hardware features AMD is introducing with RDNA 4 to better accelerate ray tracing, which should give its GPUs a reduced performance cost of having ray tracing enabled. Kepler_L2 believes that these hardware features should also make it to the GPU of the upcoming Sony PlayStation 5 Pro.

To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
Sources: Kepler_L2 (Twitter), VideoCardz
Add your own comment

247 Comments on Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

#201
AusWolf
PunkenjoyRT is just a tool, a very nice one, but just a tool.

If you have static light, you could just bake the lighting like they did for so long. What it is ? You pre render the lighting into texture (using RT!!!!) and apply it as light maps.

Look super nice but it's static. With RT, you can have dynamic light, dynamic day/night cycle etc. RT is not the best solution all the time, but is a great solution when dynamic lighting is required. It's still a young tech, but it will not go away. Games will continue to be made without it as they might not need it. There is no point of using RT if you have no dynamic light (or if you have very few).

But if you want a game with many dynamics light or where the gameplay require it, you will have a much better results with RT. There is still 2D games being made after all.

The whole RT On/RT off in the same game is a bit dull. If the game is made with dynamic lighting in mind, RT will always look better. If not, well it will be barely noticeable.

But let's be realist, RT is nice but i doubt it still need at least 2-3 gen to be widespread, so that probably mean 2030+.

And it's only one of the thing we need to have for better graphics. We also need way better textures too. Game needs to be made to utilize the full 24 GB of latest graphics cards.

Low-mid range should at least have 12 GB or RAM, mid at least 16-20 GB and ultra high end should already be at 32 GB. But there is no to little progress there sadly. Older games that got released when first 8 GB appeared still look very great because they have about the same texture budget as current game.

anyway. Glad that AMD is working on it. But they need to stop being the catch up player at some point.
Except that dynamic day-night cycles have existed way before RT set its foot in games. Other than that, I agree.
fevgatosyou just cannot use prebaked lighting properly EVERYWHERE on eg. an open world game.
Who says that? Open world games have been doing fine with baked lighting since forever.

Disclaimer: I'm not an AMD, Nvidia, or any fan, and I'm not bashing RT. My ultimate point is that RT currently offers little benefit for the performance penalty on any hardware.
fevgatosHow bad is the playback power draw?
On a 7800 XT with a 1440 UW 144 Hz screen such as mine? Over 40 Watts for the card total.
Posted on Reply
#202
JustBenching
AusWolfWho says that? Open world games have been doing fine with baked lighting since forever.
What do you mean fine? Reflections disappear as you move around the camera and the light is basically static no matter the angles you look / walk around it etc. Someone could argue games were doing fine with vector before raster since forever.
AusWolfOn a 7800 XT with a 1440 UW 144 Hz screen such as mine? Over 40 Watts for the card total.
32w for my 4090. So it's not that bad. But I assume the bigger amd cards will probably draw even more.
AusWolfMy ultimate point is that RT currently offers little benefit for the performance penalty on any hardware.
Maybe it does, but how will you get there if you don't start somewhere? If nvidia doesn't push like they do, we will never get to the point that RT gets manageable by a xx60 tier card.
Posted on Reply
#203
ratirt
fevgatosYeah of course, nobody i think buys a card strictly for RT. But when there are 2 similar cards in raster, roughly at the same price point, people buy the one that also has RT. No reason not to. The fans might not acknowledge this but thank god amd does and tries to rectify it by working on their RT performance.
There is plenty of people here in this forum that strictly buy graphics because of RT and you are one of them. If raster is the same on both cards and RT is better on the other, you buy the other, then you buy it for RT right? So you cant say people are not buying cards for RT because it is simply not true. In Norway 7900xtx is cheaper than 4800 super by 90-100EUR that isn't nothing. If you buy RTX4800, you buy for RT and that is exactly what you would do. Not to mention, 7900xtx also has RT. Yet you still buy RTX4800. Which means you buy for RT because RT performance on a 4800RTX is better. You pay more though. It is a tradeoff, same raster for a lower price or higher price RT better. If you pick latter one, you dont buy it because the raster is the same but the RT is better. You buy it for RT.
AMD is following the market. Everyone one does that and it needs to be addressed despite if it is a great feature or lackluster. If not, the company doing it will market it and get more sales. That is exactly what is happening with NV vs AMD. Every industry flow that schematic and it is for a reason.
Posted on Reply
#204
AusWolf
fevgatosWhat do you mean fine? Reflections disappear as you move around the camera and the light is basically static no matter the angles you look / walk around it etc. Someone could argue games were doing fine with vector before raster since forever.
These are minor things if you ask me. I usually don't even notice such things until someone points them out.
fevgatos32w for my 4090. So it's not that bad. But I assume the bigger amd cards will probably draw even more.
Oh, that's kind of fine, then. Still, I would prefer my 8800 XT not to spin its fans up from time to time just for watching a Youtube video.
fevgatosIf nvidia doesn't push like they do...
Are they pushing, though? :wtf: We've been getting the same architecture with minor modifications for 3 generations straight now. The ratio of raster-to-RT hardware, and the performance drop when enabling RT has been the same since RT was introduced with Turing. Of course, reviews sing praises about the million changes in the RT engine, but I don't see them manifest anywhere in practice. The difference is that when Nvidia switches language to technobabble, fans piss their pants in joy and hand over big lumps of cash for the latest 9090 GTRTX Titanium Platinum FE card, but when AMD does the same, people bash them for not seeing the real world performance promised. Where is Nvidia's real world performance? All I see is pricier GPUs generation by generation that achieve more performance only by having more of the same hardware as before. Pure marketing.

And like I said, I'm not an AMD fan.
Posted on Reply
#205
JustBenching
ratirtThere is plenty of people here in this forum that strictly buy graphics because of RT and you are one of them. If raster is the same on both cards and RT is better on the other, you buy the other, then you buy it for RT right?
That doesn't mean I'm buying it strictly for RT though. If 2 cards are pretty much identical in raster, and one of them is faster in RT - yes ill buy the one with RT, provided the price is roundabout the same. Which it is right now.
ratirtYou pay more though. It is a tradeoff, same raster for a lower price or higher price RT better.
If you are in the EU you don't really pay more, the prices are roughly the same. Even in Norway you mentioned, xtx costs 1004€, 4080super costs 1083€. 80€ gets you RT, DLSS , FG, lower power draw. It's not even a question of which one you should buy at these prices. The xtx at 900€, sure, then we got an argument.
AusWolfAre they pushing, though? :wtf: We've been getting the same architecture with minor modifications for 3 generations straight now. The ratio of raster-to-RT hardware, and the performance drop when enabling RT has been the same since RT was introduced with Turing. Of course, reviews sing praises about the million changes in the RT engine, but I don't see them manifest anywhere in practice. The difference is that when Nvidia switches language to technobabble, fans piss their pants in joy and hand over big lumps of cash for the latest 9090 GTRTX Titanium Platinum FE card, but when AMD does the same, people bash them for not seeing the real world performance promised. Where is Nvidia's real world performance? All I see is pricier GPUs generation by generation that achieve more performance only by having more of the same hardware as before. Pure marketing.

And like I said, I'm not an AMD fan.
The ratio of raster to RT will remain the same cause raster also gets a boost? I mean it makes sense, no? Games that the 2080ti or even the 3090 struggled to play, the 4090 just cruises through. There is definitely progress. Not in the low end though (4060 etc.) but that's something that both companies ignore right now.
Posted on Reply
#206
ratirt
fevgatosThat doesn't mean I'm buying it strictly for RT though. If 2 cards are pretty much identical in raster, and one of them is faster in RT - yes ill buy the one with RT, provided the price is roundabout the same. Which it is right now.
The price is not the same and you know it. Check all the sites (except sale offers) and you will see the difference. You buy for RT performance. You dont buy RTX4800 because raster is the same as the 7900xtx do you? You buy because 4800 has better RT performance. That is the factor for your purchase. RT performance so you buy for RT.
fevgatosIf you are in the EU you don't really pay more, the prices are roughly the same. Even in Norway you mentioned, xtx costs 1004€, 4080super costs 1083€. 80€ gets you RT, DLSS , FG, lower power draw. It's not even a question of which one you should buy at these prices. The xtx at 900€, sure, then we got an argument.
I live in EU and the prices are not the same. 7900xtx is cheaper by around 90-100EUR depending on where you buys it. Sometimes the price difference goes to 60EUR if there are sales and sometimes it can grow to 130EUR depending on sales of cards. The point is, 7900xtx is in general a cheaper product you can buy.
So if you buy 7900xtx you are buying for raster performance even though it can utilize RT
4800 you buy for RT because 7900xtx is for raster.
4900, you just want the best despite costs.
fevgatosEven in Norway you mentioned, xtx costs 1004€, 4080super costs 1083€. 80€ gets you RT, DLSS , FG, lower power draw.
When did i say this? I said the difference is 90-100EUR between the two cards. 80EUR does not get you RT it gets you better RT so you buy for RT. You have FSR and also FG with AMD cards. Better or worse but it is there.
Posted on Reply
#207
JustBenching
ratirtThe price is not the same and you know it. Check all the sites (except sale offers) and you will see the difference. You buy for RT performance. You dont buy RTX4800 because raster is the same as the 7900xtx do you? You buy because 4800 has better RT performance. That is the factor for your purchase. RT performance so you buy for RT.

I live in EU and the prices are not the same. 7900xtx is cheaper by around 90-100EUR depending on where you buys it. Sometimes the price difference goes to 60EUR if there are sales and sometimes it can grow to 130EUR depending on sales of cards. The point is, 7900xtx is in general a cheaper product you can buy.
So if you buy 7900xtx you are buying for raster performance even though it can utilize RT
4800 you buy for RT because 7900xtx is for raster.
4900, you just want the best despite costs.
Well I checked alternatede, the place I always buy my tech stuff, both cards are at 999.

The cheaper xtx I can find ready to be shipped is at 965 in nbb. Still that's a 35 euro difference.... That's peanuts
Posted on Reply
#208
ratirt
fevgatosWell I checked alternatede, the place I always buy my tech stuff, both cards are at 999.

The cheaper xtx I can find ready to be shipped is at 965 in nbb. Still that's a 35 euro difference.... That's peanuts
Yes it is if that is the case. So you add 35EUR to buy RTX4800. so the peanuts are for better RT meaning you buy for RT performance.
In Norway the difference is 90EUR at least.
Posted on Reply
#209
AusWolf
fevgatosThat doesn't mean I'm buying it strictly for RT though. If 2 cards are pretty much identical in raster, and one of them is faster in RT - yes ill buy the one with RT, provided the price is roundabout the same. Which it is right now.

If you are in the EU you don't really pay more, the prices are roughly the same. Even in Norway you mentioned, xtx costs 1004€, 4080super costs 1083€. 80€ gets you RT, DLSS , FG, lower power draw. It's not even a question of which one you should buy at these prices. The xtx at 900€, sure, then we got an argument.
Which means you'd pay 100 bucks more for better RT, which is fine (even though I wouldn't), so I don't know why you keep contradicting yourself to hide the fact.
fevgatosThe ratio of raster to RT will remain the same cause raster also gets a boost? I mean it makes sense, no? Games that the 2080ti or even the 3090 struggled to play, the 4090 just cruises through. There is definitely progress. Not in the low end though (4060 etc.) but that's something that both companies ignore right now.
So if I give you two bars of chocolate for $2 instead of just one bar for $1, you'll call it progress? That's a very strange way of thinking.

Personally, I fail to see any progress here:
Posted on Reply
#210
JustBenching
ratirtYes it is if that is the case. So you add 35EUR to buy RTX4800. so the peanuts are for better RT meaning you buy for RT performance.
In Norway the difference is 90EUR at least.
As ive said I checked Norway as well. 60 euro difference.

Yes I'd buy 35 for dlss, rt and lower power draw. Who the hell wouldn't?
AusWolfWhich means you'd pay 100 bucks more for better RT, which is fine, so I don't know why you keep contradicting yourself to hide the fact.


So if I give you two bars of chocolate for $2 instead of just one bar for $1, you'll call it progress? That's a very strange way of thinking.

Personally, I fail to see any progress here:
How is what your showing on the graph progress? What am I missing?
Posted on Reply
#211
ratirt
fevgatosAs ive said I checked Norway as well. 60 euro difference.

Yes I'd buy 35 for dlss, rt and lower power draw. Who the hell wouldn't?
Still you are buying for RT performance as it was previously said. All those upscalers and FG you mentioned from day one, is for you to be able to play RT so RT it is.
Posted on Reply
#212
JustBenching
ratirtStill you are buying for RT performance as it was previously said. All those upscalers and FG is for you to be able to play RT so RT it is.
I'm using dlss whether the game has rt or not. It just looks better than native, has nothing to do with RT.

I'd pay 0 to 50 euros more for an identical card in raster that has much better RT, yes. Again, even amd acknowledges this and they are trying to push their rt performance.
Posted on Reply
#213
AusWolf
fevgatosHow is what your showing on the graph progress? What am I missing?
Turing, Ampere and Ada go through the exact same RT performance loss when enabling the feature. So Ada doesn't ray trace faster than Ampere, just like Ampere doesn't ray trace faster than Turing. The only reason your 4090 is faster than the 3090 is because it has more hardware units and is also more expensive. Like I said, 2 chocolate bars for 2 bucks instead of one for one. This is not progress.
Posted on Reply
#214
ratirt
fevgatosI'm using dlss whether the game has rt or not. It just looks better than native, has nothing to do with RT.

I'd pay 0 to 50 euros more for an identical card in raster that has much better RT, yes. Again, even amd acknowledges this and they are trying to push their rt performance.
So you wouldn't pay 90EUR? Because that is the difference in Norway at the moment.
BTW. there are games showing, DLSS is not always better than native. Actually, I've never seen it better all the way. There is also a tradeoff here. It has its flaws. So it is a stretch form you part a bit.
Posted on Reply
#215
JustBenching
AusWolfTuring, Ampere and Ada go through the exact same RT performance loss when enabling the feature. So Ada doesn't ray trace faster than Ampere, just like Ampere doesn't ray trace faster than Turing. The only reason your 4090 is faster than the 3090 is because it has more hardware units and is also more expensive. Like I said, 2 chocolate bars for 2 bucks instead of one for one. This is not progress.
Because they are also faster in raster?? If you increase rt performance by 50% and raster performance by 50% then the performance drop will be the same, doesn't mean rt didn't progress.

The 4080super doesn't have more units or is more expensive than the 3090,yet it's faster?
ratirtSo you wouldn't pay 90EUR? Because that is the difference in Norway at the moment.
BTW. there are games showing, DLSS is not always better than native. Actually, I've never seen it better all the way. There is also a tradeoff here. It has its flaws. So it is a stretch form you part a bit.
It obviously depends on the price point. If one card is 200 and the other 300 no, I wouldn't pay that for RT if everything else is the same. If one card is 1000 and the other 1090 yes, absolutely id pay. For a high end pc that will cost like 2.5k minimum (monitor included) 90 euros is nothing.
Posted on Reply
#216
AusWolf
fevgatosBecause they are also faster in raster?? If you increase rt performance by 50% and raster performance by 50% then the performance drop will be the same, doesn't mean rt didn't progress.
If raster performance improves by 50% due to having 50% more shader cores while costing less than 50% more, and RT performance doesn't drop by 40%, but only 25%, I'll call it an improved RT core design. Until then, let's stay with my chocolate bar analogy. More of the same for a proportionally higher price is not progress. Period.
Posted on Reply
#217
ratirt
fevgatosIt obviously depends on the price point. If one card is 200 and the other 300 no, I wouldn't pay that for RT if everything else is the same. If one card is 1000 and the other 1090 yes, absolutely id pay. For a high end pc that will cost like 2.5k minimum (monitor included) 90 euros is nothing.
That is almost the same difference 100EUR vs 90EUR. Just because the price is in the 1000, the difference is OK for you? You can be very easily manipulated my friend. Even though you can buy a card for 800 and also have RT experience? You dont pay for RT, you pay for better RT mind you.
It makes a bit no sense to me.
Posted on Reply
#218
JustBenching
ratirtThat is almost the same difference 100EUR vs 90EUR. Just because the price is in the 1000, the difference is OK for you? You can be very easily manipulated my friend. Even though you can buy a card for 800 and also have RT experience? You dont pay for RT, you pay for better RT mind you.
It makes a bit no sense to me.
Yes it makes a difference because one product is 50% more expensive and the other one is 10% more expensive.
AusWolfIf raster performance improves by 50% due to having 50% more shader cores while costing less than 50% more, and RT performance doesn't drop by 40%, but only 25%, I'll call it an improved RT core design. Until then, let's stay with my chocolate bar analogy. More of the same for a proportionally higher price is not progress. Period.
Again, 4080 super (or even the 4070ti according to your graph) is both faster and cheaper than the 3090. What the heck man?
Posted on Reply
#219
AusWolf
fevgatosAgain, 4080 super (or even the 4070ti according to your graph) is both faster and cheaper than the 3090. What the heck man?
The 4080 Super is a great example of being the only 40-series card I consider worth its price (the fact that I don't want to spend that much on a GPU is a different story). A real outlier from the Ada generation.
Posted on Reply
#220
ratirt
fevgatosYes it makes a difference because one product is 50% more expensive and the other one is 10% more expensive.
What if you got 50% more performance?
The price increase is still up by 100EUR anyway
Posted on Reply
#221
Kn0xxPT
AusWolfPersonally, I fail to see any progress here:
Interesting chart indeed,
It only proves that RT is clearly a Markting stunt for the Media and "Benchmarks" ...
In the end, I dont think RT r&d is worth the investment, when in fact the most practical and beneficial features in GPU's are Perfo/Wat/Raster, Upscalling tech and Encoding/Decoding.
If AMD succeed to deliver these features at top level .... they can start to get some market share...
Yes RT is a nice thing ...but totally unecessary, ~35% performance hit just to see minor ambient lights reflections... no
Posted on Reply
#222
AusWolf
Kn0xxPTInteresting chart indeed,
It only proves that RT is clearly a Markting stunt for the Media and "Benchmarks" ...
In the end, I dont think RT r&d is worth the investment, when in fact the most practical and beneficial features in GPU's are Perfo/Wat/Raster, Upscalling tech and Encoding/Decoding.
If AMD succeed to deliver these features at top level .... they can start to get some market share...
Yes RT is a nice thing ...but totally unecessary, ~35% performance hit just to see minor ambient lights reflections... no
This wasn't the point I was trying to prove, but it's a point nonetheless. :)
Posted on Reply
#223
Chrispy_
Vya DomusIt's not that they look awful but they radically change the look and feel the of the game in a negative way for something that you're meant to play in real time. Lighting in a game is not supposed to be ultra realistic in the same way lighting on a movie set isn't either, it's artificial and carefully controlled because otherwise movies would look terrible. If you then try to manipulate light sources to work around this in a game you've made the use of RT redundant as it's no longer realistic.
I mean, they *do* look awful.
They look good when not in motion, which is great for screenshots and marketing material, but the reality is a splotchy, crawling mess.

I've posted screenshots using CP2077, since that's the game that's had the most effort spent on RT to date, with a lot of help from Nvidia.
www.techpowerup.com/forums/threads/nvidia-builds-exotic-rtx-4070-from-larger-ad103-by-disabling-nearly-half-its-shaders.321976/post-5245009

This is their flagship RT example, and it's dogshit in motion - that's not subjective, it's objectively proven as captured to be a splotchy, ugly, incorrect mess that looks nothing like advertised and nothing like the artists' intent. On top of the awful lighting in motion, you're also losing all the motion clarity because DLSS kinda sucks in motion too, it only really sharpens up when you stop moving the camera.
gffermariIf you have a mid range gpu, you just try RT reflections or shadows.
It's pointless to try any GI or indirect lighting on a mid range card unless you know what you're doing with other settings and you know what to expect.
The PC gamers should already know that. That's the reason they have PCs instead of stupid consoles.
The ability to recognize the bottleneck, the heavy part in a game etc.
I think on a midrange GPU you just go with reflections. That's the most significant visual upgrade in many games for a relatively low RT performance cost.
GI or RT shadow occlusion is too expensive and just doesn't look much better, if you can even see the improvement at all.
Posted on Reply
#224
Darkholm
fevgatosIn EU it isn't. Just checked, you can find both at 999. Talking about the 4080 super.
Yep. In June last year when I was buying GPU, 7900XTX was my 1st pick and there was only one single model below €1000 on german amazon and alternate ans it was Sapphire Pulse. €999 :D
Nitro+, Merc, TUF, Gaming OC and similar top cooling solutions were €1100 and above.
Funny thing that RTX 4080 that I have (Zotac Trinity) with 5y warranty was €989. Also some KFA and PNY entry models were same priced.

The whole 2023 until that moment, 7900XTX model were in best cases same priced as 4080. Many more often they were even more expensive. So I opted for nVidia because I didn't want to spend over 1000 on a GPU. Over a year now, been happy with this Zotac card. Temp/noise is OK.
Posted on Reply
#225
redeye
fevgatosMost people on the internet (you know,the vocal ones) are fans of amd. You can tell on forums, reddit etc they are the vast majority.
dude, you bought a 2000 dollar GPU, Nvidia has a share of 80%. AMD has a market share of 20%… you are the one with selective bias.
20% cannot be the vast majority!
Posted on Reply
Add your own comment
Nov 24th, 2024 07:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts