Monday, July 22nd 2024

Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

We've known since May that AMD is giving its next generation RDNA 4 graphics architecture a significant upgrade with ray tracing performance, and had some clue since then, that the company is working on putting more of the ray tracing workflow through dedicated, fixed function hardware, unburdening the shader engine further. Kepler_L2, a reliable source with GPU leaks sheds light on some of the many new hardware features AMD is introducing with RDNA 4 to better accelerate ray tracing, which should give its GPUs a reduced performance cost of having ray tracing enabled. Kepler_L2 believes that these hardware features should also make it to the GPU of the upcoming Sony PlayStation 5 Pro.

To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
Sources: Kepler_L2 (Twitter), VideoCardz
Add your own comment

247 Comments on Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

#51
JustBenching
AnarchoPrimitivIn 2023 Nvidia had a $7.4 Billion R&D budget that almost the entirety was spent on GPU development.

In 2023 AMD had an R&D budget $5.8 billion R&D budget that was primarily spent on x86 development as thet is by far their largest revenue stream.

Now, I want everybody in these comments that just assumes that AMD should just be able to match Nvidia explain to me, how that is to be accomplished. Because rhe way 99% of you speak about it, you act like these two companies are on a level.playing field, that they have access to the same sort of resources, and that for AMD it's just a problem of "not pricing videocards cheap enough" while completely ignoring the fact that tmthr current capitalist paradigm is stock price above all and quarterly earnings above all....tell me how AMD is supposed to go out there, undercut Nvidia at each tier by $150+ and still keep stock prices up and investors happy while quarterly profits decrease....PLEASE explain that to me. If I remember correctly, Intel sold alderlake with a diminished profit margin, how has thst worked for them? Oh that's right, AMD surpassed them in value.

The other lot of you act like it's merely a willpower problem, rhat AMD just doesn't "want it bad enough", we'll please explain to me why AMD should be focusing on videocards when they make the overwhelming majority of money from x86?
If amd spends a fraction of nvidia's r&d shouldn't their cards also cost a fraction of the price?
Posted on Reply
#52
Vya Domus
AnarchoPrimitivI want everybody in these comments that just assumes that AMD should just be able to match Nvidia explain to me
AMD can always just build a huge chip on the smallest node available like Nvidia does all the time, it's not that difficult, the issue is people still wouldn't buy it in the same quantity that they would an equivalent Nvidia GPU, so it simply does not make sense from a business stand point.
fevgatosThose are the averages.
Right because picking the extremes is more realistic :roll:

As usual it's from bad to worse with you every time you post something.
Posted on Reply
#53
JustBenching
Vya DomusAMD can always just build a huge chip like Nvidia does all the time, it's not that difficult, the issue is people still wouldn't buy in the same quantity that they would an equivalent Nvidia GPU, so it simply does not make sense from a business stand point.


Right because picking the extremes is more realistic :roll:

As usual it's from bad to worse with you every time you post something.
If you want a 4080 super you can get it for 999. If you want an xtx you can also get it for 999. You lied and suggested otherwise, let's get over it.
Posted on Reply
#54
P4-630
Sunny and 75The right path to take, for sure!
Yeah I can wait, I can still play my games with 2070 Super @ 1440p..
Patience... :D
Posted on Reply
#55
Sunny and 75
P4-630Will most likely do a Arrow Lake platform upgrade end this year though.
Wait for BTL (to be released in Q3 2025).

With just a BIOS update, you could get a performance uplift that matches the 9800X3D so don't make the upgrade to the ARL platform just yet (new LGA1851 MB required also).
Posted on Reply
#56
InVasMani
RT boils down to where you place the bar of expectations. I'm far more excited about AI. It's more readily usable to be leveraged at least parts of it. That said it's contingent on developers doing so in the right ways. I can foresee a lot of cool stuff coming down the pike with AI and games. It'll be fascinating to see what triple A developers do with AI in the coming years because it's absolutely on the horizon. It's much too powerful a tool to leverage and insert and utilize creatively. Like you can already run python scripts for Minecraft as one example which should send your mind and imagination into a tailspin if you grasp that.
Posted on Reply
#58
ratirt
Still to early for RT in my opinion but let the companies compete.
For me, the level of RT performance is still not enough for the mid range cards to consider it an improvement and pay so much for it.
Because it does not bring that much difference to an image quality and a gameplay itself as it takes away in FPS.
I will look closely how it goes though.
Posted on Reply
#59
wolf
Better Than Native
Vya DomusAs usual it's from bad to worse with you every time you post something.
Oh the irony, I damn near spat out my drink :roll:
InVasManiRT boils down to where you place the bar of expectations
I think this is key. Some are happy to adopt early and play with bleeding edge features, some want it to be thoroughly normalised and be able to buy much lower end hardware and still enjoy a modern feature set. Neither side of that coin should make blanket statements as if it should or does apply to everyone.
Posted on Reply
#60
Sunny and 75
ratirtFor me, the level of RT performance is still not enough for the mid range cards to consider it an improvement and pay so much for it.
Because it does not bring that much difference to an image quality and a gameplay itself as it takes away in FPS.
The same.
Posted on Reply
#61
Sithaer
fevgatosYou definitely don't need a 4090 to play rt games. Entirely depends on your resolution. I'm playing rt on my 3060ti, for example hogwarts.

DLSS only drops quality to amd users, whoever has an nvidia card prefers it over native. Heck I activate it even in games that can easily play native.
Same here with a 3060 Ti, I've played and finished multiple games with RT on and it was perfectly enjoyable for me with DLSS on Quality.
Control,Cyberpunk with tweaked settings and some RT stuff on Ultra, Ghostwire Tokyo and some other smaller RT games.

Also the same with DLSS in general, I enable it even when I don't need the extra performance cause even if nothing else at least it fixes the flickering issues and gets rid of the crappy TAA which does look worse to me in most games.
Posted on Reply
#62
Vya Domus
wolfOh the irony
Irony is when you post factually correct information, average TPU discourse never going above double digit IQ takes, keeping it classy.
Posted on Reply
#63
Chrispy_
john_AMD lost a generation by not improving RT performance with 7000 series. Who knows, maybe they didn't have enough time and resources to do so back then.
As a counterpoint, how many games on a mid-range RTX card are actually better with RT?

I've played several RTX showcase titles on a 3090 and 4070S, and for the most part they look worse. Sure, in a pretty static screenshot they look arguably better, but RT in motion with current gen hardware is kind ugly. RT reflections are an improvement over screen-space reflections IMO, but shadows and lighting done with RT instead of shaders is truly awful - there simply aren't enough samples per frame to give any kind of reasonable effect so shadows crawl like they have lice, lighting that is supposed to be smooth and static is lumpy and wriggles around. Any time you pan your view the new part of the scene can take a good 20-30 frames to look even remotely correct as the denoiser and temporal blender get their shit together to make it even vaguely the right colour and brightness.

If you don't believe me, find your favourite RT title, and take some screenshots of areas that are rich in RT shadows or occlusion as you pan the camera around, and then try to match one of those in-motion screenshots when you're not moving. The difference is stark and the image quality of the actual, in-motion RT you are experiencing in real gameplay is absolute dogshit.
Posted on Reply
#65
gffermari
If you have a mid range gpu, you just try RT reflections or shadows.
It's pointless to try any GI or indirect lighting on a mid range card unless you know what you're doing with other settings and you know what to expect.
The PC gamers should already know that. That's the reason they have PCs instead of stupid consoles.
The ability to recognize the bottleneck, the heavy part in a game etc.

RT is always better when you know what and when to enable it.
If you expect miracles, just get a PS5 and call it a day.
Posted on Reply
#66
wolf
Better Than Native
Vya Domuskeeping it classy.
It certainly wouldn't be the same without you.
W3RN3Rrather focus on FSR.
The reality is, if they want market share from Nvidia they need to double down on both RT and FSR, and at least the signs are positive that it will be the case for both too, so here's hoping.
Posted on Reply
#67
Vya Domus
Chrispy_but shadows and lighting done with RT instead of shaders is truly awful
It's not that they look awful but they radically change the look and feel the of the game in a negative way for something that you're meant to play in real time. Lighting in a game is not supposed to be ultra realistic in the same way lighting on a movie set isn't either, it's artificial and carefully controlled because otherwise movies would look terrible. If you then try to manipulate light sources to work around this in a game you've made the use of RT redundant as it's no longer realistic.
Posted on Reply
#68
AusWolf
wolfAhh yeah fair enough, no big regrets then like the other thread?
Regrets? No, not at all. :) Although part of me misses the 7800 XT, I would regret not being able to spend to my heart's content when out and about a lot more.
wolfInteresting, I didn't do eother of those , it wasn't extortionate and no RT games have been a slide show, I guess it's all relative.

Exactly this, my 3080 has allowed me to enjoy many games with RT effects on, and even taste playable path tracing, as you say a 7900XTX can do this but that's yesteryears performance so it's not all that impressive.
Because you said numerous times that you play with DLSS. Personally, I don't call anything with DLSS/FSR on "decent performance". If this is my only choice, then I'd rather play at native resolution with RT off. I prefer sharpness to pretty lights.
Posted on Reply
#69
tommo1982
AnarchoPrimitivIn 2023 Nvidia had a $7.4 Billion R&D budget that almost the entirety was spent on GPU development.

In 2023 AMD had an R&D budget $5.8 billion R&D budget that was primarily spent on x86 development as thet is by far their largest revenue stream.

Now, I want everybody in these comments that just assumes that AMD should just be able to match Nvidia explain to me, how that is to be accomplished. Because the way 99% of you speak about it, you act like these two companies are on a level playing field, that they have access to the same sort of resources, and that for AMD it's just a problem of "not pricing videocards cheap enough" while completely ignoring the fact that the current capitalist paradigm is stock price above all and quarterly earnings above all....

Tell me how AMD is supposed to go out there, undercut Nvidia at each tier by $150+ and still keep stock prices up and investors happy while quarterly profits decrease AND all the while LITERALLY paying either the same or even a higher costs than nvidia on the materials used to make the card (Nvidia probably gets components cheaper do to larger volume)....PLEASE explain that to me. If I remember correctly, Intel sold alderlake with a diminished profit margin, how has thst worked for them? Oh that's right, AMD surpassed them in value.

The other lot of you act like it's merely a willpower problem, that AMD just doesn't "want it bad enough", we'll please explain to me why AMD should be focusing on videocards when they make the overwhelming majority of money from x86? Why should they dump money into videocards when you consumers have proven in the past numerous times that even when they make a product that is OBJECTIVELY a better value, 90% of you STILL buy the Nvidia card (that's right, you're not as rational as you think you are an research into consumer psychology has proven this time and time again)? If I was a business, that wouldn't sound like a good investment to me...

We literally live in a world we're money and profit dictates reality, yet in over a decade observing these "discussion" I honestly cannot think of a single instance where anyone even addressed the fact that Nvidia just plain has more resources to compete with, which is arguably the MOST determinant factor in this competition.

The other part of it that seemingly everybody ignores is the fact that the overwhelming majority, 99% of all consumers, including ALL OF YOU, make purchasing decisions based on IRRATIONAL factors like how the product makes you "feel", and we KNOW that's true for videocards because even when AMD offers a compelling videocard that on paper is an OBJECTIVELY better value, the Nvidia competitor still outsells it 10 to 1.

I'm sure so much of this is motivated by FOMO as well as the fact that some of you probably just don't like the ID of coming online to forums like this and saying you have an AMD gpu so you buybthe Nvidia one because you want to be associated with the "winning side"...and don't laugh, because there is literally decades of consumer psychology research that proves the existence and primacy of these phenomenon. How are you going to get irrational consumers to switch to a competitor based on something that is rational like a a product bring a "better value"?
Thank you. It's well said. I'm amazed how AMD, being the smaller company, is able to compete with Intel and nVidia. The hard truth is, given the chance, AMD could do the same anticompetitive practices as the other two. Like you said, it's all about making the investors happy.
Posted on Reply
#70
AusWolf
fevgatosIf amd spends a fraction of nvidia's r&d shouldn't their cards also cost a fraction of the price?
Because manufacturing and shipping costs nothing? Have you seen what TSMC charges for wafers recently?
Posted on Reply
#71
Sunny and 75
wolfThe reality is, if they want market share from Nvidia they need to double down on both RT and FSR, and at least the signs are positive that it will be the case for both too, so here's hoping.
That's the way.
Posted on Reply
#72
Onasi
@Chrispy_ @Vya Domus
This reminds me of a news article couple of months back that I think I actually discussed this issue with you, @Chrispy_, when Diablo IV added RT support and especially touted awesome new RT shadows with included screenshots… and in every instance RT looked worse. Sure, it was “realistic”, but it was obvious that original crisp and high contrast shadows were deliberate in the part of the game artists to create a certain mood and make the scene readable at a glance from pseudo-iso perspective. RT shadows just looked muddy and undefined instead and made the whole image, funnily enough, look lower res overall.

I don’t have anything against RT, I just feel that most current implementation are analogous to tasteless over-saturated ENB packs and 8K textures for old games - it’s missing the point and mostly makes things look worse, but gamers with no taste lap it up because it’s ostensibly new and high tech, so it must be better.
Posted on Reply
#73
AusWolf
Onasi@Chrispy_ @Vya Domus
This reminds me of a news article couple of months back that I think I actually discussed this issue with you, @Chrispy_, when Diablo IV added RT support and especially touted awesome new RT shadows with included screenshots… and in every instance RT looked worse. Sure, it was “realistic”, but it was obvious that original crisp and high contrast shadows were deliberate in the part of the game artists to create a certain mood and make the scene readable at a glance from pseudo-iso perspective. RT shadows just looked muddy and undefined instead and made the whole image, funnily enough, look lower res overall.

I don’t have anything against RT, I just feel that most current implementation are analogous to tasteless over-saturated ENB packs and 8K textures for old games - it’s missing the point and mostly makes things look worse, but gamers with no taste lap it up because it’s ostensibly new and high tech, so it must be better.
Interesting thought. Sure, RT makes a scene look more realistic, but I wonder how much realism is a key factor when making a game such as Minecraft, for example. Probably not much. And if that's the case, then how does tilting the game's overall appearance away from the artist's original intention towards a kind of forced realism add to the game's value? I'll let everyone answer this for themselves.
Posted on Reply
#74
wolf
Better Than Native
AusWolfPersonally
Like I said, it's all relative. And given the resolution / hardware you've had, I can absolutely see why you make the choices you do, the parts lacked in RT performance and FSR need a lot of work. Personally, I don't get too bogged down in how the sausage is made, I just enjoy the meal when it tastes great. Even you've said 4k makes the most sense for DLSS, and indeed it does. Rendering at native res isn't really a consideration or goal for me (if going for sharpness I aim for supersampling), I prefer next generation visuals to yesteryears graphics slightly sharper, I really hope AMD can deliver that and tempt me across. I really want them to succeed.
Posted on Reply
#75
Vayra86
AssimilatorBy having something called a "business plan".
Which they have got, except it doesnt align with what we like to see in a discrete consumer gpu line up. Dont forget AMD is still and has historically maintained volume sales on its GPU division AND uses that to create and maintain a custom Chip business, APU, console presence and recently owned the handheld PC market just like that: the synergy is value that Nvidia doesnt have, nor Intel at this point manages to match.

Its the reason AMDs share price has still gained a good 400% in the face of 'losing battles' against much larger chip behemoths. Not a bad result.
Posted on Reply
Add your own comment
Nov 24th, 2024 07:27 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts