Friday, May 3rd 2024

AMD to Redesign Ray Tracing Hardware on RDNA 4

AMD's next generation RDNA 4 graphics architecture is expected to feature a completely new ray tracing engine, Kepler L2, a reliable source with GPU leaks, claims. Currently, AMD uses a component called Ray Accelerator, which performs the most compute-intensive portion of the ray intersection and testing pipeline, while AMD's approach to ray tracing on a hardware level still relies greatly on the shader engines. The company had debuted the ray accelerator with RDNA 2, its first architecture to meet DirectX 12 Ultimate specs, and improved the component with RDNA 3, by optimizing certain aspects of its ray testing, to bring about a 50% improvement in ray intersection performance over RDNA 2.

The way Kepler L2 puts it, RDNA 4 will feature a fundamentally transformed ray tracing hardware solution from the ones on RDNA 2 and RDNA 3. This could probably delegate more of the ray tracing workflow onto fixed-function hardware, unburdening the shader engines further. AMD is expected to debut RDNA 4 with its next line of discrete Radeon RX GPUs in the second half of 2024. Given the chatter about a power-packed event by AMD at Computex, with the company expected to unveil "Zen 5" CPU microarchitecture on both server and client processors; we might expect some talk on RDNA 4, too.
Sources: HotHardware, Kepler_L2 (Twitter)
Add your own comment

227 Comments on AMD to Redesign Ray Tracing Hardware on RDNA 4

#151
Steevo
sepheronxI dunno if it has been said or not.

But do you guys remember Physx and Hairworks? How AMD Struggled or couldn't operate with it? I mean, there was dedicated cards for it, heck I had one for that space bugs game on that cold planet? I cant remember the name. But yeah, it was used heavily and AMD Couldn't work with it. Had to get another card.

Anyway, what I am getting at is that AMD is late to the game, as usual. RT is the new Physx and Hairworks. Even bigger actually. And a game changer to lighting. Hell, it is fantastic for horror games.

I am glad they are now being active in looking into it. But at this point, for midrange, I don't care who it is (AMD, Intel, Nvidia), so long as I can get a cheaper GPU that can implement RT, then I will go for it.
There was a profit driven idea behind PhysX, and why after the company that built it saying a GPU couldn't do what it did was bought it simply merged into a compute language that was proven to work fine on AMD cards.

PhysX was also mostly still precooked calculations with a few finishing touches on the fly, as Nvidias own programming for CUDA document showed.

Nvidia has done amazing things but they aren't the technology pioneer they are seen as.
Posted on Reply
#152
wolf
Better Than Native
AusWolfIt makes it better, but only slightly
That's where we disagree I suppose. It varies game to game, some are not worth the hit, some are and I won't cherry pick a single example to hang my hat on. It's especially the case if the upscaling doesn't turn it into a shimmering garbly mess and can genuinely help. Like I said I've been enjoying it for years now, and AMD seem to be finally be admitting it and are on the bandwagon too, they know that if they want more Radeon customers they have quite a steep RT/Upscaling hill to climb.
Posted on Reply
#153
Denver
Space Lynxthey could still distribute how they spend their R&D money differently. really they should give up on Ray tracing and focus solely on matching DLSS, I plan to buy a 5090 or 6090 and sell my 7900 XT when I do that and its largely because DLSS is the future and there is no stopping it. unfortunately.
What a hell of a future you expect huh. Better to wait for the zombie apocalypse. In either scenario, I'll stand ready, finely trained and armed with a katana by my side.

My optimistic wish for the future; TAA must die, and consequently DLSS and its analogues. Games will cease to be plagued by glitches/Blur/bugs, and companies will refrain from dismissing QA and launching flawed games. Instead, they'll prioritize innovation and build games that are truly enjoyable.

Yes, this future, exists for me and you, see the doors of heaven opening. :p
Posted on Reply
#154
Redwoodz
dgianstefaniThat's your opinion. Industry moves towards things that result in what consumers want (better graphics and performance), along with techniques that allow for simpler and easier workloads in production. The fact you offer resistance doesn't help your preferred GPU vendor, who also seemed to think that these technologies weren't important, and look at where they are now. Adding proper RT hardware, going the way of AI (finally) for FSR, etc.


I'd agree, except AMD does not exist in a vacuum. NVIDIA is also releasing new cards which will be improved too, and they're starting ahead.
You are so wrong it's not even funny. This has everything to do with AI and nothing to do with gaming.
Posted on Reply
#155
AusWolf
Dr. DroThe small things obviously add up. That the 3090 went from 22 to 49 fps there is a ~120% jump. That's more than substantial.
No. It went up from 22 to 46 by enabling DLSS Q. Then, it gained a whole other 3 FPS by ray reconstruction. That's an unnoticeable 6% increase.
Dr. DroAs for the AMD card, why not? It's supposed to be focused on power efficiency. If the price is right, sounds like fun to me.
Fair enough. I'm actually on the same opinion. :)

If AMD manages to fix only the video playback power consumption, and add nothing else, I'll gladly swap my 7800 XT for the new model. I might even sell it early as I could use the cash for some upcoming vacations. Then, buying the new model will come with a lot less buyer's remorse, as I'll essentially be upgrading from a 1660 Ti or 6500 XT. :D
Posted on Reply
#156
Prima.Vera
Vya DomusYeah they can finally check out how the game looks almost indistinguishable from the regular RT, a real treat.
I keep saying this.
RT is the most overrated in CP, there is only a slight difference, but not that big to worth the fps tank. Not at all.
Posted on Reply
#157
Vayra86
Beginner Macro DeviceWhere? I find it impossible to confuse real life with computer games, unless you're completely whacked. Not even close. Shadows, textures, reflections, they all behave not the same way.

Not my fallacy. I love some games precisely for being far from representing reality. Fallout series, for example, have it way off but this is what makes the series even better. Same applies to Quake, Doom etc. These games are graphically the bee's knees.
Well then, we're in agreement here.

This is exactly what I'm saying. Realism is abstract in gaming. If its not real, its not real, and gaming never is. Most games are what they are exactly because they're not real. So why would you need RT to improve realism over the dynamic lighting we already had? Is the desire for better graphics really something that hinges on having RT or not, or is that just what Nvidia wants to tell us?

Because frankly I haven't seen a single game where RT made or broke the visuals or the realism. Its either already there in the rasterized picture, or its not there at all. RT won't make the surreal, real. It just applies minor changes to the image's lighting and reflective qualities.

What I DID see a lot of in my gaming decades is that the real gaming eye openers were those with painstakingly hand-crafted scenery. The most immersive games are the games that just breathe they are works of art, that real effort went into them and things just click. RT doesn't do that. Rather, for being so 'ubiquitous' in gaming if its 'so easy to apply', it will make a lot of games look the same. A bit like how the autotune destroyed pop music. Everything sounds the same, vocals are computerized, heard one, you've heard them all.

That's the fallacy of this chase for realistic graphics and 'the holy grail of graphics'. Its reducing an art form to something based on universal metrics, but at that point it directly stopped being art and starts being a spreadsheet, a system with recognizable patterns much like that procedural 'random' world you walk through that's still boring AF.
Posted on Reply
#158
Godrilla
Prima.VeraI keep saying this.
RT is the most overrated in CP, there is only a slight difference, but not that big to worth the fps tank. Not at all.
The best feature in terms of return on investment of resources and performance hit imo is rt global illuminations. It makes the image pop especially with HDR display vs traditional rasterization ambient occlusions. It's a night and day difference subjectively speaking.
Posted on Reply
#159
RGAFL
For me personally I think the biggest quality uplift for graphics is getting HDR right for every game. I can see the need for RT in certain games (and it will only be in certain games, not every game needs it) but HDR does add a certain something to all games. When I play PS5 games on my LG OLED TV with HDR it can look stunning.
Posted on Reply
#160
Vayra86
RGAFLFor me personally I think the biggest quality uplift for graphics is getting HDR right for every game. I can see the need for RT in certain games (and it will only be in certain games, not every game needs it) but HDR does add a certain something to all games. When I play PS5 games on my LG OLED TV with HDR it can look stunning.
Yeah I gotta say watching Dune on the OLED with HDR on turned my perspective around on HDR. Its amazing when done right (and viewed on something with much better dynamic range than the average LCD). Unfortunately most panels are simply incapable of good HDR and it only serves as an excuse to either give you a paled image or heavy oversaturation and crushed highlights/blacks.
Posted on Reply
#161
dgianstefani
TPU Proofreader
Vayra86Yeah I gotta say watching Dune on the OLED with HDR on turned my perspective around on HDR. Its amazing when done right (and viewed on something with much better dynamic range than the average LCD). Unfortunately most panels are simply incapable of good HDR and it only serves as an excuse to either give you a paled image or heavy oversaturation and crushed highlights/blacks.
Dune, esp part II, made me remember what a well made film can be.

Hollywood really dropped the ball these last couple decades. Nice to see masters at work.
Posted on Reply
#162
wolf
Better Than Native
dgianstefaniOf course, for those of us not living in 2018 there is great demand for good upscaling/DLAA and ray tracing performance, but it's easy to just write off that majority when you can simply call them fanboys and therefore excuse AMD not delivering.
Yeah people definitely didn't ask for more realistic/better lighting effects (aka, better graphics) and more performance, which is what both of the technologies actually do .... But they'll never admit it, because we didn't specifically ask for exactly what was delivered on a technical level, we're all just 100% riding jacket mans dong and they're waaaay smarter than us, duh.
Posted on Reply
#163
GhostRyder
I really want to see some actual people who legitimately use Ray tracing at least significant to the point that its a must have. I hear all the time how Ray Tracing is just so amazing yet all the people I know and speak to use it on either one of the lower settings or turn it off in most cases because it kills the performance. Now granted only one of my friends has a 4090 and can truly use it on the higher settings without killing his performance to sub 60 performance in games, most of my friends have 4070 ti cards and lower (Those that have Nvidia cards).

I am not complaining about Ray Tracing existing, but it got old along time ago all the talk about it being the only way AMD can be competitive. Its a Niche tech, and most of the games that use it use it in a small way. Not a ton of games are like Cyberpunk who truly show what it can do and I agree that when its used to its fullest it looks great! But there is a reason most games that use it use it have heartedly (At least in my opinion). Again the only card I think that truly shows off what Ray Tracing can do is the 4090 which is very expensive.

As for AMD redesigning it, good, I mean I am all for them making it more competitive as if the tech is staying they might as well invest in it. It will be interesting to see how it performs on next generation hardware this year!
Posted on Reply
#164
dgianstefani
TPU Proofreader
GhostRyderI really want to see some actual people who legitimately use Ray tracing at least significant to the point that its a must have. I hear all the time how Ray Tracing is just so amazing yet all the people I know and speak to use it on either one of the lower settings or turn it off in most cases because it kills the performance. Now granted only one of my friends has a 4090 and can truly use it on the higher settings without killing his performance to sub 60 performance in games, most of my friends have 4070 ti cards and lower (Those that have Nvidia cards).

I am not complaining about Ray Tracing existing, but it got old along time ago all the talk about it being the only way AMD can be competitive. Its a Niche tech, and most of the games that use it use it in a small way. Not a ton of games are like Cyberpunk who truly show what it can do and I agree that when its used to its fullest it looks great! But there is a reason most games that use it use it have heartedly (At least in my opinion). Again the only card I think that truly shows off what Ray Tracing can do is the 4090 which is very expensive.

As for AMD redesigning it, good, I mean I am all for them making it more competitive as if the tech is staying they might as well invest in it. It will be interesting to see how it performs on next generation hardware this year!
When every console uses borderline unusably slow RT hardware (RDNA2 APU), the majority of games developers will build for the lowest common denominator (Xbox series S).

UE5.4 has lumen as the default, PS5 Pro coming soon which has slightly less useless RT hardware from a RNDA3.5 APU, so expect to see stronger and more detailed implementations.

Games devs are mandated to have every release run on the series S, so blame Microsoft for stupid segmentation and AMD for behind the times hardware.

Posted on Reply
#165
nguyen
GhostRyderI really want to see some actual people who legitimately use Ray tracing at least significant to the point that its a must have. I hear all the time how Ray Tracing is just so amazing yet all the people I know and speak to use it on either one of the lower settings or turn it off in most cases because it kills the performance. Now granted only one of my friends has a 4090 and can truly use it on the higher settings without killing his performance to sub 60 performance in games, most of my friends have 4070 ti cards and lower (Those that have Nvidia cards).

I am not complaining about Ray Tracing existing, but it got old along time ago all the talk about it being the only way AMD can be competitive. Its a Niche tech, and most of the games that use it use it in a small way. Not a ton of games are like Cyberpunk who truly show what it can do and I agree that when its used to its fullest it looks great! But there is a reason most games that use it use it have heartedly (At least in my opinion). Again the only card I think that truly shows off what Ray Tracing can do is the 4090 which is very expensive.

As for AMD redesigning it, good, I mean I am all for them making it more competitive as if the tech is staying they might as well invest in it. It will be interesting to see how it performs on next generation hardware this year!
If technologies were being held back by what the majority have or use, you wouldn't have such nice thing like a PC today

Another example, is OLED screen pointless because the majority of users still have LCD screen?
Posted on Reply
#166
sepheronx
If Silent Hill 2 is good and not shit (oh God I'm hoping it's good. Otherwise that's it, it will be a dog's day afternoon for me), then RT will play an important roll in it. For atmosphere, lighting, shadows, the fog, etc. That is where RT will shine.

I agree in cyberpunk its kinda meh. Can be done, those effects, without it or at least without such a penalty in performance. But I've tried it on my 3080 and I was impressed. But when running around in an open environment you won't notice it. More closed environments or something like a forest or a jungle, you would notice it.
Posted on Reply
#167
RGAFL
nguyenIf technologies were being held back by what the majority have or use, you wouldn't have such nice thing like a PC today

Another example, is OLED screen pointless because the majority of users still have LCD screen?
Then that makes RT a bit pointless then as the vast MAJORITY of graphics cards users are 4060 or equivalent and under.
Posted on Reply
#168
nguyen
RGAFLThen that makes RT a bit pointless then as the vast MAJORITY of graphics cards users are 4060 or equivalent and under.
It seems you didn't get it.
New bleeding edge technologies aren't just going to be ubiquitous in a single day, but they will slowly replace the current tech.
For example: LCD replaced CRT, optical mouse replaced trackball mouse, DDR2 - DDR3 - DDR4 - DDR5 RAM, etc...

Kinda stupid to think new techs are pointless because the majority are not using those tech LOL
Posted on Reply
#169
AusWolf
Vayra86Well then, we're in agreement here.

This is exactly what I'm saying. Realism is abstract in gaming. If its not real, its not real, and gaming never is. Most games are what they are exactly because they're not real. So why would you need RT to improve realism over the dynamic lighting we already had? Is the desire for better graphics really something that hinges on having RT or not, or is that just what Nvidia wants to tell us?

Because frankly I haven't seen a single game where RT made or broke the visuals or the realism. Its either already there in the rasterized picture, or its not there at all. RT won't make the surreal, real. It just applies minor changes to the image's lighting and reflective qualities.

What I DID see a lot of in my gaming decades is that the real gaming eye openers were those with painstakingly hand-crafted scenery. The most immersive games are the games that just breathe they are works of art, that real effort went into them and things just click. RT doesn't do that. Rather, for being so 'ubiquitous' in gaming if its 'so easy to apply', it will make a lot of games look the same. A bit like how the autotune destroyed pop music. Everything sounds the same, vocals are computerized, heard one, you've heard them all.

That's the fallacy of this chase for realistic graphics and 'the holy grail of graphics'. Its reducing an art form to something based on universal metrics, but at that point it directly stopped being art and starts being a spreadsheet, a system with recognizable patterns much like that procedural 'random' world you walk through that's still boring AF.
My only question to the general audience is: why are we celebrating the paint brush instead of the painter?

No graphical tool is worth a penny if the game is jack shite, and if the game is good, then graphics only add to the immersion, not make it, nor break it.

Or someone please tell me that Half-Life is unplayable garbage because it doesn't have RT (or even DirectX 9 for that matter).
wolfYeah people definitely didn't ask for more realistic/better lighting effects (aka, better graphics) and more performance, which is what both of the technologies actually do .... But they'll never admit it, because we didn't specifically ask for exactly what was delivered on a technical level, we're all just 100% riding jacket mans dong and they're waaaay smarter than us, duh.
I'm not sure if you meant this in a sarcastic way, but actually, I never asked for more realistic lighting. I've been asking for characters that don't look like plastic dolls in the rain, but I don't seem to be getting them.

I remember being amazed by coloured lights in SW: Dark Forces 2 (Jedi Knight) and Half-Life. Then, there was Doom 3 and Half-Life 2. Then Crysis. Now we have Cyberpunk and Alan Wake 2. Every single game improved on lights for decades without anyone asking for anything, so no, I don't want to pretend that the whole idea came out of leather jacket man's arse on a Sunday afternoon while sipping tea just to make us all feel jolly good about... more lights?
nguyenIf technologies were being held back by what the majority have or use, you wouldn't have such nice thing like a PC today

Another example, is OLED screen pointless because the majority of users still have LCD screen?
That's a bad example. I specifically avoid OLED because of burn-in.

Or maybe not a bad example at all... the same way OLED doesn't replace LCD, ray tracing doesn't replace rasterization, either.
Posted on Reply
#170
RGAFL
nguyenIt seems you didn't get it.
New bleeding edge technologies aren't just going to be ubiquitous in a single day, but they will slowly replace the current tech.
For example: LCD replaced CRT, optical mouse replaced trackball mouse, DDR2 - DDR3 - DDR4 - DDR5 RAM, etc...

Kinda stupid to think new techs are pointless because the majority are not using those tech LOL
Oh I get it alright. I work with tech in games development that makes my home system look rather pathetic in comparison. I'm talking the average Joe Schmo with a 3060/4060. The likes of 4090 performance ain't reaching them for another two to three generations at least. That's the critical mass point. The problem is the cores that do the RT/upscaling heavy lifting are not cheap to make. If you think that graphics cards are staying at the price point they are now then it seems you don't get it. The talk is that the 5090 will be around £2000. If true that's another £400 price rise in a generation. Filtering down the scale that means a 5060 or equivalent getting a price point of £5/600 or getting more cut down to hit it's current price point. You sure are going to pay for that sweet RT/upscaling goodness. I more than get it, seems other people might not afford to.

Oh and another thing, all those techs you mentioned, the replacements eventually reached a point that the price was the same as the old tech. Graphics cards, ummm, no.
Posted on Reply
#171
Vayra86
dgianstefaniWhen every console uses borderline unusably slow RT hardware (RDNA2 APU), the majority of games developers will build for the lowest common denominator (Xbox series S).

UE5.4 has lumen as the default, PS5 Pro coming soon which has slightly less useless RT hardware from a RNDA3.5 APU, so expect to see stronger and more detailed implementations.

Games devs are mandated to have every release run on the series S, so blame Microsoft for stupid segmentation and AMD for behind the times hardware.

Blame whoever, this has been the gist of consoles since forever. They're not bleeding edge tech because they need to be cost effective. Don't for a second think Nvidia is/was capable of pushing a more competitive deal here - if they were able or willing to, we would've had Nvidia based consoles. Especially if the RT promise was seen industry wide as a killer feature.

And yet here we are. The market proves how important RT really is. Blame has no place. Economy doesn't care about blame. The indisputable fact is that (third gen!) RT has proven to be even less than a buzz word outside of DIY PC than AI is in its current infancy. People don't give a shit about it, despite lots of marketing and pushed games/content. The fact we're still talking about Cyberpunk's extra special RT features right now speaks volumes. It reminds a lot of how people talk about the tiny handful of VR games because there simply isn't anything else. Yeah... have fun with your little box of toys... it ain't going places though.

The developments are there, but we can be much more dismissive of what's going to actually work for us and what's not. Others say that all technology starts small and then takes over, but that's not true. The majority of developments do in fact, fail. Many things are thrown at walls, only the best things stick. I put a lot more stock in engine developments like Nanite and the slow crawl of AMD's RT hardware improvements than proprietary pushes and per-game TLC to 'show off' what it can do. Sure, you can use a few of those poster childs to push the tech, but its really that, and nothing more. It won't ever become a thing in that way, shape or form. Cyberpunk's PT, the excessive TLC on that game's graphics... fun project, but not indicative in the slightest of a gaming market for the (near) future. Not the next gen, not the gen after it either. 10 years? Maybe. Maybe not.

RT is currently mostly selling an idea and has a few tech demos to show it off. There is no proof of its economical feasibility whatsoever yet. Given the abysmal raster performance on the new crop of games, you can also rest assured that Ye Upscale won't save RT. Game performance bar keeps moving up regardless.
Posted on Reply
#172
nguyen
RGAFLOh I get it alright. I work with tech in games development that makes my home system look rather pathetic in comparison. I'm talking the average Joe Schmo with a 3060/4060. The likes of 4090 performance ain't reaching them for another two to three generations at least. That's the critical mass point. The problem is the cores that do the RT/upscaling heavy lifting are not cheap to make. If you think that graphics cards are staying at the price point they are now then it seems you don't get it. The talk is that the 5090 will be around £2000. If true that's another £400 price rise in a generation. Filtering down the scale that means a 5060 or equivalent getting a price point of £5/600 or getting more cut down to hit it's current price point. You sure are going to pay for that sweet RT/upscaling goodness. I more than get it, seems other people might not afford to.
RT is getting cheaper each generation, during RTX2000 only the 2080 Ti is somewhat capable of RT, now the 600usd 4070S is about good enough (PT is another beast).
Next gen we will definitely see some 450-500usd GPUs capable of handling RT GI, Reflections and AO at 1440p (or 4K with Upscaling + Frame Gen).

Pay for better visuals? gladly. After all I'm paying 70usd for a AAA game these day already.
Posted on Reply
#173
Vayra86
AusWolfThat's a bad example. I specifically avoid OLED because of burn-in.

Or maybe not a bad example at all... the same way OLED doesn't replace LCD, ray tracing doesn't replace rasterization, either.
Its a perfectly fine example. Both LCD and OLED are inferior in their own ways, we've arrived at a point where there aren't definitive, decisively better technologies, there are advantages to each one for specific environments/use cases. Even if you look at LCD: first we had a battle between TN and IPS. Then VA got better. Now you can buy monitors in all three corners and they all have unique selling points.

The same thing applies for RT. It has a purpose, and it works, if used sparingly. If its used to brute force something that raster can do at half or less the performance hit, we're looking at something that's not viable in the long or even mid-term. There is no future where RT will replace everything before it. There's no point in doing so, until it costs less/equal performance as raster and while doing so, provides a better visual.
Posted on Reply
#174
RGAFL
nguyenRT is getting cheaper each generation, during RTX2000 only the 2080 Ti is somewhat capable of RT, now the 600usd 4070S is about good enough (PT is another beast).
Next gen we will definitely see some 450-500usd GPUs capable of handling RT GI, Reflections and AO at 1440p (or 4K + Upscaling + Frame Gen)
You think RT is getting cheaper each generation. Tell me again about the price rises last gen to this gen. The 3080 to 4080 price rise was insane. But okay graphics cards are getting cheaper. Like I said, you want the RT/upscaling goodies to get better each gen then expect to pay more..
Posted on Reply
#175
wolf
Better Than Native
AusWolfI'm not sure if you meant this in a sarcastic way, but actually, I never asked for more realistic lighting. I've been asking for characters that don't look like plastic dolls in the rain, but I don't seem to be getting them.

I remember being amazed by coloured lights in SW: Dark Forces 2 (Jedi Knight) and Half-Life. Then, there was Doom 3 and Half-Life 2. Then Crysis. Now we have Cyberpunk and Alan Wake 2. Every single game improved on lights for decades without anyone asking for anything, so no, I don't want to pretend that the whole idea came out of leather jacket man's arse on a Sunday afternoon while sipping tea just to make us all feel jolly good about... more lights?
It shouldn't come as a surprise that your wants don't represent everyone's, and I never said that they did, just that people do in fact want in game visuals, and specifically lighting to continue to get better and better, and other members insinuating the opposite are not only incorrect, but being purposely disengenuous and trite in an attempt to win an argument they constructed.

Again, both can be true, I know you don't like the recent push and have things higher on the list for you personally. Be that as it may, others do want RT performance to increase.
Posted on Reply
Add your own comment
Nov 26th, 2024 08:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts