Friday, May 3rd 2024

AMD to Redesign Ray Tracing Hardware on RDNA 4

AMD's next generation RDNA 4 graphics architecture is expected to feature a completely new ray tracing engine, Kepler L2, a reliable source with GPU leaks, claims. Currently, AMD uses a component called Ray Accelerator, which performs the most compute-intensive portion of the ray intersection and testing pipeline, while AMD's approach to ray tracing on a hardware level still relies greatly on the shader engines. The company had debuted the ray accelerator with RDNA 2, its first architecture to meet DirectX 12 Ultimate specs, and improved the component with RDNA 3, by optimizing certain aspects of its ray testing, to bring about a 50% improvement in ray intersection performance over RDNA 2.

The way Kepler L2 puts it, RDNA 4 will feature a fundamentally transformed ray tracing hardware solution from the ones on RDNA 2 and RDNA 3. This could probably delegate more of the ray tracing workflow onto fixed-function hardware, unburdening the shader engines further. AMD is expected to debut RDNA 4 with its next line of discrete Radeon RX GPUs in the second half of 2024. Given the chatter about a power-packed event by AMD at Computex, with the company expected to unveil "Zen 5" CPU microarchitecture on both server and client processors; we might expect some talk on RDNA 4, too.
Sources: HotHardware, Kepler_L2 (Twitter)
Add your own comment

227 Comments on AMD to Redesign Ray Tracing Hardware on RDNA 4

#76
Assimilator
Yikes, so AMD is basically gonna be a generation behind at best.
Vya DomusIf you can find even one instance of consumers asking for stuff like path tracing or upscaling before 2018, I'll concede that everyone was in fact begging for these things to be added and I am just being delusional.
Ah, so moving the goalposts again. Well I'll play, I've been hoping for ray- and path-tracing to replace raster since I learned how much of an inferior technical compromise the latter is, which would've been around... 1997 I think, when my high school comp sci teacher brought it up. This was before I had internet access at home, he lent me a book about implementing RT in software which, TBH, a lot of the mathematics went over my head... but the concept just made so much sense in terms of visual fidelity.
AnarchoPrimitivAlso, ever consider that AMD reasons for not doing something isn't choice so much as financial constraint? It seems as though whenever somebody compares Nvidia, Intel, and AMD they just assume that they're all competing on an even playing field when that couldn't be further from the truth. Nvidia and Intel are so much bigger than AMD and have so many more resources that we shouldn't view it as AMD underachieving compared to Nvidia in Intel, but that what AMD is able to achieve with so little resources and their competition having every advantage is truly impressive.

Up until the last three or four years, Nvidia had an R&D budget over twice as big as AMD's and Intel had one over 7x as large, and despite that, AMD was able to compete and hang with Nvidia and actually beat Intel. Can anybody name an example from any other industry where a company is able to successfully.cimpete while being completely outmatched by its competition resource wise?
The only people who care that AMD has a smaller R&D budget are fanboys trying to explain away why their products are shit.
ToTTenTranzI wonder where all this progressivism was when ATi/AMD tried to introduce TruForm, then the first tessellation hardware in DX10, then TruAudio.
All of which were good enough to be widely adopted in consoles (i.e. a much higher volume of gaming hardware than PC dGPUs), but for some reason the PC crowd ignored.
You mean consoles with GPUs manufactured by AMD? Yeah, I really wonder why AMD would include its own technologies in its own products... that's a true head-scratcher.
Posted on Reply
#77
the54thvoid
Super Intoxicated Moderator
I read an argument about delivering what consumers wanted. RT improves certain graphical situations - no doubt, but without a doubt, and using the most neutral logic - consumers did not ask for RT tech. This was very much led by the industry. It doesn't make it a bad move, but it's just not true to say 'we, the consumer' had any inclination for it. My old industry was the exact same. Manufacturers of products needed to 'innovate' in order to move to a new sales model. The manufacturers (via proxies) told us what we wanted to buy. Most times, the new stuff was 100% fad.

There is a difference here though. In the longer term, IIRC, RT and PT will make it easier for devs to code? Once the hardware is up to scratch, it should make it better for all of us. I mean, the movie industry has used RT in its CGI for ages. Its what makes CGI look real.

Then again, to play devils advocate, I don't need my game to look real. I want to enjoy the game. Case in point, Sony make some stellar titles with amazing immersion. My fave game in a long time is Days Gone. It's got no RT. Same for the Horizon games, I think. There are indie games that don't need it either - Super Hot - I'm looking at you. 2D scrollers are another example.

In the push for realism, the graphical hardware requirement goes up and up. More fidelity needs more componentry. RT/PT is the future, but it's not the present, and that's the worst part about PC gaming right now. Nvidia are hardcore pushing the AI narrative and steaming ahead with their hardware to suit. They are undoubtedly the leaders, but they're also leading us on a ride. They need to push AI fidelity solutions because their own hardware isn't powerful enough to run what they say is the best thing ever. They're playing clever language games with us, then have the cheek to make up PR speak about paying more to save... Cringe.

But, without any contest, it's what they can do. There is always a market leader, and the consumerbase will either roll with them, villify them, or (like me) hold my hands up and say, "It's not worth it to me. I see no reason to chase this mighty green dragon." Yet, I did buy an Nvidia card this round (though for the first time in years it was 3 tiers down). My criteria was simple - 50% faster than my old 2080ti, and similar power draw with quiet fans. RT didn't matter so much, but I'll not lie, it was about 10% of my decision.

Now, as a mod - I'll say this:

This news piece is about RDNA 4 and RT. This isn't a place for rasterisation discussions, or posting benchies of non-RT raster. If you're arguing AMD's point on that here, you're definitely in the wrong thread.
Posted on Reply
#78
evernessince
ZoneDymoAnywho, there is no denoiser for it so its a grainy mess but obviously in the future, we could simply up the bounce count to a point where a denoiser would not be needed.
But like you said, that is future music.
You mean samples per pixel correct (num of primary rays per pixel)? Increasing bounces would enable more realistic light propagation while casting more primary rays would create higher quality and less noisy output. Current ray tracing effects in most games use 0.5 primary rays per pixel and 0 bounces. For example, if you look in CP2077 PT you can see that GI doesn't illuminate the underside of objects from sunlight bouncing off the ground as it would in real life.


On the article, I have to wonder if it's feasible to for AMD to separate dedicated RT hardware into a chiplet. This would allow them to pack more cores onto the GPU core die itself while tailoring RT performance to the market. Imagine being able to configure the number of RT chiplets to the task, allowing them to create cheaper GPUs without RT for those that want to save money and also GPUs packed with RT chiplets for those that want the best RT performance. They could even make dedicated RT accelerator GPUs for video production use, all the while using the same basic set of chiplets it uses across it's entire stack. Might also be wise to segment the AI cores from the GPU core as well.

Of course I'm not a GPU engineer so I can't say if the latency and bandwidth would be an issue here. We know AMD couldn't do two GPU dies due to bandwidth issues, not yet at least using their current organic substrate. That said bandwidth requirements of just a fraction of that die might be a different story.
the54thvoidThere is a difference here though. In the longer term, IIRC, RT and PT will make it easier for devs to code? Once the hardware is up to scratch, it should make it better for all of us. I mean, the movie industry has used RT in its CGI for ages. Its what makes CGI look real.
I'd say it absolutely will not make it easier to code. It'll make it easier to get higher quality lighting for sure but at the end of the day you'll still want to be placing light by hands and creating custom shaders if you want to do a specific kind of visual effect.
Posted on Reply
#79
Vya Domus
NoyandAnd you are calling people out on being dishonest
Not a single clue what you are trying to say, I never said anything that wasn't true. I literally cannot see how I am being dishonest, Intel's raster performance is atrocious, that may make comparisons seem strange but it's not my fault there are situations where they're beaten by mid range cards from 5 years ago.

I am told Intel is this really competitive new third player in the GPU market and they are for sure gonna destroy AMD but the fact is performance is in the dump on Intel GPUs in almost all conceivable metrics.
Posted on Reply
#80
RGAFL
To me RDNA 4 is a test bed for the gen after that. I can say having spoken with AMD reps and other devs that do have test cards that the RT is much improved RELATIVE to the tier they release at. I've heard some say about 25% improvement. Also some of the improvements will filter down to RDNA 3 in future drivers but not all as they have improved the hardware in RDNA 4 so obviously only on the software side. On the test cards there is now dedicated hardware for RT and upscaling, how that pans out I don't know yet but they do know that they have to improve.

Also from what i'm hearing don't discount a high end card.
Posted on Reply
#81
Denver
dgianstefaniThe Arc A770 obliterates the RX 7600.

The 7600XT you are comparing to is a three month old review with old drivers, and you've specifically chosen 1080p RT off, the Intel card scales better at higher resolutions, and/or with RT on.


dgianstefaniThe Arc A770 obliterates the RX 7600.

The 7600XT you are comparing to is a three month old review with old drivers, and you've specifically chosen 1080p RT off, the Intel card scales better at higher resolutions, and/or with RT on.



Irrelevant huh?


4K ? The shaders' ability to handle RT within the rendering pipeline is inherently limited. When the number of rays remains within these constraints, the difference is less pronounced. However, exceeding these capacities results in shader units becoming overloaded. This overload parallels the performance drop experienced when VRAM is insufficient, causing performance to plummet far below the expected power ratio. Consequently, this overload impedes the shaders' ability to fulfill their workload effectively. Plus, the performance discrepancy is also influenced by VRAM limitations.

While ASICs optimized for RT would offer better performance under these circumstances, the gameplay remains unplayable on both low-end and dedicated RT hardware. Regardless, for those interested, RDNA4's implementation with larger blocks dedicated to RT promises significantly improved perf (and remain unplayable like the others)...

Posted on Reply
#82
RGAFL
evernessinceI'd say it absolutely will not make it easier to code. It'll make it easier to get higher quality lighting for sure but at the end of the day you'll still want to be placing lights by hands and creating custom shaders if you want to do a specific kind of visual effect.
Someone gets it, you still have to program the shaders. It's not as simple as placing a light.
Posted on Reply
#83
ToTTenTranz
AssimilatorYikes, so AMD is basically gonna be a generation behind at best.
In the eyes of nvidia diehard fans and nvidia marketeers, AMD is basically gonna be at least three generations behind at best at whatever their favorite trillion-dollar corporation orders them to adore at any point in time.

Even if RDNA4 brings RT performance/watt parity with Lovelace or even Blackwell, the goalposts will probably shift to whatever Nvidia has in a scale that AMD doesn't. With Fermi it was compute throughput, with Kepler it was geometry performance, with Turing / Ampere / Ada it was RT performance, with Blackwell it'll probably be AI.


There's no remotely possible catching up if the >80% marketshare player uses its dominant position to constantly weasel in new goalposts every new generation.
AssimilatorYou mean consoles with GPUs manufactured by AMD? Yeah, I really wonder why AMD would include its own technologies in its own products... that's a true head-scratcher.
Videogame developers happily adopted the 1st-gen tessellation on the X360 and TruAudio on the PS4 and XB1, yet their PC counterparts in AMD dGPUs were promptly ignored.
Posted on Reply
#84
Makaveli
dgianstefaniThe Arc A770 obliterates the RX 7600.

The 7600XT you are comparing to is a three month old review with old drivers, and you've specifically chosen 1080p RT off, the Intel card scales better at higher resolutions, and/or with RT on.
Would anyone be really using any of these cards A770, 7600 or 7600XT with RT in anything above 1080p and expecting good performance?
Posted on Reply
#85
dgianstefani
TPU Proofreader
MakaveliWould anyone be really using any of these cards A770, 7600 or 7600XT with RT in anything above 1080p and expecting good performance?
For a £300+ card I want to be able to use whatever resolution and settings I please.

Besides, GPUs are traditionally tested at higher resolutions, and CPUs at lower resolutions, to avoid bottlenecks from other parts of the system.

It's true, 1080p/1440p is the range these cards can comfortably operate with settings maxed. However the A770 in particular is decent as a productivity card, and doesn't mind higher resolutions for gaming either.
Posted on Reply
#86
RGAFL
dgianstefaniFor a £300+ card I want to be able to use whatever resolution and settings I please.

Besides, GPUs are traditionally tested at higher resolutions, and CPUs at lower resolutions, to avoid bottlenecks from other parts of the system.

It's true, 1080p/1440p is the range these cards can comfortably operate with settings maxed. However the A770 in particular is decent as a productivity card, and doesn't mind higher resolutions for gaming either.
Be honest, would you buy a £300 card to play games with all the RT bells and whistles at 1440p. If so why do you have a 3080Ti if that is your card in your specs. A £300 card cannot hit a consistent 60fps unless settings are turned down.
Posted on Reply
#87
dgianstefani
TPU Proofreader
RGAFLBe honest, would you buy a £300 card to play games with all the RT bells and whistles at 1440p. If so why do you have a 3080Ti if that is your card in your specs. A £300 card cannot hit a consistent 60fps unless settings are turned down.
Are we moving the goalposts again?

The original comment thread was how does Arc compare at RT compared to the competition.

Someone tried to say the RX 7600 is a faster card, so I posted some TPU testing that shows that's not the case. RT or no.

If I wanted to play at 1080p60 with RT off I'd buy an APU.
Posted on Reply
#88
Vya Domus
Such bizarre logic, the A770 is apparently a really decent card for RT but if you'd ask the same Nvidia fanboy about the 7900XTX which is twice as fast in RT he'd tell you it's basically worthless if you want play games with RT on.
Posted on Reply
#89
dgianstefani
TPU Proofreader
Vya DomusSuch bizarre logic, the A770 is apparently a really decent card for RT but if you'd ask the same Nvidia fanboy about the 7900XTX which is twice as fast in RT he'd tell you it's basically worthless if you want play games with RT on.
3x the price, and pointless if you use RT compared to NVIDIA competition at the same price point.

Keep name calling though.
Posted on Reply
#90
RGAFL
dgianstefaniAre we moving the goalposts again?

The original comment thread was how does Arc compare at RT compared to the competition.

Someone tried to say the RX 7600 is a faster card, so I posted some TPU testing that shows that's not the case. RT or no.
What is it with the goalposts all the time. Don't like being proven wrong it seems. It was a genuine question. Would you buy a £300 card for all the bells and whistles at 1440p 60fps.

Thinks someone's got the needle about the Intel thread. I didn't get proven wrong there either. Gaming, Zen 4. Datacentre and servers, Zen 4. Niche cases that should use a GPU instead of the CPU, Intel or AMD.
Posted on Reply
#91
Vya Domus
dgianstefani3x the price
Ah yes of course, when Nvidia has faster GPUs in RT they can charge a disproportionally large premium over AMD and it's all cool, if AMD has a faster GPU in RT than Intel they can't because uhm...uhm...

Ah, shucks, I forgot about the "Nvidia good, AMD bad" infallible logic once again, my bad.
Posted on Reply
#92
dgianstefani
TPU Proofreader
Vya DomusAh yes of course, when Nvidia has faster GPUs in RT they can charge a disproportionally large premium over AMD and it's all cool, if AMD has a faster GPU in RT than Intel they can't because uhm...uhm...

Ah, shucks, I forgot about the "Nvidia good, AMD bad" infallible logic once again, my bad.
Except the cheaper 4070 Ti S beats the 7900XTX with RT on, at any resolution, and it's much, much faster with PT. So where's that premium? Or are we being disingenuous again?

Even the 12 GB 4070 Ti is faster at every resolution except 4K, while using 75 W less.
RGAFLWhat is it with the goalposts all the time. Don't like being proven wrong it seems. It was a genuine question. Would you buy a £300 card for all the bells and whistles at 1440p 60fps.
If my budget was £300 and I was choosing between a 7600XT and a Arc A770 I'd absolutely pick the A770.

Cheapest 7600XT is £20 more though at £320.
Posted on Reply
#93
64K
This is an enthusiast site so it's natural to think about 4K being a thing but really it's not and never has been. Probably never will be because with each new faster generation of GPUs we also get more demanding games. The 4K goalpost gets constantly pushed forward out of the reach of the vast majority of gamers endlessly. Even after 12 years of being on the market and in the last few years 4K monitors becoming very reasonably priced the Steam Hardware Survey shows it at a paltry 4%. By far the most common resolution is 1080p at 58% and 1440p at 19%.

That's just a reality check. GPUs need to be able to handle RT in small implementations. It's not an all or nothing scenario as some might believe.
Posted on Reply
#94
RGAFL
dgianstefaniIf my budget was £300 and I was choosing between a 7600XT and a Arc A770 I'd absolutely pick the A770.
Good, so would I probably. See, simples. But with all the bells and whistles at 1440p not a chance in hell with either. Even at 1080p in some titles they would struggle.
Posted on Reply
#95
Vya Domus
dgianstefaniOr are we being disingenuous again
Of course you are, the 7900XTX is still faster in raster and if you want a GPU that can comfortably beat it at that metric from Nvidia you actually have to go all the way up to the eye watering 1600$ 4090.

I know you are totally obsessed with RT and if it was after you literally nothing else would matter but that's not how it works in the real world. AMD can ask for a premium for better raster performance in the same way Nvidia can for RT, it must be real hard to wrap your head around the concept but it's actually a real phenomena.

The thing is Intel sucks in both raster and RT so you shouldn't be so bewildered that AMD cards are much more expensive than Intel's.
Posted on Reply
#96
dgianstefani
TPU Proofreader
Vya DomusOf course you are, the 7900XTX is still faster in raster.
What was it the mod said?
the54thvoidNow, as a mod - I'll say this:This news piece is about RDNA 4 and RT. This isn't a place for rasterisation discussions, or posting benchies of non-RT raster. If you're arguing AMD's point on that here, you're definitely in the wrong thread.
Posted on Reply
#97
the54thvoid
Super Intoxicated Moderator
dgianstefaniWhat was it the mod said?
You've all been guilty :p . But as you've quoted me, fine, I'll say it again.

RT talk people. Raster isn't the point of this OP.

Reply bans for those unwilling to discuss RT.
Posted on Reply
#98
Noyand
Vya DomusNot a single clue what you are trying to say, I never said anything that wasn't true. I literally cannot see how I am being dishonest, Intel's raster performance is atrocious, that may make comparisons seem strange but it's not my fault there are situations where they're beaten by mid range cards from 5 years ago.

I am told Intel is this really competitive new third player in the GPU market and they are for sure gonna destroy AMD but the fact is performance is in the dump on Intel GPUs in almost all conceivable metrics.
IIrc both of you were arguing about the RT speed, not the raster. Alchemist was delayed™ to death, and they settled to compete in the lower mid-range. Here's a reminder of the context in which Arc was launched: the RX6800XT has always been waaaaaay out reach from Arc alchemist, and they even though that they needed to undercut the RX 6700XT to have a chance...in the current market Arc is just lost, many Europeans stores even stopped selling them altogether, the few that are left are so badly priced it's not even funny :D A RX 6800 is currently 50€ more expensive than a A770. Yup. You read that right.


We know that Intel fumbled when it comes to general rendering performance, in theory the A770 should have been faster than it is...but it failed to deliver. However, TPU review showed that the hit that Arc is taking when RT is on is lower when compared to RDNA 2 and not that far behind ampere. Now that doesn't mean that Battlemage will manage to battle it out with Blackwell and RDNA 4, but if they can figure out the general rendering performance, they might realistically target an overall parity with Ada upper midrange (and nvidia even made the job easier for them)
Posted on Reply
#99
Makaveli
dgianstefaniFor a £300+ card I want to be able to use whatever resolution and settings I please.

Besides, GPUs are traditionally tested at higher resolutions, and CPUs at lower resolutions, to avoid bottlenecks from other parts of the system.

It's true, 1080p/1440p is the range these cards can comfortably operate with settings maxed. However the A770 in particular is decent as a productivity card, and doesn't mind higher resolutions for gaming either.
Wanting to use whatever resolutions and settings you please is usually reserved for High end cards not £300+

i've never heard anyone say that for a midrange gpu and even more so with RT on
Posted on Reply
#100
RGAFL
NoyandIIrc both of you were arguing about the RT speed, not the raster. Alchemist was delayed™ to death, and they settled to compete in the lower mid-range. Here's a reminder of the context in which Arc was launched: the RX6800XT has always been waaaaaay out reach from Arc alchemist, and they even though that they needed to undercut the RX 6700XT to have a chance...in the current market Arc is just lost, many Europeans stores even stopped selling them altogether, the few that are left are so badly priced it's not even funny :D A RX 6800 is currently 50€ more expensive than a A770. Yup. You read that right.


We know that Intel fumbled when it comes to general rendering performance, in theory the A770 should have been faster than it is...but it failed to deliver. However, TPU review showed that the hit that Arc is taking when RT is on is lower when compared to RDNA 2 and not that far behind ampere. Now that doesn't mean that Battlemage will manage to battle it out with Blackwell and RDNA 4, but if they can figure out the general rendering performance, they might realistically target an overall parity with Ada upper midrange (and nvidia even made the job easier for them)
If it was strictly RT games I wanted to play at 1080/1440p then I would go Intel truth be told. But in the real world for the sake of the 4 or 5 games that some cards do really well in (truth be told that's all it is) is it worth sacrificing performance in the near 100% other games.
Posted on Reply
Add your own comment
Nov 26th, 2024 08:46 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts