Monday, July 22nd 2024

Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

We've known since May that AMD is giving its next generation RDNA 4 graphics architecture a significant upgrade with ray tracing performance, and had some clue since then, that the company is working on putting more of the ray tracing workflow through dedicated, fixed function hardware, unburdening the shader engine further. Kepler_L2, a reliable source with GPU leaks sheds light on some of the many new hardware features AMD is introducing with RDNA 4 to better accelerate ray tracing, which should give its GPUs a reduced performance cost of having ray tracing enabled. Kepler_L2 believes that these hardware features should also make it to the GPU of the upcoming Sony PlayStation 5 Pro.

To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
Sources: Kepler_L2 (Twitter), VideoCardz
Add your own comment

247 Comments on Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

#26
ARF
NC37RDNA3 was such a stopgap. Should give RDNA3 owners a rebate. Clear RDNA3 was dead on arrival and AMD knew it.
AusWolfWhat makes you say that?
We all know that Navi 31 is extremely underperforming, with many issues still not fixed.



Posted on Reply
#27
AusWolf
64KThis is more good news and more hope for a healthier market because the Market Share right now is not healthy at all.

Setting aside the Nvidia mindshare issue which is real but what AMD is doing about RT performance isn't going to turn that around completely but possibly make a dent in it. I hear it over and over that AMD needs to compete more with Nvidia but sometimes it seems that the message is really that AMD needs to compete more with Nvidia by not competing with Nvidia. Nvidia is pushing RT hard. That's going to be the future. There will continue to be software improvements to keep frame rates from tanking but eventually all those old GPUs will get replaced with GPUs that can handle RT better. Note that I'm not talking about fully raytraced games, just a mixture of raytracing and rasterization like we have now.

Inevitably someone will say that there are hardly any games that use RT but that's not true. For now there are over 500 games using some RT and there are guaranteed to be thousands and thousands more as the years roll by.
I'm just wondering when we'll get to the age when even lower-midrange cards can run RT at acceptable frame rates. Turing was a demo of this new and experimental tech, and I'm getting the feeling that now, 6 years and 2 Nvidia generations later, it's still just that (new and experimental). Until RT becomes mainstream (as in performance, not in game adoption), my vote will be on raster vs RT. Let history prove me wrong.
Posted on Reply
#28
wolf
Better Than Native
AusWolfto replace the 7800 XT that I sold not long ago
Why did you get rid of it?
AusWolfLet history prove me wrong.
And/or let personal choice be personal choice, plenty of being enjoying RT in plenty of games already, it's just not for you and that's OK too.
Posted on Reply
#29
AusWolf
ARFWe all know that Navi 31 is extremely underperforming, with many issues still not fixed.
The only issue it has is video playback power consumption. Not meeting AMD's performance targets is a sad story, but I wouldn't call future improvements on that front "bug fixes".
wolfWhy did you get rid of it?
To save some cash for holidays this year (1 done, 2 to go). And because its price will probably tank once RDNA 4 is out, which I'm curious of.
wolfAnd/or let personal choice be personal choice, plenty of being enjoying RT in plenty of games already, it's just not for you and that's OK too.
I don't mind RT at all. :) What I do mind is running my games as a slide show and/or spending extortionate amounts of money on a single GPU just to be able to run it.
Posted on Reply
#30
TumbleGeorge
AusWolfI'm just wondering when we'll get to the age when even lower-midrange cards can run RT at acceptable frame rates.
In period 2028-2030.
Posted on Reply
#31
AusWolf
TumbleGeorgeIn period 2028-2030.
That sounds realistic. That's why I'm not too worried about RT performance right now. Sure, it's a nice extra, but nothing more.
Posted on Reply
#32
Vayra86
fevgatosIf amd wants to increase their market share (they don't, gpus are very big dies with minimal profits) they need to compete in RT simply because most of the userbase has nvidia cards. A 4080 user isn't going to upgrade to an 8900xtx if that xtx isn't beating it in rt as well.


Absolutely not. If amd cards compete in both raster and rt with nvidia they will fly off the shelves, especially the high end cards. I've been ready to pull the trigger on an amd card since 2017, but they always fail at the high end.
I think the most plausible reason still for AMD to improve their RT is because they want to remain in the high volume low cost console business. The movement aligns, they dont need 4090+ performance in a console. Nothing even remotely near it either. They do need efficient RT because the market demands it.

If they can make a stronger in RT 7900 XT level perf GPU on a much smaller die / chiplet setup theyre golden; that and continued focus on FSR improvements will carry consoles fine.
Posted on Reply
#34
TumbleGeorge
Vayra86I think the most plausible reason still for AMD to improve their RT is because they want to remain in the high volume low cost console business. The movement aligns, they dont need 4090+ performance in a console. Nothing even remotely near it either. They do need efficient RT because the market demands it.

If they can make a stronger in RT 7900 XT level perf GPU on a much smaller die / chiplet setup theyre golden; that and continued focus on FSR improvements will carry consoles fine.
This is one of the main reasons for delaying the PS6 until 2028.
Posted on Reply
#35
Vayra86
64KIt won't take too long. Have a look at this article from over a year ago:

www.tomshardware.com/news/rtx-on-nvidia-data-shows-surprising-amount-of-gamers-use-ray-tracing-dlss

RT has already been more popular with gamers than many here believe. Cards get old and get replaced. Eventually we will all be in the same place together and RT will just be as accepted as if it had always been there.
Absolutely. How long since Turing released now? Back then my prediction was ten years for it to mature... maybe we'll get there in 8?
Posted on Reply
#36
InVasMani
Seems promising overall at least in terms of ray tracing it's a sizable improvement. I'm a bit more interested in the other details, but it's a good sign in tandem with some of the other stuff we heard about it. It should be a more competitive landscape this generation for GPU's. It's been awhile since we've seen a new generation of GPU's offering some strong value.
Posted on Reply
#37
gffermari
You don't need a flagship to enjoy RT.
At Turing era, a 2070 could play decently all available games like Battlefield, Control etc with RT On.

At Ampere era, the requirements changed. Metro Exodus with GI for example.
You still can play it with a 2080Ti but some games needed better RT performance. But a 3070(=2080Ti) was capable of playing anything.

At Ada era, pathtrace introduced and games like Alan Wake 2 are available.
The mid range cards can play anything now although the prices are bad.

All these years you could play any RT game with mid to high end nVidia but not AMD card.
That was the problem. Always a gen or two behind.
The 7000 series, as well as the 6000, were not bad. The problem was that they were released 2 years later.

No one (seems to) cares if the XTX can deliver 3080-3090 level of RT performance anymore. Because that performance was available 4 years ago and now it's beyond that.
Posted on Reply
#38
john_
TumbleGeorgeIn period 2028-2030.
Too optimistic. It took 20 years to get to a place where a mid range GPU can play everything in raster.
Posted on Reply
#39
wolf
Better Than Native
AusWolfTo save some cash for holidays this year (1 done, 2 to go). And because its price will probably tank once RDNA 4 is out, which I'm curious of.
Ahh yeah fair enough, no big regrets then like the other thread?
AusWolfWhat I do mind is running my games as a slide show and/or spending extortionate amounts of money on a single GPU just to be able to run it.
Interesting, I didn't do eother of those , it wasn't extortionate and no RT games have been a slide show, I guess it's all relative.
gffermariYou don't need a flagship to enjoy RT.
Exactly this, my 3080 has allowed me to enjoy many games with RT effects on, and even taste playable path tracing, as you say a 7900XTX can do this but that's yesteryears performance so it's not all that impressive.

Now if AMD comes out and matches Blackwell (per tier performance in raster and RT) I'll be extremely impressed.
Posted on Reply
#40
Sunny and 75
btarunrRDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3
If they can achieve RT parity with the 4090 in CP2077 at less than half the price (~650 US), that'd be impressive. By parity I mean raw RT performance at 100% render resolution.
Posted on Reply
#41
stimpy88
NC37RDNA3 was such a stopgap. Should give RDNA3 owners a rebate. Clear RDNA3 was dead on arrival and AMD knew it.
RDNA4 is exactly the same. I hope Radeon survives 5 years of non-competitive, overpriced hardware.

RDNA5 needs to be amazeballs.
Posted on Reply
#42
P4-630
So..In other words... I better buy Nvidia!.. :D
Posted on Reply
#43
AnarchoPrimitiv
In 2023 Nvidia had a $7.4 Billion R&D budget that almost the entirety was spent on GPU development.

In 2023 AMD had an R&D budget $5.8 billion R&D budget that was primarily spent on x86 development as thet is by far their largest revenue stream.

Now, I want everybody in these comments that just assumes that AMD should just be able to match Nvidia explain to me, how that is to be accomplished. Because the way 99% of you speak about it, you act like these two companies are on a level playing field, that they have access to the same sort of resources, and that for AMD it's just a problem of "not pricing videocards cheap enough" while completely ignoring the fact that the current capitalist paradigm is stock price above all and quarterly earnings above all....

Tell me how AMD is supposed to go out there, undercut Nvidia at each tier by $150+ and still keep stock prices up and investors happy while quarterly profits decrease AND all the while LITERALLY paying either the same or even a higher costs than nvidia on the materials used to make the card (Nvidia probably gets components cheaper do to larger volume)....PLEASE explain that to me. If I remember correctly, Intel sold alderlake with a diminished profit margin, how has thst worked for them? Oh that's right, AMD surpassed them in value.

The other lot of you act like it's merely a willpower problem, that AMD just doesn't "want it bad enough", we'll please explain to me why AMD should be focusing on videocards when they make the overwhelming majority of money from x86? Why should they dump money into videocards when you consumers have proven in the past numerous times that even when they make a product that is OBJECTIVELY a better value, 90% of you STILL buy the Nvidia card (that's right, you're not as rational as you think you are an research into consumer psychology has proven this time and time again)? If I was a business, that wouldn't sound like a good investment to me...

We literally live in a world we're money and profit dictates reality, yet in over a decade observing these "discussion" I honestly cannot think of a single instance where anyone even addressed the fact that Nvidia just plain has more resources to compete with, which is arguably the MOST determinant factor in this competition.

The other part of it that seemingly everybody ignores is the fact that the overwhelming majority, 99% of all consumers, including ALL OF YOU, make purchasing decisions based on IRRATIONAL factors like how the product makes you "feel", and we KNOW that's true for videocards because even when AMD offers a compelling videocard that on paper is an OBJECTIVELY better value, the Nvidia competitor still outsells it 10 to 1.

I'm sure so much of this is motivated by FOMO as well as the fact that some of you probably just don't like the ID of coming online to forums like this and saying you have an AMD gpu so you buybthe Nvidia one because you want to be associated with the "winning side"...and don't laugh, because there is literally decades of consumer psychology research that proves the existence and primacy of these phenomenon. How are you going to get irrational consumers to switch to a competitor based on something that is rational like a a product bring a "better value"?
Posted on Reply
#45
Kn0xxPT
its clear that Nvidia buyers ...choose Nvidia not because of RT ... its everything else around Nvidia GPU's . Efficient perf/w, DLSS (and all upscalling techs options), NVEnc, Optimized Drivers and then RT.
Instead of bumping RT performance, AMD should give priority to other systems .. compete with NVEnc for example, work on efficiency... RT hit performance is still too big, FSR needs to evolve first.
Posted on Reply
#46
Assimilator
AnarchoPrimitivNow, I want everybody in these comments that just assumes that AMD should just be able to match Nvidia explain to me, how that is to be accomplished. Because rhe way 99% of you speak about it, you act like these two companies are on a level.playing field, that they have access to the same sort of resources, and that for AMD it's just a problem of "not pricing videocards cheap enough" while completely ignoring the fact that tmthr current capitalist paradigm is stock price above all and quarterly earnings above all....tell me how AMD is supposed to go out there, undercut Nvidia at each tier by $150+ and still keep stock prices up and investors happy while quarterly profits decrease....PLEASE explain that to me. If I remember correctly, Intel sold alderlake with a diminished profit margin, how has thst worked for them? Oh that's right, AMD surpassed them in value.
By having something called a "business plan".
Posted on Reply
#47
Sunny and 75
P4-630I better buy Nvidia
If you're a 70 class user, wait for the @W1zzard review of the RDNA4 lineup, then decide on the purchase.

There won't be any high-end competition from AMD this time around (they intend to sit this one out).
Posted on Reply
#48
JustBenching
Vya DomusNo it's not, 4080 super models are a good 100 EUR more expensive most of the time, you just keep lying all the time.

bestvaluegpu.com/en-eu/history/new-and-used-rtx-4080-super-price-history-and-specs/
bestvaluegpu.com/en-eu/history/new-and-used-rx-7900-xtx-price-history-and-specs/
Those are the averages. Nvidia's has more expensive cards like the noctua 4080 that increases the average.

Just go to alternate de and you'll find both 4080super and xtx at 999. Both ready to be shipped. Seems like you are the one lying. All the time

www.alternate.de/Gainward/GeForce-RTX-4080-SUPER-Panther-OC-Grafikkarte/html/product/100033241?sug=4080%20supwr
Posted on Reply
#49
P4-630
Sunny and 75If you're a 70 class user, wait for the @W1zzard review of the RDNA4 lineup, then decide on the purchase.

There won't be any high-end competition from AMD this time around (they intend to sit this one out).
I still use my 2070 Super. Will most likely do a Arrow Lake platform upgrade end this year though.
Won't be buying a new GPU before GTA6 PC is out somewhere in 2026...
Posted on Reply
#50
Sunny and 75
P4-630Won't be buying a new GPU before GTA6 PC is out somewhere in 2026...
The right path to take, for sure!
Posted on Reply
Add your own comment
Nov 24th, 2024 07:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts