Monday, July 22nd 2024

Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

We've known since May that AMD is giving its next generation RDNA 4 graphics architecture a significant upgrade with ray tracing performance, and had some clue since then, that the company is working on putting more of the ray tracing workflow through dedicated, fixed function hardware, unburdening the shader engine further. Kepler_L2, a reliable source with GPU leaks sheds light on some of the many new hardware features AMD is introducing with RDNA 4 to better accelerate ray tracing, which should give its GPUs a reduced performance cost of having ray tracing enabled. Kepler_L2 believes that these hardware features should also make it to the GPU of the upcoming Sony PlayStation 5 Pro.

To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
Sources: Kepler_L2 (Twitter), VideoCardz
Add your own comment

247 Comments on Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

#101
ARF
Neo_MorpheusWell, moving from my previous gpu (gtx970) to a 7900xtx doesn’t feel like a doa product and unlike Ngreedias offerings, mine had a 120US$ rebate plus 2 games worth 170US$ a couple of months after the original launch. So yes, they offered a rebate. :)
I guess that a upgrade from GeForce 2, or even better, from Riva TNT to Radeon RX 7900 XTX would be super duper impressive.
These are completely different performance tiers, and historical periods. :roll:
Posted on Reply
#102
AnarchoPrimitiv
AssimilatorBy having something called a "business plan".
That's a non-answer...that's the same as if a Coach for a sports team came up to you and asked for advice on how to win the game and you tell him "Score more points than the other team"
fevgatosIf amd spends a fraction of nvidia's r&d shouldn't their cards also cost a fraction of the price?
That makes zero sense....why would AMD's chips from TSMC or components from other manufacturers cost less because AMD spends less on R&D?
Posted on Reply
#103
JustBenching
AnarchoPrimitivThat makes zero sense....why would AMD's chips from TSMC or components from other manufacturers cost less because AMD spends less on R&D?
Huh? Are you asking how it's possible for a company with a fraction of the R&D costs to have lower prices?

They both have to pay TSMC and other components, but nvidia has to pay 10 times the R&D costs that amd spents (that was your claim, not mine), isn't it obvious that the latter can offer their products a lot cheaper?
Posted on Reply
#104
64K
AnarchoPrimitivThat's a non-answer...that's the dame as if a Coach for a sports team came up to you and asked for advice on how to win the game and you tell him "Score more points than the other team"
Well, that advice would work :p

I think what AMD is doing will work too. Both Intel and AMD know that they need to improve on RT performance and clearly AMD is on the right track. Assuming they do pull it off but no need to speculate on that right now. Just wait for samples to drop here and on other trusted sites for review.

I'm just glad to see AMD being more competitive. The current series was just underwhelming to most gamers looking to upgrade.
Posted on Reply
#105
ARF
AnarchoPrimitivwhy would AMD's chips from TSMC or components from other manufacturers cost less because AMD spends less on R&D?
Chips can have quite different prices, depending on many things, including profit margins, economy of scale (quantity, quality), employees salaries, etc..
Posted on Reply
#106
Vya Domus
fevgatosAt some point you need to stop pretending amd is at 10% marketshare because of brand name.
When you have fanboys implying that yes, they will in fact pay X amount to stare down at reflections for 200 hours I have no option but to consider this is the case.
Posted on Reply
#107
AnarchoPrimitiv
Kn0xxPTAMD GPU is 1 gen behind Nvidia ... Nvidia is making sure to keep AMD on that step ...
There are things that turns a product into better value in mainstream. Performance/Efficiency/Ecosystem.
Lets see what NVIDIA offers:
+Better Performance/Watt
+Better DLSS implementations (even XeSS got a little better than FSR, go figure )
+Better Encoding Capabilities aka NVENC ... used in OBS and even on Discord.
+Better day1 Games Optimized drivers
+Better Software support on CUDA
- Premium Price

In every "+" AMD failed to surpass Nvidia,
You're assuming these are things every consumer uses or cares about....there's also the little problem of the fact that consumers can CLAIM they care about these things only to literally never actually use them...what I'm trying to point out is thar a lot of people around here just ASSUME consumers are rational when LITERALLY for 100 years, since Eduard Bernase invented PR by using his Uncle's concepts (his unvle was Sigmund Freud), we have known that consumers are easily, manipulated and far from rational.

Even in the past when AMD have an OBJECTIVELY better videocard (I'm talking 10+ years ago) everybody still bought Nvidia....AdoredTV did a great multipart investigation on this with tons of empirical data a few years ago....and overcoming mindshare and "mentia" is not as simple as, for exmaple, cutting prices.

I like to use the example of flagship phones. I think its safe to assume that 90%+ of people who spend $1500 on a phone probably ending up using 10% of the compute power of that phone and instead buy it as a way to silently announce to the world their higher status in the Hierarchy. They'd probably be better served by a $200 phone and a $500 laptop and a $500 DSLR camera, but people still line up to spend $1500 on a phone whose hardware they'll never fully utilize... and the same is probably at play with videocards....they want all the features, but they probably won't actually use them...it's a FOMO thing.
ARFChips can have quite different prices, depending on many things, including profit margins, economy of scale (quantity, quality), employees salaries, etc..
No, AMD's expenditures on R&D will NEVER have an effect on the price of OTHER manufacturers components
Posted on Reply
#108
R0H1T
AnarchoPrimitivI'm trying to point out is thar a lot of people around here just ASSUME consumers are rational when LITERALLY for 100 years, since Eduard Bernase invented PR by using his Uncle's concepts (his unvle was Sigmund Freud), we have known that consumers are easily, manipulated and far from rational.
I mean it's not like they're not humans, no wait :slap:
Posted on Reply
#109
ARF
AnarchoPrimitivNo, AMD's expenditures on R&D will NEVER have an effect on the price of OTHER manufacturers components
He never meant that the relationship should be cause-consequence.
Posted on Reply
#110
JustBenching
Vya DomusWhen you have fanboys implying that yes, they will in fact pay X amount to stare down at reflections for 200 hours I have no option but to consider this is the case.
Of course, the whole reason to upgrade your GPU is for better graphics. If 50€ (or 0 in EU) get you better graphics, then of course youll pay 0 euros. Why wouldnt you.
Posted on Reply
#111
AnarchoPrimitiv
fevgatosHuh? Are you asking how it's possible for a company with a fraction of the R&D costs to have lower prices?

They both have to pay TSMC and other components, but nvidia has to pay 10 times the R&D costs that amd spents (that was your claim, not mine), isn't it obvious that the latter can offer their products a lot cheaper?
No, my original claim is that for the BOM on AMD's videocards, they will be either paying at best the same as Nvidia does (assuming they go to the same manufacturers) or most likely even more because they cannot make the same large volume orders as Nvidia that inccur a larger discount. Nobody made the claim that Nvidia "spends 10x what AMD does on R&D", I literally gave the exact figures on what they spend. Also, I'm sure that expenditures on R&D amount to a small percentage of overall cost on a videocard versus the BOM, labor costs, transportation costs, import taxes, etc
Posted on Reply
#112
JustBenching
AnarchoPrimitivEven in the past when AMD have an OBJECTIVELY better videocard (I'm talking 10+ years ago) everybody still bought Nvidia....AdoredTV did a great multipart investigation on this with tons of empirical data a few years ago....and overcoming mindshare and "mentia" is not as simple as, for exmaple, cutting prices.
That's a great point but there is a bit of an issue. It's just not true. When AMD (ATI) made better cards, they had way higher market share. They had anywhere from 40 to 60% up until maxwell and pascal (makes sense, doesn't it?). They lost marketshare when they were literally making horrible cards or no cards at all (the vega lineup delayed for a year).

I bought all of them btw. Hd 4770, 5850, 6850 (that was a downgrade), 7850XT, had them all. Then they just stopped competing.
Posted on Reply
#113
Vya Domus
fevgatosthe whole reason to upgrade your GPU is for better graphics.
It used to be the graphics improvements were very noticeable, this isn't the case anymore, you need to squint your eyes to tell which graphical options look better and even then you'd have no clue which actually is objectively "better".

Here's a fun test : Can you tell which side is using RT ?

Posted on Reply
#114
Makaveli
fevgatosThat's a great point but there is a bit of an issue. It's just not true. When AMD (ATI) made better cards, they had way higher market share. They had anywhere from 40 to 60% up until maxwell and pascal (makes sense, doesn't it?). They lost marketshare when they were literally making horrible cards or no cards at all (the vega lineup delayed for a year).

I bought all of them btw. Hd 4770, 5850, 6850 (that was a downgrade), 7850XT, had them all. Then they just stopped competing.
Ati never had 40-60% market share in DGPU. Where are you getting that info from?

www.theregister.com/2006/12/06/q3_06_graphics_market/
Posted on Reply
#115
Assimilator
AnarchoPrimitivThat's a non-answer...that's the dame as if a Coach for a sports team came up to you and asked for advice on how to win the game and you tell him "Score more points than the other team"
Oh spare me. You used a lot of words to ask "how does a company get big and successful?" and I gave you an answer in kind. Mainly because I'm sick to death of pathetic fanboy drivel like yours that continually attempts to paint AMD as the poor underdog, that just needs people to be understanding of its failures and buy its products, and all of a sudden it will git gud and be able to compete.

AMD is a company. It exists to make money. It does not give a flying fuck about you. It will never do shit for you. So why, exactly, are you spending your time writing sob stories that make it look good? The number of people on this forum who have a parasocial relationship with a god damn corporation is mindblowing and horrifying in equal parts. Being a nerd is fine, being an unpaid corporate shill is not.

AMD is a company. A company that has traditionally been out-planned and out-executed by its competitors, who strangely are far wealthier. If it wants to be better than them, it needs a better plan and better execution, which surprise surprise is what my answer said. What it does not need, again because it's a god damned corporation, is your sympathy.
Posted on Reply
#116
AnarchoPrimitiv
ARFHe never meant that the relationship should be cause-consequence.
He said "If AMD spends less on R&D, shouldn't their products be cheaper", I'm pointing out that
fevgatosThat's a great point but there is a bit of an issue. It's just not true. When AMD (ATI) made better cards, they had way higher market share. They had anywhere from 40 to 60% up until maxwell and pascal (makes sense, doesn't it?). They lost marketshare when they were literally making horrible cards or no cards at all (the vega lineup delayed for a year).

I bought all of them btw. Hd 4770, 5850, 6850 (that was a downgrade), 7850XT, had them all. Then they just stopped competing.
What's just not true? Is English your first language? I'm asking that with ALL respect, but it seems like I'm saying one thing and then you're arguing with another thing that I haven't actually said. What I'm arguing is that there have been numerous times in the past, when AMD has had a product that offered OBJECTIVELY better value than Nvidia's competing product, and despite this, consumers still overwhelmingly bought the Nvidia card despite having no empirical reason to do so. I do not see how either company's marketshare at any given point would be relevant to that statement...a consumer does not consider the company's marketshare when deciding whose product to buy
Posted on Reply
#117
JustBenching
Vya DomusIt used to be the graphics improvements were very noticeable, this isn't the case anymore, you need to squint your eyes to tell which graphical options look better and even then you'd have no clue which actually is objectively "better".

Here's a fun test : Can you tell which side is using RT ?

Of course, but the best part is that if you can't tell the difference then you might as well go for a slower card in RT and save 0$.

Posted on Reply
#118
Vya Domus
fevgatosOf course, but the best
The best part is that you wont say it.

Come on, which is which ?
Posted on Reply
#119
JustBenching
MakaveliAti never had 40-60% market share in DGPU. Where are you getting that info from?
Everywhere. Even the adored TV that anarchoprimitv suggested has it.
Posted on Reply
#120
AnarchoPrimitiv
AssimilatorOh spare me. You used a lot of words to ask "how does a company get big and successful?" and I gave you an answer in kind. Mainly because I'm sick to death of pathetic fanboy drivel like yours that continually attempts to paint AMD as the poor underdog, that just needs people to be understanding of its failures and buy its products, and all of a sudden it will git gud and be able to compete.

AMD is a company. It exists to make money. It does not give a flying fuck about you. It will never do shit for you. So why, exactly, are you spending your time writing sob stories that make it look good? The number of people on this forum who have a parasocial relationship with a god damn corporation is mindblowing and horrifying in equal parts. Being a nerd is fine, being an unpaid corporate shill is not.

AMD is a company. A company that has traditionally been out-planned and out-executed by its competitors, who strangely are far wealthier. If it wants to be better than them, it needs a better plan and better execution, which surprise surprise is what my answer said. What it does not need, again because it's a god damned corporation, is your sympathy.
Whose a fanboy? How is listing OBJECTIVE empirical data like R&D expenditures painting someone as an underdog? I think you're engaging in projection here....I'll boil down my original post for you: and I NEVER asked, in any way "how a company gets big and successful", what I AM asking is "how is AMD supposed to do all the things everybody thinks they should do?"

How is AMD supposed to do what everyone on this site thinks they should do when they have OBJECTIVELY fewer resources?All I was pointing out is that this is a highly multifaceted issue, and the factors I find to be most important, i.e. MONEY are the exact ones I NEVER here anyone discussing here. I'm pointing out that people like to tell companies what to do, but never address how they're supposed to do it.

Final question: Why are you getting personal and in your feelings on this? You're obviously all worked up as the contempt is seething from your words, why are you so offended? I don't think I deserve your obvious contempt.

FYI: If you look at my posting history, there are tons of times where I literally say that we should never cheer for one company or the other and that we should be cheering for these companies to split marketshare evenly because that would bring about the most competition a d would be best for us as consumers....in that sense. I cheer for AMD, but if AMD was the dominant party, I'd be cheering against them...how could you ever think that someone with an OBVIOUSLY anti-capitalist worldview (look at my name) thinks a company cares about anybody? Personally, I'm offended by how you are literally inserting motive into my comments that was never there...and again, if you look at my past comments, I'm LITERALLY the one constantly talking about parasocial relationships and consumer irrationality... you're the only one getting emotional here, like somehow you've been personally offended.
Posted on Reply
#121
Makaveli
fevgatosEverywhere. Even the adored TV that anarchoprimitv suggested has it.
Go back and look at my previous post I added a link that shows that is not correct.

They had under 30% during that time.
Posted on Reply
#122
Assimilator
Vya DomusIt used to be the graphics improvements were very noticeable, this isn't the case anymore, you need to squint your eyes to tell which graphical options look better and even then you'd have no clue which actually is objectively "better".

Here's a fun test : Can you tell which side is using RT ?

I remember the days when your argument was "but look at how crappy the RT image is". How the goalposts have shifted.
Posted on Reply
#123
JustBenching
Vya DomusThe best part is that you wont say it.

Come on, which is which ?
LOL of course I can't tell, not only is the picture you posted super low quality it's not even a proper side by side. Post it side by side in normal quality.

Can you tell which one is which on my screenshot? Come on, say it.
MakaveliGo back and look at my previous post I added a link that shows that is not correct.

They had under 30% during that time.
Your link says ATI had 63%. But does it matter? It was way higher than it is today when they were making competitive cards.
Posted on Reply
#124
ARF
fevgatosOf course, but the best part is that if you can't tell the difference then you might as well go for a slower card in RT and save 0$.

Does NFS Most Wanted (2005) include RT?



Posted on Reply
#125
Vya Domus
AssimilatorI remember the days when your argument was "but look at how crappy the RT image is"
No you don't, because I never said RT looks worse.
fevgatosnot only is the picture you posted super low quality it's not even a proper side by side.
It is side by side what are you talking about, the image I uploaded is almost 1080p but it doesn't matter because you're clutching at straws,. If the use of RT is so blatantly obvious it should be easy to tell which is which, even if I am not showing you a zoomed in quadrillion mega pixel image.
Posted on Reply
Add your own comment
Nov 24th, 2024 07:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts