Monday, July 22nd 2024
Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked
We've known since May that AMD is giving its next generation RDNA 4 graphics architecture a significant upgrade with ray tracing performance, and had some clue since then, that the company is working on putting more of the ray tracing workflow through dedicated, fixed function hardware, unburdening the shader engine further. Kepler_L2, a reliable source with GPU leaks sheds light on some of the many new hardware features AMD is introducing with RDNA 4 to better accelerate ray tracing, which should give its GPUs a reduced performance cost of having ray tracing enabled. Kepler_L2 believes that these hardware features should also make it to the GPU of the upcoming Sony PlayStation 5 Pro.
To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
Sources:
Kepler_L2 (Twitter), VideoCardz
To begin with, the RDNA 4 ray accelerator introduces the new Double Ray Tracing Intersect Engine, which should at least mean a 100% ray intersection performance increase over RDNA 3, which in turn offered a 50% increase over that of RDNA 2. The new RT instance node transform instruction should improve the way the ray accelerators handle geometry. Some of the other features we have trouble describing include a 64-byte RT node, ray tracing tri-pair optimization, Change flags encoded in barycentrics to simplify detection of procedural nodes; improved BVH footprint (possibly memory footprint): and RT support for oriented bounding box and instance node intersection. AMD is expected to debut Radeon RX series gaming GPUs based on RDNA 4 in early 2025.
247 Comments on Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked
If they can make a stronger in RT 7900 XT level perf GPU on a much smaller die / chiplet setup theyre golden; that and continued focus on FSR improvements will carry consoles fine.
www.tomshardware.com/news/rtx-on-nvidia-data-shows-surprising-amount-of-gamers-use-ray-tracing-dlss
RT has already been more popular with gamers than many here believe. Cards get old and get replaced. Eventually we will all be in the same place together and RT will just be as accepted as if it had always been there.
At Turing era, a 2070 could play decently all available games like Battlefield, Control etc with RT On.
At Ampere era, the requirements changed. Metro Exodus with GI for example.
You still can play it with a 2080Ti but some games needed better RT performance. But a 3070(=2080Ti) was capable of playing anything.
At Ada era, pathtrace introduced and games like Alan Wake 2 are available.
The mid range cards can play anything now although the prices are bad.
All these years you could play any RT game with mid to high end nVidia but not AMD card.
That was the problem. Always a gen or two behind.
The 7000 series, as well as the 6000, were not bad. The problem was that they were released 2 years later.
No one (seems to) cares if the XTX can deliver 3080-3090 level of RT performance anymore. Because that performance was available 4 years ago and now it's beyond that.
Now if AMD comes out and matches Blackwell (per tier performance in raster and RT) I'll be extremely impressed.
RDNA5 needs to be amazeballs.
In 2023 AMD had an R&D budget $5.8 billion R&D budget that was primarily spent on x86 development as thet is by far their largest revenue stream.
Now, I want everybody in these comments that just assumes that AMD should just be able to match Nvidia explain to me, how that is to be accomplished. Because the way 99% of you speak about it, you act like these two companies are on a level playing field, that they have access to the same sort of resources, and that for AMD it's just a problem of "not pricing videocards cheap enough" while completely ignoring the fact that the current capitalist paradigm is stock price above all and quarterly earnings above all....
Tell me how AMD is supposed to go out there, undercut Nvidia at each tier by $150+ and still keep stock prices up and investors happy while quarterly profits decrease AND all the while LITERALLY paying either the same or even a higher costs than nvidia on the materials used to make the card (Nvidia probably gets components cheaper do to larger volume)....PLEASE explain that to me. If I remember correctly, Intel sold alderlake with a diminished profit margin, how has thst worked for them? Oh that's right, AMD surpassed them in value.
The other lot of you act like it's merely a willpower problem, that AMD just doesn't "want it bad enough", we'll please explain to me why AMD should be focusing on videocards when they make the overwhelming majority of money from x86? Why should they dump money into videocards when you consumers have proven in the past numerous times that even when they make a product that is OBJECTIVELY a better value, 90% of you STILL buy the Nvidia card (that's right, you're not as rational as you think you are an research into consumer psychology has proven this time and time again)? If I was a business, that wouldn't sound like a good investment to me...
We literally live in a world we're money and profit dictates reality, yet in over a decade observing these "discussion" I honestly cannot think of a single instance where anyone even addressed the fact that Nvidia just plain has more resources to compete with, which is arguably the MOST determinant factor in this competition.
The other part of it that seemingly everybody ignores is the fact that the overwhelming majority, 99% of all consumers, including ALL OF YOU, make purchasing decisions based on IRRATIONAL factors like how the product makes you "feel", and we KNOW that's true for videocards because even when AMD offers a compelling videocard that on paper is an OBJECTIVELY better value, the Nvidia competitor still outsells it 10 to 1.
I'm sure so much of this is motivated by FOMO as well as the fact that some of you probably just don't like the ID of coming online to forums like this and saying you have an AMD gpu so you buybthe Nvidia one because you want to be associated with the "winning side"...and don't laugh, because there is literally decades of consumer psychology research that proves the existence and primacy of these phenomenon. How are you going to get irrational consumers to switch to a competitor based on something that is rational like a a product bring a "better value"?
bestvaluegpu.com/en-eu/history/new-and-used-rtx-4080-super-price-history-and-specs/
bestvaluegpu.com/en-eu/history/new-and-used-rx-7900-xtx-price-history-and-specs/
Instead of bumping RT performance, AMD should give priority to other systems .. compete with NVEnc for example, work on efficiency... RT hit performance is still too big, FSR needs to evolve first.
There won't be any high-end competition from AMD this time around (they intend to sit this one out).
Just go to alternate de and you'll find both 4080super and xtx at 999. Both ready to be shipped. Seems like you are the one lying. All the time
www.alternate.de/Gainward/GeForce-RTX-4080-SUPER-Panther-OC-Grafikkarte/html/product/100033241?sug=4080%20supwr
Won't be buying a new GPU before GTA6 PC is out somewhere in 2026...