• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

Most people on the internet (you know,the vocal ones) are fans of amd. You can tell on forums, reddit etc they are the vast majority.
dude, you bought a 2000 dollar GPU, Nvidia has a share of 80%. AMD has a market share of 20%… you are the one with selective bias.
20% cannot be the vast majority!
 
dude, you bought a 2000 dollar GPU, Nvidia has a share of 80%. AMD has a market share of 20%… you are the one with selective bias.
20% cannot be the vast majority!
Do you understand what the vocal minority is? It's the people that scream on forums about how bad intel and nvidia iare They are indeed the minority as you pointed out, that's why the industry doesn't care about them. They are just being loud on social media. The industry follows the 80% of customers.

AMD knows this that's why they are ignoring the "RT bad" crowd and try to push for RT. Let's just hope they do well this time around. A 100% increase as they claim over RDNA 3 is pretty decent.
 
Its just not relevant anymore. Its as simple as that. I find myself enjoying games like Vampire Survivor (old school graphics, fantastic mechanics and progression paths) on Deck relaxing in the sun so much more than having top notch graphics but being forced to sit behind a PC with kb/mouse and shitloads of gpu grunt.

At the same time sure, I spent 100+ hours in Cyberpunk too but two thirds of that was without RT on a 1080 and the game isnt better in the last 33 hours with RT on a 7900XT. Graphics upgrade notwithstanding, the game underneath was already 'done' in the first two thirds of that time; even with better graphics I still focused on finding more 'game' within it but its still the same shooter in a halfway finished open world.

Nvidias RT push hasnt brought a single game further in terms of gameplay.

312 hours on a ps5, on RT mode… and around 130 hours on a ps4pro, and around 150 hours on a PC 4K, ULTRA, RT shadows
(7900xtx, etc)
i think i enjoyed the ps5 more. even at 30fps (60 fps, do not look good, so inured to a 30fps rate, although it was in HDR, whereas on the PC had to bein SDR.
 
Last edited:
60/64CU $650 24GB, 275W - 5-10% faster than 7900XT raster, at least on par with 7900XTX in RT
54/58CU $550 20GB, 250W - On par with 7900XTX raster, slightly slower than 7900XTX in RT
48/52CU $400 16GB, 230W - 15% faster than 7800XT, 7900XT in RT

I have a 6900XT and I would even consider upgrading if these prices are good. Been doing AI stuff, and man it's time consuming. Would love to process my images and videos more quickly. Though I think if I were able to find some more modern algorithms, that would help a lot. Seems like all the tools I find came out 4yrs ago and it's difficult to tell what model was last updated. I've been curious what kind of speed improvements, if any, I'd get with RDNA3.

This focus is on RT improvements, which is nice, but I am also highly curious what kind of AI improvements RDNA4 will have. Will it move to XDNA architecture? Still use its own "AI accelerators"?
 
I fail to see how that is relevant, at this point people here are just discussing ( weirdly enough) the value of RT, I offered a good representation of what RT offers over rasterization, its not even about the game, its about the tech and that would be applied the same way in any other game.

But fine...even more DF material:



DOGMA2 looks gaudy with path tracing. I mean if you're blind it looks acceptable and you can enjoy that snow pixel crawling of the denoise.
 
keep changing the narrative
This^

Low-mid range should at least have 12 GB or RAM, mid at least 16-20 GB and ultra high end should already be at 32 GB. But there is no to little progress there sadly. Older games that got released when first 8 GB appeared still look very great because they have about the same texture budget as current game.
This^

It is too expensive to get the benefit of that in a way I would be satisfied.
The same.

My ultimate point is that RT currently offers little benefit for the performance penalty on any hardware.
Even the 4090 can't provide 60 FPS in CP2077 (not PT, just plain RT) at 4K maxed out settings natively (100% render resolution).

The upcoming 5090 will be able to do that but at what cost, 2000+ EUR?!

I believe RT will truly become mainstream only after the PS6 is out and available for purchase.

By mainstream I mean, a 60 class GPU (non-Super, non-Ti) that is able to provide on average 60 FPS in RT-heavy titles at QHD RT-maxed out settings natively (100% render resolution).

AMD is following the market. Everyone one does that and it needs to be addressed despite if it is a great feature or lackluster. If not, the company doing it will market it and get more sales. That is exactly what is happening with NV vs AMD. Every industry flow that schematic and it is for a reason.
If they intend to stay relevant, otherwise SONY will turn to others like Intel (or even Nvidia) for their next console.

If raster performance improves by 50% due to having 50% more shader cores while costing less than 50% more, and RT performance doesn't drop by 40%, but only 25%, I'll call it an improved RT core design. Until then, let's stay with my chocolate bar analogy. More of the same for a proportionally higher price is not progress. Period.
This^
 
I can't take DF seriously when their like the scene on the right looks better...FOH it does.
Norse Mythology Game GIF by Xbox
Tuning Youtube GIF by Feliks Tomasz Konczakowski
 
DOGMA2 looks gaudy with path tracing. I mean if you're blind it looks acceptable and you can enjoy that snow pixel crawling of the denoise.

Did you even watch the video...they explain exactly why it looks like it looks....both that and Forza dont have the full official implementation, these are hacks of what is running underneath....come on now.
 
If the RT/PT doesn't look like this, I don't want it and call it like it is: a gimmick.


This is always what I expected from RT, not to turn it on and take a perf hit and looks practically the same.
 
All I hope from this new gen is a 7800xt raster perf for 399 or less.
 
All I hope from this new gen is a 7800xt raster perf for 399 or less.

yeah miss the days of next gen lower tier card offering last gen higher tier performance at lower tier pricing instead of today where it's next gen lower tier card offering last gen higher tier performance at the higher tier pricing.
 
All I hope from this new gen is a 7800xt raster perf for 399 or less.
Agreed, but i fear that people want AMD to release such GPUs with the hope the Ngreedia will cut their prices just so everyone will give their money to ….Ngreedia.
 
Even the 4090 can't provide 60 FPS in CP2077 (not PT, just plain RT) at 4K maxed out settings natively (100% render resolution).
Native is irrelevant.

Is this what native enjoyers are really after? The picture on the right is basically every new games "native" due to freaking TAA. Just click it and see how much detail is gone.

1721975931034.png
 
Native is irrelevant.

Is this what native enjoyers are really after? The picture on the right is basically every new games "native" due to freaking TAA. Just click it and see how much detail is gone.

View attachment 356419
TAA is not native. I'd rather play with less details native resolution+anti-aliasing settings to increase the clarity, than ultra settings, rt but the need of an upscaler (FSR, DLSS, XeSS).
This is very important especially in War thunder, I use SSAA 4x even if it makes my graphics card noisier because I can see distant enemies better. It makes me want to have a 4K screen but unfortunately I don't intend to buy one soon and my PC wouldn't support it correctly for now anyway.
 
Last edited:
Native is irrelevant.

Is this what native enjoyers are really after? The picture on the right is basically every new games "native" due to freaking TAA. Just click it and see how much detail is gone.

View attachment 356419
The goal of AA is the opposite of upscaling - it's supposed to give you more detail with less performance. The fact that TAA fails sometimes is a different story. But for that matter, TAA isn't any more "native" than 8x MSAA.
 
i think you have it reversed…

i think AA smooths the Alias-es, (or blurs) those diagonal lines that should not be there, but are because of the quantization or screen door effect that too few pixels produces…

AA blurs… (no new content)

upscaling generates from nothing, so requires blurring or making a circle out of a square…so you don’t notice the blob effect. (or from clues in the picture, motion vectors, scene content etc.)

lol, CSI’s detail from nothing zoom effect, could be excused by stating the source they use is actually downsampled to fit the screen, and zooming is just displaying what is there…
 
i think you have it reversed…

i think AA smooths the Alias-es, (or blurs) those diagonal lines that should not be there, but are because of the quantization or screen door effect that too few pixels produces…

AA blurs… (no new content)

upscaling generates from nothing, so requires blurring or making a circle out of a square…so you don’t notice the blob effect. (or from clues in the picture, motion vectors, scene content etc.)

lol, CSI’s detail from nothing zoom effect, could be excused by stating the source they use is actually downsampled to fit the screen, and zooming is just displaying what is there…
AA is extremelly vast, TAA, SSAA, MSAA are all AA techniques but with a different real life perception. In theory, even FSR, DLSS and XeSS can be used as AA. Make them use your native resolution and render higher and you have an "AI" AA.
I believe I've already see someone talking about DLAA in techpowerup, but I've never see this settings in a game. It should be exactly DLSS used as AA considering the name.
In the end I'm a bit confused. :oops:

1721999976436.png


It kinda help me a bit.
 
upscaling is not AA.
Can have the same effect, but it will amplify jaggies when it is integer value scaling, which requires further processing to make not noticeable.
lanzos (amongst all “fancy“upscalers) upscales in noninteger amounts, thus has build in AA. which “spreads” out the jaggies…

AA is not upscaling. it “cleans up the picture”. it “in effect“ smears the pixels, because a stairstep line is more noticeable that a blurry line…
and looks real when moving because you see blur as speed in real life…
 
upscaling is not AA.
Can have the same effect, but it will amplify jaggies when it is integer value scaling, which requires further processing to make not noticeable.
lanzos (amongst all “fancy“upscalers) upscales in noninteger amounts, thus has build in AA. which “spreads” out the jaggies…

AA is not upscaling. it “cleans up the picture”. it “in effect“ smears the pixels, because a stairstep line is more noticeable that a blurry line…
and looks real when moving because you see blur as speed in real life…
That's interesting.

yeah miss the days of next gen lower tier card offering last gen higher tier performance at lower tier pricing
Sure do.

Agreed, but i fear that people want AMD to release such GPUs with the hope the Ngreedia will cut their prices just so everyone will give their money to ….Ngreedia.
Yep, that's concerning.
 
FACT: Only a subset of a subset of a subset of PC gamers actually cares about RT enough to want to spend significantly more money on an RT-capable card. The vast majority (>85%) would rather buy a card with equivalent raster capability and ZERO RT if it means spending 25-50% less. I've heard the arguments about the necessity of pushing the technology envelope forward while gradually bringing the down the cost. That's a stupid argument. No one pays 25-50% more for a car because it has a cool paintjob using new paint technology. Nobody. There must be something else driving the RT push other than a few whiny gotta-be-the besters.
 
You definitely don't need a 4090 to play rt games. Entirely depends on your resolution. I'm playing rt on my 3060ti, for example hogwarts.

DLSS only drops quality to amd users, whoever has an nvidia card prefers it over native. Heck I activate it even in games that can easily play native.

FSR is fine, worse than dlss but it's fine. Question is, why would you buy an amd card for FSR when nvidia cards also have access to it?
More VRAM
 
Back
Top