• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Plans Late-October or Early-November Debut of RDNA3 with Radeon RX 7000 Series

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD is planning to debut its next-generation RDNA3 graphics architecture with the Radeon RX 7000 series desktop graphics cards, some time in late-October or early-November, 2022. This, according to Greymon55, a reliable source with AMD and NVIDIA leaks. We had known about a late-2022 debut for AMD's next-gen graphics, but now we have a finer timeline.

AMD claims that RDNA3 will repeat the feat of over 50 percent generational performance/Watt gains that RDNA2 had over RDNA. The next-generation GPUs will be built on the TSMC N5 (5 nm EUV) silicon fabrication process, and debut a multi-chip module design similar to AMD's processors. The logic dies with the GPU's SIMD components will be built on the most advanced node, while the I/O and display/media accelerators will be located in separate dies that can make do on a slightly older node.



View at TechPowerUp Main Site | Source
 
Now the question is whether this is a paper launch or a real one.
 
Last edited:
Low quality post by fancucker
let me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.

I don't see how "abysmal" ray tracing performance makes it obsolete when 95% of the users don't give a shit about raytracing...
 
let me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
Ray tracing. I don’t have one game with that.
 
I feel like the top range of both Nvidia and AMD GPUs will be more memery than anything else with their rumored high TDPs.

I'm actually more interested in the middle range.

let me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
RT is worthless outside of the top range for the most part since there's no point for it unless you go fully into it. And even then, traditional lighting has improved enough that RT is a waste of energy.
 
Ray Tracing was foisted upon us by NVIDIA looking for more revenue streams than what they have with HPC, AI, and auto. Most people, I suspect, would rather like to have smoother and higher frame rates at 4K+ than useless pretty lighting.
 
The real question is, will it also be a "improvement" of 50% more $$$/performance?
 
The real question is, will it also be a "improvement" of 50% more $$$/performance?
My guess is that if the prices stay as they're, we're lucky.

Additionally, people voted with their wallets, tacitly approving the graphics card price rise.
 
I feel like the top range of both Nvidia and AMD GPUs will be more memery than anything else with their rumored high TDPs.

I'm actually more interested in the middle range.


RT is worthless outside of the top range for the most part since there's no point for it unless you go fully into it. And even then, traditional lighting has improved enough that RT is a waste of energy.
Not if you can convince the developers to cut the better traditional lightning in favor of RT and put some basic shit instead.
 
Once in a lifetime might just switch to AMD, for a RX 7700 XT RYSen 7 7700X build with 7 7s. performance very similar to 5800X and RX 6900.
People voted, will vote again, but those were the upper wealthy or desperate 20%, the remaining 80% get to vote this round.
 
Not if you can convince the developers to cut the better traditional lightning in favor of RT and put some basic shit instead.
Traditional lighting isn't better, it's much poorer approximation. And remember the performance hits you take when you enable AO or Sun rays. Traditional lighting also won't do off-screen reflections and a few other things. There's a reason Hollywood productions are all ray-traced ;)

The reason we're not seeing much difference today is AMD's weak RT hardware which forces developers to use very, very few rays if they want their games to run on AMD cards. But this will fix itself, in time.
 
let me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
They had similar raster performance to Ampere would be surprised if they don't mange to do the same with RDNA 3.
Ray tracing is rubbish, no GPU is capable to run it, the only point I see in ray tracing is when used by developers to help create fake lightning otherwise use the GPU performance to really improve graphics like more way more polygons to have realistic models (what's the point of a cube ball reflecting everything) and better textures.
 
@bug RT works like shit on Nvidia also, so don't blame AMD for stuff not going "forward".
 
Ray tracing. I don’t have one game with that.
Dlss is much more important for me, RT I turn off :) however, FSR 2.0 seems close to dlss.
 
@bug RT works like shit on Nvidia also, so don't blame AMD for stuff not going "forward".
Nvidia can handle more rays (i.e. takes less of a hit when handling the same number of rays), but yes, overall the hardware is not there yet on either side.
 
Using RDNA1 as the baseline, AMD has been pretty close for two generations:

RX590 225W 58%
RX5700XT 225W 100% (just shy 50% perf/W)
RX6900XT 300W 200% (over 50% perf/W)
RX7900XT 450W 400% (promised perf/W)

Percents from TPU reference card reviews on the date of release (2.5k and 4k res)
 
Now the question is whether this is a paper launch or a real one.
AMD didn't announce a date, so it can be neither.

Ray Tracing was foisted upon us by NVIDIA looking for more revenue streams than what they have with HPC, AI, and auto. Most people, I suspect, would rather like to have smoother and higher frame rates at 4K+ than useless pretty lighting.
Man I want both. And I hope this generation both Nvidia and AMD giveth all the hardware power to render the proper light.

Dlss is much more important for me, RT I turn off :) however, FSR 2.0 seems close to dlss.
I can't say I'm a fan of DLSS or FSR 2.0; however, I completely understand their importance to the gaming community. They certainly have an uphill battle for them...
 
Last edited:
AMD didn't announce a date, so it can be neither.
I think what he meant is: when AMD announces something in Oct/Nov/whenever, will there be immediate availability? Because there wasn't for the previous generation. Obviously, very few people know the answer to that question and they're all under NDAs.
 
let me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.

You mean like how the 6900XT is 31% faster than the 3070 and 8% faster than the 3080? At least if you're going to troll, do a better job. And Nvidia's RT is also obsolete, merely slightly less so: trading tiny visual changes for whopping FPS drops.
 
I don't see how "abysmal" ray tracing performance makes it obsolete when 95% of the users don't give a shit about raytracing...

Well said, and I agree 100%

Remember when Nvidia Physx was all the hype? Made games unplayable (I prefer high refresh gaming for ultimate smoothness)

eh it's w.e

I like both companies, but really hope I can get my hands a on high end RNDA3 GPU this winter. it may be the last build I do in next ten years if my life plans keep going the way they have been (I may be getting married soon) :rockout:
 
I don't see how "abysmal" ray tracing performance makes it obsolete when 95% of the users don't give a shit about raytracing...

It has it's place, but there are area's where hardware resources can still be put to greater use like bump mapping quality along with ambient occlusion at higher geometry detail and LOD distances. Also better animation quality that requires greater frame rates and you're not getting thru RTRT anytime soon.
 
let me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
Man, that's one of the lazier attempts at trolling I've seen lately. Come on, at least make an effort!



To me this sounds like a reasonable time frame (late in the year, but in time for the holidays if there is stock), but of course this is all to be taken with a huge helping of salt. It'll be fun to see how performance pans out in the next generation though. The rumors are all over the place, but all seem to promise more performance (just at wildly varying power costs), so it'll be interesting to see it all play out.
 
Back
Top