Tuesday, June 14th 2022

AMD Plans Late-October or Early-November Debut of RDNA3 with Radeon RX 7000 Series

AMD is planning to debut its next-generation RDNA3 graphics architecture with the Radeon RX 7000 series desktop graphics cards, some time in late-October or early-November, 2022. This, according to Greymon55, a reliable source with AMD and NVIDIA leaks. We had known about a late-2022 debut for AMD's next-gen graphics, but now we have a finer timeline.

AMD claims that RDNA3 will repeat the feat of over 50 percent generational performance/Watt gains that RDNA2 had over RDNA. The next-generation GPUs will be built on the TSMC N5 (5 nm EUV) silicon fabrication process, and debut a multi-chip module design similar to AMD's processors. The logic dies with the GPU's SIMD components will be built on the most advanced node, while the I/O and display/media accelerators will be located in separate dies that can make do on a slightly older node.
Sources: Greymon55 (Twitter), VideoCardz
Add your own comment

90 Comments on AMD Plans Late-October or Early-November Debut of RDNA3 with Radeon RX 7000 Series

#1
ixi
Imagine the shock when they gonna delay it. :D
Posted on Reply
#2
Timelessest
Now the question is whether this is a paper launch or a real one.
Posted on Reply
#3
Gungar
fancuckerlet me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
I don't see how "abysmal" ray tracing performance makes it obsolete when 95% of the users don't give a shit about raytracing...
Posted on Reply
#4
mechtech
fancuckerlet me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
Ray tracing. I don’t have one game with that.
Posted on Reply
#5
windwhirl
I feel like the top range of both Nvidia and AMD GPUs will be more memery than anything else with their rumored high TDPs.

I'm actually more interested in the middle range.
fancuckerlet me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
RT is worthless outside of the top range for the most part since there's no point for it unless you go fully into it. And even then, traditional lighting has improved enough that RT is a waste of energy.
Posted on Reply
#6
Leiesoldat
lazy gamer & woodworker
Ray Tracing was foisted upon us by NVIDIA looking for more revenue streams than what they have with HPC, AI, and auto. Most people, I suspect, would rather like to have smoother and higher frame rates at 4K+ than useless pretty lighting.
Posted on Reply
#7
bug
The real question is, will it also be a "improvement" of 50% more $$$/performance?
Posted on Reply
#8
windwhirl
bugThe real question is, will it also be a "improvement" of 50% more $$$/performance?
My guess is that if the prices stay as they're, we're lucky.

Additionally, people voted with their wallets, tacitly approving the graphics card price rise.
Posted on Reply
#9
Kohl Baas
windwhirlI feel like the top range of both Nvidia and AMD GPUs will be more memery than anything else with their rumored high TDPs.

I'm actually more interested in the middle range.


RT is worthless outside of the top range for the most part since there's no point for it unless you go fully into it. And even then, traditional lighting has improved enough that RT is a waste of energy.
Not if you can convince the developers to cut the better traditional lightning in favor of RT and put some basic shit instead.
Posted on Reply
#10
ppn
Once in a lifetime might just switch to AMD, for a RX 7700 XT RYSen 7 7700X build with 7 7s. performance very similar to 5800X and RX 6900.
People voted, will vote again, but those were the upper wealthy or desperate 20%, the remaining 80% get to vote this round.
Posted on Reply
#11
bug
Kohl BaasNot if you can convince the developers to cut the better traditional lightning in favor of RT and put some basic shit instead.
Traditional lighting isn't better, it's much poorer approximation. And remember the performance hits you take when you enable AO or Sun rays. Traditional lighting also won't do off-screen reflections and a few other things. There's a reason Hollywood productions are all ray-traced ;)

The reason we're not seeing much difference today is AMD's weak RT hardware which forces developers to use very, very few rays if they want their games to run on AMD cards. But this will fix itself, in time.
Posted on Reply
#12
Unregistered
fancuckerlet me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
They had similar raster performance to Ampere would be surprised if they don't mange to do the same with RDNA 3.
Ray tracing is rubbish, no GPU is capable to run it, the only point I see in ray tracing is when used by developers to help create fake lightning otherwise use the GPU performance to really improve graphics like more way more polygons to have realistic models (what's the point of a cube ball reflecting everything) and better textures.
Posted on Edit | Reply
#13
Thimblewad
@bug RT works like shit on Nvidia also, so don't blame AMD for stuff not going "forward".
Posted on Reply
#14
Taraquin
mechtechRay tracing. I don’t have one game with that.
Dlss is much more important for me, RT I turn off :) however, FSR 2.0 seems close to dlss.
Posted on Reply
#15
Daven
fancuckerlet me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
What is ray tracing? :)
Posted on Reply
#16
bug
Thimblewad@bug RT works like shit on Nvidia also, so don't blame AMD for stuff not going "forward".
Nvidia can handle more rays (i.e. takes less of a hit when handling the same number of rays), but yes, overall the hardware is not there yet on either side.
Posted on Reply
#17
Daven
Using RDNA1 as the baseline, AMD has been pretty close for two generations:

RX590 225W 58%
RX5700XT 225W 100% (just shy 50% perf/W)
RX6900XT 300W 200% (over 50% perf/W)
RX7900XT 450W 400% (promised perf/W)

Percents from TPU reference card reviews on the date of release (2.5k and 4k res)
Posted on Reply
#18
Guwapo77
TimelessestNow the question is whether this is a paper launch or a real one.
AMD didn't announce a date, so it can be neither.
LeiesoldatRay Tracing was foisted upon us by NVIDIA looking for more revenue streams than what they have with HPC, AI, and auto. Most people, I suspect, would rather like to have smoother and higher frame rates at 4K+ than useless pretty lighting.
Man I want both. And I hope this generation both Nvidia and AMD giveth all the hardware power to render the proper light.
TaraquinDlss is much more important for me, RT I turn off :) however, FSR 2.0 seems close to dlss.
I can't say I'm a fan of DLSS or FSR 2.0; however, I completely understand their importance to the gaming community. They certainly have an uphill battle for them...
Posted on Reply
#19
bug
Guwapo77AMD didn't announce a date, so it can be neither.
I think what he meant is: when AMD announces something in Oct/Nov/whenever, will there be immediate availability? Because there wasn't for the previous generation. Obviously, very few people know the answer to that question and they're all under NDAs.
Posted on Reply
#20
Lew Zealand
fancuckerlet me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
You mean like how the 6900XT is 31% faster than the 3070 and 8% faster than the 3080? At least if you're going to troll, do a better job. And Nvidia's RT is also obsolete, merely slightly less so: trading tiny visual changes for whopping FPS drops.
Posted on Reply
#21
Space Lynx
Astronaut
GungarI don't see how "abysmal" ray tracing performance makes it obsolete when 95% of the users don't give a shit about raytracing...
Well said, and I agree 100%

Remember when Nvidia Physx was all the hype? Made games unplayable (I prefer high refresh gaming for ultimate smoothness)

eh it's w.e

I like both companies, but really hope I can get my hands a on high end RNDA3 GPU this winter. it may be the last build I do in next ten years if my life plans keep going the way they have been (I may be getting married soon) :rockout:
Posted on Reply
#22
InVasMani
GungarI don't see how "abysmal" ray tracing performance makes it obsolete when 95% of the users don't give a shit about raytracing...
It has it's place, but there are area's where hardware resources can still be put to greater use like bump mapping quality along with ambient occlusion at higher geometry detail and LOD distances. Also better animation quality that requires greater frame rates and you're not getting thru RTRT anytime soon.
Posted on Reply
#23
Valantar
fancuckerlet me save you the suspense, mildly more efficient, with comparable raster performance to the xx70 tier next generation nvidia card, and abysmal ray tracing performance rendering it obsolete from the get-go.
Man, that's one of the lazier attempts at trolling I've seen lately. Come on, at least make an effort!



To me this sounds like a reasonable time frame (late in the year, but in time for the holidays if there is stock), but of course this is all to be taken with a huge helping of salt. It'll be fun to see how performance pans out in the next generation though. The rumors are all over the place, but all seem to promise more performance (just at wildly varying power costs), so it'll be interesting to see it all play out.
Posted on Reply
#24
Timelessest
bugI think what he meant is: when AMD announces something in Oct/Nov/whenever, will there be immediate availability? Because there wasn't for the previous generation. Obviously, very few people know the answer to that question and they're all under NDAs.
Exactly, nowadays when AMD, Intel, and NVIDIA announce something there's zero availability for a good amount of time. So, it's possible that they announce it in November, but we will only find it in stores in December.
Posted on Reply
#25
Valantar
TimelessestExactly, nowadays when AMD, Intel, and NVIDIA announce something there's zero availability for a good amount of time. So, it's possible that they announce it in November, but we will only find it in stores in December.
It's extremely unlikely for there to be a launch event without stock this late in the year, simply because of the holiday shopping season and the subsequent q1 lull. That would be a very, very poor business strategy, even if the product can stand on its own merits - most people will have very little money to spend on luxuries at that point in time. This leads me to interpret "launch" in the source here as the actual launch date, not the announcement date (which in recent years has tended to be a couple of months before this).
Posted on Reply
Add your own comment
Dec 22nd, 2024 00:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts