Tuesday, June 14th 2022

AMD Plans Late-October or Early-November Debut of RDNA3 with Radeon RX 7000 Series
AMD is planning to debut its next-generation RDNA3 graphics architecture with the Radeon RX 7000 series desktop graphics cards, some time in late-October or early-November, 2022. This, according to Greymon55, a reliable source with AMD and NVIDIA leaks. We had known about a late-2022 debut for AMD's next-gen graphics, but now we have a finer timeline.
AMD claims that RDNA3 will repeat the feat of over 50 percent generational performance/Watt gains that RDNA2 had over RDNA. The next-generation GPUs will be built on the TSMC N5 (5 nm EUV) silicon fabrication process, and debut a multi-chip module design similar to AMD's processors. The logic dies with the GPU's SIMD components will be built on the most advanced node, while the I/O and display/media accelerators will be located in separate dies that can make do on a slightly older node.
Sources:
Greymon55 (Twitter), VideoCardz
AMD claims that RDNA3 will repeat the feat of over 50 percent generational performance/Watt gains that RDNA2 had over RDNA. The next-generation GPUs will be built on the TSMC N5 (5 nm EUV) silicon fabrication process, and debut a multi-chip module design similar to AMD's processors. The logic dies with the GPU's SIMD components will be built on the most advanced node, while the I/O and display/media accelerators will be located in separate dies that can make do on a slightly older node.
90 Comments on AMD Plans Late-October or Early-November Debut of RDNA3 with Radeon RX 7000 Series
Also, virtually all new TVs are 4K.
Why it has to be Ultra, you can make very meaningful deductions in settings to gain performance.
The problem with a $499 SRP for the Navi33 8GB model will be the competition.
Cutdown AD104 (i suspect 184TC vs 240 of the full chip) it will probably be around 26-32% faster than RTX 3070 (184TC) depending on frequency and i don't think a RTX 4060Ti or whatever Nvidia calls it to be higher than GTX 1080 price ($499).
So full Navi33 will have very similar performance level with cutdown AD104 and AD104 has 12GB memory, sure AMD may price it the same, it has done craziest things in the past, but street price at least in Europe will be lower in order to sell...
As for 6900XT and 3080 Ti performance hitting mid-range cards .... only if your definition of mid-range is $500 and up. Which used to be the high end a few generations ago.
www.techpowerup.com/review/amd-radeon-rx-6900-xt/38.html
Although, it is, when one uses crippleware like Control, which even has separate codepath for AMD GPUs.
Can you explain the high 1366x768 and 1440x900 results?
1920x1080 67.32%
2560x1440 10.49%
1366x768???? 5.89%
3840x2160 2.57%
1440x900???? 2.15%
I keep regular tabs on 4K and it has only gone up by 1% over the past 4 years in the Steam Hardware Survey. It probably never will become mainstream because at the same time that faster GPUs get released games require more and more resources and speed to run.
IMO, the future for tech enthusiasts must entail a radical shift in attitude, making more conscious long-term purchases at higher prices and keeping them for longer as it simply won't be feasible to quickly upgrade to something tangibly better a few years down the road.
The difference between 3840x2160 with 8.2 MPixels and 2560x1600 with 4.1 MPixels is noticeable.
What is 4K PRO-UHD and why does it have lower resolution than standard 4K UHD? | BenQ US
My kids game on 1440x900 screens. If they want something better, they can figure a way to buy a 1080p screen.
Also, Steam is the only one offering statistics with a somewhat high userbase to take data from. So, while it should be taken with a grain of salt, it's somewhat reliable info.
Reality is 4k won't steamroll anything until a few more years at the least, if it ever does. Not to mention, there might be reasons why a 1080p/1440p panel might be preferred over a 4k one. Also, that's a you problem, due to whatever eye condition you might have (whether your sight is basically in perfect state or the complete opposite) and the distance from your eyes to your monitor (which also have a say in which size of panel you'd prefer).
- I never said higher resolutions don't look better (often by quite a lot!)
- I specifically said in the following sentence that in motion, it isn't very noticeably low resolution
- I never said this applied to every person on earth (myself included!)
I did say, and I quote: that it looks "fine". Not good. Not great. But not crap either - unless you either have very good eyesight or are spoiled by higher resolutions - like I am. Heck, personally I would never go below 1440p for my main monitor - but that's not mainly due to gaming, but rather because of the other uses the monitor has. I sort of agree with you about 1080p - my secondary 24" 1080p monitor could definitely stand to have some higher pixel density for what I use it for. But that isn't gaming, and motion resolution is very different from static pixel resolution on LCDs, and is just as dependent on response times as it is on pixel count. It's pretty easy to find a 1080p panel with better motion resolution than a 2160p one.
Does a good 1440p or 2160p monitor look better than a good 1080p one, each running at native resolution? Yes, all else being equal. But for most people, other factors start getting into the equation at that point - which I also covered above - factors of cost, access, processing power, etc. The 1080p monitor you can afford looks better than the 2160p one you can't afford; 1080p high or equivalent at 60+ fps looks quite a lot better than 2160p at low-to-medium 30fps, etc. And, crucially, you can get a good 1080p monitor in the ~$300 range. For gaming, that is. You won't find even a passable 2160p gaming monitor below $700 - below that they're all 60Hz office monitors with slow response times. Which, again, will likely have significantly worse motion resolution than that $300 1080p gaming monitor.
Also, it's downright hilarious to see someone use one of those terrible "this is what resolution looks like" comparison marketing photos in a discussion. Like ... do you honestly think that is representative? Or that it somehow tells me something I'm not familiar with? Heck, they don't even illustrate resolution well in the first place! (And the photoshop work is really lazy!)
I know.
As for 4k a lot are getting their main family TV upgraded to 4k so adoption is increasing but Timmy isn't getting Csgo on that.
Well, 1080p on a 27" monitor does look bad - it is not year 1998 in order to accept it as "fine".
Look at the Retina smartphones displays - they are such exactly because you can't put no more a bad, low-quality 600x400 screen on a 5-6" smartphone screen.
1080p keeps going for two main reasons:
-ugly political support without reasoning;
-people don't think and don't care, just wait for something to fall from the heavens...
For me personally I use 1440p, as 4k on a 27 inch screen everything would be too small for desktop use and I also feel its a decent performance/quality trade off, note that downscaling from 4k still has nice benefits of which dont need a 4k screen for. Also one thing I observed as well is that modern AA standards at least prior to DLSS and FSR are very poor, e.g. play star ocean 4 on a PS3 and then play it again on a PS4 or PC, there is a lot more visible jaggies etc. due to how rubbish the AA is, especially the hair on the characters. AA quality is as important as resolution for image quality.
Lew Zealand seems to have got it right where we seem to be transitioning to a market state that is driven by the enthusiasts instead of the mainstream. Seeing 120hz support on consoles was a real eye opener, as I dont know anyone personally who gives a damn about that. The take up of that feature must be really low, I be surprised if its over 10%.