Monday, November 4th 2024
AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20
As we entered November, Valve just finished processing data for October in its monthly update of Steam Hardware and Software Survey, showcasing trend changes in the largest gaming community. And according to October data, AMD's discrete GPUs are not exactly in the best place. In the top 20 most commonly used GPUs, not a single discrete SKU was based on AMD. All of them included NVIDIA as their primary GPU choice. However, there is some change to AMD's entries, as the Radeon RX 580, which used to be the most popular AMD GPU, just got bested by the Radeon RX 6600 as the most common choice for AMD gamers. The AMD Radeon RX 6600 now holds 0.98% of the GPU market.
NVIDIA's situation paints a different picture, as the top 20 spots are all occupied by NVIDIA-powered gamers. The GeForce RTX 3060 remains the most popular GPU at 7.46% of the GPU market, but the number two spot is now held by the GeForce RTX 4060 Laptop GPU at 5.61%. This is an interesting change since this NVIDIA GPU was in third place, right behind the regular GeForce RTX 4060 for desktops. However, laptop gamers are in abundance, and they are showing their strength, placing the desktop GeForce RTX 4060 in third place, recording 5.25% usage.
Source:
Steam Survey
NVIDIA's situation paints a different picture, as the top 20 spots are all occupied by NVIDIA-powered gamers. The GeForce RTX 3060 remains the most popular GPU at 7.46% of the GPU market, but the number two spot is now held by the GeForce RTX 4060 Laptop GPU at 5.61%. This is an interesting change since this NVIDIA GPU was in third place, right behind the regular GeForce RTX 4060 for desktops. However, laptop gamers are in abundance, and they are showing their strength, placing the desktop GeForce RTX 4060 in third place, recording 5.25% usage.
222 Comments on AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20
But each to their own. Yes. But then I conclude what I've always known: people are idiots.
But then it does not enable low end graphics, it enables higher end graphics on a console. Or higher framerates. So in the end its only, exclusively a matter of perspective I think. The proof is in the pudding: are games actually looking better for X hardware cost, or Y upscale usage. Im a big promoter of keeping access to the options, but with the mess that is new engines and their requirements for TAA (to avoid artifacting in almost anything that moves), we're not really keeping access, and the idea is slowly created in people's minds that you NEED upscale to make a game look good, when in fact the native experience was already shit to begin with and a sufficient level of blurring hides that efficiently.
There are quite a few ways to look at this but its really the perfect storm for computer graphics right now. Total confusion. Between the performance black box that is RT performance (vs gained IQ), upscale (extra FPS vs lost IQ), and dynamic rendering, all bets are off wrt the conclusions one may draw looking at a video game these days.
Upscalers allow you to target higher resolutions which results in better image quality at same performance. Dlss / fsr is the reason I have a 4k monitor, since dlss / fsr quality at 4k looks a lot better than native 1440p with similar performance.
Anyhow, I railed way too far from the main topic and I don't got anything else to add so till next time.
With all due respect, I honestly think people like you are lying to themselves by being happy that DLSS is less shit than 720p when in fact, no one wanted to play at 720p anno 2024 to begin with (crude example).
Similarly, their attempts at high end with HBM... first attempt: Fury X, lots of issues sourcing chips, no OC potential and the chip wasn't better than competitive offerings, while also being stuck with 50% less VRAM as their competitive offering... Their solution for delta compression came far too late, and Nvidia doubled down on it while AMD was pushing Polaris. It is delta compression that allowed Nvidia to avoid Hawaii XT's 512bit bus (and the looming situation that there's nothing above that, and no faster VRAM either), keeping bus width to 256 bit in everything but their top end product. But AMD? AMD was happy to continue pushing HBM, that was STILL hard to source, costly, and complicated efficiency clocking too (neither Vega or Fury could OC worth a damn). Gosh... that failed too. Strange!
Every time it is the lack of dedication to push the boundaries further that kills AMD's progress. RDNA4 is more of that, but they say they will push the RT boundary. To what level? Past Nvidia? I hope so, because otherwise they're still behind. Its like one step forward, and two steps back that way, because then the gap in raw perf with Nvidia will have pretty much doubled from what it is now.
Same for my laptop, I went for a 1600p display cause of fsr.
But yeah, I agree, enough derailment for today. :) That strategy may make them lose market share, but the current strategy (fruitlessly trying hard to compete) makes them lose money. If you ask which one out of these two I'd rather keep, I'd choose money. Good for you. I'm still just a peasant gaming at 1440 UW, and every single upscaling solution I've seen on my screen is worse than native, and I'm certain that'll never change.
I'll amend what I said above: DLSS/FSR can be a good entry ticket to 4K. Not many people have the money and/or desire to game at 4K, though.
When you leave RT performance on the table for 3 full generations, that's not doing your damnest best to compete in the high end, IMHO, either. They even literally said they'd wait it out until it hits the midrange, when they launched RDNA. Obviously, if you're all in, you will use that time to get a solid RT solution by the time you DO need it. And here we are.
If you have good ideas, the money or the market comes anyway. Look at Freesync. That was a good AMD idea - but even there, it was just using what was already there, and pushing it forward a little bit. Not much money involved. Similarly, G-Sync obviously isn't a very costly solution either; you develop it once and use it ad infinitum. Strategically, AMD won that battle, and it proves that AMD's key values CAN work: affordable & open is where its at - as long as it doesn't suck.
I'm not sure what you mean by saying 'it sounds more like an MS problem'. Clearly, the lack of RT perf is an AMD problem, because AMDs not selling GPUs, and they DO make those GPUs to run stuff on a DX12 API. :) Yeah I remember that was the squeeze. But I think it was a long term issue for AMD, Nvidia already offered less in memory bandwidth since Kepler and the gap kept growing. AMD just kept increasing bus widths, and only started on an efficiency improvement with Tonga, which was the rebrandeon age that went literally nowhere.
As you can see the vast majority of heavy hitting AAA games are indeed amd sponsored.
You might want to reconsider that statement ;) Nvidia just sells software performance now, and the vast majority doesn't know what that means.
I was just referring to reviews in general, I didn't see anyone calling the Intel 200 series a "flop" or a failure, yet I know at least one tech channel that called Zen 5 a flop. A page from a company themselves promoting games they sponsored, not really a surprise there. I'm definitely not the target group of nvidia then, imagine buying games for enjoying the story and gameplay itself, unfortunately doesn't seem to be much of a thing anymore as reviewers promote eyecandy over a game actually being enjoyable. Now compare that to the amount of Nvidia sponsored games.
Out of that list a lot of those are console exclusive titles, so it makes sense for AMD to have some sponsored games.
The RT argument is not one I am a fan of because its another one of those new tech arguments where Nvidia comes out with something and now everyone constantly references it like its the new standard. Personally I still see RT as unnecessary or overhyped especially with how much of a performance drop you get in most high end games enabling it. Neither side has it down where I would consider turning it on for most games just to watch my FPS halved in many cases. No doubt Nvidia is better, but of course they are they started the trend. I mean, AMD has tried proprietary things in the past but they never invest enough in getting's games to support it. Normally they just open source it and hope. Problem with Nvidia's proprietary things is in the past alot have been just performance killers if you don't have Nvidia hardware like with Gameworks which I don't think helped anyone other than them. I agree, RT is way overhyped and people talk about it like its the second coming. I have seen some of the best case scenarios on my friends 4090 PC and to be honest I was not that impressed (To be fair, it looked good on Cyberpunk at 4K settings maxed, but performance was pretty abysmal depending on the scene with DLSS off). I will say I like DLSS a lot more than I like RT. I think many people and places focus on RT way to much.