Monday, November 4th 2024
AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20
As we entered November, Valve just finished processing data for October in its monthly update of Steam Hardware and Software Survey, showcasing trend changes in the largest gaming community. And according to October data, AMD's discrete GPUs are not exactly in the best place. In the top 20 most commonly used GPUs, not a single discrete SKU was based on AMD. All of them included NVIDIA as their primary GPU choice. However, there is some change to AMD's entries, as the Radeon RX 580, which used to be the most popular AMD GPU, just got bested by the Radeon RX 6600 as the most common choice for AMD gamers. The AMD Radeon RX 6600 now holds 0.98% of the GPU market.
NVIDIA's situation paints a different picture, as the top 20 spots are all occupied by NVIDIA-powered gamers. The GeForce RTX 3060 remains the most popular GPU at 7.46% of the GPU market, but the number two spot is now held by the GeForce RTX 4060 Laptop GPU at 5.61%. This is an interesting change since this NVIDIA GPU was in third place, right behind the regular GeForce RTX 4060 for desktops. However, laptop gamers are in abundance, and they are showing their strength, placing the desktop GeForce RTX 4060 in third place, recording 5.25% usage.
Source:
Steam Survey
NVIDIA's situation paints a different picture, as the top 20 spots are all occupied by NVIDIA-powered gamers. The GeForce RTX 3060 remains the most popular GPU at 7.46% of the GPU market, but the number two spot is now held by the GeForce RTX 4060 Laptop GPU at 5.61%. This is an interesting change since this NVIDIA GPU was in third place, right behind the regular GeForce RTX 4060 for desktops. However, laptop gamers are in abundance, and they are showing their strength, placing the desktop GeForce RTX 4060 in third place, recording 5.25% usage.
222 Comments on AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20
It feels like the releases at that price point are something like "let's release something - anything at the 300$ mark so we stop them from crying about it". Remember that a 1080 was just 57% faster than a 1060, nowadays a 4080 is ~2.2 to 2.6 times faster than a 4060 (depending on resolution). The low end has become bottom of the barrel.
But at the same time, there's a significant group that's past that and there's a performance delta between generations that pushes people to move up in the stack too. There's also emerging markets in the (semi) pro segment that use high end gpus and they ain't buying Quadros.
Its a bit like what you see in gaming itself: there's a real market for higher end gaming, its big enough to push content for (RT updates in Cyberpunk are the best example), it generates sales all on its own and they are super high margin for all involved - the fools and money part of the market, really, no offense intended, but these people just buy what they can - and despite that being a growth market, it doesn't take any sales away from other segments, there's just more people playing more content continuously. In a relative sense, the amount of higher end GPUs is growing.
And low end (being current day x60... talking bout perspective hehe) was always stuck with some weird configuration of specs: let's recall the GTX 660 with 1.5GB + 0.5GB; the 970 with a similar weird bus; the 4060 with its abysmal bandwidth... AMD's set of failures in RDNA2/3 or the endless rebrand/limbo of Pitcairn... something's always gonna give. But they STILL run games.
Gaming at a bazillion FPS at 4K Ultra is not the dream of every gamer. For some (for many, I'd argue), 1080/1440p Medium is fine if it means leaving some massive cash in one's pocket for other things in life.
Just to illustrate my point: the price difference between the 4060 and the 4090 is around £1,300-1,500. That's the price of an average cruise ticket. So which one brings more to one's table? A once-in-a-lifetime experience around the world, or a different graphics card that plays the exact same games just faster? Um... dunno. :rolleyes:
@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
4k DLSS Q looks disgustingly better than 1440p native. Period. Obviously, since only nvidia is making an enthusiast card right now.
Well here we go. This is directly from Leo at Kitguru while he was reviewing a X870E board.
@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
Sure, but I also think its also an image thing in both the name (Kinda like how Apple products are considered premium regardless of what the truth is) and the fact they keep holding the crown for 'Most Powerful Gaming GPU'. I think alot of things play into it, but I think the image is what the people not doing encoding and are mostly gaming.
It's not about the PPI, ppi is irrelevant. An object in a 4k screen will be made up of twice as many pixels than on a 1440p screen, no matter what the screen size actually is. You can't extract more detail just by having similar PPI simply because you have less pixels to work with. No amount of PPI can replace raw pixel count. Even lods are higher quality when you are running higher res.
Im literally explaining the issue to you. Every single object on a 4k screen will be made up of more than twice the amount of pixels compared to a 1440p screen, regardless of the screen sizes. You cannot - by definition - have better image quality on a 1440p image. A spoon on a 1440p screen will be made out of 200 pixels, on a 4k screen it will be made out of 450 pixels.
You do need to be a techie to know what makes an image better, else youll just drag yourself into the ppi race.
You are technically correct (the best type, I suppose), but what Aus is driving at it that at sizes where hypothetically both 1440p and 4K would hit a certain ppi that is high enough for the task, like 150+ for typical monitor distance or 300+ for mobile usage, the PERCEIVED image quality will be very close, if not indistinguishable. I certainly can’t reliably tell the difference between, say, a recent iPhone with 460 ppi and a new Galaxy Ultra with 500+, not in a meaningful way.
EG1. PPI matters when you are comparing same resolution screens, since the one with the higher PPI will appear sharper while having the same image quality. Crosscomparing with different resolutions is pointless.
I don't give a damn how many pixels there are as long as it looks better. And that's due to PPI first and foremost. No, and I don't really care if I'm honest. I'm not talking about pictures and zooming into them. I'm talking about monitor image quality. You're not working on pixels in a raw image when you're gaming, are you?
The photoshop was just an example to demonstrate the concept. Set it to 1:1 pixel view and just draw a horizontal line with each pixel having a unique color. On a 1080p monitor you can only get 1920 different colors, on a 4k screen you can get double that. This is literally what image detail is, how many unique colors you can get. It's not different in games. What do you think happens in a game when you drop resolution from 4k to 1080p? What do you think happens to the image? There were 8.3m pixes at 4k, now we are down to 2m. Do you think those extra 6.3m pixels were doing nothing there?
I know everyone likes clowning on Apple (for some reason), but their Retina concept isn’t actually just a marketing meme and there was thought put behind it. And most serious researchers, even those noting some imperfections with it, tend to agree on the principle. Yes, it’s called a laptop screen. And yes, the WHOLE GOAL IS IMPROVING PERCEIVED DISPLAY QUALITY, that’s the whole point, not just jacking off to numbers.
Edit: Also, make a discussion or something guys, I just noticed the thread we are in and this is a derail if I ever saw one.