Monday, November 4th 2024

AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20

As we entered November, Valve just finished processing data for October in its monthly update of Steam Hardware and Software Survey, showcasing trend changes in the largest gaming community. And according to October data, AMD's discrete GPUs are not exactly in the best place. In the top 20 most commonly used GPUs, not a single discrete SKU was based on AMD. All of them included NVIDIA as their primary GPU choice. However, there is some change to AMD's entries, as the Radeon RX 580, which used to be the most popular AMD GPU, just got bested by the Radeon RX 6600 as the most common choice for AMD gamers. The AMD Radeon RX 6600 now holds 0.98% of the GPU market.

NVIDIA's situation paints a different picture, as the top 20 spots are all occupied by NVIDIA-powered gamers. The GeForce RTX 3060 remains the most popular GPU at 7.46% of the GPU market, but the number two spot is now held by the GeForce RTX 4060 Laptop GPU at 5.61%. This is an interesting change since this NVIDIA GPU was in third place, right behind the regular GeForce RTX 4060 for desktops. However, laptop gamers are in abundance, and they are showing their strength, placing the desktop GeForce RTX 4060 in third place, recording 5.25% usage.
Source: Steam Survey
Add your own comment

222 Comments on AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20

#176
JustBenching
Vayra86If you want to do some half serious, but even casual gaming on a PC that is more than browser activity you are probably getting into something along the x50ti-x60 range of price/GPU. So that is the 250-350 dollar segment; not the glorified IGP segment. I build the occasional system for that target demographic and you generally end up in that segment for GPU, higher is deemed too expensive and 'doesn't pay off' for these people. They don't care about graphics at all, they just want it to run and not look completely shit and rarely have over $1k to spend.

And believe it or not but this is precisely what AMD and Nvidia target for mass markets, its also precisely what Steam surveys show as the most prevalent 'real' gaming GPUs that aren't IGPs. The overwhelming majority of gamers aren't graphics whores, but people who want to just play games.
From personal experience, my mobile 6700s (which is basically a power limited - underclocked 6600xt) is perfectly fine for exactly what you described, playing games without them looking like complete ass. And sure, that 250$ price range used to be the bulk of sales and profits, but is it still? Going back to steam survey, 4060 is 5.25% of the market, while the 4090 is 1.2% Isn't nvidia making a lot more bang from that 1.2%? I don't know, I'd think their margins are much bigger there, but I ain't an expert, could be wrong.

It feels like the releases at that price point are something like "let's release something - anything at the 300$ mark so we stop them from crying about it". Remember that a 1080 was just 57% faster than a 1060, nowadays a 4080 is ~2.2 to 2.6 times faster than a 4060 (depending on resolution). The low end has become bottom of the barrel.
Posted on Reply
#177
Vayra86
fevgatosFrom personal experience, my mobile 6700s (which is basically a power limited - underclocked 6600xt) is perfectly fine for exactly what you described, playing games without them looking like complete ass. And sure, that 250$ price range used to be the bulk of sales and profits, but is it still? Going back to steam survey, 4060 is 5.25% of the market, while the 4090 is 1.2% Isn't nvidia making a lot more bang from that 1.2%? I don't know, I'd think their margins are much bigger there, but I ain't an expert, could be wrong.
I think there's two things going on. The basic premise still is that people who get into pc gaming buy (Lower) midrange.

But at the same time, there's a significant group that's past that and there's a performance delta between generations that pushes people to move up in the stack too. There's also emerging markets in the (semi) pro segment that use high end gpus and they ain't buying Quadros.

Its a bit like what you see in gaming itself: there's a real market for higher end gaming, its big enough to push content for (RT updates in Cyberpunk are the best example), it generates sales all on its own and they are super high margin for all involved - the fools and money part of the market, really, no offense intended, but these people just buy what they can - and despite that being a growth market, it doesn't take any sales away from other segments, there's just more people playing more content continuously. In a relative sense, the amount of higher end GPUs is growing.
Posted on Reply
#178
AusWolf
fevgatosFrom personal experience, my mobile 6700s (which is basically a power limited - underclocked 6600xt) is perfectly fine for exactly what you described, playing games without them looking like complete ass. And sure, that 250$ price range used to be the bulk of sales and profits, but is it still? Going back to steam survey, 4060 is 5.25% of the market, while the 4090 is 1.2% Isn't nvidia making a lot more bang from that 1.2%? I don't know, I'd think their margins are much bigger there, but I ain't an expert, could be wrong.
5.25% is 4.3x more than 1.2% while the price of the 4090 is roughly 5.3x higher than the 4060. I don't know about the differences in profit margins, but the money moved seems to be similar.
fevgatosIt feels like the releases at that price point are something like "let's release something - anything at the 300$ mark so we stop them from crying about it".
The vast majority of PC gamers disagree with you.
Posted on Reply
#179
Vayra86
fevgatosIt feels like the releases at that price point are something like "let's release something - anything at the 300$ mark so we stop them from crying about it". Remember that a 1080 was just 57% faster than a 1060, nowadays a 4080 is ~2.2 to 2.6 times faster than a 4060 (depending on resolution). The low end has become bottom of the barrel.
This I think is really a matter of perspective. If you've travelled to the top end of the GPU stack and seen it all, a lot of the lower end stuff looks pretty shitty, but then we're just spoiled, really, because those lower end cards DO run games fine. I mean let's consider for a moment the amount of content a Steam Deck will run on a mere APU at 15W. And it sells. Oh boy does it sell.

And low end (being current day x60... talking bout perspective hehe) was always stuck with some weird configuration of specs: let's recall the GTX 660 with 1.5GB + 0.5GB; the 970 with a similar weird bus; the 4060 with its abysmal bandwidth... AMD's set of failures in RDNA2/3 or the endless rebrand/limbo of Pitcairn... something's always gonna give. But they STILL run games.
Posted on Reply
#180
JustBenching
AusWolf5.25% is 4.3x more than 1.2% while the price of the 4090 is roughly 5.3x higher than the 4060. I don't know about the differences in profit margins, but the money moved seems to be similar.


The vast majority of PC gamers disagree with you.
Well didn't you say the vast majority of people are idiots or something? So doesn't really matter, does it? :D
Posted on Reply
#181
AusWolf
Vayra86This I think is really a matter of perspective. If you've travelled to the top end of the GPU stack and seen it all, a lot of the lower end stuff looks pretty shitty,
I disagree with that too because:
Vayra86but then we're just spoiled, really, because those lower end cards DO run games fine. I mean let's consider for a moment the amount of content a Steam Deck will run on a mere APU at 15W. And it sells. Oh boy does it sell.
Exactly. How many times I've upgraded my hardware to the highest end with great enthusiasm only to have a reaction like "meh" when I saw it in action. Then I sold my stuff and bought something midrange.

Gaming at a bazillion FPS at 4K Ultra is not the dream of every gamer. For some (for many, I'd argue), 1080/1440p Medium is fine if it means leaving some massive cash in one's pocket for other things in life.
Posted on Reply
#182
Vayra86
AusWolfI disagree with that too because:

Exactly. How many times I've upgraded my hardware to the highest end with great enthusiasm only to have a reaction like "meh" when I saw it in action. Then I sold my stuff and bought something midrange.

Gaming at a bazillion FPS at 4K Ultra is not the dream of every gamer. For some (for many, I'd argue), 1080/1440p Medium is fine if it means leaving some massive cash in one's pocket for other things in life.
There's always a sweet spot for gaming, and its generally at or around what consoles do at the moment. I agree with you. Its much better playing at that sweet spot because you'll have the best bang for buck, good support, no early adopting nonsense... etc.
Posted on Reply
#183
AusWolf
fevgatosWell didn't you say the vast majority of people are idiots or something? So doesn't really matter, does it? :D
Touché. :D Anyway, my point stands that it's not a "let's just release something" category. People do care about their money (this isn't what makes them idiots, though).
Vayra86There's always a sweet spot for gaming, and its generally at or around what consoles do at the moment. I agree with you. Its much better playing at that sweet spot because you'll have the best bang for buck, good support, no early adopting nonsense... etc.
Not to mention better thermals, sensible power consumption and a size that fits into your chassis. :)

Just to illustrate my point: the price difference between the 4060 and the 4090 is around £1,300-1,500. That's the price of an average cruise ticket. So which one brings more to one's table? A once-in-a-lifetime experience around the world, or a different graphics card that plays the exact same games just faster? Um... dunno. :rolleyes:
Posted on Reply
#184
LittleBro
OnasiAt OG MSRP considering the 24 gigs of new and expensive GDDR6X and it being a halo card (those are by default bad value)? Nah, it was okay. Transistor per dollar isn’t everything, which is why I am not too hot on you using it as a metric.
What it did end up selling at eventually in real world is another matter entirely.
Since I'm unable to edit my previous post anymore, I'm posting update here and more accurate calculations below. By the way, transistors per dollar metric (while being my quick calculation idea) is not completely bullshit and does say something about manufacturers margins. But here we go again, 24 GB of GDDR6X VRAM seems to be like $600 more more expensive than 16 GB, right? Definitely not. Just look at RTX 4090, same amount of memory, much higher performance but yet somewhat similar MSRP to 3090 and much lower than 3090 Ti, achieving that despite ever raising complexity of production and other things to blame.



Bottom line: We end up paying higher amounts of money for GPUs, but we also keep getting more hardware for it.
Maybe that's what explains famous Jensen's motto: "The more you buy, the more you save." Anyway, don't take those calculations too seriously.

yfn_ratchetMind you everyone, Steam Hardware Survey also contains a huge amount of:
  1. OEM Prebuilts
  2. Chinese PCs
  3. Both.
And in those markets, people generally only want something 'good enough' and will likely be playing primarily esports titles or grindfests (looking at you, War Thunder). Green sells way better than red, because the popular perception of 'Nvidia is the gamer's choice' is still prevalent. They'd have to actually massively flop for that to change. Ada was just disappointing.

Were you to ask PC enthusiasts exclusively, I imagine far more 6800XTs, 7800XTs, and 7900XTXs would turn up.
And as was already pointed out, not everyone participates in the survey. For hardware survey, one would assume that the bigger the data pool, the better, but Valve probably thinks otherwise. It is strange why only chosen ones are participating each month. It's like you want to evaluate average final mark in particular subject of a 25 students in class but you pick only 10 and those might be the better or worse ones which will render the statistics badly inaccurate due to poor statistical sample.
fevgatosAs I've said, I've bought a 4k monitor cause with dlss I can get the same performance to a 1440p monitor but with much higher image quality. . I was literally between the 27" 240hz woled and the 32" 4k woled, and I went for the latter because of dlss. It just looks better with hardly any performance sacrifice.
With DLSS, you can get same performance on 4K monitor as with 1440p one and even with much higher image quality? Does having distorted and guessed frames improve image quality? From the beginning, DLSS and similar stuff has always been about sacrificing image quality for performance. Your thinking seems familia... Oh, it's you again ...
Posted on Reply
#185
kapone32
Well here we go. This is directly from Leo at Kitguru while he was reviewing a X870E board.

@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
Posted on Reply
#186
JustBenching
LittleBroWith DLSS, you can get same performance on 4K monitor as with 1440p one and even with much higher image quality? Does having distorted and guessed frames improve image quality? From the beginning, DLSS and similar stuff has always been about sacrificing image quality for performance. Your thinking seems familia... Oh, it's you again ...
Funny thing cause I checked my eyes 2 weeks ago, 12 / 10 (24/20 if you are a US resident). Flawless vision.

4k DLSS Q looks disgustingly better than 1440p native. Period.
kapone32Well here we go. This is directly from Leo at Kitguru while he was reviewing a X870E board.

@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
Obviously, since only nvidia is making an enthusiast card right now.
Posted on Reply
#187
kapone32
fevgatosFunny thing cause I checked my eyes 2 weeks ago, 12 / 10 (24/20 if you are a US resident). Flawless vision.

4k DLSS Q looks disgustingly better than 1440p native. Period.

Obviously, since only nvidia is making an enthusiast card right now.
There is no Graphical difference between 4k High and 4K Ultra but the CPU works much harder at high than Ultra. Try City Skylines 2 and see what I mean.

Well here we go. This is directly from Leo at Kitguru while he was reviewing a X870E board.

@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
Posted on Reply
#188
GhostRyder
TheinsanegamerNAMD has long needed to purge most of their obsolete management system and get newer, younger, more driven people in charge. One of the worst things they did was NOT purge ATi employees in 2006. They never integrated well with AMD employees.

Streamers go with nvidia because of encoding. Thats it. We like to meme about all the things nvidia does better then AMD that nobody uses, like encoding, but for streamers NVENC is an absolute game changer. Especially for HD or 4k streams or anything high bitrate. AMD simply doesnt compete there, and thats the use case where it makes way more sense.
I agree, they really need to do some overhauling on the people running Radeon.

Sure, but I also think its also an image thing in both the name (Kinda like how Apple products are considered premium regardless of what the truth is) and the fact they keep holding the crown for 'Most Powerful Gaming GPU'. I think alot of things play into it, but I think the image is what the people not doing encoding and are mostly gaming.
Posted on Reply
#189
AusWolf
fevgatosFunny thing cause I checked my eyes 2 weeks ago, 12 / 10 (24/20 if you are a US resident). Flawless vision.

4k DLSS Q looks disgustingly better than 1440p native. Period.
Do you have a 1440p monitor to test for sure? In my experience, any resolution lower than your monitor's native looks like ass. No wonder a higher res with DLSS looks better.
kapone32Well here we go. This is directly from Leo at Kitguru while he was reviewing a X870E board.

@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
So who is the end user? Enthusiasts? Or regular people who just want to run games?
Posted on Reply
#190
JustBenching
AusWolfDo you have a 1440p monitor to test for sure? In my experience, any resolution lower than your monitor's native looks like ass. No wonder a higher res with DLSS looks better.
I have 6 monitors, yeah
Posted on Reply
#191
AusWolf
fevgatosI have 6 monitors, yeah
Similar PPI?
Posted on Reply
#192
JustBenching
AusWolfSimilar PPI?
They can't be similar man, in order for a 1440p monitor to have similar ppi to a 32" 4k it needs to be what, 24"? Does that even exist?

It's not about the PPI, ppi is irrelevant. An object in a 4k screen will be made up of twice as many pixels than on a 1440p screen, no matter what the screen size actually is. You can't extract more detail just by having similar PPI simply because you have less pixels to work with. No amount of PPI can replace raw pixel count. Even lods are higher quality when you are running higher res.
Posted on Reply
#193
AusWolf
fevgatosThey can't be similar man, in order for a 1440p monitor to have similar ppi to a 32" 4k it needs to be what, 24"? Does that even exist?

It's not about the PPI, ppi is irrelevant. An object in a 4k screen will be made up of twice as many pixels than on a 1440p screen, no matter what the screen size actually is. You can't extract more detail just by having similar PPI simply because you have less pixels to work with. No amount of PPI can replace raw pixel count. Even lods are higher quality when you are running higher res.
I don't mean to be rude, but that's bullshit. PPI and colour accuracy are the main deciding factors that determine your image quality. Higher PPI means more pixels in a given area which results in a crisper image. That's monitor basics, man, you don't even need to be a techie to know this. Why else do you think a 1080p phone screen looks better than a 27" 1080p monitor?
Posted on Reply
#194
JustBenching
AusWolfI don't mean to be rude, but that's bullshit. PPI and colour accuracy are the main deciding factors that determine your image quality. Higher PPI means more pixels in a given area which results in a crisper image. That's monitor basics, man, you don't even need to be a techie to know this. Why else do you think a 1080p phone screen looks better than a 27" 1080p monitor?
Uhm, no. A phone screen looks better cause you don't have good enough vision to notice that it really doesn't (I don't either, not a personal attack).

Im literally explaining the issue to you. Every single object on a 4k screen will be made up of more than twice the amount of pixels compared to a 1440p screen, regardless of the screen sizes. You cannot - by definition - have better image quality on a 1440p image. A spoon on a 1440p screen will be made out of 200 pixels, on a 4k screen it will be made out of 450 pixels.

You do need to be a techie to know what makes an image better, else youll just drag yourself into the ppi race.
Posted on Reply
#195
AusWolf
fevgatosUhm, no. A phone screen looks better cause you don't have good enough vision to notice that it really doesn't (I don't either, not a personal attack).

Im literally explaining the issue to you. Every single object on a 4k screen will be made up of more than twice the amount of pixels compared to a 1440p screen, regardless of the screen sizes. You cannot - by definition - have better image quality on a 1440p image. A spoon on a 1440p screen will be made out of 200 pixels, on a 4k screen it will be made out of 450 pixels.

You do need to be a techie to know what makes an image better, else youll just drag yourself into the ppi race.
Then why does text on my 1200x540 phone screen appear a lot shaper even when I'm looking at it from a distance of 5 cm than my 1440 ultrawide monitor from half a metre away? You're talking nonsense, my dude. Have you not noticed that your photos look a lot better on your phone than on the big screen TV?
Posted on Reply
#196
Onasi
@fevgatos
You are technically correct (the best type, I suppose), but what Aus is driving at it that at sizes where hypothetically both 1440p and 4K would hit a certain ppi that is high enough for the task, like 150+ for typical monitor distance or 300+ for mobile usage, the PERCEIVED image quality will be very close, if not indistinguishable. I certainly can’t reliably tell the difference between, say, a recent iPhone with 460 ppi and a new Galaxy Ultra with 500+, not in a meaningful way.
Posted on Reply
#197
JustBenching
AusWolfThen why does text on my 1200x540 phone screen appear a lot shaper even when I'm looking at it from a distance of 5 cm than my 1440 ultrawide monitor from half a metre away? You're talking nonsense, my dude. Have you not noticed that your photos look a lot better on your phone than on the big screen TV?
Exactly, it appears a lot sharper because you don't have the eye vision to tell that it isn't. You are hiding the imperfections by making the picture tiny on a phone. Text or image isn't suddenly sharper, you just can't see it. Have you ever worked on photoshop?
Onasi@fevgatos
You are technically correct (the best type, I suppose), but what Aus is driving at it that at sizes where hypothetically both 1440p and 4K would hit a certain ppi that is high enough for the task, like 150+ for typical monitor distance or 300+ for mobile usage, the PERCEIVED image quality will be very close, if not indistinguishable. I certainly can’t reliably tell the difference between, say, a recent iPhone with 460 ppi and a new Galaxy Ultra with 500+, not in a meaningful way.
Someone that has ever in his life worked in photoshop can quickly tell you that PPI is a laughable metric and raw pixel count is the one and only metric that matters for image quality. Just zoom in into a 1080p image vs a 4k image and youll instantly notice the huge difference in color banding between the two. It's just physics, you can't have as much colors / pixels / image quality on a small resolution, regardless of how tiny your screen is.

EG1. PPI matters when you are comparing same resolution screens, since the one with the higher PPI will appear sharper while having the same image quality. Crosscomparing with different resolutions is pointless.
Posted on Reply
#198
AusWolf
fevgatosExactly, it appears a lot sharper because you don't have the eye vision to tell that it isn't. You are hiding the imperfections by making the picture tiny on a phone. Text or image isn't suddenly sharper, you just can't see it.
Isn't that the whole point of image quality? To make it appear better? :kookoo:

I don't give a damn how many pixels there are as long as it looks better. And that's due to PPI first and foremost.
fevgatosHave you ever worked on photoshop?
No, and I don't really care if I'm honest.
fevgatosSomeone that has ever in his life worked in photoshop can quickly tell you that PPI is a laughable metric and raw pixel count is the one and only metric that matters for image quality. Just zoom in into a 1080p image vs a 4k image and youll instantly notice the huge difference in color banding between the two. It's just physics, you can't have as much colors / pixels / image quality on a small resolution, regardless of how tiny your screen is.

EG1. PPI matters when you are comparing same resolution screens, since the one with the higher PPI will appear sharper while having the same image quality. Crosscomparing with different resolutions is pointless.
I'm not talking about pictures and zooming into them. I'm talking about monitor image quality. You're not working on pixels in a raw image when you're gaming, are you?
Posted on Reply
#199
JustBenching
AusWolfIsn't that the whole point of image quality? To make it appear better? :kookoo:

I don't give a damn how many pixels there are as long as it looks better. And that's due to PPI first and foremost.
Well if your goal is just to make it appear better then we might as well be using a 14" 1080p monitor at a distance of 2 meters, it will be so tiny you won't see any misshaps.

The photoshop was just an example to demonstrate the concept. Set it to 1:1 pixel view and just draw a horizontal line with each pixel having a unique color. On a 1080p monitor you can only get 1920 different colors, on a 4k screen you can get double that. This is literally what image detail is, how many unique colors you can get.
AusWolfI'm not talking about pictures and zooming into them. I'm talking about monitor image quality. You're not working on pixels in a raw image when you're gaming, are you?
It's not different in games. What do you think happens in a game when you drop resolution from 4k to 1080p? What do you think happens to the image? There were 8.3m pixes at 4k, now we are down to 2m. Do you think those extra 6.3m pixels were doing nothing there?
Posted on Reply
#200
Onasi
fevgatosSomeone that has ever in his life worked in photoshop can quickly tell you that PPI is a laughable metric and raw pixel count is the one and only metric that matters for image quality.
Okay, this is just a ridiculous maximalistic statement at this point. Static images and actual screen quality in daily usage aren’t the same thing. There is more to it than raw pixel count. You are arguing for something that nobody brought up. Do you actually, unironically think that there won’t be a point in desktop screens where resolution increases will no longer lead to perceivable improvements in image quality and the performance hit will just not be worth it? Hint - there absolutely will come such a time.
I know everyone likes clowning on Apple (for some reason), but their Retina concept isn’t actually just a marketing meme and there was thought put behind it. And most serious researchers, even those noting some imperfections with it, tend to agree on the principle.
fevgatosWell if your goal is just to make it appear better then we might as well be using a 14" 1080p monitor at a distance of 2 meters, it will be so tiny you won't see any misshaps.
Yes, it’s called a laptop screen. And yes, the WHOLE GOAL IS IMPROVING PERCEIVED DISPLAY QUALITY, that’s the whole point, not just jacking off to numbers.

Edit: Also, make a discussion or something guys, I just noticed the thread we are in and this is a derail if I ever saw one.
Posted on Reply
Add your own comment
Dec 4th, 2024 09:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts