• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20

AMD has long needed to purge most of their obsolete management system and get newer, younger, more driven people in charge. One of the worst things they did was NOT purge ATi employees in 2006. They never integrated well with AMD employees.


based on the younger people in my industry(software dev). they have very poor critical thinking and problem solving skills. Sure they are more enthusiastic and have a lot of knowledge, but knowledge is useless if you dont know how to use it.
 
If you want to do some half serious, but even casual gaming on a PC that is more than browser activity you are probably getting into something along the x50ti-x60 range of price/GPU. So that is the 250-350 dollar segment; not the glorified IGP segment. I build the occasional system for that target demographic and you generally end up in that segment for GPU, higher is deemed too expensive and 'doesn't pay off' for these people. They don't care about graphics at all, they just want it to run and not look completely shit and rarely have over $1k to spend.

And believe it or not but this is precisely what AMD and Nvidia target for mass markets, its also precisely what Steam surveys show as the most prevalent 'real' gaming GPUs that aren't IGPs. The overwhelming majority of gamers aren't graphics whores, but people who want to just play games.
From personal experience, my mobile 6700s (which is basically a power limited - underclocked 6600xt) is perfectly fine for exactly what you described, playing games without them looking like complete ass. And sure, that 250$ price range used to be the bulk of sales and profits, but is it still? Going back to steam survey, 4060 is 5.25% of the market, while the 4090 is 1.2% Isn't nvidia making a lot more bang from that 1.2%? I don't know, I'd think their margins are much bigger there, but I ain't an expert, could be wrong.

It feels like the releases at that price point are something like "let's release something - anything at the 300$ mark so we stop them from crying about it". Remember that a 1080 was just 57% faster than a 1060, nowadays a 4080 is ~2.2 to 2.6 times faster than a 4060 (depending on resolution). The low end has become bottom of the barrel.
 
From personal experience, my mobile 6700s (which is basically a power limited - underclocked 6600xt) is perfectly fine for exactly what you described, playing games without them looking like complete ass. And sure, that 250$ price range used to be the bulk of sales and profits, but is it still? Going back to steam survey, 4060 is 5.25% of the market, while the 4090 is 1.2% Isn't nvidia making a lot more bang from that 1.2%? I don't know, I'd think their margins are much bigger there, but I ain't an expert, could be wrong.
I think there's two things going on. The basic premise still is that people who get into pc gaming buy (Lower) midrange.

But at the same time, there's a significant group that's past that and there's a performance delta between generations that pushes people to move up in the stack too. There's also emerging markets in the (semi) pro segment that use high end gpus and they ain't buying Quadros.

Its a bit like what you see in gaming itself: there's a real market for higher end gaming, its big enough to push content for (RT updates in Cyberpunk are the best example), it generates sales all on its own and they are super high margin for all involved - the fools and money part of the market, really, no offense intended, but these people just buy what they can - and despite that being a growth market, it doesn't take any sales away from other segments, there's just more people playing more content continuously. In a relative sense, the amount of higher end GPUs is growing.
 
From personal experience, my mobile 6700s (which is basically a power limited - underclocked 6600xt) is perfectly fine for exactly what you described, playing games without them looking like complete ass. And sure, that 250$ price range used to be the bulk of sales and profits, but is it still? Going back to steam survey, 4060 is 5.25% of the market, while the 4090 is 1.2% Isn't nvidia making a lot more bang from that 1.2%? I don't know, I'd think their margins are much bigger there, but I ain't an expert, could be wrong.
5.25% is 4.3x more than 1.2% while the price of the 4090 is roughly 5.3x higher than the 4060. I don't know about the differences in profit margins, but the money moved seems to be similar.

It feels like the releases at that price point are something like "let's release something - anything at the 300$ mark so we stop them from crying about it".
The vast majority of PC gamers disagree with you.
 
It feels like the releases at that price point are something like "let's release something - anything at the 300$ mark so we stop them from crying about it". Remember that a 1080 was just 57% faster than a 1060, nowadays a 4080 is ~2.2 to 2.6 times faster than a 4060 (depending on resolution). The low end has become bottom of the barrel.
This I think is really a matter of perspective. If you've travelled to the top end of the GPU stack and seen it all, a lot of the lower end stuff looks pretty shitty, but then we're just spoiled, really, because those lower end cards DO run games fine. I mean let's consider for a moment the amount of content a Steam Deck will run on a mere APU at 15W. And it sells. Oh boy does it sell.

And low end (being current day x60... talking bout perspective hehe) was always stuck with some weird configuration of specs: let's recall the GTX 660 with 1.5GB + 0.5GB; the 970 with a similar weird bus; the 4060 with its abysmal bandwidth... AMD's set of failures in RDNA2/3 or the endless rebrand/limbo of Pitcairn... something's always gonna give. But they STILL run games.
 
5.25% is 4.3x more than 1.2% while the price of the 4090 is roughly 5.3x higher than the 4060. I don't know about the differences in profit margins, but the money moved seems to be similar.


The vast majority of PC gamers disagree with you.
Well didn't you say the vast majority of people are idiots or something? So doesn't really matter, does it? :D
 
This I think is really a matter of perspective. If you've travelled to the top end of the GPU stack and seen it all, a lot of the lower end stuff looks pretty shitty,
I disagree with that too because:
but then we're just spoiled, really, because those lower end cards DO run games fine. I mean let's consider for a moment the amount of content a Steam Deck will run on a mere APU at 15W. And it sells. Oh boy does it sell.
Exactly. How many times I've upgraded my hardware to the highest end with great enthusiasm only to have a reaction like "meh" when I saw it in action. Then I sold my stuff and bought something midrange.

Gaming at a bazillion FPS at 4K Ultra is not the dream of every gamer. For some (for many, I'd argue), 1080/1440p Medium is fine if it means leaving some massive cash in one's pocket for other things in life.
 
I disagree with that too because:

Exactly. How many times I've upgraded my hardware to the highest end with great enthusiasm only to have a reaction like "meh" when I saw it in action. Then I sold my stuff and bought something midrange.

Gaming at a bazillion FPS at 4K Ultra is not the dream of every gamer. For some (for many, I'd argue), 1080/1440p Medium is fine if it means leaving some massive cash in one's pocket for other things in life.
There's always a sweet spot for gaming, and its generally at or around what consoles do at the moment. I agree with you. Its much better playing at that sweet spot because you'll have the best bang for buck, good support, no early adopting nonsense... etc.
 
Well didn't you say the vast majority of people are idiots or something? So doesn't really matter, does it? :D
Touché. :D Anyway, my point stands that it's not a "let's just release something" category. People do care about their money (this isn't what makes them idiots, though).

There's always a sweet spot for gaming, and its generally at or around what consoles do at the moment. I agree with you. Its much better playing at that sweet spot because you'll have the best bang for buck, good support, no early adopting nonsense... etc.
Not to mention better thermals, sensible power consumption and a size that fits into your chassis. :)

Just to illustrate my point: the price difference between the 4060 and the 4090 is around £1,300-1,500. That's the price of an average cruise ticket. So which one brings more to one's table? A once-in-a-lifetime experience around the world, or a different graphics card that plays the exact same games just faster? Um... dunno. :rolleyes:
 
At OG MSRP considering the 24 gigs of new and expensive GDDR6X and it being a halo card (those are by default bad value)? Nah, it was okay. Transistor per dollar isn’t everything, which is why I am not too hot on you using it as a metric.
What it did end up selling at eventually in real world is another matter entirely.
Since I'm unable to edit my previous post anymore, I'm posting update here and more accurate calculations below. By the way, transistors per dollar metric (while being my quick calculation idea) is not completely bullshit and does say something about manufacturers margins. But here we go again, 24 GB of GDDR6X VRAM seems to be like $600 more more expensive than 16 GB, right? Definitely not. Just look at RTX 4090, same amount of memory, much higher performance but yet somewhat similar MSRP to 3090 and much lower than 3090 Ti, achieving that despite ever raising complexity of production and other things to blame.

1730912830758.png
Bottom line: We end up paying higher amounts of money for GPUs, but we also keep getting more hardware for it.
Maybe that's what explains famous Jensen's motto: "The more you buy, the more you save." Anyway, don't take those calculations too seriously.


Mind you everyone, Steam Hardware Survey also contains a huge amount of:
  1. OEM Prebuilts
  2. Chinese PCs
  3. Both.
And in those markets, people generally only want something 'good enough' and will likely be playing primarily esports titles or grindfests (looking at you, War Thunder). Green sells way better than red, because the popular perception of 'Nvidia is the gamer's choice' is still prevalent. They'd have to actually massively flop for that to change. Ada was just disappointing.

Were you to ask PC enthusiasts exclusively, I imagine far more 6800XTs, 7800XTs, and 7900XTXs would turn up.
And as was already pointed out, not everyone participates in the survey. For hardware survey, one would assume that the bigger the data pool, the better, but Valve probably thinks otherwise. It is strange why only chosen ones are participating each month. It's like you want to evaluate average final mark in particular subject of a 25 students in class but you pick only 10 and those might be the better or worse ones which will render the statistics badly inaccurate due to poor statistical sample.

As I've said, I've bought a 4k monitor cause with dlss I can get the same performance to a 1440p monitor but with much higher image quality. . I was literally between the 27" 240hz woled and the 32" 4k woled, and I went for the latter because of dlss. It just looks better with hardly any performance sacrifice.
With DLSS, you can get same performance on 4K monitor as with 1440p one and even with much higher image quality? Does having distorted and guessed frames improve image quality? From the beginning, DLSS and similar stuff has always been about sacrificing image quality for performance. Your thinking seems familia... Oh, it's you again ...
 
Well here we go. This is directly from Leo at Kitguru while he was reviewing a X870E board.

@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
 
With DLSS, you can get same performance on 4K monitor as with 1440p one and even with much higher image quality? Does having distorted and guessed frames improve image quality? From the beginning, DLSS and similar stuff has always been about sacrificing image quality for performance. Your thinking seems familia... Oh, it's you again ...
Funny thing cause I checked my eyes 2 weeks ago, 12 / 10 (24/20 if you are a US resident). Flawless vision.

4k DLSS Q looks disgustingly better than 1440p native. Period.
Well here we go. This is directly from Leo at Kitguru while he was reviewing a X870E board.

@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
Obviously, since only nvidia is making an enthusiast card right now.
 
Funny thing cause I checked my eyes 2 weeks ago, 12 / 10 (24/20 if you are a US resident). Flawless vision.

4k DLSS Q looks disgustingly better than 1440p native. Period.

Obviously, since only nvidia is making an enthusiast card right now.
There is no Graphical difference between 4k High and 4K Ultra but the CPU works much harder at high than Ultra. Try City Skylines 2 and see what I mean.

Well here we go. This is directly from Leo at Kitguru while he was reviewing a X870E board.

@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
 
AMD has long needed to purge most of their obsolete management system and get newer, younger, more driven people in charge. One of the worst things they did was NOT purge ATi employees in 2006. They never integrated well with AMD employees.

Streamers go with nvidia because of encoding. Thats it. We like to meme about all the things nvidia does better then AMD that nobody uses, like encoding, but for streamers NVENC is an absolute game changer. Especially for HD or 4k streams or anything high bitrate. AMD simply doesnt compete there, and thats the use case where it makes way more sense.
I agree, they really need to do some overhauling on the people running Radeon.

Sure, but I also think its also an image thing in both the name (Kinda like how Apple products are considered premium regardless of what the truth is) and the fact they keep holding the crown for 'Most Powerful Gaming GPU'. I think alot of things play into it, but I think the image is what the people not doing encoding and are mostly gaming.
 
Funny thing cause I checked my eyes 2 weeks ago, 12 / 10 (24/20 if you are a US resident). Flawless vision.

4k DLSS Q looks disgustingly better than 1440p native. Period.
Do you have a 1440p monitor to test for sure? In my experience, any resolution lower than your monitor's native looks like ass. No wonder a higher res with DLSS looks better.

Well here we go. This is directly from Leo at Kitguru while he was reviewing a X870E board.

@KitGuruTech replied: "The explanation is that the vast majority of the enthusiast market buys Nvidia and we want our reviews to reflect the likely experience of the end user. Leo"
So who is the end user? Enthusiasts? Or regular people who just want to run games?
 
Do you have a 1440p monitor to test for sure? In my experience, any resolution lower than your monitor's native looks like ass. No wonder a higher res with DLSS looks better.
I have 6 monitors, yeah
 
Similar PPI?
They can't be similar man, in order for a 1440p monitor to have similar ppi to a 32" 4k it needs to be what, 24"? Does that even exist?

It's not about the PPI, ppi is irrelevant. An object in a 4k screen will be made up of twice as many pixels than on a 1440p screen, no matter what the screen size actually is. You can't extract more detail just by having similar PPI simply because you have less pixels to work with. No amount of PPI can replace raw pixel count. Even lods are higher quality when you are running higher res.
 
They can't be similar man, in order for a 1440p monitor to have similar ppi to a 32" 4k it needs to be what, 24"? Does that even exist?

It's not about the PPI, ppi is irrelevant. An object in a 4k screen will be made up of twice as many pixels than on a 1440p screen, no matter what the screen size actually is. You can't extract more detail just by having similar PPI simply because you have less pixels to work with. No amount of PPI can replace raw pixel count. Even lods are higher quality when you are running higher res.
I don't mean to be rude, but that's bullshit. PPI and colour accuracy are the main deciding factors that determine your image quality. Higher PPI means more pixels in a given area which results in a crisper image. That's monitor basics, man, you don't even need to be a techie to know this. Why else do you think a 1080p phone screen looks better than a 27" 1080p monitor?
 
I don't mean to be rude, but that's bullshit. PPI and colour accuracy are the main deciding factors that determine your image quality. Higher PPI means more pixels in a given area which results in a crisper image. That's monitor basics, man, you don't even need to be a techie to know this. Why else do you think a 1080p phone screen looks better than a 27" 1080p monitor?
Uhm, no. A phone screen looks better cause you don't have good enough vision to notice that it really doesn't (I don't either, not a personal attack).

Im literally explaining the issue to you. Every single object on a 4k screen will be made up of more than twice the amount of pixels compared to a 1440p screen, regardless of the screen sizes. You cannot - by definition - have better image quality on a 1440p image. A spoon on a 1440p screen will be made out of 200 pixels, on a 4k screen it will be made out of 450 pixels.

You do need to be a techie to know what makes an image better, else youll just drag yourself into the ppi race.
 
Uhm, no. A phone screen looks better cause you don't have good enough vision to notice that it really doesn't (I don't either, not a personal attack).

Im literally explaining the issue to you. Every single object on a 4k screen will be made up of more than twice the amount of pixels compared to a 1440p screen, regardless of the screen sizes. You cannot - by definition - have better image quality on a 1440p image. A spoon on a 1440p screen will be made out of 200 pixels, on a 4k screen it will be made out of 450 pixels.

You do need to be a techie to know what makes an image better, else youll just drag yourself into the ppi race.
Then why does text on my 1200x540 phone screen appear a lot shaper even when I'm looking at it from a distance of 5 cm than my 1440 ultrawide monitor from half a metre away? You're talking nonsense, my dude. Have you not noticed that your photos look a lot better on your phone than on the big screen TV?
 
@fevgatos
You are technically correct (the best type, I suppose), but what Aus is driving at it that at sizes where hypothetically both 1440p and 4K would hit a certain ppi that is high enough for the task, like 150+ for typical monitor distance or 300+ for mobile usage, the PERCEIVED image quality will be very close, if not indistinguishable. I certainly can’t reliably tell the difference between, say, a recent iPhone with 460 ppi and a new Galaxy Ultra with 500+, not in a meaningful way.
 
Then why does text on my 1200x540 phone screen appear a lot shaper even when I'm looking at it from a distance of 5 cm than my 1440 ultrawide monitor from half a metre away? You're talking nonsense, my dude. Have you not noticed that your photos look a lot better on your phone than on the big screen TV?
Exactly, it appears a lot sharper because you don't have the eye vision to tell that it isn't. You are hiding the imperfections by making the picture tiny on a phone. Text or image isn't suddenly sharper, you just can't see it. Have you ever worked on photoshop?

@fevgatos
You are technically correct (the best type, I suppose), but what Aus is driving at it that at sizes where hypothetically both 1440p and 4K would hit a certain ppi that is high enough for the task, like 150+ for typical monitor distance or 300+ for mobile usage, the PERCEIVED image quality will be very close, if not indistinguishable. I certainly can’t reliably tell the difference between, say, a recent iPhone with 460 ppi and a new Galaxy Ultra with 500+, not in a meaningful way.
Someone that has ever in his life worked in photoshop can quickly tell you that PPI is a laughable metric and raw pixel count is the one and only metric that matters for image quality. Just zoom in into a 1080p image vs a 4k image and youll instantly notice the huge difference in color banding between the two. It's just physics, you can't have as much colors / pixels / image quality on a small resolution, regardless of how tiny your screen is.

EG1. PPI matters when you are comparing same resolution screens, since the one with the higher PPI will appear sharper while having the same image quality. Crosscomparing with different resolutions is pointless.
 
Exactly, it appears a lot sharper because you don't have the eye vision to tell that it isn't. You are hiding the imperfections by making the picture tiny on a phone. Text or image isn't suddenly sharper, you just can't see it.
Isn't that the whole point of image quality? To make it appear better? :kookoo:

I don't give a damn how many pixels there are as long as it looks better. And that's due to PPI first and foremost.

Have you ever worked on photoshop?
No, and I don't really care if I'm honest.

Someone that has ever in his life worked in photoshop can quickly tell you that PPI is a laughable metric and raw pixel count is the one and only metric that matters for image quality. Just zoom in into a 1080p image vs a 4k image and youll instantly notice the huge difference in color banding between the two. It's just physics, you can't have as much colors / pixels / image quality on a small resolution, regardless of how tiny your screen is.

EG1. PPI matters when you are comparing same resolution screens, since the one with the higher PPI will appear sharper while having the same image quality. Crosscomparing with different resolutions is pointless.
I'm not talking about pictures and zooming into them. I'm talking about monitor image quality. You're not working on pixels in a raw image when you're gaming, are you?
 
Isn't that the whole point of image quality? To make it appear better? :kookoo:

I don't give a damn how many pixels there are as long as it looks better. And that's due to PPI first and foremost.
Well if your goal is just to make it appear better then we might as well be using a 14" 1080p monitor at a distance of 2 meters, it will be so tiny you won't see any misshaps.

The photoshop was just an example to demonstrate the concept. Set it to 1:1 pixel view and just draw a horizontal line with each pixel having a unique color. On a 1080p monitor you can only get 1920 different colors, on a 4k screen you can get double that. This is literally what image detail is, how many unique colors you can get.

I'm not talking about pictures and zooming into them. I'm talking about monitor image quality. You're not working on pixels in a raw image when you're gaming, are you?
It's not different in games. What do you think happens in a game when you drop resolution from 4k to 1080p? What do you think happens to the image? There were 8.3m pixes at 4k, now we are down to 2m. Do you think those extra 6.3m pixels were doing nothing there?
 
Back
Top