Monday, November 4th 2024

AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20

As we entered November, Valve just finished processing data for October in its monthly update of Steam Hardware and Software Survey, showcasing trend changes in the largest gaming community. And according to October data, AMD's discrete GPUs are not exactly in the best place. In the top 20 most commonly used GPUs, not a single discrete SKU was based on AMD. All of them included NVIDIA as their primary GPU choice. However, there is some change to AMD's entries, as the Radeon RX 580, which used to be the most popular AMD GPU, just got bested by the Radeon RX 6600 as the most common choice for AMD gamers. The AMD Radeon RX 6600 now holds 0.98% of the GPU market.

NVIDIA's situation paints a different picture, as the top 20 spots are all occupied by NVIDIA-powered gamers. The GeForce RTX 3060 remains the most popular GPU at 7.46% of the GPU market, but the number two spot is now held by the GeForce RTX 4060 Laptop GPU at 5.61%. This is an interesting change since this NVIDIA GPU was in third place, right behind the regular GeForce RTX 4060 for desktops. However, laptop gamers are in abundance, and they are showing their strength, placing the desktop GeForce RTX 4060 in third place, recording 5.25% usage.
Source: Steam Survey
Add your own comment

222 Comments on AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20

#126
JustBenching
Hecate91I seriously doubt that, Nvidia has tons of sponsored games and pays many AAA game studios to use Nvidia features.
Just go to the gaming evolved page? A lot of big name aaa games are amd sponsored.
Hecate91personally could care less about what graphics feature a game has, I buy games for the gameplay.
Then you are not the target group of either amd or nvidia, since you'll probably using a 10 year old card playing at low settings 1080p.
Posted on Reply
#127
AusWolf
Beginner Macro DeviceOr to max your ridiculously fast display out. I somehow have proved to be sensitive to that and gaming at 170 FPS is leagues more pleasant than gaming at <100 FPS and is just substantially better than 100–130 FPS. Even if I somewhat wasted visuals by upscaling artifacts. Smoothness rocks.
Of course I'd be happy to point blank game devs who make games an unplayable mess if you don't turn these things on or if you don't get yourself a severely overclocked 4090 and graphics ain't next-gen but implemented right, these tools are great and not to be abolished.
My sense of smoothness comes from VRR (FreeSync), and the game running within my monitor's VRR range, and not from super-high FPS. I cannot see a difference above 50-70 FPS anyway (depending on game).
But each to their own.
fevgatosWhen it's usually people with higher end gpus that use them and people with lower end gpus being against them, doesn't it make you pause?
Yes. But then I conclude what I've always known: people are idiots.
Posted on Reply
#128
Vayra86
AusWolfI don't care what anyone says, DLSS, FSR and XeSS are tools to get better performance out of ageing or lower end hardware when even low quality graphics options won't do anymore, nothing more. And that is a sign for me that the need for a GPU upgrade is imminent.
It is exactly the enabler for the PS5 Pro: that entire console can exist for the price it can have because it uses a weak chip with a shitload of upscale and FG on top.

But then it does not enable low end graphics, it enables higher end graphics on a console. Or higher framerates. So in the end its only, exclusively a matter of perspective I think. The proof is in the pudding: are games actually looking better for X hardware cost, or Y upscale usage. Im a big promoter of keeping access to the options, but with the mess that is new engines and their requirements for TAA (to avoid artifacting in almost anything that moves), we're not really keeping access, and the idea is slowly created in people's minds that you NEED upscale to make a game look good, when in fact the native experience was already shit to begin with and a sufficient level of blurring hides that efficiently.

There are quite a few ways to look at this but its really the perfect storm for computer graphics right now. Total confusion. Between the performance black box that is RT performance (vs gained IQ), upscale (extra FPS vs lost IQ), and dynamic rendering, all bets are off wrt the conclusions one may draw looking at a video game these days.
Posted on Reply
#129
AusWolf
AssimilatorYet they continually try to, and continually lose, and then instead of asking themselves "this doesn't seem to be working, maybe we should try something else?" they do the exact same thing and fail again. The definition of insanity...
That brings us back to RDNA 4, with which I hope AMD has learned the lesson, and instead of trying too hard to pass Nvidia in something they can't, they'll just make a decent card at a decent price.
Posted on Reply
#130
JustBenching
AusWolfYes. But then I conclude what I've always known: people are idiots.
And then someone will say people that don't use it are idiots and that doesn't lead anywhere.

Upscalers allow you to target higher resolutions which results in better image quality at same performance. Dlss / fsr is the reason I have a 4k monitor, since dlss / fsr quality at 4k looks a lot better than native 1440p with similar performance.
Posted on Reply
#131
Macro Device
AusWolfMy sense of smoothness comes from VRR
That, too. But also having FPS high as a kite is a neat addition. With such monitors being massively affordable, I don't see any reason to stop gaming like that. If my swift display breaks it's easy to replace.
Anyhow, I railed way too far from the main topic and I don't got anything else to add so till next time.
Posted on Reply
#132
AusWolf
fevgatosAnd then someone will say people that don't use it are idiots and that doesn't lead anywhere.

Upscalers allow you to target higher resolutions which results in better image quality at same performance. Dlss / fsr is the reason I have a 4k monitor, since dlss / fsr quality at 4k looks a lot better than native 1440p with similar performance.
Except that it's not a higher resolution. Even you compare your upscaled 4K image to a 1440p one, but why would you? Did you buy a 4K monitor to game at 1440p, or marvel at a technology that offers better visuals? I don't think that you did. Compare your upscaled 4K image to a native 4K one, and we can talk (although I think we have in another thread, so perhaps there is no point).

With all due respect, I honestly think people like you are lying to themselves by being happy that DLSS is less shit than 720p when in fact, no one wanted to play at 720p anno 2024 to begin with (crude example).
Posted on Reply
#133
Vayra86
AusWolfThat brings us back to RDNA 4, with which I hope AMD has learned the lesson, and instead of trying too hard to pass Nvidia in something they can't, they'll just make a decent card at a decent price.
Yeah and that's still exactly the opposite point that @Assimilator and myself are making... this strategy (aherm) is a repeat of ye olde strategy that made AMD lose momentum entirely against Nvidia, even though they had solid market share in their GCN time period. They literally pissed it away selling Polaris to miners and having nothing above it, while Nvidia was selling Pascal for everyone.

Similarly, their attempts at high end with HBM... first attempt: Fury X, lots of issues sourcing chips, no OC potential and the chip wasn't better than competitive offerings, while also being stuck with 50% less VRAM as their competitive offering... Their solution for delta compression came far too late, and Nvidia doubled down on it while AMD was pushing Polaris. It is delta compression that allowed Nvidia to avoid Hawaii XT's 512bit bus (and the looming situation that there's nothing above that, and no faster VRAM either), keeping bus width to 256 bit in everything but their top end product. But AMD? AMD was happy to continue pushing HBM, that was STILL hard to source, costly, and complicated efficiency clocking too (neither Vega or Fury could OC worth a damn). Gosh... that failed too. Strange!

Every time it is the lack of dedication to push the boundaries further that kills AMD's progress. RDNA4 is more of that, but they say they will push the RT boundary. To what level? Past Nvidia? I hope so, because otherwise they're still behind. Its like one step forward, and two steps back that way, because then the gap in raw perf with Nvidia will have pretty much doubled from what it is now.
Posted on Reply
#134
JustBenching
AusWolfExcept that it's not a higher resolution. Even you compare your upscaled 4K image to a 1440p one, but why would you? Did you buy a 4K monitor to game at 1440p, or marvel at a technology that offers better visuals? I don't think that you did. Compare your upscaled 4K image to a native 4K one, and we can talk (although I think we have in another thread, so perhaps there is no point).

With all due respect, I honestly think people like you are lying to themselves by being happy that DLSS is less shit than 720p when in fact, no one wanted to play at 720p anno 2024 to begin with (crude example).
As I've said, I've bought a 4k monitor cause with dlss I can get the same performance to a 1440p monitor but with much higher image quality. . I was literally between the 27" 240hz woled and the 32" 4k woled, and I went for the latter because of dlss. It just looks better with hardly any performance sacrifice.

Same for my laptop, I went for a 1600p display cause of fsr.
Posted on Reply
#135
Hecate91
Vayra86There is one.
And AMD was in on it since Microsoft launched the DX12 Ultimate label. They just forgot to build the right acceleration for it. This is a typical case of AMD say but don't do, or put differently, AMD being godawfully slow implementing features, as usual.

Nvidia was faster. The only reason Nvidia can keep creating proprietary features is because they simply pay off, and part of the reason they're paying off, is because AMD is always in wait and see mode.
It sounds more like an MS problem, or the MS version being crap, as for doing things first that takes a lot of money which obviously Nvidia has more of. Expecting AMD to do anything first is completely unrealistic when Nvidia has the marketshare, mindshare, and game studios to promote their features.
AssimilatorNow imagine if AMD's own management understood this.
I think they do with RDNA4, focusing on the midrange makes a lot more sense than chasing the high end.
AssimilatorYet they continually try to, and continually lose, and then instead of asking themselves "this doesn't seem to be working, maybe we should try something else?" they do the exact same thing and fail again. The definition of insanity...
Because consumers don't seem to care about price/performance, they only care about the card with green on the box.
AssimilatorBecause AMD's pricing has been so much lower :rolleyes:
Nice taking part of my post out of context, thanks. AMD's pricing has been lower, or at least better value per dollar for raster performance while not stagnating on VRAM.
Posted on Reply
#136
AusWolf
Beginner Macro DeviceThat, too. But also having FPS high as a kite is a neat addition. With such monitors being massively affordable, I don't see any reason to stop gaming like that. If my swift display breaks it's easy to replace.
Anyhow, I railed way too far from the main topic and I don't got anything else to add so till next time.
My reason is simple: I see zero difference between 80 FPS and 100, for example. So why would I lower my image quality for more FPS if it brings nothing to the table (for me)?
But yeah, I agree, enough derailment for today. :)
Vayra86Yeah and that's still exactly the opposite point that @Assimilator and myself are making... this strategy (aherm) is a repeat of ye olde strategy that made AMD lose momentum entirely against Nvidia, even though they had solid market share in their GCN time period. They literally pissed it away selling Polaris to miners and having nothing above it, while Nvidia was selling Pascal for everyone.
That strategy may make them lose market share, but the current strategy (fruitlessly trying hard to compete) makes them lose money. If you ask which one out of these two I'd rather keep, I'd choose money.
fevgatosAs I've said, I've bought a 4k monitor cause with dlss I can get the same performance to a 1440p monitor but with much higher image quality. . I was literally between the 27" 240hz woled and the 32" 4k woled, and I went for the latter because of dlss. It just looks better with hardly any performance sacrifice.
Good for you. I'm still just a peasant gaming at 1440 UW, and every single upscaling solution I've seen on my screen is worse than native, and I'm certain that'll never change.

I'll amend what I said above: DLSS/FSR can be a good entry ticket to 4K. Not many people have the money and/or desire to game at 4K, though.
Posted on Reply
#137
Vayra86
AusWolfThat strategy may make them lose market share, but the current strategy (fruitlessly trying hard to compete) makes them lose money. If you ask which one out of these two I'd rather keep, I'd choose money.
Agreed, but then you can't really call it strategy anymore, its more a case of grasping at straws. You can smell it from miles away. AMD is stuck between rock and hard place, and its entirely their own doing. They've had their sweet time. So, they were executing their other, different 'strategy' before with RDNA? That lasted for.... all of 2 generations? What 'strategy' was that then? :D

When you leave RT performance on the table for 3 full generations, that's not doing your damnest best to compete in the high end, IMHO, either. They even literally said they'd wait it out until it hits the midrange, when they launched RDNA. Obviously, if you're all in, you will use that time to get a solid RT solution by the time you DO need it. And here we are.
Posted on Reply
#138
AusWolf
Vayra86Agreed, but then you can't really call it strategy anymore, its more a case of grasping at straws. You can smell it from miles away. AMD is stuck between rock and hard place, and its entirely their own doing. They've had their sweet time. So, they were executing their other, different 'strategy' before with RDNA? That lasted for.... all of 2 generations? What 'strategy' was that then? :D
I see it more like a breather to rethink your strategy, and try not to lose too much money in the meantime. If the product is good, I personally don't care about the rest.
Posted on Reply
#139
Assimilator
Vayra86Yeah and that's still exactly the opposite point that @Assimilator and myself are making... this strategy (aherm) is a repeat of ye olde strategy that made AMD lose momentum entirely against Nvidia, even though they had solid market share in their GCN time period. They literally pissed it away selling Polaris to miners and having nothing above it, while Nvidia was selling Pascal for everyone.

Similarly, their attempts at high end with HBM... first attempt: Fury X, lots of issues sourcing chips, no OC potential and the chip wasn't better than competitive offerings, while also being stuck with 50% less VRAM as their competitive offering... Their solution for delta compression came far too late, and Nvidia doubled down on it while AMD was pushing Polaris. It is delta compression that allowed Nvidia to avoid Hawaii XT's 512bit bus (and the looming situation that there's nothing above that, and no faster VRAM either), keeping bus width to 256 bit in everything but their top end product. But AMD? AMD was happy to continue pushing HBM, that was STILL hard to source, costly, and complicated efficiency clocking too (neither Vega or Fury could OC worth a damn). Gosh... that failed too. Strange!

Every time it is the lack of dedication to push the boundaries further that kills AMD's progress. RDNA4 is more of that, but they say they will push the RT boundary. To what level? Past Nvidia? I hope so, because otherwise they're still behind. Its like one step forward, and two steps back that way, because then the gap in raw perf with Nvidia will have pretty much doubled from what it is now.
I don't think AMD ever wanted to use HBM on consumer products; they were essentially forced to because Vega was such a power-hungry POS that coupling it to DDR would have made their cards' already-terrible power consumption far, far worse. It was either launch a bad product, or launch no product; and the former is always the better decision.
Posted on Reply
#140
Vayra86
Hecate91It sounds more like an MS problem, or the MS version being crap, as for doing things first that takes a lot of money which obviously Nvidia has more of. Expecting AMD to do anything first is completely unrealistic when Nvidia has the marketshare, mindshare, and game studios to promote their features.
Doing things first doesn't take a lot of money, it takes good minds and company policy that allows those minds to do what they're good at. None of those elements have price tags attached. Its just a stance, an idea, a philosophy you have or don't have about R&D. AMD has sufficient financial space to provide that mindspace to its employees, too. They did that exact thing with Zen, I believe.

If you have good ideas, the money or the market comes anyway. Look at Freesync. That was a good AMD idea - but even there, it was just using what was already there, and pushing it forward a little bit. Not much money involved. Similarly, G-Sync obviously isn't a very costly solution either; you develop it once and use it ad infinitum. Strategically, AMD won that battle, and it proves that AMD's key values CAN work: affordable & open is where its at - as long as it doesn't suck.

I'm not sure what you mean by saying 'it sounds more like an MS problem'. Clearly, the lack of RT perf is an AMD problem, because AMDs not selling GPUs, and they DO make those GPUs to run stuff on a DX12 API. :)
AssimilatorI don't think AMD ever wanted to use HBM on consumer products; they were essentially forced to because Vega was such a power-hungry POS that coupling it to DDR would have made their cards' already-terrible power consumption far, far worse. It was either launch a bad product, or launch no product; and the former is always the better decision.
Yeah I remember that was the squeeze. But I think it was a long term issue for AMD, Nvidia already offered less in memory bandwidth since Kepler and the gap kept growing. AMD just kept increasing bus widths, and only started on an efficiency improvement with Tonga, which was the rebrandeon age that went literally nowhere.
Posted on Reply
#141
JustBenching
Hecate91I seriously doubt that, Nvidia has tons of sponsored games and pays many AAA game studios to use Nvidia features.
Just to drive the point home, below are some of the AMD sponsored games.

Call of duty BO6
FF XIV
Dragons dogma 2
Last of us
Avatar
Jedi Survivor 2
Starfield
Uncharted
Immortal of Avernum
Calisto Protocol
God of war
Farcry 6
Assassin's creed
Horizon zero dawn
Resident Evil village
RDR 2

As you can see the vast majority of heavy hitting AAA games are indeed amd sponsored.
Posted on Reply
#142
Vayra86
Hecate91Because consumers don't seem to care about price/performance, they only care about the card with green on the box.
Don't forget the 'green on the box' shows you a screenshot of game running native at 40 FPS and then one next to it running DLSS3 at 95 FPS.

You might want to reconsider that statement ;) Nvidia just sells software performance now, and the vast majority doesn't know what that means.
Posted on Reply
#143
JustBenching
Vayra86Don't forget the 'green on the box' shows you a screenshot of game running native at 40 FPS and then one next to it running DLSS3 at 95 FPS.

You might want to reconsider that statement ;) Nvidia just sells software performance now, and the vast majority doesn't know what that means.
TBF, this is from amd's presentation


Posted on Reply
#144
Hecate91
TheinsanegamerNThis ignores history. Evergreen brought AMD to a 49% market share. When AMD was consistent with releases and maintained roughly nvidia performance every generation, they had no trouble selling. So long as they keep doing this wishy washy "oh were high end now were not now we are" thing they'll struggle to sell, because that does not inspire confidence.

I'm not sure what media you were reading, the ones I read lambasted intel for the poor showing and the 200 series basically being 14th gen but slightly slower.
Except times have changed since then, the narrative now is to promote nvidia even when there are better alternatives, so its understandable that AMD keeps dropping out of the high end when consumers show they don't care about having any competition.
I was just referring to reviews in general, I didn't see anyone calling the Intel 200 series a "flop" or a failure, yet I know at least one tech channel that called Zen 5 a flop.
fevgatosJust go to the gaming evolved page? A lot of big name aaa games are amd sponsored.
A page from a company themselves promoting games they sponsored, not really a surprise there.
fevgatosThen you are not the target group of either amd or nvidia, since you'll probably using a 10 year old card playing at low settings 1080p.
I'm definitely not the target group of nvidia then, imagine buying games for enjoying the story and gameplay itself, unfortunately doesn't seem to be much of a thing anymore as reviewers promote eyecandy over a game actually being enjoyable.
fevgatosJust to drive the point home, below are some of the AMD sponsored games.

Call of duty BO6
FF XIV
Dragons dogma 2
Last of us
Avatar
Jedi Survivor 2
Starfield
Uncharted
Immortal of Avernum
Calisto Protocol
God of war
Farcry 6
Assassin's creed
Horizon zero dawn
Resident Evil village
RDR 2
As you can see the vast majority of heavy hitting AAA games are indeed amd sponsored.
Now compare that to the amount of Nvidia sponsored games.
Out of that list a lot of those are console exclusive titles, so it makes sense for AMD to have some sponsored games.
Posted on Reply
#145
JustBenching
Man if you don't care about graphics then you are really not the target group of either company. They are selling hardware that runs games with higher graphics. If you are fine with 1080p low - you are not really their demographic. You think AMD released RDNA 3 for people that don't care about graphics? :D
Hecate91Now compare that to the amount of Nvidia sponsored games.
Out of that list a lot of those are console exclusive titles, so it makes sense for AMD to have some sponsored games.
Why would I do that? You said amd doesn't have the money to sponsor games, when in reality they sponsor a big majority of the most popular AAA titles.
Posted on Reply
#146
Caring1
Y'all are carrying on like Steam actually matters lol. :roll: :roll: :roll:
Posted on Reply
#147
Vayra86
fevgatosMan if you don't care about graphics then you are really not the target group of either company. They are selling hardware that runs games with higher graphics. If you are fine with 1080p low - you are not really their demographic. You think AMD released RDNA 3 for people that don't care about graphics? :D
Well, that is a matter of perspective. If you do not own a gaming PC, but you do want to game, you can still not care about graphics but you're buying a new card regardless. And if you're upgrading, you could certainly still be of the same opinion but your current GPU is just no longer sufficient.
fevgatosTBF, this is from amd's presentation


Yeah, one presentation... versus nearly daily TPU posts about DLSS this or that ;)
Posted on Reply
#148
Markosz
Literally all 50s, 60s, and a few basic 70s series card, but everyone is like "OMG AMD doesn't compete at the high end" :D
Posted on Reply
#149
kapone32
PC World were at E3 and talking to an editor from another channel. Adam said " The 4090 is going to be expensive and the 7900XTX is over $1000. Is there a card for the person that does not want to pay that much. The Editor said "Yes the 7900XT". Adam had a look of disgust on his face and the editor said "What it is only 7% slower than the XTX and is $699". Then let's look at MSI Gaming, who in more than one live stream have openly admitted that they actively promote Intel over AMD. Then let's go to Roboytech, who will get triggered on stream if you ask him why his builds never have AMD cards. How about Kit Guru and the rest that make it seem like it's 4090 or bust when a 4090 costs the same as most of my PC;but never build using AMD cards? If we are going to use Steam Charts we should also use user reviews on retail sites. If you take the time to go to Newegg or Amazon you will see that what Varya86 said about his 7900XT is echoed. The narrative is strong though. Nvidias are also being investigated for their business practices so that China iniative may have made them lot's of money but could also lead to lot's of problems. If there were no narrative DLSS and RT would not be compared to raster. You see that is the truth AMD actually caught Nvidia on raster with the entire 6000 stack but DLSS (Which is still in lesss than 1% of Games) and RT became the media's buzzwords and Nvidia were happy to promote them too. Even TPU were gifted 4000 cards and posted about it.
Posted on Reply
#150
GhostRyder
TheinsanegamerNHe never said that. You're injecting your own argument.

He's right, AMD's RT is almost two generations behind nvidia now, and FSR is just awful. Full of blurry text ad almost hallucinogenic color mixing at times. DLSS balanced has better image quality then FSR ultra quality does. It's just fact.

How about this: if AMD wants to make an open source DLSS, make it good?
Not sure I agree with all of that statement. I mean the FSR vs DLSS argument in terms of quality is subjective in most cases. In the side by side I have seen opinions all over the place. Personally I think they are pretty close to equal in terms of image quality with each having certain things be a weakness for them.

The RT argument is not one I am a fan of because its another one of those new tech arguments where Nvidia comes out with something and now everyone constantly references it like its the new standard. Personally I still see RT as unnecessary or overhyped especially with how much of a performance drop you get in most high end games enabling it. Neither side has it down where I would consider turning it on for most games just to watch my FPS halved in many cases. No doubt Nvidia is better, but of course they are they started the trend.
Vayra86Nvidia was faster. The only reason Nvidia can keep creating proprietary features is because they simply pay off, and part of the reason they're paying off, is because AMD is always in wait and see mode.
I mean, AMD has tried proprietary things in the past but they never invest enough in getting's games to support it. Normally they just open source it and hope. Problem with Nvidia's proprietary things is in the past alot have been just performance killers if you don't have Nvidia hardware like with Gameworks which I don't think helped anyone other than them.
AusWolfLet me address these points separately:

AMD's lack of RT performance doesn't matter much when even Nvidia's RT performance is abysmal on anything less than a 4080. I'm not sad that I can't RT on my 6750 XT because I couldn't on a 3070 either. Whether or not RT is really such a significant and meaningful visual upgrade that makes it worth spending more is a separate debate, but I'm personally doubtful.

I don't care what anyone says, DLSS, FSR and XeSS are tools to get better performance out of ageing or lower end hardware when even low quality graphics options won't do anymore, nothing more. And that is a sign for me that the need for a GPU upgrade is imminent.

I'm sick and tired of responding to the mindless "oooh RT, aaah DLSS" comments everywhere, so let's leave it at that, shall we?
I agree, RT is way overhyped and people talk about it like its the second coming. I have seen some of the best case scenarios on my friends 4090 PC and to be honest I was not that impressed (To be fair, it looked good on Cyberpunk at 4K settings maxed, but performance was pretty abysmal depending on the scene with DLSS off). I will say I like DLSS a lot more than I like RT. I think many people and places focus on RT way to much.
Posted on Reply
Add your own comment
Dec 4th, 2024 09:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts