Monday, November 4th 2024
AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20
As we entered November, Valve just finished processing data for October in its monthly update of Steam Hardware and Software Survey, showcasing trend changes in the largest gaming community. And according to October data, AMD's discrete GPUs are not exactly in the best place. In the top 20 most commonly used GPUs, not a single discrete SKU was based on AMD. All of them included NVIDIA as their primary GPU choice. However, there is some change to AMD's entries, as the Radeon RX 580, which used to be the most popular AMD GPU, just got bested by the Radeon RX 6600 as the most common choice for AMD gamers. The AMD Radeon RX 6600 now holds 0.98% of the GPU market.
NVIDIA's situation paints a different picture, as the top 20 spots are all occupied by NVIDIA-powered gamers. The GeForce RTX 3060 remains the most popular GPU at 7.46% of the GPU market, but the number two spot is now held by the GeForce RTX 4060 Laptop GPU at 5.61%. This is an interesting change since this NVIDIA GPU was in third place, right behind the regular GeForce RTX 4060 for desktops. However, laptop gamers are in abundance, and they are showing their strength, placing the desktop GeForce RTX 4060 in third place, recording 5.25% usage.
Source:
Steam Survey
NVIDIA's situation paints a different picture, as the top 20 spots are all occupied by NVIDIA-powered gamers. The GeForce RTX 3060 remains the most popular GPU at 7.46% of the GPU market, but the number two spot is now held by the GeForce RTX 4060 Laptop GPU at 5.61%. This is an interesting change since this NVIDIA GPU was in third place, right behind the regular GeForce RTX 4060 for desktops. However, laptop gamers are in abundance, and they are showing their strength, placing the desktop GeForce RTX 4060 in third place, recording 5.25% usage.
222 Comments on AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20
When I upgraded from 1080p 24" to 1440 UW 34", my ppi, and therefore, my image quality stayed roughly the same. I just got more FOV and desktop area. Yeah, might as well if you don't mind the size. It was a bad example, an apples to oranges comparison. What you're talking about has nothing to do with what I'm talking about. Don't forget you're dropping resolution on the same screen size which lowers your ppi to the floor. That's why it'll look like crap.
Of course you have less detail on a smaller resolution screen, but if it's a smaller size, you won't notice it.
But, as always, you are picking a really weird hill to die on and argue for, so I am out to not further derail.
4K can give you a much bigger screen area, or a much better image quality (ppi), or a little bit of both. That's what happens with the 6.3m pixels.
It's not the pixels alone that give you a better image, but your perception of them, which is highly dependent on your ppi and viewing distance (not to mention your eyesight).
Edit: I have a question too. You have one image. You open it in Photoshop on your 4K screen and zoom in on it. Then, you open it on your phone and don't zoom in on it. Where will it look better?
Edit: Okay, let's make it simple. Here's two pictures. Lean back in your chair and tell me, which one looks better to you? (Hint: they're the exact same picture)
I don't mind if you PM me your observations. We've probably derailed this thread long enough.
Make a screenshot while playing on lower resolution, then display that same screenshot on bigger monitor with same resolution. In case your vision is really okay as you said, you should be able to notice a difference. I had once a discussion with Mac laptop dude. He bought that laptop with 13" screen with shitload of pixels to work with photos. "Retina is like made for this, it's the best you can get." And yet professional video and photo editors use much higher EIZO displays. Why? Because at that 13" 2048x1536 retina they can't see sh*t with their own eyes, they can't see what their filters and other applied effects do.
While having more PPI on same physical display size increases that display's resolution, it decreases human eye options to perceive those details. This, of course, differs from person to person. It's like hearable human ear frequency range differs from person to person. And this eye or ear "resolution" will degrade with time (as person gets older). I hate when AC/DC adaptor near my bed does that high pitched constant tone when it's charging up a phone. My girlfriend can't hear it and she's even 4 years younger.
When video, image or sound record gets downscaled, portion of information gets permanently destroyed. You can't really re-create back that information (unless it's stored in some form), you can only guess or use methods to improve that guessing accuracy (e.g. interpolation). That's what DLSS/FSR/XeSS does. Of course, upscaling algorithms are getting better and better, but still, you can't re-create the missing information. You can generate something similar but it will never be same as original, meaning it will never be same as authors intended it to be. That's why I call everything else but native fake.
Maybe try getting 24-bit FLAC sound record with rich tone variery, especially rich in lower frequencies, convert it to 320 kbps MP3, then convert it back to 24-bit FLAC and compare both using $200 headphones and standard DAC. You will notice that especially lower tones are somehow poor or even missing. When you downscale 24-bit FLAC to MP3 with like 7-8 times less bitrate, you lost a lot of information which you can't really recreate by upscaling it back to original resolution. Even with application of so called "AI", various filters, etc, it won't reach the quality of original sound. It may end up having similar sized file as original record but it won'be the same, it will be something different than autor made it to be.
OT's will just get deleted, and posters reply banned.
At the same time it's really a shame that Nvidia is allowed to raise their prices as much as they want, we need competition, but Amd and Intel aren't capable of competing... the GPU market is SO bad right now.
1. High Idle power? I have no idea what you mean. Do you mean more than 10 watts, 20 Watts? I would also ask you to show me a GPU that has better idle power than the 6500XT.
2. Give me a Game, I have plenty of old ones too. Do you mean DOS or Steam like Kingdoms of Aamluar or Sleeping Dogs? Maybe Just Cause 2? I have not seen what you describe. Maybe Praetorians?
3. Less frequent updates for older products. Yeah it kind of blows that Vega is no longer getting driver support. Except my 5600G has the latest driver.
What is the truth is that the narrative takes away the fact that China was openly buying as many 4090/4080 GPUs as they could and Nvidia was allowed to use that as part of their numbers. It is like when TW3 Kingdoms launched on Steam. It instantly became the most popular TW titile in terms of Sales but TWWH is the real driver of the Total War economy. Then you combine that with the tech media all using 4090s to the point where you will see comments on TPU like you can't Game at 4K unless you have a 4090/4080 and the 7900XTX/7900XT are not as good at RT so they are not worth the money. Then you look at prices and realize that sales are down across the board with MBs and GPUs priced to the moon. Even Storage as volatile as it has been has been it has nothing like the price gouging that Nvidia started. As an example if the 7900XT was $450 US there would be no reason to buy anything else. Where I live that is about the cost of a 4060 ti. Is a 4060Ti better than a 7900XT at anything? Before you answer that read some GPU reviews on TPU and focus on where the 7900XT is on the Gaming charts.
These modern reviews also do not use CPU Intensive Games that are the ones that make your PC cry. Like City Skylines2 or Factorio, it is the first Game at 4K where I had to turn on Hyper RX once my population started reaching 1 million. Try that at 4K high and you will see clearly the separation between CPUs in cores and clock speed. In fact most Games at 4K high on these modern systems is a great way to guage CPU performance. It let me know that a 7800X3D is not as fast a 7900X3D in City Skylines2 at 4K and niether will it produce as many FPS in Racing Games like AMS2. Reveiws use 4K Ultra or whatever the highest setting the Game allows and that all but ensures that the GPU does all the work as the frame buffer will alwys be on when you allow all the candy.
If you want pure raster AMD is a great choice.
2) Unfortunately for me i haven't see many examples of games where this happened, i read about it at least a year ago but i've seen people complaining about it at least 3-4 times, like here www.overclock.net/threads/rx-6000-cards-are-disgustingly-bad-for-old-games.1805753/ and buildapc/comments/127kaqo and AMDHelp/comments/1bqmkbj
"Nvidia spents lots and lots on game-specific optimizations throughout the years that AMD didn't catch up, and right now it makes no financial sense for AMD to work on the old stuff. Plus higher market share of discrete Nvidia means many lesser-known games only optimized for Nvidia instead of AMD. Overall you're more likely to have a worse experience playing older/less popular games with AMD compared to Nvidia."
3) not even talking about Vega, when rx 7000 released many people were angry because updates for rx6000 slowed so much.... also unrelated but destiny 2 had awful performance issues on Amd in 2020 when the nex dlc dropped and Amd took like MONTHS to fix the absymally low performance. I had 2 friends with a 5700xt that complained so much about this....
Also Nvidia has Dynamic Super Resolution which allows me to play older games at 5120x2880p instead of 1440p ( and when i get a 4k monitor i can play older games in 8k), Amd's virtual super resolution can go above 4k but they don't even advertise it... in fact until now i thought it couldn't go above 4k at all....
"As an example if the 7900XT was $450 US there would be no reason to buy anything else" I agree but that will never happen
2. Anecdotal from 3 people but I am not going to argue I had a 5600Xt but got a 6800Xt at launch and it has been Golden, same thing happened when I got a 7900XT. When you think about it Older Games have less features so the raw performance of modern PCs should be huge as an advantage and not disadvantage, Take TW Rome and see how fast a modern PC is. The Max resolution is 1080P. I seriously have no idea where you get that. Do you mean Hairworks on the Witcher? Maybe you mean Physx. That died as soon as Nvidia locked you out if the program detected an AMD card and would not work. I could give you the features in CP2077 but to be honest the raw raster performance and what 4K looks like on a modern PC is fine for my eyes and everyone else in my circle. This is anecdotal as well but my friends with 3080s are not as happy as those with 6800Xts.
3. Yep those 3 months to add 7000 to the universal stack were so long and had such a negative effect on performance that the entire community wet their panties while the other 90% of users did not even know. anything had happened. We can both use individual Games to critique both AMD and Nvidia performance. Hogwarts and Starfield come to mind.
4. AMD is not Intel and also have added lot's of things like Freesync and FSR (As much as it is derided it is universal) but even before that AMD cards have always been cheaper than
AMD software is that good and I know that AMD has advertised it but again you raised an issue that is a nothing burger.
I am not going to deny that most AMD cards do not sell as well as Nvidia cards but when the entire narrative is against you and people that have not used AMD cards strongly opine on them it shows. I will give you an example. How many of the main Youtube channels have done a deep dive on AMD software? How many tech media sites have done a deep dive on AMD software? How many know what AMD software is today if they only use Nvidia cards in their videos? That is sad because AMD software today can easily give you whatever you want. I have been playing a lot of City Skyines 2 lately and my population on my latest build is over 800,000 with 200km of Train Tracks, 35 Subways stops and 6 interchanges. 1/3 of the CIty is Office buildings so there is plenty of traffic from outside. Playing at 4k it tanked my FPs to from50s and 60s to the low 20s to high 30s. Well I went into AMD software and clicked on the Icon for CS2 and instantly HYper RX was activated. Went back into City Skylines 2 and now we are in the 60s and back in Freesync range for butter smooth Gameplay. That means fast Vehicles on the Highways and fast moving foot traffic at Train and Subway stations. AMD are actually tryiing and the fine wine is in full effect. Just look at how many people are still happy with their 6800XT. If I was playing at 1440P I would be happy with a 6800XT too but I am an enthusiast (Not a negative for those on 1440P) and 1440P was my Qnix monitor from like 14 years ago. I just happen to like AMD as they align more with my thought process on what the PC should be. No AMD user paid for them to implement Freesync and the whole world has benefited with TVs coming with (VRR) Freesync technology for the masses to enjoy.
What I do buy is that AMD is changing their position in the market and Steam is not a great indicator of their future. Steam doesn't track Playstation or Xbox consoles. It also skips the Switch...but that's an Nvidia product. I...have to acknowledge that the Switch has outsold the Playstation...but if you look at overall gaming PC sales versus console sales the point of contention is still that consoles outsell gaming PCs. That's...a lot of money.
I see AMD succeeding on the CPU side, and thriving with their APUs...and acknowledge that their next generation of video cards will not be high end. It seems like they are going the Nintendo route, where what they put out is not the best. It is not pushing new features. It is pushing value and profitability for a chunk of the market that they think they can milk, as long as they actually pay it attention. It's the same new logic that UserBenchmark has been called out on, in their ever amusing war against everything AMD. They have to acknowledge that AMD wins in some things, but follow it up by insulting their consumers and AMD's marketing team. They always cite that Intel is better...even when the numbers disagree. AMD evaluated the GPU race and has stepped back from the PC enthusiast market to carve out their niche in consoles...which based upon financials seems to have been a good step.
Before anyone calls it, Nvidia made a better one with selling their hardware as AI development tools. No questions there. I just look forward to the next two generations as Nvidia funnels all of their development there...and the eventual collapse of the power hungry glorified LLMs that make up current AI model. AMD will have consumers in the $300 window, but if the 4060 is what Nvidia is willing to do for that consumer hopefully there will be a reckoning.
It lacks many statistics requisits - sample choosing methodology (size of the sample, method of choosing sample), deviations, probability of error level, etc.
Thus, its data cannot be taken seriously. Afterall: "I only believe in statistics that I doctored myself." (Joseph Goebbels)