• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia's GPU market share hits 90% in Q4 2024 (gets closer to full monopoly)

Status
Not open for further replies.
It's very clear Nvidia doesn't care about consumers, with increasing prices, shifting the whole product stack around, restricting features only for the latest cards, and monopolizing the market while reviewers for some reason keep defending a trillion dollar corporation for ecosystem locked features. All of the complaints of a worsening market are valid complaints.

This may come as shock to you but Nvidia cares about money, like any other corporation on Earth. Having no real competition makes them behave like shit, but AMD or Intel (they did this for more than a decade) would do the same if they could.
What corporation could make more money selling to some class of buyers, and say, wait a minute... lets sell for less money to those other guys because of reasons. You wouldn't do that if you were selling a used card, let alone a corporation.
99% of us would leave our jobs for a better paying one, and wouldn't stick around for one that had less pay, because we should care about *replace consumers for whatever*
 
I am not really in the market for an "AI Card" (duh), but aren't those already really expensive? So in other words, they are already doing that...?
True, they already are but what I mean is nvidia aren't doing their customers any favors by selling cards for $2000. Also going on the 50 series leaks it seems they're heavily marketing the AI features with "neural rendering". The 5080 and 5090 are the only cards that look to be an upgrade from the previous gen, while the 5070Ti is just meh, because Nvidia is mainly focusing on AI while the gaming consumers get the leftovers unless spending $2000+. I wouldn't be surprised if Nvidia completely abandons the market below the x70 tier, as the x60 cards keep being mediocre with 8GB of ram that hasn't been enough for several years.
 
Also going on the 50 series leaks it seems they're heavily marketing the AI features with "neural rendering". The 5080 and 5090 are the only cards that look to be an upgrade from the previous gen, while the 5070Ti is just meh, because Nvidia is mainly focusing on AI while the gaming consumers get the leftovers unless spending $2000+. I wouldn't be surprised if Nvidia completely abandons the market below the x70 tier, as the x60 cards keep being mediocre with 8GB of ram that hasn't been enough for several years.
hmmm, i think thats mostly marketing speech. Isn't AI really interested in huge VRAM pools? Why would they cheap out on that stuff if it was meant for AI. Also they already have professional skus for that stuff, why would the muddy their own line up with "gaming" gpus? Seems counterproductive.

I think the 8gbs on the 4060 was enough, looking at the relative performance of it vs the 16gb model. But it might age poorly, will be interesting to see. (the main fault on the card, it's to expensive. But why would they lower the price? It's not exactly bad value (Performance per $) we need better value at lower end, which for an example the b580 is)
 
I mean is nvidia aren't doing their customers any favors by selling cards for $2000.
I mean, they aren't meant to do favors since it's a company that has to profit. And $2k is still appealing to many customers, specially the ones that will use it for AI, albeit not for gamers.
Isn't AI really interested in huge VRAM pools?
Yes.
Why would they cheap out on that stuff if it was meant for AI. Also they already have professional skus for that stuff, why would the muddy their own line up with "gaming" gpus? Seems counterproductive.
2 reasons: market segmentation and still having an entry level product for people that are getting started.
- Market segmentation is pretty simple, both the 3090 and 4090 topped out at 24GB, if you needed more for some reason you'd need to do the jump to a $4k+ product that actually had a slower die. The 3090 and 4090 were actually godsends for people that required more VRAM, since previously your only "entry" option was a Titan RTX for $2.5k or jump into a Quadro for way more.
- On the second point, nvidia only managed to get where they are now by providing consumer-level GPUs with a good API that were able to do compute without much headaches and amazing performance. This allowed researchers to speed up their tasks considerably, creating tons of new stuff built on top of those GPUs - see AlexNet as a great example of the most famous paper that made use of GPUs way back in 2011 or 2012. Having folks being able to buy a device that you can game with, but later they can start to study and get into GPGPU stuff, get used to Nvidia, and just continue in the ecosystem as you get more profissional and do more stuff.
 
This is incorrect. Nvidia's margin prior to the AI boom was only up slightly and ada represented a margin DECREASE to 2019 levels.

turns out rampant inflation and cost increases means GPUs go up in price.

Probably another reason they stick with N4P (a 5nm product). TSMC has been increasing wafer costs.
 
AMD needs to make dumb shit up like "super AI visual fidelity", "deep neural NPC AI behavior", "Ultra resolution AI improvement", etc... and its just complete junk like nvidia's pxysix, nvidia's ray reconstruction, etc... Add some minor benefits with those that no one can actually notice, but market it bigger than life. Market it as the next best thing since sliced bread.

That will get the dumb masses to actually buy an otherwise okayish product. That is what Nvidia has done with the 2000, 3000 and 4000 series.
 
AMD needs to make dumb shit up like "super AI visual fidelity", "deep neural NPC AI behavior", "Ultra resolution AI improvement", etc... and its just complete junk like nvidia's pxysix, nvidia's ray reconstruction, etc... Add some minor benefits with those that no one can actually notice, but market it bigger than life. Market it as the next best thing since sliced bread.

That will get the dumb masses to actually buy an otherwise okayish product. That is what Nvidia has done with the 2000, 3000 and 4000 series.

Nah AMD marketing department prefer dumb shit like over promising and under deliver their products, that will make dumb people excited and hype their shit products haha
 
Market it as the next best thing since sliced bread.
I've not got high hopes, they've managed to fumble so much marketing recently, like bread and butter stuff around expected performance, to the point of even purposely being misleading. But yeah if they can play into their advantages and software stack as being better than Nvidia, rather than "we have Nvidia features at home", and get the price right at launch, hey it'd be better than what they've been doing.
 
Low quality post by eidairaman1
Nah AMD marketing department prefer dumb shit like over promising and under deliver their products, that will make dumb people excited and hype their shit products haha
You hype dumbshit from green
 
Low quality post by Hecate91
There is no appeasing the team green mindshare, it always has to be "better" than what nvidia is doing in order for them to even consider anything but their favorite brand, even when AMD matches them with similar features people still find excuses "But its not DLSS". It's clear where the bias is even with reviewers and those defending nvidia while they have a monopoly are just gonna buy from nvidia anyway.
 
I was watching the PC World stream called The Full Nerd. It is obvious that all major media channels do the same thing. One of the members said that the 4090 should be the GPU of the year. Then they spent 20 minutes talking about the Intel 580. People in the chat asked what about AMD and exactly 10 seconds were spent on the 890M. The kicker is that it was the one that loves Handhleds that suggested the 4090. The truth is so far from the narrative in the GPU space that this thread is part of why they are being investigated. Some of the comments about AMD are so out of touch with things like Driver support and no performance gains from 6000 to 7000. Now the 5090 will take the place of the 4090 and that will be forgotten by the narrative just like the 3090. Once AMD catches up the goalposts are moved again and people blame AMD for giving them free performance like how FSR support does not matter what card you have but of course that meant that FSR had to be labelled as Worse than DLSS even though some of the people touting DLSS don't even have access to the latest version. It has never been as bad as it is on TPU right now. makes me wonder what Twitter looks like. This post may get Deleted as it is inferred as Pro AMD. It is funny how people can make baseless claims about AMD and the moderators never chastise them. Media is supposed to be objective but there is a pattern to tech media. Like how everyone reviews the same hardware. The latest case review from TPU was on every other major player's channel. It is also insane how price/perofmance does not matter to so many people when the reality is that only Online retailers stock the 4090.
 
Its just hardware.

Run what you want, run what you can afford, run what you cant afford.

Run what's popular, run what isn't popular. Who really cares in the end?
 
I was watching the PC World stream called The Full Nerd. It is obvious that all major media channels do the same thing. One of the members said that the 4090 should be the GPU of the year. Then they spent 20 minutes talking about the Intel 580. People in the chat asked what about AMD and exactly 10 seconds were spent on the 890M. The kicker is that it was the one that loves Handhleds that suggested the 4090. The truth is so far from the narrative in the GPU space that this thread is part of why they are being investigated. Some of the comments about AMD are so out of touch with things like Driver support and no performance gains from 6000 to 7000. Now the 5090 will take the place of the 4090 and that will be forgotten by the narrative just like the 3090. Once AMD catches up the goalposts are moved again and people blame AMD for giving them free performance like how FSR support does not matter what card you have but of course that meant that FSR had to be labelled as Worse than DLSS even though some of the people touting DLSS don't even have access to the latest version. It has never been as bad as it is on TPU right now. makes me wonder what Twitter looks like. This post may get Deleted as it is inferred as Pro AMD. It is funny how people can make baseless claims about AMD and the moderators never chastise them. Media is supposed to be objective but there is a pattern to tech media. Like how everyone reviews the same hardware. The latest case review from TPU was on every other major player's channel. It is also insane how price/perofmance does not matter to so many people when the reality is that only Online retailers stock the 4090.
I suppose a sort of good thing is tech channels like PC world are irrelevant compared to many other tech youtubers because most people can see through the nonsense of sponsorships and paid advertisements those channels push. Promoting the the 4090 as "gpu of the year" has to be some of the most distorted bias I've seen yet, since the card is unaffordable to like 95% of the gaming market,though it's unsurprising as the flagship cards always get all of the hype. The flagship cards from Nvidia have had problems for a few gens now, the 2080Ti was affected by artifacting, the 3090 was failing from the game New World, yet everyone blamed the game instead of Nvidia, the 3090Ti had overheating VRAM because some of it was on the back of the card, and the 4090 was affected by the defective 12vhpwr connector yet everyone insisted people were plugging it in wrong instead of it being called out as a serious issue. And then nvidia quietly updated the power connector to one less likely to fail. I recall the media put more attention on AMD for a few defective vapor chamber coolers, much more than the melting power connector ever got.
These reviewers are missing the point entirely as handheld systems are getting more powerful while being a more accessible gaming platform than a desktop PC, people will pick up a Steam Deck or ROG Ally instead of dropping $4k on a PC with a 4090, as most care about playing enjoyable games than paying to be a beta tester for buggy games with some shiny RTX thrown in to sell the game.
Yet reviewers will praise the 5090, even though its rumored to have "AI" ray tracing and new features only available on the 50 series. If a card is over $1000 I expect it to be able to run the latest titles without needing graphics trickery and fake frames, but reviewers and users love Nvidia for feature lock in while causing the market to stagnate on the low end and midrange. It really should be the complete opposite, because the gaming community would be better off with the same features available to every brand. I haven't seen it this bad here either since I've been here, interesting my post gets hit yet someone calling AMD shit doesn't, I'm either being targeted here, or supporters and the mindshare get away with more stuff. Twitter must be an absolute dumpster fire, though I don't use it.
It is indeed disappointing how every reviewer used to focus on price/performance but that seems to have gone away in recent years as GPU prices have only gone up, and it will probably only get worse as the rift between midrange and high end gets farther apart as Nvidia controls the market.
 
Last edited:
Not going to deny that AMD largely made their own bed, but don't discount the power of perception. My favorite example is the RX 6600. It was decisively the best price-performance card of its generation, and even beat Nvidia's offerings in efficiency. It hit all the marks people claim to care about (ignoring RT, which was irrelevant at that performance level), in a graphics card. Yet it got almost as much derision as praise (from what I saw) from commenters, and still sold poorly. Momentum is A Thing. Just look at the response to the B580. Reviewers are almost universally, "This is pretty good!" But witness the amount of negativity about it in the comments on this very site. It doesn't live up to some imaginary Nvidia-derived standard, therefore it is crap. Or something. I don't get it.
Basically said what I've been thinking. I don't take most comments and opinions from people on random sites very seriously as half the time its more opinion than fact 99% of the time. I heard at least one person claiming the B580 would be DoA, thankfully it isn't. No matter what discussion about more than one company in any remotely comparative way leads to a shit-show, Its why I'm glad we got people here as-well who can recognize things *they* value over a brand or company.
 
Its just hardware.

Run what you want, run what you can afford, run what you cant afford.

Run what's popular, run what isn't popular. Who really cares in the end?

Oh, please if I mention something like running S.L.I crossfire or mGPU, people become rampant hate mongers for it about & what not.
People care so much about what others think about their build they never build what they want any more at all.
 
Oh, please if I mention something like running S.L.I crossfire or mGPU, people become rampant hate mongers for it about & what not.
People care so much about what others think about their build they never build what they want any more at all.
Its no different than listening to people scream from the mountaintops about how good AMD is.

No one cares. If they did they would be buying them.

So who cares what people run, it does not affect your system stability in the slightest.
 
I suppose a sort of good thing is tech channels like PC world are irrelevant compared to many other tech youtubers because most people can see through the nonsense of sponsorships and paid advertisements those channels push. Promoting the the 4090 as "gpu of the year" has to be some of the most distorted bias I've seen yet, since the card is unaffordable to like 95% of the gaming market,though it's unsurprising as the flagship cards always get all of the hype. The flagship cards from Nvidia have had problems for a few gens now, the 2080Ti was affected by artifacting, the 3090 was failing from the game New World, yet everyone blamed the game instead of Nvidia, the 3090Ti had overheating VRAM because some of it was on the back of the card, and the 4090 was affected by the defective 12vhpwr connector yet everyone insisted people were plugging it in wrong instead of it being called out as a serious issue. And then nvidia quietly updated the power connector to one less likely to fail. I recall the media put more attention on AMD for a few defective vapor chamber coolers, much more than the melting power connector ever got.
These reviewers are missing the point entirely as handheld systems are getting more powerful while being a more accessible gaming platform than a desktop PC, people will pick up a Steam Deck or ROG Ally instead of dropping $4k on a PC with a 4090, as most care about playing enjoyable games than paying to be a beta tester for buggy games with some shiny RTX thrown in to sell the game.
Yet reviewers will praise the 5090, even though its rumored to have "AI" ray tracing and new features only available on the 50 series. If a card is over $1000 I expect it to be able to run the latest titles without needing graphics trickery and fake frames, but reviewers and users love Nvidia for feature lock in while causing the market to stagnate on the low end and midrange. It really should be the complete opposite, because the gaming community would be better off with the same features available to every brand. I haven't seen it this bad here either since I've been here, interesting my post gets hit yet someone calling AMD shit doesn't, I'm either being targeted here, or supporters and the mindshare get away with more stuff. Twitter must be an absolute dumpster fire, though I don't use it.
It is indeed disappointing how every reviewer used to focus on price/performance but that seems to have gone away in recent years as GPU prices have only gone up, and it will probably only get worse as the rift between midrange and high end gets farther apart as Nvidia controls the market.
The truth is much different than the narrative. When we were in the depths of Covid the least expensive way to get into compelling PC Gaming was a Gaming laptop. The vendors even understood that and applied 1080P panels to those laptops. That made the 3060 laptop so popular that it showed up on what used to be controversial Steam Charts. I also have one. I agree the Ally is basically a Windows PC with some special Asus sauce. For the price the Ally was 10% the cost of some 4090/14900HX based laptops.

Unfortunately the problem exists across the entire ecosystem. I will list some investigations on my part.

Robeytech: During a Live stream I asked, why do you never use AMD GPUs. He actually got triggered and said that AMD systems don't sell well so I don't use them.

KitGuru: Leo did a review on the 9800X3D and I asked in the comments why they never use AMD GPUs. He responded with the basis of this thread. Another reviewer on Kitguru was reviewing case and he built a PC using the 9800x3D. Of course he used a 4070TI with the build. I commented it's too bad a lot of people will never get to see AMD software as he again was using a 4080. He responded that their Nvidia partners were very good to them.

TPU: I don't remember what card it was it might have been the 7800XT review. In that Wizzard actually listed AMD software as a Pro for buying AMD. Then we get the 580 from Intel review and all of a sudden AMD does not matter even though the 7700XT is an older 12 GB GPU and in their very reviews blows the 580 away. Of course people are going to tout MSRP like GPUs actually live up to that. I am willing to bet the next round of 580s will cost the same as the 7700XT but reviews made it seem like the 580 or Nvidia are the only options in the GPU space.

The truth. I have a Cousin and nephew that are into PC Gaming. One had a 2070 anf the other had a 3070. I replaced both systems with 6800XTs that were used for Mining and they are both blown away with the performance. A couple weeks ago user on TPU lamented not being able to change his Display specs on his PC. He even referenced looking for it in AMD software. I posted the screenshot of the Window for changing Display specs in AMD software and he was happy to learn something new. Go on Newegg and filter for the 7900XT/XTX reviews and wonder where some members on TPU form their opinions. You don't need to go that far though just go to the 6000/7000 Owners Club and see that plenty of people are enjoying the hell out of their AMD cards. Since I cannot stand Nvidia's business practices it has been all AMD for me and the 6800 was good but the 7950XT was much faster and Vega was faster than that. 5000 was the same perfomance for less power but 6000 was great and 7000 makes this PC I am typing on the fastest I have ever used. Combine that with X3D and that blows away 4K. Remember when the journey was about resolution. With these modern LED monitors you can turn the contrast and saturation up to high levels and enjoy colours popping off the screen. Sometimes when I finish a race in AMS2 @4K with 200+ FPS and the car models looking spectacular. How about City Skylines 2 that has info on every building, person, vehicle and anything in the Game and think about how that latency argument works when my CPU is at 90% but I am still enjoying the Game because Freesync is that good but it is old so it does not matter to the narrative anymore even though Monitor class is separated by Specs. I amnot trying to attack anyone. It is almost Xmas and I have been off for the last 2 weeks. Nvidia can have their 90% but they will never be a monopoly. There are many factors that led to the downturn in GPU sales but you know the Steam Deck had a dent on DGPU sales for AMD.
 
The 7700 XT's MSRP is almost double that of the Arc B580 (449 vs. 249 USD), that is why W1zz recommends it. Compared to the similarly priced 8 GB RX 7600 ($20 higher MSRP, 269) the Arc B580 offers gamers more for their money. More performance, more memory, etc.
 
The 7700 XT's MSRP is almost double that of the Arc B580 (449 vs. 249 USD), that is why W1zz recommends it. Compared to the similarly priced 8 GB RX 7600 ($20 higher MSRP, 269) the Arc B580 offers gamers more for their money. More performance, more memory, etc.
Thanks for proving my point. The 7600 is $50 Cheaper where I live and the 4060 is $50 more the 580.
 
Thanks for proving my point. The 7600 is $50 Cheaper where I live and the 4060 is $50 more the 580.

It is not reasonable to ask for a reviewer to take into account every region's market conditions. That is why we have a manufacturer's suggested retail price. MSRP is rarely revisited once a product has launched, especially if the intention is to reduce it. The price of the 7900 XT for example is still theoretically $900 instead of the usual $650 you will find it for nowadays. The MSRP is an expected price, anything lower = good for you, but that is pretty much the expected benchmark under a normal supply and demand situation, if supply is strained, it will either be out of stock for an extended period of time, or as we have seen with the 4090 - stratospheric prices.
 
It is not reasonable to ask for a reviewer to take into account every region's market conditions.

I absolutely disagree with this. You can't have your cake and eat it too. They want a global audience but make no effort to get out of their bubble and often lead customers on other regions to bad purchases.
 
I absolutely disagree with this. You can't have your cake and eat it too. They want a global audience but make no effort to get out of their bubble and often lead customers on other regions to bad purchases.

Let's say that there are only 20 regions in the world. Which 20 do you focus on? OK...it's been six weeks, and prices have changed...do you now have to rewrite the review for the new prices in each region? What about supply? Imagine a region where imports are low (sic. China), or a region where the governmental taxation rate is much higher. Oh wait, you don't have to. What you are requesting is an infinitely evolving, infinitely diverse analysis...that would require updates so frequently as to be useless.

The reason that stuff doesn't constantly change, and that we compare performance to performance, is to try and mitigate this. If your region has issues getting AMD 7xxx series GPUs, then knowing that the 20% price premium for an 18% better card is as useful as a 20% premium for a 22% better performer. This is beside the obvious bumps up from how a 7700 versus a 4060 performs...etc...


Imagine for a second your standard applied to this website. It'd require every single driver update has all the cards rerun...all of the cards compared to all of the other cards (so instead of 4080 you'd have dozens of models)...all of the cards in history compared to the current crop...and if you don't get it so far, that would mean an ever increasing load of testing and results which wouldn't be useful for 90% of people who would have to comb through dozens of variants of the 3060 just to see where the 4060 might place...in multiple graphs because of how the current 4xxx series tends to match the older 3xxx series in certain cases and not others. It's asking for information overload...where 90% or more of it would be irrelevant.



So, yeah. Referring all of this back to the 90% market on Steam...I believe this is another false dichotomy of understanding. In the same way as 100% of available results, and 100% of available price comparisons is a stupid metric to hold reviewers accountable for I believe Steam as the arbiter of Nvidia's dominance is a stupid metric. I'm looking forward to the post holiday haze, where Intel surges ahead because they actually have a competent offering at pricing that is not eye wateringly high. Sometimes people on a budget will discover that there's a good alternative...and as Nvidia's influencer program is basically overlooking the old "budget" value because they can sell anything with "AI" in the label for another few hundred dollars.
 
Thats why each region has their own face, right?

You have the U.S. guys, you have those Aussie blokes, a couple of Brits, an Irish guy.. and on, and on..
 
I absolutely disagree with this. You can't have your cake and eat it too. They want a global audience but make no effort to get out of their bubble and often lead customers on other regions to bad purchases.

@lilhasselhoffer already covered it, there is no way to completely satisfy global audiences each and every tailored need, my addition to justify your argument as well is that this is the reason why it is always important for consumers to do their own research and make an informed purchasing decision.
 
Let's say that there are only 20 regions in the world. Which 20 do you focus on? OK...it's been six weeks, and prices have changed...do you now have to rewrite the review for the new prices in each region? What about supply? Imagine a region where imports are low (sic. China), or a region where the governmental taxation rate is much higher. Oh wait, you don't have to. What you are requesting is an infinitely evolving, infinitely diverse analysis...that would require updates so frequently as to be useless.

The reason that stuff doesn't constantly change, and that we compare performance to performance, is to try and mitigate this. If your region has issues getting AMD 7xxx series GPUs, then knowing that the 20% price premium for an 18% better card is as useful as a 20% premium for a 22% better performer. This is beside the obvious bumps up from how a 7700 versus a 4060 performs...etc...


Imagine for a second your standard applied to this website. It'd require every single driver update has all the cards rerun...all of the cards compared to all of the other cards (so instead of 4080 you'd have dozens of models)...all of the cards in history compared to the current crop...and if you don't get it so far, that would mean an ever increasing load of testing and results which wouldn't be useful for 90% of people who would have to comb through dozens of variants of the 3060 just to see where the 4060 might place...in multiple graphs because of how the current 4xxx series tends to match the older 3xxx series in certain cases and not others. It's asking for information overload...where 90% or more of it would be irrelevant.



So, yeah. Referring all of this back to the 90% market on Steam...I believe this is another false dichotomy of understanding. In the same way as 100% of available results, and 100% of available price comparisons is a stupid metric to hold reviewers accountable for I believe Steam as the arbiter of Nvidia's dominance is a stupid metric. I'm looking forward to the post holiday haze, where Intel surges ahead because they actually have a competent offering at pricing that is not eye wateringly high. Sometimes people on a budget will discover that there's a good alternative...and as Nvidia's influencer program is basically overlooking the old "budget" value because they can sell anything with "AI" in the label for another few hundred dollars.

it's not that difficult to collect prices online for a couple of regions, that's insanely less effort than testing cards on multiple games and settings. Were talking about a couple of minutes compared to countless hours of testing, would make no difference. That doesn't seem like a valid argument to me. A simple chart would do it, you get charts for everything nowadays, 99% less relevant to viewers than local pricing

 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top