• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia's GPU market share hits 90% in Q4 2024 (gets closer to full monopoly)

Status
Not open for further replies.
Which no one apparently noticed HAHHA, i do wounder if AIB's like DELL got these cards too.

Probably. It definitely caused them to lotcheck the whole laptop GPU batch, which were delayed as a result. Nv claims laptops are immune so far, though. The Verge covered it.
 

Monster Hunter, record players on steam and the new game customers are asking me to build for other than Warzone, sees a gigantic performance dip with 12gb cards: the 9070 is 30% faster than 5070, while all 16gb+ cards are neck and neck with each other
 
Last edited:
Just like AMD.. all you have to do is look around lol
Not really, at 10% marketshare they're not exactly doing superb
Nvidia is the default to go to, no matter if the product is actually good or not
 
And why is that?
AMD failing to actually launch products without making a clownshow of themselves, for one. The 9070 series is actually the first good, solid launch without many caveats they had (availability aside) since… maybe Hawaii? Arguably OG Polaris? That, and a lacking software ecosystem making their carda unusable by default for much of anything apart from playing vidya and, later, mining. Meanwhile, for all those years NV just kind of kept going forward without any real failures and kept advancing both the HW and the SW side. Which made them the “default”, yeah. The GPU market is even more cutthroat than the CPU one. It took AMD a while and a shitton of effort to claw itself back with Zen. And Intel was nowhere as dominant as NV. It will take them a decade of 9070 level bangers to actually meaningfully compete. This is sad, but unfortunately true.
 
AMD failing to actually launch products without making a clownshow of themselves, for one. The 9070 series is actually the first good, solid launch without many caveats they had (availability aside) since… maybe Hawaii? Arguably OG Polaris? That, and a lacking software ecosystem making their carda unusable by default for much of anything apart from playing vidya and, later, mining. Meanwhile, for all those years NV just kind of kept going forward without any real failures and kept advancing both the HW and the SW side. Which made them the “default”, yeah. The GPU market is even more cutthroat than the CPU one. It took AMD a while and a shitton of effort to claw itself back with Zen. And Intel was nowhere as dominant as NV. It will take them a decade of 9070 level bangers to actually meaningfully compete. This is sad, but unfortunately true.
Well, I kind of see Nvidia's advancements as more of the same. Nothing has really changed since Turing. Same architecture, same core layout, same everything. Just more of it at a higher price. AMD at least tries.

I'm not saying either of them is good or bad, it's all a matter of personal taste, imo.
 
Take AMD off, put ATi back on, return to Canada. We will fix it.
 
I'm not saying either of them is good or bad, it's all a matter of personal taste, imo.
For unserious stuff like playing vidya? Yeah, honestly, probably. It just doesn’t matter there. For quite literally anything else? There is no taste involved whatsoever, NV is ahead of Radeon by a decade at least. Their Instinct accelerators are a good attempt and ROCm is… going, but they are still far from where they would probably like to be. UDNA is apparently a paradigm shift, so we’ll see where they are going with that.
 
For unserious stuff like playing vidya? Yeah, honestly, probably. It just doesn’t matter there. For quite literally anything else? There is no taste involved whatsoever, NV is ahead of Radeon by a decade at least. Their Instinct accelerators are a good attempt and ROCm is… going, but they are still far from where they would probably like to be. UDNA is apparently a paradigm shift, so we’ll see where they are going with that.
Yeah, really advanced, I see. :rolleyes: Maybe in compute, but definitely not in gaming.
relative-performance-2560-1440.png

If you say Nvidia is better for gaming, then I'll say you've been drinking a bit too much of their marketing kool-aid, and it'll be a never-ending conversation, so let's leave it at that.
 
@AusWolf
Neither NV, nor AMD care about gaming. It’s just that NV is in a position where they can throw scraps into the gaming market and it doesn’t affect them in the slightest, while AMD is actually desperate for any relevance, hence the (attempted) aggressive with the 9070.

I would again reiterate as I have done for a while on TPU - the writing was on the wall since 2007. dGPUs as gaming devices were always going to slowly fade away or become luxury toys. Because vidya just isn’t that important. The fact that modern games are made by absolute invalids and require HPC tier HW to run is on those devs, not any HW vendor.
 
@AusWolf
Neither NV, nor AMD care about gaming. It’s just that NV is in a position where they can throw scraps into the gaming market and it doesn’t affect them in the slightest, while AMD is actually desperate for any relevance, hence the (attempted) aggressive with the 9070.

I would again reiterate as I have done for a while on TPU - the writing was on the wall since 2007. dGPUs as gaming devices were always going to slowly fade away or become luxury toys. Because vidya just isn’t that important. The fact that modern games are made by absolute invalids and require HPC tier HW to run is on those devs, not any HW vendor.
That much I can agree with.
 
Yeah, really advanced, I see. :rolleyes: Maybe in compute, but definitely not in gaming.
View attachment 388540
If you say Nvidia is better for gaming, then I'll say you've been drinking a bit too much of their marketing kool-aid, and it'll be a never-ending conversation, so let's leave it at that.
Here in italy, 5070 12gb lowest price is higher than 9070xt higher price. The former can even crash due to low vram in a scenario where the 9070xt is doing 20fps, making the latter virtually infinitely better... even in playable scenarios 1% lows for 12 gb cards are all over the place. The console vram being shared means they actually have 14 gb for video assets (less than 1 gb is for the os, and a bit more for the game engine physics and such) so that's the size games are optimized for...
 
Yeah, really advanced, I see. :rolleyes: Maybe in compute, but definitely not in gaming.
View attachment 388540
If you say Nvidia is better for gaming, then I'll say you've been drinking a bit too much of their marketing kool-aid, and it'll be a never-ending conversation, so let's leave it at that.
The price is completely irrelevant when talking technologically. The graph you just posted paints a pretty obvious picture of why amd is lagging behind. This is just in pure raster, an area that amd is doing the best at, still the fastest they have is number 6. Again, in an area they are doing good at. Bringing MSRP into it completely pointless cause price is a thing that can change overnight, jumping 6 positions in performance cannot.
 
The price is completely irrelevant when talking technologically. The graph you just posted paints a pretty obvious picture of why amd is lagging behind. This is just in pure raster, an area that amd is doing the best at, still the fastest they have is number 6.
The 9070 series isn't made for people who want the fastest card available. I'm also not talking about the fastest card in existence. I'm talking about value and technological advancement.

Making the fastest is easy: just put a bazillion shader cores in it, raise the power limit to the sky, and you're done - demonstrated very well by Nvidia Blackwell.

Again, in an area they are doing good at. Bringing MSRP into it completely pointless cause price is a thing that can change overnight, jumping 6 positions in performance cannot.
In such a rapidly changing environment, all you can compare is MSRP, right? I can't talk about retail pricing in my area expecting that something happening tomorrow doesn't completely invalidate my point.
 
Making the fastest is easy: just put a bazillion shader cores in it, raise the power limit to the sky, and you're done - demonstrated very well by Nvidia Blackwell.
If that's what you got by blackwell then you are severely mistaken. The 5080 doesn't have a baazilion shader cores nor does it pull as much power as amd's fastest GPU (7900xtx) and yet, it's faster, in an area (raster) that nvidia isn't even trying to be good at. So clearly, it is very hard - if it was easy amd would be doing it and grabbing all the enthusiast sales. But they can't.
 
This thread is old news. Latest report is AMD and Intel took a little share as Nvidia slipped to 82%.

aib2.png
 
@DAPUNISHER
Yeah, but it’s a fairly natural progression at this point - the markets are hungry for GPUs (both consumer, professional and enterprise) and, essentially, everything and anything that the three players can produce does sell. Or rots on the shelfs if priced too high for what it is. I wouldn’t be surprised if, with the absolute state of Blackwell on the consumer side, AMD manages to take back some market share. What interests ME is what happens then - does NV still give a singular fuck about the consumer side of things to try and fight them back down or do they just shrug and keep going as they do, basically throwing said consumers the dregs for inflated prices?
 
Last edited:
If that's what you got by blackwell then you are severely mistaken. The 5080 doesn't have a baazilion shader cores nor does it pull as much power as amd's fastest GPU (7900xtx) and yet, it's faster, in an area (raster) that nvidia isn't even trying to be good at. So clearly, it is very hard - if it was easy amd would be doing it and grabbing all the enthusiast sales. But they can't.
The 5080 is a great example. Similar number of cores as the 4080 (Super) has, slightly higher power consumption, and similar performance. The only way Nvidia could extract more performance out of Blackwell is by pushing more cores and more power into the 5090. The 5080 and everything below is Ada 2.0 (or rather 1.1).

The 9070 XT achieves 7900 XT level raster, and better than 7900 XTX level RT with much fewer cores. This is what improvement means in my books.
 
@DAPUNISHER
Yeah, but it’s a fairly natural progression at this point - the markets are hungry for GPUs (both consumer, professional and enterprise) and essentially everything and anything that the three players can produce does sell. Or rots on the shelfs if priced too high for what it is. I wouldn’t be surprised if with the absolute state of Blackwell on the consumer side AMD manages to take back some market share. What interests ME is what happens then - does NV still give a singular fuck about the consumer side of things to try and fight them back down or do they just shrug and keep going as they do basically throwing said consumers the dregs for inflated prices?
Many think Nvidia will continue to devote most of the wafers to the much more lucrative enterprise market until if or when the AI bubble pops. It gives the other vendors an opportunity to win over gamers. 50 series has been the worst launch Nvidia has ever had, which speaks volumes to where their focus currently is.

The reason I posted this, is how the hell did TPU fail to report the new data? So that we are discussing it in a 3 month old thread?
The 9070 XT achieves 7900 XT level raster, and better than 7900 XTX level RT with much fewer cores. This is what improvement means in my books.
It's a page right out of Nvidia's playbook. AMD finally nailed a launch. The day one reviews were the best they have had in a long time. The MSRP issues are not unique to them either, so it does no real PR damage. Gamers have to be frustrated at all of the IHVs.
 
The 5080 is a great example. Similar number of cores as the 4080 (Super) has, slightly higher power consumption, and similar performance. The only way Nvidia could extract more performance out of Blackwell is by pushing more cores and more power into the 5090. The 5080 and everything below is Ada 2.0 (or rather 1.1).

The 9070 XT achieves 7900 XT level raster, and better than 7900 XTX level RT with much fewer cores. This is what improvement means in my books.
But you are comparing nvidia to nvidia when point was that amd is the second fiddle. Lol man. The 5080 has is a small chip that uses less power than the 7900xtx (and the 9070xt) and still smacks them both in pure raster, let alone RT. That's an obvious example of why amd isn't and cannot compete with nvidia.
 
But you are comparing nvidia to nvidia when point was that amd is the second fiddle. Lol man. The 5080 has is a small chip that uses less power than the 7900xtx (and the 9070xt) and still smacks them both in pure raster, let alone RT. That's an obvious example of why amd isn't and cannot compete with nvidia.
Sure, we can compare Nvidia to AMD, too. $750 vs $600 MSRP. Need I say more?
 
Sure, we can compare Nvidia to AMD, too. $750 vs $600 MSRP. Need I say more?
It doesn't even matter at all because here are no such a thing as msrp it does not exist both products at these current prices are downgrade from previous gen (and that's including fact that previous gen for both sides was weak already)

When idiots are buying RX 9070 XT for 750$/€+ and RTX 5070 Ti for 900$/€+ here never will be a good products.
 
Last edited:
It doesn't even matter at all because there are no such a thing as msrp it does not exist both products at this current prices are downgrade from previous gen (and that's including that previous gen for both was weak already)

When idiots are buying RX 9070 XT for 750$/€+ and RTX 5070 Ti for 900$/€+ here will never be a good products.


For sure amd only looks like they've progressed because RDNA3 was so far behind to begin with and going by their performance expectations during the reveal wasn't even close to what they expected.

The 4070ti super was doing similar stuff to what we have now at a similar or even cheaper price 12-13 months ago.

There is no way around it this generation is #poop and that doesn't matter what gpu makers sticker is on the box.

9070XT has quite a bit more transistors than the GB103 and still loses which points to it using a slightly more advanced node it's like 46 billion vs 54 billion yet is still slower than even the heavily cut down varient.
 
It doesn't even matter at all because here are no such a thing as msrp it does not exist both products at these current prices are downgrade from previous gen (and that's including fact that previous gen for both sides was weak already)

When idiots are buying RX 9070 XT for 750$/€+ and RTX 5070 Ti for 900$/€+ here never will be a good products.
No one should buy anything much higher than MSRP, that much we can agree on.

The only thing I'd add is that the 5070 Ti isn't even worth it at MSRP. It should come at least $100 lower (at $650 MSRP).
 
Status
Not open for further replies.
Back
Top