• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia's GPU market share hits 90% in Q4 2024 (gets closer to full monopoly)

Status
Not open for further replies.
I am disappointed to find out that AMD used a 5nm process for Strix Halo :( So for sure it will have worse efficiency than M4 series.

Regarding memory Bandwidth, the M4 Pro has 273GB/s (the M4 Max has +410 GB/s), compared to Strix Halo at 256GB/s (but includes 32MB Infinity Cache). AMD had a slide comparing rendering performance:
View attachment 378760

Not sure about GPU/gaming performance, but Strix Halo should be at least 2-3x faster than Ryzen AI 370. Which means it will also be faster than the M4 Pro in graphics.

Overall, this solution is still more efficient than having a separate CPU and dGPU laptop, and with such great performance AMD should be able to increase market share with Strix Halo against Intel / Nvidia / Apple.
Don't forget that this graph has a 120W CPU (strix halo) vs a 40W one (M4 Pro).
 
Don't forget that this graph has a 120W CPU (strix halo) vs a 40W one (M4 Pro).
Not sure what TDP was the Strix Halo chip set at, as it's configurable from 45W to 120W, but for sure it was running at much higher power than the M4 Pro.
 
Not sure what TDP was the Strix Halo chip set at, as it's configurable from 45W to 120W, but for sure it was running at much higher power than the M4 Pro.
For the LNL comparison they clearly stated it was running at 55W, same for the llama inference. However, they did not disclose it in the M4 Pro comparison (not even in the actual footnotes in the slides), so I'm going to assume it was not a pretty number to show off.
 
Well at least Huang adjusted prices correctly for new generation and i don't think it's a coincidence something worked probably too many angry people. :toast:

Still not sure about raster performance because they are hiding it.
 
Last edited:
So get ready for extreme prices with new RTX 50 Series and if you think that RTX 40 Series was bad/expensive then wait until new RTX series appears. :) It probably will set new standards in terms of how expensive and slow low to midrange gpus can get. (I wouldn't be surprised if we may get something like worst nvidia gpu generation ever overall in terms of p/p) And of course for this all thanks must be said only to Nvidia buyers because this is where they brought us all.
I was not that far off from my december 13th prediction. :toast:

Some good key points about new gpus from HW unboxed (Main reasons why gpus are expensive and slow)

 
Last edited:
For low-end and midrange cards, Nvidia is often inferior to AMD. AMD's high-end card is also fairly similar to Nvidia cards at the same price.

And Intel is making progress in the GPU market, their lag is only +- 15 months and they are probably going to be fully competitive with Nvidia and AMD in 2027.

 
while that may be true for pc AMD has 100% share in xbox and playstation gpus. I imagine that is far more profitable than the pc market so they arent quite dead and could easily turn it around with some good cheap hardware you can actually buy.
 
This was a midrange GPU 8 1/2 years ago.

1740164820003.png
 
Show me how much of that market share is controlled via OEMs.

I recall only 16% is DIY, so uh oh doh.
 
On a more serious note, you have a point with that said ^^^ Radeon RX 7600 is very slow and expensive (not to mention the pathetic RX 6400 and RX 6500 XT). If it had been made on a newer 4nm node, it would have been a success.
AMD can't produce old, slow and expensive graphics and to hope for sales. No, this is a recipe of an upcoming disaster.
Nvidia still make gt 720 and gt 1030……
 
IMO, I've been thinking for quite a while that despite valid arguments around AMD, there's just a huge cult-fanboyism around the Nvidia brand with younger gamer demographies and just the standard consumer behavior, they're easily manipulated by marketing and they think they got themselfs a huge deal just because they have a GPU with 5-10 FPS/% higher scores and other insignificant differences, that aren't going to have any significant practical difference in overall gaming experience or otherwise whatsoever. I'm talking about from the ground up, is there any kind of a significant feature difference? Do some mouse buttons not work, is there a resolution limit, is there no Display Port, are you limited to 12-bit Audio, ... I mean come on, the differences are little in overall comparison if you really pull back and give it fair comparison.

It's just a lot of unfortunate cicrumstances that are going in detriment of AMD and inflating Nvidia's dominance perhaps unfairly.

Then there's most likely subjectivity and unequal treatment by pre-built system integrators and the manufacturers down the line.
 
Marketing? Who pays attention to that?

That is for sheep, don't be a sheep.

If AMD made a better product, more people would buy it, plain and simple.
 
IMO, I've been thinking for quite a while that despite valid arguments around AMD, there's just a huge cult-fanboyism around the Nvidia brand with younger gamer demographies and just the standard consumer behavior, they're easily manipulated by marketing and they think they got themselfs a huge deal just because they have a GPU with 5-10 FPS/% higher scores and other insignificant differences, that aren't going to have any significant practical difference in overall gaming experience or otherwise whatsoever. I'm talking about from the ground up, is there any kind of a significant feature difference? Do some mouse buttons not work, is there a resolution limit, is there no Display Port, are you limited to 12-bit Audio, ... I mean come on, the differences are little in overall comparison if you really pull back and give it fair comparison.

It's just a lot of unfortunate cicrumstances that are going in detriment of AMD and inflating Nvidia's dominance perhaps unfairly.

Then there's most likely subjectivity and unequal treatment by pre-built system integrators and the manufacturers down the line.
Just look at MSI. A company that fully states it will not make AMD GPUs. It was so bad that even the 6800XT was not even announced from them but just launched. The worst was probably the Claw. Everyone else releases for AMD APUs but they marketed the Claw as being able to compete with the Ally with better battery life than the Steam Deck. It is so bad that the 5070Ti backlash has been for the first time in years no Weekly Live Stream from them a the 5070TI was supposed to be the focus.
 
IMO, I've been thinking for quite a while that despite valid arguments around AMD, there's just a huge cult-fanboyism around the Nvidia brand with younger gamer demographies and just the standard consumer behavior, they're easily manipulated by marketing and they think they got themselfs a huge deal just because they have a GPU with 5-10 FPS/% higher scores and other insignificant differences, that aren't going to have any significant practical difference in overall gaming experience or otherwise whatsoever. I'm talking about from the ground up, is there any kind of a significant feature difference? Do some mouse buttons not work, is there a resolution limit, is there no Display Port, are you limited to 12-bit Audio, ... I mean come on, the differences are little in overall comparison if you really pull back and give it fair comparison.

It's just a lot of unfortunate cicrumstances that are going in detriment of AMD and inflating Nvidia's dominance perhaps unfairly.

Then there's most likely subjectivity and unequal treatment by pre-built system integrators and the manufacturers down the line.

Most of AMD's wounds are self inflicted out if their sheer stubbornness and the stupidity of those who run the company's executive branch.

Like, their latest pea brained idea of deciding not to sell reference design MBA RX 9070 series cards. Why?

 
What's the need of reference designs? Usually third party coolers are quite a bit better.
And it's not so really much about poor decisions, Nvidia has almost a cult-like following and people tend to do the lazy thing and go with the masses
 
What's the need of reference designs? Usually third party coolers are quite a bit better.
And it's not so really much about poor decisions, Nvidia has almost a cult-like following and people tend to do the lazy thing and go with the masses

Nvidia buyers are either normies (hence such extreme market share) or enthusiasts whose needs AMD's products cannot satisfy.

To sell products to normies, you use marketing. Something AMD is positively terrible at. And to sell products to extreme enthusiasts, you need to pull an RTX 5090. Hopefully without the bugged cores and house fire connectors.

Now let's not pretend thats the first party MBA cards haven't been favorites for generations, because they have. Especially with RDNA 2, MBA cards sold well and even through AIBs.

Not having an MBA product fielded gives the first party market... again, to Nvidia. It's not hard to figure this out, if there's any cult it's those who keep pretending things are fine, that the experience with an AMD card is currently equivalent and will spend an unreasonable amount of time defending it all, even when 9 out of 10 buyers are choosing the competition.
 
If AMD made a better product, more people would buy it, plain and simple.
That didn't happen the Last two times AMD made better cheaper cards than Nvidia.
 
That didn't happen the Last two times AMD made better cheaper cards than Nvidia.
To me they lacked in a couple of areas.. RT I guess, but mostly compute. This GPU has about a billion points on it from F@H.

And where I live, they were more expensive than Nvidia when I was buying. That is a big ask to experiment with.
 
To sell products to normies, you use marketing.
Normies don't buy graphics cards, they buy a console, gaming laptop, or at the most a pre-built gaming PC.
Now let's not pretend thats the first party MBA cards haven't been favorites for generations, because they have.
They haven't, finding an AMD MBA card has been rare since RDNA2, and the AIB cards are usually much better in cooler quality and performance.
Not having an MBA product fielded gives the first party market... again, to Nvidia.
How exactly does not having an MBA card give the first party market to Nvidia? The Nvidia FE cards are pretty much unobtainium unless you're a tech influencer or lucky enough to live near a Microcenter.
if there's any cult it's those who keep pretending things are fine, that the experience with an AMD card is currently equivalent and will spend an unreasonable amount of time defending it all, even when 9 out of 10 buyers are choosing the competition.
If you want to consider people that simply want a GPU to play games on that don't give a sh*t about pointless marketing, then sure go ahead call it a cult. Nvidia has a cult like following, much like Apple. The cult are the ones buying Nvidia regardless of melting power connectors or planned obsolescence in the form of not enough bandwidth and VRAM.
And it's quite the opposite here anyway seeing Nvidia users spending an unreasonable amount of time defending their favorite multi-trillion dollar company for things like proprietary feature lock in which has only damaged the entire gaming market, while always criticizing the competition for something they have no intention on ever buying.
 
That didn't happen the Last two times AMD made better cheaper cards than Nvidia.

Because... AMD hasn't made a better card than Nvidia in recent memory. They came close to it with RDNA 2 and Navi 21, but... the last time they truly had a GPU that beat the pants out of NV's flagship while being cheaper was with Hawaii back in 2013 (290X vs. Titan)

390X: Grenada was a memory doubled re-release of Hawaii, it offered absolutely nothing new. Most of its proposal hinged on value, and the fact that this was a very good card vs. the GTX 980 despite being earlier generation technology.
Fury X: Despite the new tech behind the Fiji core, 4 GB hampered the card, poorly marketed (4 GB HBM = 12 GB GDDR5). Driver support dropped unceremoniously 6 years to the clock.
Polaris: GTX 980-like gaming performance, years behind. Its appeal to gamers hinged solely on price, eventually mass marketed to cryptocurrency miners. "Wait for Vega"
Vega and VII: "Poor Volta", pretty much got bodied by their Pascal contemporaries, especially the 1080 Ti. "Wait for RDNA"
5700 XT: Hilariously problematic silicon, black screen issues that took driver devs a full year to debug and workaround, several hardware steppings issued throughout its production run, still got bodied by Turing, sure it was far more affordable, but it's downlevel hardware, doesn't qualify for DirectX 12 Ultimate... you can argue Turing + RT isn't worth it all day, won't change that
6900 XT: Great card, sadly released under horrible market conditions, this card was far more of a hit than a miss, and I'd say the best thing AMD has released since Hawaii
7900 XTX: Buggy silicon, undercooked and poorly supported by the drivers, the "fine wine" was really AMD taking a couple of years of driver releases to get to run more or less as it was intended to, owners were subject to ridiculously shoddy QA slipups like the VAC antilag incident

I just can't follow this "better cheaper cards than Nvidia" because it frankly, hasn't happened in practically 12 years

Normies don't buy graphics cards, they buy a console, gaming laptop, or at the most a pre-built gaming PC.

They haven't, finding an AMD MBA card has been rare since RDNA2, and the AIB cards are usually much better in cooler quality and performance.

How exactly does not having an MBA card give the first party market to Nvidia? The Nvidia FE cards are pretty much unobtainium unless you're a tech influencer or lucky enough to live near a Microcenter.

Normies very much do buy GPUs, and build gaming PCs. It's not nearly as gatekept as one would argue, stuff is made to be easy to access nowadays and there's lots of tech resources to help those who are new, from forums to Discord to YouTube.

MBA cards have been available even here in Brazil. In fact, you can still buy new in box, AMD-branded 6750 XT's here. Nvidia FE availability is bad, sure, I'll agree. But they do have them available.


As for the cult... I won't go there. It's just needless flinging.
 
To me they lacked in a couple of areas.. RT I guess, but mostly compute.

Right, an area of little practical significance in overall computer, development, administration, management, web browsing, movie watching, music listening, and most likely gaming experience ... unless all you do in gaming is slo-mo frame focusing on pixels for the sake of the argument.

Resourcefulness, efficiency, approach, and a bit of patience can get you a long way on something weaker and cheaper. In many ways when other factors and bottlenecks are involved, like human interaction, higher performance gives you less and less returns until it gives you only edge-case convenience, for exaple 3D development ... would a faster GPU make your 3D design business more successful, ... it may, it may not. If you bake and render out a scene, animation at the end of the day and you go to sleep, having to ship it in the morning, does it matter what time the rendering is baked during the night. A faster GPU might be done at 3.00 o'clok in the morning, a slower would be done at 6.50 ... oh you still got 10 minutes left until you actually need the finished product.

So it can make no difference ... you take a bit longer lunch break and a chat in a restaurant and the slower GPU finishes rendering anyway, and you'd get away with it.

Am I really efficiently using my 1 gigabit internet connection, am I really getting that big of a better experience than I was 10 years ago on 30 megabit?
 
Right, an area of little practical significance in overall computer, development, administration, management, web browsing, movie watching, music listening, and most likely gaming experience ... unless all you do in gaming is slo-mo frame focusing on pixels for the sake of the argument.

Resourcefulness, efficiency, approach, and a bit of patience can get you a long way on something weaker and cheaper. In many ways when other factors and bottlenecks are involved, like human interaction, higher performance gives you less and less returns until it gives you only edge-case convenience, for exaple 3D development ... would a faster GPU make your 3D design business more successful, ... it may, it may not. If you bake and render out a scene, animation at the end of the day and you go to sleep, having to ship it in the morning, does it matter what time the rendering is baked during the night. A faster GPU might be done at 3.00 o'clok in the morning, a slower would be done at 6.50 ... oh you still got 10 minutes left until you actually need the finished product.

So it can make no difference ... you take a bit longer lunch break and a chat in a restaurant and the slower GPU finishes rendering anyway, and you'd get away with it.

Am I really efficiently using my 1 gigabit internet connection, am I really getting that big of a better experience than I was 10 years ago on 30 megabit?
I don't render anything but frames in game. I do fold proteins though.

A typical assumption.

Feel better?
 
AMD's problem isn't even the performance, the performance might not be the absolute highest but it doesn't have to be. Even the outstanding software issues wouldn't count as much IF they knew how to do marketing, after all, if the hardware is in use, the software will eventually catch up because software follows hardware around. The problem is that AMD simply doesn't know how to market their products. Ryzen's successful market penetration was simply because the product has proven itself to be too bloody good; not because AMD cleverly marketed it and struck deals to make it all happen.

Look at Nvidia. Weekly press releases about new games on GeForce Now. Almost every high-profile game release, they're also releasing another PR statement about how this cool new game runs great on GeForce (they have a day-one game ready driver to back that statement up, usually released a week in advance), and a complete showcase of what their technology can do to improve the experience of gamers. The few outstanding problems are fixed at a lightning quick pace. In addition to the mainstream game-ready drivers, there are branches aimed at both graphics developers (experimental Vulkan branch) and content creators (Studio drivers). They offer a whole ecosystem of AI features to back up their claims about their cards' capabilities - easy to setup LLM chatbot, a canvas application which will generate a landscape out of a few scribbles, a whole broadcasting platform, etc. - all of these things aggregate value to their products and maximizes their marketable value.

On the other hand, AMD is always struggling to get anything shipped, let alone stable, eventually releases some lukewarm clones of NV features that as we've seen recently, are lazily and shoddily implemented and can potentially even get you banned from online games because their implementation is so sloppy and badly thought out that it'll trigger anti-cheat software. And this actually got greenlit and shipped :kookoo:

They're kind of like Xbox - again, the Xbox is a pretty decent platform... it's just that Microsoft doesn't have the slightest clue on how to market and maintain it it, which pretty much concedes almost the entire market to Sony and their PlayStation. The irony there is that the PlayStation is actually a worse console and platform than the Xbox, but it has high-value production games, exclusivity deals and marketing backing it up - which just proves how important this question is.

AMD: release the card with stupid high MSRP gets bad reviews lower price in real market
nVidia: release the card with fake low MSRP gets great reviews actual real market price is much higher

They have been doing this for YEARS yet AMD still cannot understand this basic issue, i agree marketing is their worst part.

Also, not pressing on the real problem of nvidia driver overhead in any way, only HWUnboxed ever had the balls to cover it in depth in their articles, the reason is you will get a lot of dislikes if you bring it to light with the average gamer that is nvidia fanboy. They would still get a 3050 for their prebuilt with crappy i5 instead of a much better - both gpu and for the cpu performance - rx7600 at the same price.
 
I don't render anything but frames in game. I do fold proteins though.
I know personally quite a lot of 3D artists and modelers. Literally not a single one uses an AMD card. That’s not even a point of discussion in the industry, anyone unironically suggesting Radeons for Blender/Maya/3ds Max work would be laughed out of the room. One might say that’s just vendor lock, mind share, whatever, but when you look at something like this, yeaaaaah:
1740184806315.png

IMG_1789.jpeg
 
Status
Not open for further replies.
Back
Top