• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia's GPU market share hits 90% in Q4 2024 (gets closer to full monopoly)

Status
Not open for further replies.
4k 60 Hz, half dynamic range, no problem. Full dynamic range only works at 30 Hz, though. For a while, I thought it was my TV's limitation, but then I found out that it works on both AMD cards, and my Intel Xe iGPU in the 11700, it only doesn't work on Nvidia.

Also, my brother's DAC is in constant conflict with the Nvidia drivers, giving him popping sounds in games at random places. Again, something that doesn't happen with AMD.

The only difference is that I'm not roaming the forums shouting "Nvidia drivers are shit" left and right, like some people do with AMD, because I know nuance, and I recognise that my issues are probably one in a million.
Man, what you are saying makes no sense. You think nobody with a Turing card has ever run 4k 60 because of driver issues... Really?
 
Man, what you are saying makes no sense. You think nobody with a Turing card has ever run 4k 60 because of driver issues... Really?
No. I'm saying that I can't do it with my system config for some reason. I did say that my issue isn't common, didn't I?

My point is that I could run around yelling "Nvidia drivers are shit" like the AMD bashers do with AMD, but I don't because I consider myself more intelligent than that and realise that my problem might not affect other people.
 
Not sure what you guys are doing to your poor computers.

Smooth sailing for years.. decades even.
 
Not sure what you guys are doing to your poor computers.

Smooth sailing for years.. decades even.
Nothing. It's just Nvidia drivers not wanting to play nice with my TV. I'm not generalising and projecting my problems onto others like the usual AMD bashers do, just stating what should be obvious: nothing is perfect.
 
Must have a lot of Nvidia and AMD shareholders in this thread, the passion for which some defend their favourite is inspirational. Sometimes the best defence is a good offense too, so if there's no good defence for your brand, just make sure to pick something to attack about the other.

Those poor poor multi billion dollar companies copping an absolute bashing here on Techpowerup, they need all the volunteer help they can get. I'll light a candle for them both and say a prayer.
 
Must have a lot of Nvidia and AMD shareholders in this thread, the passion for which some defend their favourite is inspirational. Sometimes the best defence is a good offense too, so if there's no good defence for your brand, just make sure to pick something to attack about the other.

Those poor poor multi billion dollar companies copping an absolute bashing here on Techpowerup, they need all the volunteer help they can get. I'll light a candle for them both and say a prayer.
If you were pointing at me with that, I'll say again: I'm not picking a fight with any company, nor am I projecting my personal problems onto millions of other users who don't share them. Every system is unique, every software environment is unique, therefore, making blanket statements on personal anecdotes would be poor. I'm not stooping so low (unlike some others on this forum). I'm not defending, or deflecting, either. There's a million reasons to hate basically anything on this planet (multimillion $ companies especially), but what's the point?

If you weren't pointing at me, then carry on, nothing to see here. :)
 
Nothing. It's just Nvidia drivers not wanting to play nice with my TV. I'm not generalising and projecting my problems onto others like the usual AMD bashers do, just stating what should be obvious: nothing is perfect.
But how the hell did you conclude it's the drivers is my question. Something with your system doesn't wanna play ball, most of the time the problem is either the cable orthe TV. Ie my switch dock connects less than 20% of the time with my brand new shiny monitor (34gs95qe), while it connects every single time with my TV.
 
But how the hell did you conclude it's the drivers is my question. Something with your system doesn't wanna play ball, most of the time the problem is either the cable orthe TV. Ie my switch dock connects less than 20% of the time with my brand new shiny monitor (34gs95qe), while it connects every single time with my TV.
Did you read my post properly? I said that my first guess was the TV. But then, it works perfectly with AMD, and it works perfectly with Intel, but it doesn't work with Nvidia. Why do you think that is?

Why is it that Nvidia cannot be blamed for anything - any issue anyone ever has always has to be due to something else, just not Nvidia? :banghead:
Whereas if I complain that the paint is chipping off of my chassis, (some) people just have a quick check of my system specs, and shoot "bam, it's the bad AMD driver's fault". :banghead:
 
Did you read my post properly? I said that my first guess was the TV. But then, it works perfectly with AMD, and it works perfectly with Intel, but it doesn't work with Nvidia. Why do you think that is?

Why is it that Nvidia cannot be blamed for anything - any issue anyone ever has always has to be due to something else, just not Nvidia? :banghead:
Whereas if I complain that the paint is chipping off of my chassis, (some) people just have a quick check of my system specs, and shoot "bam, it's the bad AMD driver's fault". :banghead:
There are lots to blame with nvidia. Not being able to display 4k 60 just isn't one of them...
 
There are lots to blame with nvidia. Not being able to display 4k 60 just isn't one of them...
Do you have a Samsung UE50TU7100KXXU television? Have you tested it with graphics cards from all vendors?

Edit: I also didn't say "4k 60". I said 4k 60 with full dynamic range colours. Nvidia has this "half dynamic range" thing in the driver - that's the only thing that works at 4k 60. If I switch it to full, I'll get 4k 30.
 
Last edited:
Do you have a Samsung UE50TU7100KXXU television? Have you tested it with graphics cards from all vendors?
That's my point, if I need the specific television then the problem is either cable or the TV itself, I already said that. It's the same with m LG monitor, it's very picky with the inputs,.
 
That's my point, if I need the specific television then the problem is either cable or the TV itself, I already said that. It's the same with m LG monitor, it's very picky with the inputs,.
How the hell is the TV the problem if it works flawlessly with both AMD and Intel GPUs? :banghead:
 
How the hell is the TV the problem if it works flawlessly with both AMD and Intel GPUs? :banghead:
Well you mentioned the TV, why mention it if you don't think it's part of the issue? Anyways, if you think it's the drivers, sure whatever.
 
Well you mentioned the TV, why mention it if you don't think it's part of the issue? Anyways, if you think it's the drivers, sure whatever.
Look, I'm not expecting Nvidia, or any other hardware manufacturer to test their products in every imaginable hardware configuration. But if something works, it works, and if something doesn't work, it doesn't work. And my TV doesn't work well with Nvidia, but it does with AMD and Intel just fine. It doesn't matter who you blame, the end result is the same: I have to use an AMD or Intel card if I want 4k 60 with true 8-bit colour.

So yes, I'm blaming the Nvidia driver. And no, I'm not saying that it's a universal issue, just my own. And that's that. If you don't like it, it's not my problem.
 
And then this: Nvidia gaming GPUs an afterthought as AI generates mountains of cash — RTX 50-series shortages mentioned, not explained | Tom's Hardware
It shocks me that it doesn't occur to media commentators that Nvidia intentionally limited supply to ensure continued high pricing. This is a strategy they have clearly cultivated over the last three generations (with the assistance of COVID restrictions). No offence meant to blind people, but blind Freddy could see the neglect and strategy coming.
 
And then this: Nvidia gaming GPUs an afterthought as AI generates mountains of cash — RTX 50-series shortages mentioned, not explained | Tom's Hardware
It shocks me that it doesn't occur to media commentators that Nvidia intentionally limited supply to ensure continued high pricing. This is a strategy they have clearly cultivated over the last three generations (with the assistance of COVID restrictions). No offence meant to blind people, but blind Freddy could see the neglect and strategy coming.

It's pretty apparent by now that they've not really cared about gaming graphics since at least Ampere.
 
The last couple of gens, the chips have been designed for high performance compute, the gaming cards are essentially made from the leftover chips

The yields on ada were low, on blackwell they are even lower, if the yields were good then they could use most of the chips for workstation/ai cards and leave the gaming market

Theres lots of leftover chips and nvidia have figured out how to sell ai to gamers
 
Yes, that video is almost 3 years old now. And vram requirements go up, not down. They also clearly show how some games do not regress performance but show empty spaces and actually run faster but look awful.

It seem nvidia actually made that a thing with the new cards, HUB made a video about it, you can actually use less vram with the new software/hardware that comes with the 5000 series.

The last couple of gens, the chips have been designed for high performance compute, the gaming cards are essentially made from the leftover chips

The yields on ada were low, on blackwell they are even lower, if the yields were good then they could use most of the chips for workstation/ai cards and leave the gaming market

Theres lots of leftover chips and nvidia have figured out how to sell ai to gamers

is that how it works? i don't think so, a 5090 is not leftover from AI cards, they are completely different. The wafers can be the same but that's another story, when you know the yields are bad you already committed the wafer to either a gaming card or AI card, you can't use a H whatever and make it into a 5090, i don't think so. Maybe i'm wrong and someone can correct me
 
Not sure what you guys are doing to your poor computers.

Smooth sailing for years.. decades even.

My computers must be unicorns i haven't had driver issues on either side since sli/crossfire was a thing. Sure there has been a game here or there that didn't want to cooperate occasionally with one of the other but those mostly ended up being a game engine issue Pascal on gears 4 comes to mind...

Occasionally I've had to clean install my drivers or roll back to an older driver but that's just being a pc gamer if you wanted the true plug and play experience a console is the best in that regards unless their network goes down ofc.
 
It seem nvidia actually made that a thing with the new cards, HUB made a video about it, you can actually use less vram with the new software/hardware that comes with the 5000 series.



is that how it works? i don't think so, a 5090 is not leftover from AI cards, they are completely different. The wafers can be the same but that's another story, when you know the yields are bad you already committed the wafer to either a gaming card or AI card, you can't use a H whatever and make it into a 5090, i don't think so. Maybe i'm wrong and someone can correct me
Imo, the problem is that Nvidia has been using the same architecture since Pascal. They just added RT+Tensor cores on Turing, and have been playing around the ratio of FP/INT capable cores since then. But everything else is the same. The only performance uplift comes from the higher clock speeds doable on smaller manufacturing nodes. Blackwell is the first architecture that didn't get a node shrink and clock speed bump, and look what happened: Ada 2.0.

At least AMD isn't pretending that their architectures are vastly different by calling them RDNA 1, 2, 3 and 4.
 
Imo, the problem is that Nvidia has been using the same architecture since Pascal. They just added RT+Tensor cores on Turing, and have been playing around the ratio of FP/INT capable cores since then. But everything else is the same. The only performance uplift comes from the higher clock speeds doable on smaller manufacturing nodes. Blackwell is the first architecture that didn't get a node shrink and clock speed bump, and look what happened: Ada 2.0.

At least AMD isn't pretending that their architectures are vastly different by calling them RDNA 1, 2, 3 and 4.

i already said it. Nvidia can have a card that doubles the 5090 performance in the closet right now and i think if i were them i also wouldn't release it, there is no need to do it, there is no competition. Eventually when they need it they would already have a card ready to launch.
The abuse in the pricing, that's another story, that's a shitty move, that most people can make them pay for that later on, people don't forget those things.
 
i already said it. Nvidia can have a card that doubles the 5090 performance in the closet right now and i think if i were them i also wouldn't release it, there is no need to do it, there is no competition. Eventually when they need it they would already have a card ready to launch.
The abuse in the pricing, that's another story, that's a shitty move, that most people can make them pay for that later on, people don't forget those things.
Man, the 5090 already has a 750 mm2 die, and a TDP of 575 W. How much more do you want for a card that's twice as fast? What makes you think it's doable?
 
575w..

That is about 125w more than a 6 core X58 system with a solid tune running a 10GB load in Linpack Xtreme.

Through that little plug lol. I trust it for what I have, but I don't know if I would trust it for that.
 
575w..

That is about 125w more than a 6 core X58 system with a solid tune running a 10GB load in Linpack Xtreme.

Through that little plug lol. I trust it for what I have, but I don't know if I would trust it for that.

Yeah I'd get one of those thermal grizzly cable/power monitors wire view pro I think its called lol and a brand new high quality cable.

Honestly anyone spending 2500+ and likely 3k should be able to afford one lol.

Although I'd immediately drop it to 400-450w becuase above that is really silly for a gaming gpu that isn't making you money.
 
Status
Not open for further replies.
Back
Top