• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Next Gen GPU's will be even more expensive

Status
Not open for further replies.
So more people are buying AMD CPU these days, but those same people are also buying Nvidia GPU, because of some mind virus against AMD products? very weird.

Also AMD market cap is like 2.5x that of Intel now, AMD have been investing in more lucrative sectors such as AI/Data Center, meanwhile Intel is stuck making cheap CPU/GPU
I believe @tfdsaf didnt say exactly that.
If I understand it correctly the point was "even when AMD has better products the majority of buyers still buying Intel/nVidia."

To be honest I believe its true but not because some "virus"... Its simply ignorance on the subject.
In my work place I'm dealing with 50 people every day, and a lot more, more rarely. I had discussions about PCs sometimes with a number of them through out 2 decades.
70% of the time people know brand names like Intel pentium, i5, i7 and GeForce but no AMD and Radeons. And many of them who did hear about the red team do not trust to buy it simply because the do not spend time learning about them. 90% of all buyers just want a PC without getting into it deep. "Yeah give me that, I know it and it works" kinda thing.

"We" are forgetting that the vast majority of users never clicked on a hardware forum, let alone an analytic review. Its a waste of time for them.
AMD has had the better processors for the past 4 years, better value system for at least 7 years now, yet Intel is still at a 70% market share. And in 3 years time Intel might get back on top again and regain market share again and reach 90% of it again.
I think you are forgetting about the mobile market which still is dominated by Intel based laptops and the same ignorance I talked about.

If I do a search about laptops on one of my local market/store Amazon type catalog there are 4110 different laptops

Apple: 396
AMD: 1044
Intel: 2670
 
Last edited:
It's gonna be super interesting when nVidia finally populates that list.
I've never met any normie that knows what a GeForce is and it's unrelated to the kind of G-Force mentioned at the track.
That's a whole other realm of enthusiasts that are fast af boi.

Maybe I grew up in a better timeline but Rage/Radeon/Voodoo were kind of the norm.
It wasn't until team green started pumping out FX cards that they had something cool.
 
I agree with the pricing you listed there, though the xx50 seems to be dead, xx60 is bottom of the barrel, with nvidia stagnating on the low end, xx60 cards need to have 12GB VRAM.
Though I don't see pricing changing from what it is because people showed they were still willing to pay scalpers as cards sold well over MSRP, and people are still willing for pay for a GPU then criticize anyone not happy with the pricing.

Exactly, and the line only keeps getting thinner as the gap between high end and the flagship gets further apart.

Agreed, the current GPU market is in shambles, team green has a near monopoly, team red left the high end, and well Intel is trying but they have to gain some marketshare first.
People are rightfully upset and of course many are going to complain as the hobby becomes unaffordable. Although if some want to enforce only positivity, then I don't see that going over well. And only blaming one side or "team" shows a clear bias for the other "team" so I would suggest people with that thinking maybe try to be more neutral.
I don't think PC gaming is unaffordable. One just needs to shop with their brain and not with their ass. You need to establish your needs and your budget and plan accordingly. Sure, the days of putting the most expensive gear into your basket and clicking pay without a second thought have gone, but that doesn't mean you can't build a decent gaming PC on a budget as long as you're not targeting 4K 120+ FPS.

Correct. However, with a 256-bit bus you can add 16 or 32GB of VRAM, while on a 320-bit bus you can add 10, 20 or 40GB of VRAM. While 32 and 40GB of VRAM is definitely the future, especially for 8K gaming, right now I am happy with something that can be future proof for at least next 4 to 5 years: 20GB. I keep the 384-bit bus card out of it, since this will be out of my budget.
16GB is perfect for now, but since the cards are getting ridiculously expensive, I definitely don't want to buy a new card next year, or next-next, just because I run out of VRAM in future games.
I think that comes down to chip design. Basically, the entire Ada lineup consists of relatively small chips, and I think Nvidia wants to keep it that way with Blackwell (except for the 5090). A wider memory bus would make these chips bigger and more expensive.

So really, I have 2 kids, and my inner child to supply for.. I should have made better life choices :laugh:
One of the reasons why I don't want to have kids. :laugh:
 
  • Like
Reactions: N/A
Sure, the days of putting the most expensive gear into your basket and clicking pay without a second thought have gone
I sure don't remember days like that, ever since I've been a pc gamer with my own earned disposable income, simply putting the most expensive gear in my basket has never really been an option. I'd be lucky to have one part be the best one. Would be an interesting cost exercise to pick a year and build a best of the best then add inflation and see how much worse it really is.

Things have certainly slowed down too, I remember a span of several years where I was buying more than one video card per year.
 
I sure don't remember days like that, ever since I've been a pc gamer with my own earned disposable income, simply putting the most expensive gear in my basket has never really been an option. I'd be lucky to have one part be the best one. Would be an interesting cost exercise to pick a year and build a best of the best then add inflation and see how much worse it really is.

Things have certainly slowed down too, I remember a span of several years where I was buying more than one video card per year.
Around 2006-07, I sold my Radeon 9600 XT, put some pocket money into it and bought an X800 XT, then a GeForce 7800 GS.
Then in 2012-13, I built an FX-8150 rig with a Radeon HD 7970 using cash left from my student loan for a summer.

Now, as a full-time employee, I couldn't afford a top-level gaming PC without taking on debt, or using my savings, which is a terrible idea for some entertainment, and no one should ever do, imo.

Times have changed.
 
Times have changed
While I agree that's the case, most of those parts you listed aren't the most expensive / best of their respective generations either, especially since if we want to be honest about it, for a very long time the best of the best was 2-4 of the best video cards.

I'd also argue that to some degree, the best today is a more versatile and feature rich product.
 
I believe @tfdsaf didnt say exactly that.
If I understand it correctly the point was "even when AMD has better products the majority of buyers still buying Intel/nVidia."

To be honest I believe its true but not because some "virus"... Its simply ignorance on the subject.
In my work place I'm dealing with 50 people every day, and a lot more, more rarely. I had discussions about PCs sometimes with a number of them through out 2 decades.
70% of the time people know brand names like Intel pentium, i5, i7 and GeForce but no AMD and Radeons. And many of them who did hear about the red team do not trust to buy it simply because the do not spend time learning about them. 90% of all buyers just want a PC without getting into it deep. "Yeah give me that, I know it and it works" kinda thing.

"We" are forgetting that the vast majority of users never clicked on a hardware forum, let alone an analytic review. Its a waste of time for them.

I think you are forgetting about the mobile market which still is dominated by Intel based laptops and the same ignorance I talked about.

If I do a search about laptops on one of my local market/store Amazon type catalog there are 4110 different laptops

Apple: 396
AMD: 1044
Intel: 2670

Yeah that's unfortunately the truth but the tide is turning on the cpu side and if AMD could outright make a better gpu or gpu's that were killer price to performance multiple generations in a row they might be able to turn the tide.

Nvidia doesn't even try very hard in the sub 600 usd market anymore and yet AMDs offerings are still generally slower the one exception is the 7700XT vs the 4060ti but they botched that launch with stupid pricing and it isn't so much better that it deters buyers from the nvidia product.

They need to be the clearly better option not the toss up flip a coin option because if that's all they offer people are going to pick nvidia 9 our of 10 times.

I'm not sure there is anything AMD can actually do though their one card they play is giving more vram but that and a slightly lower msrp isn't enough just look at how much their cards cave in price over time.

We are definitely in the minority though it's fun to debate and throw barbs and argue but at the end of the day we should all want the same thing for the hardware to be better and give us more for our money each generation.
 
Spend your money all you want, just don't expect me to praise you like a god. Not everybody who doesn't own a 4090 is envious. Some people just want to live in peace. Not everything is about "other people".
Envious, no, but everybody wants a 4090. If price wasn't an issue, every gamer would have one, no reason not to.
 
While I agree that's the case, most of those parts you listed aren't the most expensive / best of their respective generations either, especially since if we want to be honest about it, for a very long time the best of the best was 2-4 of the best video cards.
SLi/CF didn't exist back in my X800 XT days. That GPU was top of the line (until the X850 XT came out a few months later).

I'd also argue that to some degree, the best today is a more versatile and feature rich product.
What does that mean? What versatility and features does a 4090 have that a 4060 doesn't?

Envious, no, but everybody wants a 4090. If price wasn't an issue, every gamer would have one, no reason not to.
No, I don't want one. It's big (I prefer micro-ATX or even ITX cases), too power-hungry and just obnoxious. The same way I don't want a Ferrari, either. Not even for free (unless I can sell it straight away).

Some people (rich people) find it hard to understand, but not everybody craves the most expensive things. Some of us are happy with the small things in life.
 
SLi/CF didn't exist back in my X800 XT days. That GPU was top of the line (until the X850 XT came out a few months later).
You could SLI voodoo cards if memory serves, then a hiatus of course till pci express, but for a many a year the enthusiast teir was an SLI or Crossfire rig.
What does that mean? What versatility and features does a 4090 have that a 4060 doesn't?
It means a 4090 is a more feature rich and versatile product than a 7900GTX, or even a GTX780Ti. Times have changed as you say.
 
You could SLI voodoo cards if memory serves, then a hiatus of course till pci express, but for a many a year the enthusiast teir was an SLI or Crossfire rig.
Yes, you could SLi Voodoo cards (as SLi was 3DFx's thing, Nvidia just bought it together with the company), but there was nothing during the AGP era, which is when I had my X800 XT. It was the top.

It means a 4090 is a more feature rich and versatile product than a 7900GTX, or even a GTX780Ti. Times have changed as you say.
That's just technological progress. Every generation has more features than the last. No need to praise Nvidia/AMD for it, and definitely not a reason to pay more.
 
No, I don't want one. It's big (I prefer micro-ATX or even ITX cases), too power-hungry and just obnoxious. The same way I don't want a Ferrari, either. Not even for free (unless I can sell it straight away).

Some people (rich people) find it hard to understand, but not everybody craves the most expensive things. Some of us are happy with the small things in life.
Not fitting your case is a practical issue, that doesn't mean you don't want one, it just means you can't fit it in the case. Saying it's power hungry is just borderline silly, you can power limit it to whatever power you desire, it will be the fastest card at that power, therefore the most efficient, therefore the one to get if you really care about power.

It's not about rich people, I specifically said if price is no concern (let's say you get a card of your choosing for free without a resell option), everybody would just get the 4090, no reason not to ( I mean sure, there are people here that wouldn't cause they hate nvidia, but that's besides the point and has nothing to do with the card itself).

EG1. Especially regarding the power thing, just FYI, my 4090 needs less power to lock 1440p 120hz in POE2 than my 3060ti needs. One of the reasons I opted for the 4090 was efficiency ^^.

EG2. The Ferrari is a different beast, because it's not the best car for everything, for example, travelling with your family. The 4090 is the best at everything you are going to use a GPU for.
 
We have a better chance of AMD actually competing than ever getting these prices again every node shrink the price is going to go up it's probably why 5000 series is using basically the same node as 4000 series.....

I do think they can probably get away with $2500 on the 5090 though due to the 32GB of vram more than the actual performance increase and then replace it a year later with a 5090 super at 2000 to offload stock if they even have any. To maintain the same margins as the 4090 it would need to be around $2000 from what I've seen and I doubt they want to take less margins on it.

I do wonder at what point even the Nvidia die hard who think Jensen Huang is their lord and savior start to say enough is enough with the price gauging.

Its cute that you still say 'the node' is cause for inflated prices - thats bullshit when you knowsaid companies have 50-60% profit margins.

The reality is that demand never ceases. What happened with dGPU is not that they are oh so costly all of a sudden. EVERY SHRINK is only ever done if there is economical sense to it, thats why 28/22nm lasted so long... And if you look at the margins, well... there is a shitload of wiggle room we are just ponying up no questions asked. And then we parrot the nonsense life in foundry land is hard? Lmao
 
Its cute that you still say 'the node' is cause for inflated prices - thats bullshit when you knowsaid companies have 50-60% profit margins.

The reality is that demand never ceases. What happened with dGPU is not that they are oh so costly all of a sudden. EVERY SHRINK is only ever done if there is economical sense to it. And if you look at the margins, well... there is a shitload of wiggle room we are just ponying up no questions asked. And then we parrot the nonsense life in foundry land is hard? Lmao

No what I meant is Nvidia is going to keep it's current margins and if 3n is 50% more expensive than 4n guess what we will have to pay for it.
 
Its cute that you still say 'the node' is cause for inflated prices - thats bullshit when you knowsaid companies have 50-60% profit margins.

The reality is that demand never ceases. What happened with dGPU is not that they are oh so costly all of a sudden. EVERY SHRINK is only ever done if there is economical sense to it, thats why 28/22nm lasted so long... And if you look at the margins, well... there is a shitload of wiggle room we are just ponying up no questions asked. And then we parrot the nonsense life in foundry land is hard? Lmao
Genuine question, have the profit margins (specifically for the gaming dgpu market) gone up? Was nvidias margins at 30% during pascal and skyrocketed to 60% nowadays? Im really curious about that
 
Not fitting your case is a practical issue, that doesn't mean you don't want one, it just means you can't fit it in the case. Saying it's power hungry is just borderline silly, you can power limit it to whatever power you desire, it will be the fastest card at that power, therefore the most efficient, therefore the one to get if you really care about power.
I... do... not... want... one. Which part is so hard to understand? :confused:

I do not want to power limit anything. I'm on Linux, I'm running my AMD card with the kernel-integrated driver, and no 3rd party tool installed. It works great. If a card can't do this, then it's not made for me. Simple as that. Not everybody wants a bazillion tools to control a bazillion things on their gaming rig. Some people just want a decent out-of-the-box experience. Or is this equally hard to understand?

It's not about rich people, I specifically said if price is no concern (let's say you get a card of your choosing for free without a resell option), everybody would just get the 4090, no reason not to ( I mean sure, there are people here that wouldn't cause they hate nvidia, but that's besides the point and has nothing to do with the card itself).
No, I wouldn't. See my reasons above.

EG2. The Ferrari is a different beast, because it's not the best car for everything, for example, travelling with your family. The 4090 is the best at everything you are going to use a GPU for.
No, it's not. It's not the best at fitting into a m-ATX or ITX case without heating up too much. It's also not the best at providing the best and most efficient out-of-the-box experience on Linux.
 
Genuine question, have the profit margins (specifically for the gaming dgpu market) gone up? Was nvidias margins at 30% during pascal and skyrocketed to 60% nowadays? Im really curious about that

With their professional cards selling likely for 80-90% margins it's hard to know what they are making on each RTX card but my guess is with turing, ampere, and ada margins were pretty good they went with Samsung becuase it was hella cheap after all last generation.
 
Doesn't matter when it was, point is it did not change anything, They had better processors for 4 years and Intel still came on top. AMD has had the better processors for the past 4 years, better value system for at least 7 years now, yet Intel is still at a 70% market share. And in 3 years time Intel might get back on top again and regain market share again and reach 90% of it again.

Right now the market share should be 60%-40% for AMD because Intel's 13 and 14 series have had major issues with reliability, their 12 series were much worse value than Ryzen counterparts, their new 200 series are inferior in almost all departments and are still very expensive, so market share you'd think would be in AMD's favor.

Same thing with GPU's, AMD has absolutely dominated Nvidia in terms of performance and value and everything, yet Nvidia has kept a 70%+ market share at their lowest and realistically more like 85% market share for over 2 decades.

So if its not the competition coming up with better products, which I've shown they have multiple times, then you have to admit that its a mind virus that makes consumers sick and unable to think rationally. They go for their abuser.
AMD 'dominated' nothing at all. They never had the mindshare, did not lead on featureset ever, and the price argument was also not ever compelling enough. AMD had no answers against the GTX 970 or 980Ti. Those answers came FAR too late and they didnt cover the whole stack either. They had no answers for ANYTHING in Pascal or Turing. Availability of Nvidia gpus was almost always better and when AMD finally did catch up again, they neutered volume production on RDNA2. Then they lost all monentum with RDNA3.

I mean lol. What do we expect here. Its a miracle theyre still here.
 
Yes, you could SLi Voodoo cards (as SLi was 3DFx's thing, Nvidia just bought it together with the company), but there was nothing during the AGP era, which is when I had my X800 XT. It was the top.
Cool well then yes I completely agree based on your single example of one card that was the top at the time, you bought the best one and it's cheaper than the best one today. My point is that starting before then till well after, enthusiast teir setups were also bonkers expensive, and essentially unnecessary for most gamers, but many of us still drooled on the idea of running those setups.
That's just technological progress. Every generation has more features than the last. No need to praise Nvidia/AMD for it, and definitely not a reason to pay more.
Ima agree to disagree on that one.
 
I... do... not... want... one. Which part is so hard to understand? :confused:

I do not want to power limit anything. I'm on Linux, I'm running my AMD card with the kernel-integrated driver, and no 3rd party tool installed. It works great. If a card can't do this, then it's not made for me. Simple as that. Not everybody wants a bazillion tools to control a bazillion things on their gaming rig. Some people just want a decent out-of-the-box experience. Or is this equally hard to understand?


No, I wouldn't. See my reasons above.


No, it's not. It's not the best at fitting into a m-ATX or ITX case without heating up too much. It's also not the best at providing the best and most efficient out-of-the-box experience on Linux.
Ok, sorry, I'll change my argument. There is no legitimate reason for someone not wanting a 4090 besides price. Sure, there is the odd person that might not like the box art or doesn't want to click the power limit button, those are the exceptions.

With their professional cards selling likely for 80-90% margins it's hard to know what they are making on each RTX card but my guess is with turing, ampere, and ada margins were pretty good they went with Samsung becuase it was hella cheap after all last generation.
Yeah but between Pascal (1080ti) and ada (4090) prices doubled. Did their margins double, or is it just the BOM that skyrocketed. We will never know I guess.
 
AMD 'dominated' nothing at all. They never had the mindshare, did not lead on featureset ever, and the price argument was also not ever compelling enough. AMD had no answers against the GTX 970 or 980Ti. Those answers came FAR too late and they didnt cover the whole stack either. They had no answers for ANYTHING in Pascal or Turing. Availability of Nvidia gpus was almost always better and when AMD finally did catch up again, they neutered volume production on RDNA2. Then they lost all monentum with RDNA3.

I mean lol. What do we expect here. Its a miracle theyre still here.

Their biggest mistake with rdna 3 was thinking people would pay 900 usd for a 7900XT other thah the die hard for life fanboys. That card could have been a real winner if it started at the price it ended up at....

They did a similar move with the 7700XT to a lesser extent.
 
Genuine question, have the profit margins (specifically for the gaming dgpu market) gone up? Was nvidias margins at 30% during pascal and skyrocketed to 60% nowadays? Im really curious about that
Get ready


IMG_6774.png


'Moore's law is dead, we took it to our profit margin now'
Signed, J Huang
 
Ok, sorry, I'll change my argument. There is no legitimate reason for someone not wanting a 4090 besides price. Sure, there is the odd person that might not like the box art or doesn't want to click the power limit button, those are the exceptions.
I just gave you my legitimate reasons. Do you have a reading comprehension problem or something? :confused:
 
But is that regarding gaming GPUs?
They are all chips out of the same wafers and gaming gpus are leftover chips enterprise wont even look at. You tell me :D
 
Status
Not open for further replies.
Back
Top