• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Report Suggests "Extreme" Stock Limits for GeForce RTX 5090 & 5080 GPUs in Germany

The Nvidia shit show continues.

"Look what we can make!"

But we don't want to make too many at once because if you really want one, you are going to have to sacrifice something that you really like.

And we really enjoy seeing our dedicated, loyal customers suffer while we reap the incredible benefits that have been owed to us for so long.

So, to our beloved customers, we just want to say "Fuck You".

A little dramatic, mate :laugh:

But like I said this could very much be attributed to AMD and the 9800X3D too... Neither do this on purpose IMHO. Good products will always experience high demand. I think the only company that ever managed a steady launch date supply was Apple and that's because they shore up stock and stagger global releases. For example the iPhone 15 Pro had 7 global release dates, and that's with their insane vertical integration and B2B/B2C capability, using a node pretty much to themselves, etc.

1737207410362.png


Besides I think we are jumping the gun a bit. Stuff isn't out yet, no point in being outraged beforehand.
 
OK!

So maybe I was being a tad dramatic.. had just woken up and having first sips of coffee..

Bad agent, bad.

Sorry :)
 
Fools and Money shall be parted.
Episode 4.
In this episode: a repeat of all you've seen before, but likely worse. A culmination of human idiocy, built on impatience and ego.
I am that fool! :D

If I can get the 5090 FE at MSRP and sell my Zotac Amp Extreme Airo 4090 then it's about a £1k upgrade, so could be worse.
 
Nvidia wants people to buy low-end cards like 3050 and 3060 (4060, 5060 etc) then renew them each year when another milking card is released.
This way you subsidize Nvidia to make and sell AI video cards to companies. You get 5% video graphics performance increase, Nvidia gets the money
 
Last edited:
I am that fool! :D

If I can get the 5090 FE at MSRP and sell my Zotac Amp Extreme Airo 4090 then it's about a £1k upgrade, so could be worse.

Yeah, once my 4080 goes I should be able to recover at least 40% of the value of the upgrade, and that is assuming I manage to grab a ROG Astral. The rest goes into my credit card and I'll pay a little each month.
 
It isn't surprising if stock is going to be limited for consumers and Nvidia decides to sell to corporations instead. Lying about gaming supply when it isn't being sold to the intended customer should be illegal, but as already mentioned Nvidia hasn't been punished for it
And Nvidia has been saying "f#&$ you" to their gaming customers for several gpu generations, but it's interesting some have finally realized that.
 
Last edited:
Nvidia wants people to buy low-end cards like 3050 and 3060 (4060, 5060 etc) then renew them each year when another milking card is released.
This way you subsidize Nvidia to make and sell AI video cards to companies. You get 5% video graphics performance increase, Nvidia gets the money
At this point, nvidia is just releasing those low-end cards as subside to get people to jump into their cuda ecosystem and get used to it during their professional lives.
Yeah, once my 4080 goes I should be able to recover at least 40% of the value of the upgrade, and that is assuming I manage to grab a ROG Astral. The rest goes into my credit card and I'll pay a little each month.
you'll me missing out on that sweet pix discount most of our stores have.
 
That’s fine if you are just a gamer that can make do with their previous graphics card.
if you need a new machine because your previous one no longer works and you can’t get anything thats an other problem.

and you cant get anything? how is that in any way a realistic scenario....come on man the heck
 
I am that fool! :D

If I can get the 5090 FE at MSRP and sell my Zotac Amp Extreme Airo 4090 then it's about a £1k upgrade, so could be worse.
Oh but if you can nab one AT msrp... thats not fools territory. What I just dont get is that people pay inflated pricing because they cant wait. They make life harder for everyone else, too.

For myself, I just take it one step further and always lag about one gen behind; so when AMD announces 9000 series cpus, I buy the 7000 series at a decent price. Similarly... when Nvidia introduced a better 1080, I bought the not better version at a very good price. Its great; you pay the real price, the early adopter issues are known and/or fixed, and you can still resell your old stuff at a great price too.
 
This would be epic if Nvidia is playing the game of scalpers buying the stock, but have more then enough to service to public and getting the scalpers to buy... but not able to sell again at inflated prices again.

Justice!
 
Can't you see the reality, people?
AMD is the company that actualy cares about gamers!
RX 9070 XT is already at stock in EU resellers!
We just don't know the specs and price and they can't start selling lol.

:D
 
OK!

So maybe I was being a tad dramatic.. had just woken up and having first sips of coffee..

Bad agent, bad.

Sorry :)
I think you just let your heart speak for once instead of the cognitive dissonance... the veil falls off. Its about time imho

Dont live in denial... its a little bit rough sometimes, but it keeps ya real.
 
I think you just let your heart speak for once instead of the cognitive dissonance... the veil falls off. Its about time imho
Lol.. all I run is Nvidia. But this time I might buy an AMD.. but shh. Or I might not :D

And if I don't, there will be many peoples calling me all kinds of stuff ;)
 
Lol.. all I run is Nvidia. But this time I might buy an AMD.. but shh. Or I might not :D

And if I don't, there will be many peoples calling me all kinds of stuff ;)
All things in moderation, they say... perhaps a brand switch can add insight. What are you really missing, or not... nice to discover. If it doesnt work out, youre just one sell away from a replacement. Its one of the main reasons I went for another camp this time. Just exploring and experiencing.
 
All things in moderation, they say... perhaps a brand switch can add insight. What are you really missing, or not... nice to discover. If it doesnt work out, youre just one sell away from a replacement. Its one of the main reasons I went for another camp this time. Just exploring and experiencing.
But I fold, do some video stuff now and then, but from what I know, AMD does not fold well at all.
 
Linus Torvalds was way ahead of his time.


Yep and Nvidia still treats Linux with disdain. They're getting a little better but most gaming-oriented Linux distribution forums and subreddits are full of tales of woe over Nvidia drivers screwing up.

Back to the topic at hand: this isn't an excuse for Nvidia skimping on VRAM across the board, bult most people doing serious AI work need at a bare-minimum 64GB+ of VRAM. The models have gotten massive to the point that 32GB won't cut it.

Obviously that's not going to stop AI Brain Geniuses from buying 5090s simply to generate pictures of naked large-breasted elf girls, but bless them for doing this necessary service to humanity.
 
Linus may be stupid and not be able to understand that we have hit the silicon limit on the hardware side and the raster improvements on the software side.
So you either get DLSS and RT/PT or stay where you were in 2018 for the next 15-20 years.
RTX 5000 is the proof that since nVidia didn't get their 3nm chips, they did not have any choice but to release software upgrades and ...whatever architectural changes were possible to be made at that point.

Regarding the prices, I find utterly normal for nVidia and AMD to ask whatever they want for their gpus since this is the most important part of a gaming, and not only, PC.
So many companies sell cases, PSUs, MBs, etc at ridiculous prices without any risk or innovation while they sell top notch tech and what? they should price them at affordable prices??? why should they?
Remember when Intel made quadcores for a decade?! People legitimately thought this was it. Bigger CPUs werent for the consumer market, because Intel didnt position them as such.

And then we got Zen and today we have rumors that Intel is for sale. Do you need a bigger writing on the wall?

Now keep in mind we already have chiplet GPUs and Nvidia is already grasping at straws to sell their product; the marketing machine is booked top to bottom,'maximum misleading in their presentation and selling everything EXCEPT raster/raw performance. Huang tells you how amazing it all is, but he's really selling you last gen hardware with new artificial boundaries. DLSS4 didnt need another hardware release.

And you believe this. Painful... being so gullible. Especially when the competition produces nearly the same thing, or is moving there with not much more than just a little more time.

Here is a news flash: salesmen lie to you.
 
Oh well i'm getting my hands on the 5090 on the 30th :cool:.

All the necessary hardwares swap have been done too, 13700KF --> 9800X3D, Corsair HX850 --> HXi 1200i
 
Remember when Intel made quadcores for a decade?! People legitimately thought this was it. Bigger CPUs werent for the consumer market, because Intel didnt position them as such.

And then we got Zen and today we have rumors that Intel is for sale. Do you need a bigger writing on the wall?

Now keep in mind we already have chiplet GPUs and Nvidia is already grasping at straws to sell their product; the marketing machine is booked top to bottom,'maximum misleading in their presentation and selling everything EXCEPT raster/raw performance. Huang tells you how amazing it all is, but he's really selling you last gen hardware with new artificial boundaries. DLSS4 didnt need another hardware release.

And you believe this. Painful... being so gullible. Especially when the competition produces nearly the same thing, or is moving there with not much more than just a little more time.

Here is a news flash: salesmen lie to you.

And when these dual ccd CPUs performed well in games? Never.
Actually when they managed to put 8 cores in one.

If that doesn’t work with CPUs where 90% of the time in gaming they offer nothing, imagine how bad it will be latencywise in a gpu.

I agree with you though that something else has to be introduced, even if that is dual triple ccd gpus with separate shader, tensor, whatever cores.

Until then, although I don’t like at all paying for software only updates, we get stuck with the leather jacket man’s approach.
 
wording could have been better, to me it sounded like limited oc for AIB, not low stock in stores.

@kondamin
in the last 20y i have only "waited" longer on a gpu purchase, because i wanted a certain model, or for a special deal, and in the last 10y even for those i bought for customer builds.
at least for US/Germany and their online stores

one reason i love when ppl tell someone to "wait" (short of new release coming, affecting prev gen pricing).
 
Can't you see the reality, people?
AMD is the company that actualy cares about gamers!
RX 9070 XT is already at stock in EU resellers!
We just don't know the specs and price and they can't start selling lol.

:D
That cards far too slow :)

One reason they dropped out of the very high end, they can't keep up.
 
And when these dual ccd CPUs performed well in games? Never.
Actually when they managed to put 8 cores in one.

If that doesn’t work with CPUs where 90% of the time in gaming they offer nothing, imagine how bad it will be latencywise in a gpu.

I agree with you though that something else has to be introduced, even if that is dual triple ccd gpus with separate shader, tensor, whatever cores.

Until then, although I don’t like at all paying for software only updates, we get stuck with the leather jacket man’s approach.
The same 8 cores that we used to not be able to use... are now used in games, enabling that single CCD 8 core monstrosity that the competition can't even beat with 3x the power and core count.

Now look at GPUs. We had virtually all GPUs from the midrange on up simply destroy anything we threw at them. And then came RT. And then DLSS. And engine updates. What was really gained here? Or is this just a similar brutal inefficiency as what we've been seeing in gaming CPU performance before DX12? We wanted more actors and assets on screen... but the API nor the hardware was suited for it. Now it is, and the CPUs destroy all the things again. Granted, no, you can't run everything at 300 FPS. But that's another segment/niche, and way beyond common sense.

Glad you're seeing the parallel there though, and I don't think you're wrong either, we have what we've got.
 
Remember when Intel made quadcores for a decade?! People legitimately thought this was it. Bigger CPUs werent for the consumer market, because Intel didnt position them as such.

And then we got Zen and today we have rumors that Intel is for sale. Do you need a bigger writing on the wall?

Now keep in mind we already have chiplet GPUs and Nvidia is already grasping at straws to sell their product; the marketing machine is booked top to bottom,'maximum misleading in their presentation and selling everything EXCEPT raster/raw performance. Huang tells you how amazing it all is, but he's really selling you last gen hardware with new artificial boundaries. DLSS4 didnt need another hardware release.

And you believe this. Painful... being so gullible. Especially when the competition produces nearly the same thing, or is moving there with not much more than just a little more time.

Here is a news flash: salesmen lie to you.
From what I've heard a few times, AMD/Nvidia didn't go with the MCM that's already being used in the datacenter, (chip to chip rather than chip to cache) because games don't seem to enjoy that kind of setup. But the Ultra chip of apple don't really seem to have an issue with that even though their interconnect is half the speed of AMD infinity fabric. So I don't really know what's happening there with AMD even getting back to monolithic, when it should in theory be more costly.

The analogy with Intel might not work in the GPU space, with AMD seemingly moving in a similar direction as Nvidia when it comes to design : RDNA4 is the last 100% gamer-focused arch that we'll be getting next is a consumer version of their HPC GPUs, which is what Nvidia has been doing for a while.

IMHO, it's not that DLSS4/3 is artificially locked, but more that Nvidia never looked at the problem with the angle of making it work on older hardware, the exact opposite of AMD FSR3. Even if you hack DLSS3 on older hardware, it's going to run like crap, when FSR3 just works...not as good as it does on AMD's cards, but better than a hacked DLSS3.
Xess can work everywhere, but work best on ARC. AMD has been very careful about not saying outright that FSR 4 will be available on RDNA 3. It might, but it's not guaranteed. anything older than that won't have it.
1737225196961.png
 
From what I've heard a few times, AMD/Nvidia didn't go with the MCM that's already being used in the datacenter, (chip to chip rather than chip to cache) because games don't seem to enjoy that kind of setup. But the Ultra chip of apple don't really seem to have an issue with that even though their interconnect is half the speed of AMD infinity fabric. So I don't really know what's happening there with AMD even getting back to monolithic, when it should in theory be more costly.

The analogy with Intel might not work in the GPU space, with AMD seemingly moving in a similar direction as Nvidia when it comes to design : RDNA4 is the last 100% gamer-focused arch that we'll be getting next is a consumer version of their HPC GPUs, which is what Nvidia has been doing for a while.

IMHO, it's not that DLSS4/3 is artificially locked, but more that Nvidia never looked at the problem with the angle of making it work on older hardware, the exact opposite of AMD FSR3. Even if you hack DLSS3 on older hardware, it's going to run like crap, when FSR3 just works...not as good as it does on AMD's cards, but better than a hacked DLSS3.
Xess can work everywhere, but work best on ARC. AMD has been very careful about not saying outright that FSR 4 will be available on RDNA 3. It might, but it's not guaranteed. anything older than that won't have it.
View attachment 380531
I think a big part of it has to do with difficulty of scaling up performance. Too much data to move on the interconnect(s), the latency hit gets in the way of performance. And to counteract that you either need more power or bigger chip or both, and the power metric is already in full use to maximize chip output while minimizing size... so what you'll get is bigger chips... and they cost more. High performance GPUs don't just do more difficult graphics, we also want them to produce simpler graphics at very high FPS, = ultra low latency. The interconnect may not be that dynamic.
 
Back
Top