Friday, February 9th 2024

Widespread GeForce RTX 4080 SUPER Card Shortage Reported in North America

NVIDIA's decision to shave off $200 from its GeForce RTX 4080 GPU tier has caused a run on retail since the launch of SUPER variants late last monthVideoCardz has investigated an apparent North American supply shortage. The adjusted $999 base MSRP appears to be an irresistible prospect for discerning US buyers—today's report explains how: "a week after its release, that GeForce RTX 4080 SUPER cards are not available at any major US retailer for online orders." At the time of writing, no $999 models are available to purchase via e-tailers (for delivery)—BestBuy and Micro Center have a smattering of baseline MSRP cards (including the Founders Edition), but for in-store pickup only. Across the pond, AD103 SUPER's supply status is a bit different: "On the other hand, in Europe, the situation appears to be more favorable, with several retailers listing the cards at or near the MSRP of €1109."

The cheapest custom GeForce RTX 4080 SUPER SKU, at $1123, seems to be listed by Amazon.com. Almost all of Newegg's product pages are displaying an "Out of Stock" notice—ZOTAC GAMING's GeForce RTX 4080 SUPER Trinity OC White Edition model is on "back order" for $1049.99, while the only "in stock" option is MSI's GeForce RTX 4080 Super Expert card (at $1149.99). VideoCardz notes that GeForce RTX 4070 SUPER and RTX 4070 TI SUPER models are in plentiful supply, which highlights a big contrast in market conditions for NVIDIA's latest Ada Lovelace families. The report also mentions an ongoing shortage of GeForce RTX 4080 (Non-SUPER) cards, going back weeks prior to the official January 31 rollout: "Similar to the RTX 4090, finding the RTX 4080 at its $1200 price point has proven challenging." Exact sales figures are not available to media outlets—it is unusual to see official metrics presented a week or two after a product's launch—so we will have to wait a little longer to find out whether demand has far outstripped supply in the USA.
Source: VideoCardz
Add your own comment

95 Comments on Widespread GeForce RTX 4080 SUPER Card Shortage Reported in North America

#26
chodaboy19
I got the Asus ProArt RTX 4080 Super OC from B&H today.
Posted on Reply
#27
R0H1T
Dr. DroThere's a reason they're currently at 1.8 trillion dollars market cap and still going up.
Market cap means very little in the real world ~ there's a reason 2008, 2000, 1984(?) or 1929 happened. Do you also not remember Evergrande or Country Garden? The markets worldwide are sitting on a powder keg, at least IMO & anyone of the potential triggers we see these days could push it over the edge! The bigger issue is that unlike in the past the structural flaws which caused many of these boom/bust cycles aren't really being fixed or reformed like they used to ~ instead they're "fixing" them by pouring more money :shadedshu:

And of course money from various sectors is being recycled to prop up others ~ like crypto>stocks>commodities>bonds>property et al. This usually wasn't possible at the same rate it is today. It's like a never ending cycle of chasing (high)returns at all costs!
Posted on Reply
#28
JAB Creations
"Shortage" Ha! Those are some extremely expensive lower-mid-range cards! It took me a week to max out my 16GB card about ~18 months ago. 24GB needs to become the new mid-range GPU RAM capacity.
Posted on Reply
#29
evernessince
Dr. DroSure, because absolutely everyone buying is buying strictly high-end enterprise gear on a practically unlimited budget, and buying a 4090 totally isn't a valid way to get a cheap and reasonably fast/efficient accelerator for freelance work.
The 4090 is indeed a valid card for AI. After all I have a rig with a 4090 just for that. That said, it doesn't change the fact that it does not represent a major portion of demand:
Nvidia's data-center segment is up 141% while it's gaming segment is up 11%. While yes some gaming cards are suitable for AI the VAST majority of growth in profits is the data-center / enterprise segments. At 11% growth I'm not sure there's much of a basis for an argument that the gaming segment is seeing growth due to AI given that's subpar even for pre-AI earnings reports.
People are not buying high-end gaming graphics cards en bulk for hobbisyist AI as the data demonstrates. A large number of enterprises are buying high-end enterprise cards though with a nearly unlimited budget and thus we get the numbers above. You may have some ouliers but the data does not lie.
Dr. DroBuying a 4090 or 7900 XTX, and even the 4080/S is a much better idea for this level of work (which I firmly believe is the vast majority of the market outside of AI-focused enterprise) is a way better investment than say, buying a previous-generation RTX A4500 or something like that. Being honest, this is where used RTX 3090s start to look particularly juicy.
There was no "level of work" discussed or defined here. You implied that gaming GPUs were a significant driver of demand with the following:
Gaming sector overlaps and its demand is also fueled by AI. They're buying Radeons by the crate for AI too.
There's a reason they're currently at 1.8 trillion dollars market cap and still going up. $NVDA market cap will hit $2 trillion before the month is out. Don't be foolish. there is no intentional restriction of supply by "nGreedia" here, the cards are selling out because they're desirable across every segment.
To which I replied with the sales data showing that the gaming segment is at best performing mediocre, if not under-performing historically at only 11% growth. Any defining of "level of work" after the fact would amount to changing of goal posts, if that's what you are trying to say here.

You keep implying there's massive sales of gaming cards for AI (as quoted above) but the data clearly shows that's not the case. Only the top end cards this generation are a good idea for hobbyist AI. You can do AI on cards with less VRAM but for reasons stated earlier you will be limited. A 4080 super is terrible value compared to the 4090 for AI. Not only is the performance difference between the two much greater in AI workloads (with of course the 4090 pulling much further ahead) than gaming workloads, the VRAM severely curtails what the 4080 super can do. The 4080 super is going to be restricted to the small last gen models just through virutue of it's VRAM limits. Why in the world they didn't up the VRAM to 20 GB is likely due to greed but it needs to be several hundred dollars cheaper in order to make any sense for hobbyist AI. People do not drop that kind of money and only expect to be able to dabble in last gen stuff. At $1,200-1,300 you might as well get a 4090.
Dr. DroCome on, man, it's not an ideal world out there. I'd be surprised if the AI-generated sukebei you get to look at on pixiv was even generated by a guy that has a 4090 to begin with. Probably making do with way less.
You could make do with a 4080 super but for AI it's relatively less value than a 4090 and has the limitations stated above that further kill it's value for the hobbyist AI segment. The 4080 has been Nvidia's worst selling 4000 series card this generation. I don't see any reason why a refresh wherein performance, VRAM, and price being indentical to the 4080 would sudeenly cause a surge in demand. Nothing has changed, other than perhaps Nvidia pulling more and more supply towards the enterprise.
Darmok N JaladIf it’s a slight performance improvement over the 4080, then they would undoubtedly limit production until those dry up in the market. Curious, does the non-S get the same discount, or is it still selling for $200 more? Perhaps a bit of bait ‘n switch?
It's still selling for $200+ above msrp. Doesn't make any sense given you could have picked up 4080s for over a year now off eBay for around $950 - $1,000 including warranty if you buy MSI or ASUS. Even then though I don't think it's a great deal. It needs to be $750 - $850 at this point in the lifecycle. If it were a 20GB card it'd be a slightly different story.
Posted on Reply
#30
stimpy88
All part of nGreedias plan to price cut for a week or two, get some good press for once, then push prices higher than the original 4080.

I just checked my country's prices, and indeed, they are higher than the original 4080, with only the most expensive models available, which are far more expensive than the original 4080 has ever been over the last 12 months.

Nice work nGreedia.
Posted on Reply
#31
Dave65
evernessinceI really wish the false logic that being sold out = high demand would stop being used. Without knowing the actual sales numbers, a product being sold out tells us nothing of it's desirability or number actually sold.

There isn't a single 4080 super on Amazon's top GPU selling list.



Most likely because Nvidia is allocating all it's wafers to AI chips, not because the 4080 non-super has high sales or is popular. It was the least desirable card in the entire line-up and just further proves the fallacy of sold out = desirable.
THIS!
Posted on Reply
#32
Icon Charlie
stimpy88All part of nGreedias plan to price cut for a week or two, get some good press for once, then push prices higher than the original 4080.

I just checked my country's prices, and indeed, they are higher than the original 4080, with only the most expensive models available, which are far more expensive than the original 4080 has ever been over the last 12 months.

Nice work nGreedia.
IMHO I believe it is worse than that. I'm getting real sick and tired of these... "launches". Because they are not true launches per se' but limited launches. Limited launches are designed for the marketing hype of "The Sellout ploy" so they can say that their product has sold out their by...

1. Saying that their product is so damn good and awesome.. because it sold out... so it it must be good!
2. By having a limited launch the prices for said product can and most likely will be sold at a higher prices for those products that are left on the shelf, and/or AIB's and/or, stability or future products.
Posted on Reply
#33
The Von Matrices
JAB Creations"Shortage" Ha! Those are some extremely expensive lower-mid-range cards! It took me a week to max out my 16GB card about ~18 months ago. 24GB needs to become the new mid-range GPU RAM capacity.
No game shows any performance improvement with more than 16GB. Even 12GB is more than enough except in edge cases that are way too stressful for the market these cards serve anyway.
Across all the 100+ game tests, (25 raster + 10 RT) x 3 resolutions, we only identified two cases where 16 GB results in a meaningful improvement over 12 GB: The Last of Us 4K and Alan Wake 2 RT at 4K
Posted on Reply
#34
evernessince
The Von MatricesNo game shows any performance improvement with more than 16GB. Even 12GB is more than enough except in edge cases that are way too stressful for the market these cards serve anyway.
Hardware leads software, not the other way around. It's backwards to imply that because games don't use x amount of VRAM today, x amount of VRAM isn't worthwhile. That kind of logic only perpetuates stagnation of VRAM capacity. Devs aren't going to make games with VRAM requirements the vast majority of cards can't meet. It's up to the hardware vendors to increase GPU VRAM allowance so that game devs can then create games that utilize that extra capacity.

I will never understand people who fight against the idea of increasing VRAM amounts of video cards. VRAM is cheap, enables devs to do more with their games, and increases card longevity. Whether it benefits you at this very moment is an extremely narrow way to look at things.
Posted on Reply
#35
The Von Matrices
evernessinceHardware leads software, not the other way around. It's backwards to imply that because games don't use x amount of VRAM today, x amount of VRAM isn't worthwhile. That kind of logic only perpetuates stagnation of VRAM capacity. Devs aren't going to make games with VRAM requirements the vast majority of cards can't meet. It's up to the hardware vendors to increase GPU VRAM allowance so that game devs can then create games that utilize that extra capacity.

I will never understand people who fight against the idea of increasing VRAM amounts of video cards. VRAM is cheap, enables devs to do more with their games, and increases card longevity. Whether it benefits you at this very moment is an extremely narrow way to look at things.
I look at the most extreme scenario at present and then assume that will be the "normal" two years from now. If that's the case, then games won't use more than 16GB two years from now, and the current generation cards are still going to be limited by their compute power before their VRAM capacity becomes the bottleneck. The current-gen consoles having 16GB of memory also helps that argument. There were only three situations I can think of where a mid/high-end GPU was hampered by having too little VRAM at the time it was released - the GTX 570/580, the RTX 2080 Ti, and the R9 Fury X. All other cards were bottlenecked by the speed of their ASIC long before VRAM capacity became an issue.

The other problem I have with additional VRAM is just how much power it consumes. The 3090 I used to have, which had 24 memory chips, idled around 100W when using multiple monitors. 90% of that power is from the RAM since it has to run at full speed with multiple monitors. The 4090 is much better at idle due to having half the number of RAM chips and only uses about 55W in the same scenario.
Posted on Reply
#36
evernessince
The Von MatricesI look at the most extreme scenario at present and then assume that will be the "normal" two years from now. If that's the case, then games won't use more than 16GB two years from now
"Most extreme scenario"? So like the 52GB VRAM Stable Diffusion will use with with an SDXL model at 1024 x 1024 resolution? Or the 26GB games will use at 8K resolution?

I'm sure you will reply with something akin to "reasonable extreme scenario" but at the end of the day it's just some subjective definition that'll conform to whatever you want it to try to justify your narrative. Again that's backwords logic, software conforms to hardware. Not the other way around.
The Von Matricesthe current generation cards are still going to be limited by their compute power before their VRAM capacity becomes the bottleneck. The current-gen consoles having 16GB of memory also helps that argument. There were only three situations I can think of where a mid/high-end GPU was hampered by having too little VRAM at the time it was released - the GTX 570/580, the RTX 2080 Ti, and the R9 Fury X. All other cards were bottlenecked by the speed of their ASIC long before VRAM capacity became an issue.
Both GamersNexus and HardwareUnboxed have already thoroughly called out the nonsense argument that a given card is not powerful enough to utilize a larger VRAM buffer. That's fundamentally not how VRAM works. More VRAM allows a card to store more graphics data irregardless of it's raw performance. A card with little raw hoursepower might not get amazing FPS but 30 FPS is a LOT better than that same GPU but memory starved at 12 FPS and 1 FPS 1% lows.
The Von MatricesThe other problem I have with additional VRAM is just how much power it consumes. The 3090 I used to have, which had 24 memory chips, idled around 100W when using multiple monitors. 90% of that power is from the RAM since it has to run at full speed with multiple monitors. The 4090 is much better at idle due to having half the number of RAM chips and only uses about 55W in the same scenario.
No, your 3090 consumed a lot of power because your specific setup prevented your specific GPU from going into a lower power state.

The single monitor idle for the 3090 is a mere 15w (which by the way is lower then that of the 4090):



and the multi-monitor idle is 30w




I have no idea how you saw your 3090's high idle consumption and assumed that out of everything the VRAM was at fault when high idle power consumption with multi-monitor setups is a well know issue. You essentially brushed aside everything else that could have been the culprit to try to forward a false narrative.
Posted on Reply
#37
stimpy88
The Von MatricesNo game shows any performance improvement with more than 16GB. Even 12GB is more than enough except in edge cases that are way too stressful for the market these cards serve anyway.
It's a known fact that some game developers had been asking nGreedia for more VRAM on the 40x0 series, but were ignored, so some of them fought back by releasing games that needed 12GB to run properly. I applaud them for that.

The fact is that most of the AAA games on the market NOW, require 12GB, but the fact that Devs are holding back, especially with UE5 based games, then I'd say 16GB will be the new minimum within 6 months on AAA games at high settings.

In 2 years' time 16GB will be holding back some games at high settings. But we all know that the consumer slop which nGreedia will be serving us in 2 years' time will all be 16GB, and only the 5090 having 24 or more. I'd be shocked if the 5080 has more than 16GB next year.
Posted on Reply
#38
GhostRyder
Its very clear this is just a supply issue and not that the card is super popular. Besides the price drop its the same overall performance (Give 1%) for $200 less, people are not lining up on the street to buy it, there just hardly were any out there. Now sales numbers could change minds, but considering how reviews went from every major outlet I doubt gamers were chomping at the bit for this card.
Posted on Reply
#39
evernessince
stimpy88It's a known fact that some game developers had been asking nGreedia for more VRAM on the 40x0 series, but were ignored, so some of them fought back by releasing games that needed 12GB to run properly. I applaud them for that.

The fact is that most of the AAA games on the market NOW, require 12GB, but the fact that Devs are holding back, especially with UE5 based games, then I'd say 16GB will be the new minimum within 6 months on AAA games at high settings.

In 2 years' time 16GB will be holding back some games at high settings. But we all know that the consumer slop which nGreedia will be serving us in 2 years' time will all be 16GB, and only the 5090 having 24 or more. I'd be shocked if the 5080 has more than 16GB next year.
Yep, GN has queried some developers anonymously about 2 years ago and they replied that optimizing for 8GB as the midrange was extremely resource intensive and increasingly infeasible. We were stuck at 8GB as the average for far far too long.
Posted on Reply
#40
Vya Domus
One of the reasons Nvidia is stingy with VRAM is to prevent what's been happening with the 4090. They want you to pay an arm and a leg if you want to use their cards for any serious ML work, buy Quadros or rent their data center GPUs.

For games the real limiting factor are the consoles, that's what they target first and foremost.
Posted on Reply
#41
trsttte
stimpy88It's a known fact that some game developers had been asking nGreedia for more VRAM on the 40x0 series, but were ignored, so some of them fought back by releasing games that needed 12GB to run properly. I applaud them for that.

The fact is that most of the AAA games on the market NOW, require 12GB, but the fact that Devs are holding back, especially with UE5 based games, then I'd say 16GB will be the new minimum within 6 months on AAA games at high settings.

In 2 years' time 16GB will be holding back some games at high settings. But we all know that the consumer slop which nGreedia will be serving us in 2 years' time will all be 16GB, and only the 5090 having 24 or more. I'd be shocked if the 5080 has more than 16GB next year.
As soon as the new generation of consoles was released with 16gb of vram any graphics card in the supposed "mid range" or above should at least match that. It could be somewhat excused in the 30x series but in the 40 series it makes no sense, anyone buying a 4070 super with 12gb is willingly getting screwed by nvidia.
Posted on Reply
#42
Dr. Dro
Do any of you even remotely believe what you've been saying? It's been proven time and time again that you need to push unrealistic, unreasonable settings to run out of VRAM on 12 GB GPUs today. Certainly settings far beyond the means of an RTX 4070 Ti, and certainly far beyond the means of AMD's equivalents.

There's two games in W1zzard's suite, that at 4K resolution and ultra high settings, require more than 12 GB. if you're playing ultra high 4K, you aren't running a midranger card.
Posted on Reply
#43
Vya Domus
Dr. DroThere's two games in W1zzard's suite, that at 4K resolution and ultra high settings, require more than 12 GB. if you're playing ultra high 4K, you aren't running a midranger card.
The 12GB 4070ti can hardly be considered mid range at it's price.
Posted on Reply
#44
Dr. Dro
Vya DomusThe 12GB 4070ti can hardly be considered mid range at it's price.
For the price, agreed, but:

- 4090
- 4080 SUPER
- 4080
- 4070 Ti SUPER
- 4070 Ti
- 4070 SUPER
- 4070
- 4060 Ti
- 4060

Smack in the middle
Posted on Reply
#45
trsttte
Dr. DroDo any of you even remotely believe what you've been saying? It's been proven time and time again that you need to push unrealistic, unreasonable settings to run out of VRAM on 12 GB GPUs today. Certainly settings far beyond the means of an RTX 4070 Ti, and certainly far beyond the means of AMD's equivalents.

There's two games in W1zzard's suite, that at 4K resolution and ultra high settings, require more than 12 GB. if you're playing ultra high 4K, you aren't running a midranger card.
That's arguable but this is not even about needing it or not. vram is not that expensive and nvidia's profits have been over the moon, yet they're selling you a 600$+ card with only 12gb of vram in 2024. I don't care if every game this year will be able to run with that, 12gb for that price you're getting fleeced hard. 500$ consoles launched 3 years ago had 16gb, think about that for a second.
Posted on Reply
#46
Dr. Dro
trsttteThat's arguable but this is not even about needing it or not. vram is not that expensive and nvidia's profits have been over the moon, yet they're selling you a 600$+ card with only 12gb of vram in 2024. I don't care if every game this year will be able to run with that, 12gb for that price you're getting fleeced hard. 500$ consoles launched 3 years ago had 16gb, think about that for a second.
www.digikey.com/en/products/detail/micron-technology-inc/MT61K512M32KPA-24-U-TR/17632186

This is the exact memory chip used on the RTX 4080. At $55,700, the per unit cost is $27.85 in a 2000 unit bulk.

$27.85 * 8 = $222.80 in memory IC costs alone.

www.digikey.com/en/products/detail/micron-technology-inc/MT61K512M32KPA-21-U/17631914

The same situation with the slower 21 Gbps chip used in the 4070 Ti and 4090, at $25.31 per unit at bulk cost

$25.31 * 6 = $151.86 in memory costs alone

Not only does the cost add up, but factor in PCB costs, power delivery component costs, etc. - losses, the fact that everyone needs profit, yada yada, you get what I'm going at. I'm fairly sure the BoM for a 4070 Ti must be somewhere in the vicinity of $500 as a rough estimate. The rest makes up for profit, distribution cost, driver software development costs, etc.
Posted on Reply
#47
trsttte
Dr. DroSmack in the middle
thats complete bullshit and you know it ;)
Posted on Reply
#48
Vya Domus
Dr. DroSmack in the middle
It's really not. I'll remind you that Nvidia originally intended for this thing to be a 900$ "4080" and no "super" models most likely (not that they really change anything). They moved the labels around but that doesn't change what these products were originally intended to be.

Not that it matters, your comment is trying to imply that it's unreasonable to expect such a card to be used for 4K gaming. 700$ is a truckload of money, no matter how spin it that's not mid range, nor is isn't unreasonable to expect it to not be limited by VRAM at 4K ultra.
Dr. DroNot only does the cost add up, but factor in PCB costs, power delivery component costs, etc. - losses, the fact that everyone needs profit, yada yada, you get what I'm going at.
Strange how AMD despite being the budget option can always include a truckload of VRAM.
Posted on Reply
#49
Dr. Dro
Vya DomusIt's really not. I'll remind you that Nvidia originally intended for this thing to be a 900$ "4080" and no "super" models most likely (not that they really change anything). They moved the labels around but that doesn't change what these products were originally intended to be.

Not that it matters, your comment is trying to imply that it's unreasonable to expect such a card to be used for 4K gaming. 700$ is a truckload of money, no matter how spin it that's not mid range, nor is isn't unreasonable to expect it to not be limited by VRAM at 4K ultra.


Strange how AMD despite being the budget option can always include a truckload of VRAM.
I'm well aware, and like I said, I agree, $700 is a lot of money, but it remains the middle option in Nvidia's product stack. We just have to accept that the days for a $699 flagship aren't coming back, as much as we'd want that to happen.

AMD adds VRAM because they have to stand out somehow, it's not a big secret. Also, they've been discontinuing driver support for their previous generation hardware mercilessly as of late. That alone would justify a cheaper asking price, IMHO.
Posted on Reply
#50
Vya Domus
Dr. DroAMD adds VRAM because they have to stand out somehow, it's not a big secret.
The reason this is wrong is because Nvidia adds more memory to their cards all the time, point in case the 4070ti super.

So how am I supposed to interpret this ? Nvidia needs to stand out as well now ? Why ?

Nah, that's not it. Nvidia is simply penny pinching and crippling the VRAM on their cards intentionally. They've been at it since the early 2000s, that's why their cards have always had these bizarre VRAM configurations, 384mb, 768mb, 896mb, 1.2GB, 1.5GB, etc.

AMD isn't trying to stand out as much as Nvidia is trying to skimp on VRAM, that much is clear.
Posted on Reply
Add your own comment
Dec 18th, 2024 07:22 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts