Friday, February 9th 2024
Widespread GeForce RTX 4080 SUPER Card Shortage Reported in North America
NVIDIA's decision to shave off $200 from its GeForce RTX 4080 GPU tier has caused a run on retail since the launch of SUPER variants late last month—VideoCardz has investigated an apparent North American supply shortage. The adjusted $999 base MSRP appears to be an irresistible prospect for discerning US buyers—today's report explains how: "a week after its release, that GeForce RTX 4080 SUPER cards are not available at any major US retailer for online orders." At the time of writing, no $999 models are available to purchase via e-tailers (for delivery)—BestBuy and Micro Center have a smattering of baseline MSRP cards (including the Founders Edition), but for in-store pickup only. Across the pond, AD103 SUPER's supply status is a bit different: "On the other hand, in Europe, the situation appears to be more favorable, with several retailers listing the cards at or near the MSRP of €1109."
The cheapest custom GeForce RTX 4080 SUPER SKU, at $1123, seems to be listed by Amazon.com. Almost all of Newegg's product pages are displaying an "Out of Stock" notice—ZOTAC GAMING's GeForce RTX 4080 SUPER Trinity OC White Edition model is on "back order" for $1049.99, while the only "in stock" option is MSI's GeForce RTX 4080 Super Expert card (at $1149.99). VideoCardz notes that GeForce RTX 4070 SUPER and RTX 4070 TI SUPER models are in plentiful supply, which highlights a big contrast in market conditions for NVIDIA's latest Ada Lovelace families. The report also mentions an ongoing shortage of GeForce RTX 4080 (Non-SUPER) cards, going back weeks prior to the official January 31 rollout: "Similar to the RTX 4090, finding the RTX 4080 at its $1200 price point has proven challenging." Exact sales figures are not available to media outlets—it is unusual to see official metrics presented a week or two after a product's launch—so we will have to wait a little longer to find out whether demand has far outstripped supply in the USA.
Source:
VideoCardz
The cheapest custom GeForce RTX 4080 SUPER SKU, at $1123, seems to be listed by Amazon.com. Almost all of Newegg's product pages are displaying an "Out of Stock" notice—ZOTAC GAMING's GeForce RTX 4080 SUPER Trinity OC White Edition model is on "back order" for $1049.99, while the only "in stock" option is MSI's GeForce RTX 4080 Super Expert card (at $1149.99). VideoCardz notes that GeForce RTX 4070 SUPER and RTX 4070 TI SUPER models are in plentiful supply, which highlights a big contrast in market conditions for NVIDIA's latest Ada Lovelace families. The report also mentions an ongoing shortage of GeForce RTX 4080 (Non-SUPER) cards, going back weeks prior to the official January 31 rollout: "Similar to the RTX 4090, finding the RTX 4080 at its $1200 price point has proven challenging." Exact sales figures are not available to media outlets—it is unusual to see official metrics presented a week or two after a product's launch—so we will have to wait a little longer to find out whether demand has far outstripped supply in the USA.
95 Comments on Widespread GeForce RTX 4080 SUPER Card Shortage Reported in North America
And of course money from various sectors is being recycled to prop up others ~ like crypto>stocks>commodities>bonds>property et al. This usually wasn't possible at the same rate it is today. It's like a never ending cycle of chasing (high)returns at all costs!
You keep implying there's massive sales of gaming cards for AI (as quoted above) but the data clearly shows that's not the case. Only the top end cards this generation are a good idea for hobbyist AI. You can do AI on cards with less VRAM but for reasons stated earlier you will be limited. A 4080 super is terrible value compared to the 4090 for AI. Not only is the performance difference between the two much greater in AI workloads (with of course the 4090 pulling much further ahead) than gaming workloads, the VRAM severely curtails what the 4080 super can do. The 4080 super is going to be restricted to the small last gen models just through virutue of it's VRAM limits. Why in the world they didn't up the VRAM to 20 GB is likely due to greed but it needs to be several hundred dollars cheaper in order to make any sense for hobbyist AI. People do not drop that kind of money and only expect to be able to dabble in last gen stuff. At $1,200-1,300 you might as well get a 4090. You could make do with a 4080 super but for AI it's relatively less value than a 4090 and has the limitations stated above that further kill it's value for the hobbyist AI segment. The 4080 has been Nvidia's worst selling 4000 series card this generation. I don't see any reason why a refresh wherein performance, VRAM, and price being indentical to the 4080 would sudeenly cause a surge in demand. Nothing has changed, other than perhaps Nvidia pulling more and more supply towards the enterprise. It's still selling for $200+ above msrp. Doesn't make any sense given you could have picked up 4080s for over a year now off eBay for around $950 - $1,000 including warranty if you buy MSI or ASUS. Even then though I don't think it's a great deal. It needs to be $750 - $850 at this point in the lifecycle. If it were a 20GB card it'd be a slightly different story.
I just checked my country's prices, and indeed, they are higher than the original 4080, with only the most expensive models available, which are far more expensive than the original 4080 has ever been over the last 12 months.
Nice work nGreedia.
1. Saying that their product is so damn good and awesome.. because it sold out... so it it must be good!
2. By having a limited launch the prices for said product can and most likely will be sold at a higher prices for those products that are left on the shelf, and/or AIB's and/or, stability or future products.
I will never understand people who fight against the idea of increasing VRAM amounts of video cards. VRAM is cheap, enables devs to do more with their games, and increases card longevity. Whether it benefits you at this very moment is an extremely narrow way to look at things.
The other problem I have with additional VRAM is just how much power it consumes. The 3090 I used to have, which had 24 memory chips, idled around 100W when using multiple monitors. 90% of that power is from the RAM since it has to run at full speed with multiple monitors. The 4090 is much better at idle due to having half the number of RAM chips and only uses about 55W in the same scenario.
I'm sure you will reply with something akin to "reasonable extreme scenario" but at the end of the day it's just some subjective definition that'll conform to whatever you want it to try to justify your narrative. Again that's backwords logic, software conforms to hardware. Not the other way around. Both GamersNexus and HardwareUnboxed have already thoroughly called out the nonsense argument that a given card is not powerful enough to utilize a larger VRAM buffer. That's fundamentally not how VRAM works. More VRAM allows a card to store more graphics data irregardless of it's raw performance. A card with little raw hoursepower might not get amazing FPS but 30 FPS is a LOT better than that same GPU but memory starved at 12 FPS and 1 FPS 1% lows. No, your 3090 consumed a lot of power because your specific setup prevented your specific GPU from going into a lower power state.
The single monitor idle for the 3090 is a mere 15w (which by the way is lower then that of the 4090):
and the multi-monitor idle is 30w
I have no idea how you saw your 3090's high idle consumption and assumed that out of everything the VRAM was at fault when high idle power consumption with multi-monitor setups is a well know issue. You essentially brushed aside everything else that could have been the culprit to try to forward a false narrative.
The fact is that most of the AAA games on the market NOW, require 12GB, but the fact that Devs are holding back, especially with UE5 based games, then I'd say 16GB will be the new minimum within 6 months on AAA games at high settings.
In 2 years' time 16GB will be holding back some games at high settings. But we all know that the consumer slop which nGreedia will be serving us in 2 years' time will all be 16GB, and only the 5090 having 24 or more. I'd be shocked if the 5080 has more than 16GB next year.
For games the real limiting factor are the consoles, that's what they target first and foremost.
There's two games in W1zzard's suite, that at 4K resolution and ultra high settings, require more than 12 GB. if you're playing ultra high 4K, you aren't running a midranger card.
- 4090
- 4080 SUPER
- 4080
- 4070 Ti SUPER
- 4070 Ti
- 4070 SUPER
- 4070
- 4060 Ti
- 4060
Smack in the middle
This is the exact memory chip used on the RTX 4080. At $55,700, the per unit cost is $27.85 in a 2000 unit bulk.
$27.85 * 8 = $222.80 in memory IC costs alone.
www.digikey.com/en/products/detail/micron-technology-inc/MT61K512M32KPA-21-U/17631914
The same situation with the slower 21 Gbps chip used in the 4070 Ti and 4090, at $25.31 per unit at bulk cost
$25.31 * 6 = $151.86 in memory costs alone
Not only does the cost add up, but factor in PCB costs, power delivery component costs, etc. - losses, the fact that everyone needs profit, yada yada, you get what I'm going at. I'm fairly sure the BoM for a 4070 Ti must be somewhere in the vicinity of $500 as a rough estimate. The rest makes up for profit, distribution cost, driver software development costs, etc.
Not that it matters, your comment is trying to imply that it's unreasonable to expect such a card to be used for 4K gaming. 700$ is a truckload of money, no matter how spin it that's not mid range, nor is isn't unreasonable to expect it to not be limited by VRAM at 4K ultra. Strange how AMD despite being the budget option can always include a truckload of VRAM.
AMD adds VRAM because they have to stand out somehow, it's not a big secret. Also, they've been discontinuing driver support for their previous generation hardware mercilessly as of late. That alone would justify a cheaper asking price, IMHO.
So how am I supposed to interpret this ? Nvidia needs to stand out as well now ? Why ?
Nah, that's not it. Nvidia is simply penny pinching and crippling the VRAM on their cards intentionally. They've been at it since the early 2000s, that's why their cards have always had these bizarre VRAM configurations, 384mb, 768mb, 896mb, 1.2GB, 1.5GB, etc.
AMD isn't trying to stand out as much as Nvidia is trying to skimp on VRAM, that much is clear.