Friday, February 9th 2024
Widespread GeForce RTX 4080 SUPER Card Shortage Reported in North America
NVIDIA's decision to shave off $200 from its GeForce RTX 4080 GPU tier has caused a run on retail since the launch of SUPER variants late last month—VideoCardz has investigated an apparent North American supply shortage. The adjusted $999 base MSRP appears to be an irresistible prospect for discerning US buyers—today's report explains how: "a week after its release, that GeForce RTX 4080 SUPER cards are not available at any major US retailer for online orders." At the time of writing, no $999 models are available to purchase via e-tailers (for delivery)—BestBuy and Micro Center have a smattering of baseline MSRP cards (including the Founders Edition), but for in-store pickup only. Across the pond, AD103 SUPER's supply status is a bit different: "On the other hand, in Europe, the situation appears to be more favorable, with several retailers listing the cards at or near the MSRP of €1109."
The cheapest custom GeForce RTX 4080 SUPER SKU, at $1123, seems to be listed by Amazon.com. Almost all of Newegg's product pages are displaying an "Out of Stock" notice—ZOTAC GAMING's GeForce RTX 4080 SUPER Trinity OC White Edition model is on "back order" for $1049.99, while the only "in stock" option is MSI's GeForce RTX 4080 Super Expert card (at $1149.99). VideoCardz notes that GeForce RTX 4070 SUPER and RTX 4070 TI SUPER models are in plentiful supply, which highlights a big contrast in market conditions for NVIDIA's latest Ada Lovelace families. The report also mentions an ongoing shortage of GeForce RTX 4080 (Non-SUPER) cards, going back weeks prior to the official January 31 rollout: "Similar to the RTX 4090, finding the RTX 4080 at its $1200 price point has proven challenging." Exact sales figures are not available to media outlets—it is unusual to see official metrics presented a week or two after a product's launch—so we will have to wait a little longer to find out whether demand has far outstripped supply in the USA.
Source:
VideoCardz
The cheapest custom GeForce RTX 4080 SUPER SKU, at $1123, seems to be listed by Amazon.com. Almost all of Newegg's product pages are displaying an "Out of Stock" notice—ZOTAC GAMING's GeForce RTX 4080 SUPER Trinity OC White Edition model is on "back order" for $1049.99, while the only "in stock" option is MSI's GeForce RTX 4080 Super Expert card (at $1149.99). VideoCardz notes that GeForce RTX 4070 SUPER and RTX 4070 TI SUPER models are in plentiful supply, which highlights a big contrast in market conditions for NVIDIA's latest Ada Lovelace families. The report also mentions an ongoing shortage of GeForce RTX 4080 (Non-SUPER) cards, going back weeks prior to the official January 31 rollout: "Similar to the RTX 4090, finding the RTX 4080 at its $1200 price point has proven challenging." Exact sales figures are not available to media outlets—it is unusual to see official metrics presented a week or two after a product's launch—so we will have to wait a little longer to find out whether demand has far outstripped supply in the USA.
95 Comments on Widespread GeForce RTX 4080 SUPER Card Shortage Reported in North America
As the party that holds both the market and mind share, it isn't Nvidia that needs to do something to stand out, it's the other way around.
They are the ones in control of the design, I think even they get surprised with what they're getting away sometimes, sometimes they push their luck and have to pull back - like the 4060ti and 4080 12gb - but the 4070 seems like a clear example throwing shit into the market and getting away with it as it's still selling anyway.
I had a 1080 Ti for nearly 7 years and it wasn't until the tail end of that did a few games start showing up that challenged the VRAM. None of this day 1 lacking VRAM nonsense. That card cost me $650 and was the top end card of that generation. Now that much money doesn't even get you an xx70 Ti class card, you get a 4070 with a mere 1GB more VRAM over a 7 year period. Historically VRAM capacity would have doubled two to three times in the same timespan. 780 Ti had 3GB, 980 Ti had 6GB, and 1080 Ti had 11GB. Imagine that, VRAM nearly doubling or doubling gen over gen was pretty normal and there was a mere $50 price difference between 3 generations.
The 1080 Ti also does a good job of demonstrating that if VRAM allowances on cards don't increase, devs won't create games that utilize that additoinal VRAM. 8GB was the standard that game devs were forced to target for a whopping 7 years and heck even today still have to heavily consider it given the high price tag of 12GB+ card on the Nvidia side (which sells some 80%+ of all video cards).
Also lest we not forget another major factor:
"Hardware leads software, not the other way around. It's backwards to imply that because games don't use x amount of VRAM today, x amount of VRAM isn't worthwhile. That kind of logic only perpetuates stagnation of VRAM capacity. Devs aren't going to make games with VRAM requirements the vast majority of cards can't meet. It's up to the hardware vendors to increase GPU VRAM allowance so that game devs can then create games that utilize that extra capacity.
I will never understand people who fight against the idea of increasing VRAM amounts of video cards. VRAM is cheap, enables devs to do more with their games, and increases card longevity. Whether it benefits you at this very moment is an extremely narrow way to look at things."
Anyway, let's be realistic for a moment here. When the 1080 Ti launched 7 years ago in the Pascal refresh cycle, 11 GB was truly massive. And we all know that 11 GB was because Nvidia only removed a memory chip to lower the bandwidth and ensure that the 2016 Titan X retained a similar level of performance (despite having slower memory) and that there was a significant enough gap between it and the then-new Titan Xp, which came with the complete GP102 chip and the fastest memory that they had at the time. For most of its lifetime you could simply enable all settings and never worry about that.
GPU memory capacities began to rise after the Turing and RDNA generation but we were still targeting 8 GB for the performance segment parts (RTX 2080, 5700 XT), and these cards weren't really all that badly choked by these. Then came RDNA 2 and started to give everyone 12 and 16 GB like candy, but that still didn't give them a very distinct advantage over Ampere, that was still lingering on 8-10 GB, and then eventually 12 GB for the high end. Nvidia positioned the RTX 3090 with 24 GB to basically assume both the role of gaming flagship and a replacement for the Titan RTX, while removing a few perks of the latter and technically lowering the MSRP by $1000 (which was tbh an acceptable tradeoff). We know reality is different, but if we exclude the RTX 3090, then only AMD really offered a high VRAM capacity.
This generation started to bring higher VRAM capacities on both sides, and it's only now with games that were designed first and foremost for the PS5 and Xbox Series X starting to become part of gamers' rosters and benchmark suites, all featuring advanced graphics, new texture streaming systems, tons of high-resolution assets, etc. that we've begun to see that being utilized to a bigger extent. IMO, this places cards like the RTX 3080 in a bad spot, but is it really a deal breaker? With the exception of a noteworthy game I'm about to mention, we've yet to see even the 3080 fall dramatically behind its AMD counterpart despite having 6 GB less.
I agree, more VRAM is better. But realistically speaking, it's not a life or death situation, at least not yet, and I don't think it'll be one for some time to come, especially if you're willing to drop textures from "ultra" to "high" and not run extreme raytracing settings. Perhaps the only game where 10-12 GB has become a true limitation at this point is The Last of Us, which seems to use VRAM so aggressively that it's not going to perform on a GPU that has less than 16 GB of VRAM, as evident here given the RX 6800 XT wins out against the otherwise far superior 4070 Ti.
Frankly, a game that *genuinely* needs so much VRAM probably has more than a few optimization issues, but I digress. We're not "fighting against memory increases", I'm just making the case that VRAM amounts are currently adequate, if not slightly lacking across most segments, as you see, even then the aforementioned scenario involves 4K and very high graphics settings.
It's cool that you mention the 3090 as well, that was problematic for an entirely different set of reasons: the titan cards used to have professional features enabled, bringing back the x90 class card was supposed to replace the titan and was even advertised often as such but they cuf off access to those.
Vote with your wallet and all that, doesn't matter all that much when mindless people will buy whatever is available but anyway... here's to waiting the 50 series isn't as bad as this one but i'm not holding my breath, given the current AI boom we'll need to wait for AI winter.
The 3090 (and it's varients) is the only Nvidia card that actually increased VRAM on the Nvidia side. the rest of the stack saw little to no movement, especially if you factored in price (with or without inflation). The 3090 was also super expensive. You don't seem to understand that my argument has never been "hey look at this chart, it shows more VRAM = better". Yes you can show that but the point I've been making, and have repeated 4 times now is:
"Hardware leads software, not the other way around. It's backwards to imply that because games don't use x amount of VRAM today, x amount of VRAM isn't worthwhile. That kind of logic only perpetuates stagnation of VRAM capacity. Devs aren't going to make games with VRAM requirements the vast majority of cards can't meet. It's up to the hardware vendors to increase GPU VRAM allowance so that game devs can then create games that utilize that extra capacity."
There isn't ever going to be a point where games are releasing en mass with requirements that make most people's experience unenjoyable. You don't seem to understand that one has to come before the other, instead arguing that because the latter hasn't come the former is fine. Again, backwords logic.
And again, let's be honest. Games are generally developed with average hardware in mind, and primarily target consoles. Yes, I brought all of that up and i'm in agreement. But remember that the consoles also have unified memory pool, those 16 GB count for system and graphics memory simultaneously.
1) Both consoles have additional reserved memory just for the OS
2) Consoles have less overhead
3) Both consoles have proprietary texture decompression chips that allow them to dynamically stream data off their SSDs at high speeds and low latency, drastically reducing the amount of data that has to be stored in memory / VRAM.
But, for all we know, nVidia™ might not be able to design a memory controller as well as others; and bad yields in the controller area might be making die costs way higher than AMD or Intel.
VRAM is the second most costly part of a graphics card, and that right there is the reason we don't see 16GB as standard on all but the lowest end cards in 2024. I truly worry about Blackwell, as if nGreedia doesn't up the VRAM for that generation, then we will begin to see a stutter fest on all AAA games next year onwards. But I suspect that we will see only the unobtanium card getting 24 or possibly 32GB frame buffers, while the rest will top out at 16GB, and the midrange and lower will top out at 12GB. If so, that generation will be very bad value, even worse than the bad value of this awful generation.
How he wants to spend his money is his propagative, but if you ask me, a fool and his money are soon parted. The card alone cost almost the same as the rest of the items he was getting (SSD, MB, CPU, case, RAM, PSU). In all he was looking at around $2300 for everything after taxes, based on the prices on the items (it is possible some were priced lower than the sticker cost).
To be fair, I asked a guy in the DIY department how many 4080S they see come in and he said just a couple here and there and within a few days the couple that come do sell. They are selling, but by no means are they coming in by the truckload.
Proves time and time again, most people make pretty poor choices when it comes to their “needs” when it comes to tech.
Another data point is the price of DDR5 on Digikey. Notice it says 16Gbit. Now we all know that this price is much higher than the actual price of DRAM. DDR5 is now starting at $73 for two 16 GB DIMMs. That is less than a quarter the price of the Digikey quote if they are selling it one IC at a time. The logical conclusion is that the reel, i.e. one unit, doesn't correspond to one DRAM IC.
It goes down to $13.5 when ordering in large quantities.