Friday, January 15th 2021

NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes

Everyone and their mother expected NVIDIA to announce - if not a SUPER refresh to their existing graphics cards with increased memory sizes - at least the RTX 3080 Ti. That card surfaced as a planned NVIDIA counter to AMD's preemptive pricing of $999 on its RX 6900 XT graphics card (which to be fair, is in itself as abundant a card as unicorns this side of the galaxy). GamersNexus reported NVIDIA partners' comments on the indefinite postponement of the RTX 3080 Ti and possible SUPER derivatives of the RTX 30-series lineup. It's being said that NVIDIA decided (smartly, I would say) to ensure consistent supply of their existing lineup to sate demand, instead of dispersing its limited chip production across even more product lines.

This would result, I have no doubt, on NVIDIA only having even more SKUs out of stock than they currently do. Considering the market's current state of mind in regards to NVIDIA's lineup, this seems like the most sensible decision possible. TechPowerUp has in the meantime confirmed this information with NVIDIA partners themselves.
Source: GamersNexus
Add your own comment

85 Comments on NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes

#76
Valantar
Adc7dTPUIt is an ugly practice, I don't like it one bit but sadly that's how it is right now.
Ugly? I'd call it necessary. When you're launching a new product based on a new, large chip on a (relatively) new manufacturing node, yields are bound to be less than optimal. Thus, announcing a day 1 SKU based on a fully enabled die will be inviting scarcity, as that bin will always be in limited supply for the first few months. That is why you first launch a high end cut-down SKU, then wait half a year or so while you improve yields/build up stocks of fully enabled chips and then release the full-fat product. There's nothing wrong with this whatsoever, outside of pissing off people who lack the self control to wait until the full product stack has arrived, yet still somehow demand that their purchase decisions be ideal.
Posted on Reply
#77
Sunny and 75
Valantarpissing off people who lack the self control to wait until the full product stack has arrived, yet still somehow demand that their purchase decisions be ideal.
Interesting point of view! Though GTX 980 (full GM204 GPU) and GTX 1080 (full GP104 GPU) were full-fat products at launch and no Super (Ti) refresh variants needed the year after.
So it must have been the scale is so much higher now with Turing and Ampere that forces Nvidia and leaves them no choice outside of practicing "first produce chips, then harvest, then release bumped up version".

Thankfully AMD didn't feel the need to do so and released the full Navi 21 GPU with their 6900 XT. God bless AMD. God bless the United States of America. God bless the universe.
Posted on Reply
#78
Locotes_Killa
newtekie1Creators that have projects than actually could use more than 6GB of VRAM, are also not buying 3060s. Because those projects need processing power more than VRAM. They're buying 3070s and 3080s or even 3060Ti's with 6GB of VRAM.

No, there is not game that legitimately needs more than 6GB of VRAM for 1080p, there really isn't any that needs more than that for 1440p. There are games that will "use" more than that, but not ones that actually need it, and that's the big difference.

Plus, you have to wonder if the card is even powerful enough for that extra RAM to matter. By the time you raise the graphics settings to the point where 6GB would be a limit, it's likely a 3060 would be chugging anyway.

I mean, yeah, you can use ultra-high res textures in games and get VRAM usage pretty damn high(textures being the biggest VRAM hog these days). But when you can still use 6GB by just turning down the texture settings one notch, I have a hard time saying more than 6GB is needed in this level of card.

I think people really over-estimate how much VRAM games are actually using. We're talking about a card that, in modern games, will really struggle at 1440p. This is going to be this generations 1080p card. Having 6GB of VRAM doesn't make the card useless, it barely is going to affect the card at the performance segment it is aimed at(if it affects it at all). However, the extra cost of doubling the VRAM to 12GB is going to result in a high price. So you end up with a card that is priced too closely to the 3060Ti to really be a compelling buy.
Not all creators are able to afford the more powerful graphics cards, but it's ok as today's midrange gpu's are capable of being highly effective with their horsepower, and needing enough vram to handle projects being worked on holds a higher importance, as a workstation that is hitting a memory bottleneck is going to feel much worse to use than one which is able to perform tasks nice and smoothly.

As demonstrated here...

www.pugetsystems.com/recommended/Recommended-Systems-for-Adobe-Premiere-Pro-143/Hardware-Recommendations

Check out the 3060Ti's rendering time improvements that Nvidia is showing off vs the more expensive previous gen gpu's too...

www.cgchannel.com/2020/12/nvidia-launches-the-geforce-rtx-3060-ti/

You've not been reading the previous comments have you now? No because if you did... then you would know that I have said yes there is a game that wants more than 6gb for 1080p to max it out... and DooM Eternal's built-in OSD shows actual, not allocated vram usage.

www.techspot.com/article/1999-doom-eternal-benchmarks/

You would know that the right choice due to how memory buses work is to double the vram, not to make it 8gb or 10gb for the 3060.
newtekie1Task Manger doesn't show real VRAM usage either. The fact is, there isn't really a way currently to see real VRAM usage, only allocated size. You'd have to track the actual game to see how much of that allocated VRAM the game is actually accessing. And, at least AFAIK, there isn't anything that does that. The way developers are doing things these days is they are just cramming as many textures as possible into VRAM, even if the texture isn't anywhere near the player and isn't being seen by the player. They start at where the player is, and just load outward until VRAM is full or there aren't anymore textures to load.

There used to be a very good quick video that showed how games used to handle textures. They would only load into VRAM what the player was actually seeing and a small amount around the field of view, in a cone. This was back in the days when VRAM was actually limited. Now developers have become spoiled with huge VRAM sizes, so they just load that VRAM with as many textures as they can. But then there are textures sitting in VRAM that are never accessed, so the VRAM usage is drastically inflated.

Getting back to why this affects the 3060. There are going to be people that are like "but what if I run at 4k texture mod on [game], that uses like 10GB of VRAM?!?" Ok, so you're using a 4k texture mod on a card that is really only capable of playing at 1080p resolution? Do you really think those stupid high texture resolutions are really helping you any?
I have already shown there is a tool available which can monitor actual rather than allocated vram usage in a link I have posted, that also goes into detail about what you're saying here and the future of vram usage with games.

Do please watch the Unreal Rebirth demo too, if you can't see that the textures look far beyond anything seen before in games, even if you watch the video in 1080p then you really need to get your eyes checked... and it's fair to say that a gpu will be wanting a bit more than 6gb vram to handle this in games considering what DooM Eternal is already doing, don't ya think?

So, can you bother to actually pay attention to what has been said so far? That would be nice of you as then we wouldn't have to be going around in circles... repeating ourselves. Don't be a bottleneck... :wtf:
Posted on Reply
#79
Sunny and 75
mdbrotha03Don't know why this game is using 20GB @1440p
Are you using the HD texture pack?
Posted on Reply
#80
Vayra86
newtekie1The good ol' marketting tactic of sticking way more RAM on a card than it needs or could ever actually use just to catch the eye of the uneducated buyer...
Or the reality that there are now chips available that offer the required and actually intended sizes for all SKUs.

We know Nvidia rushed Ampere to market by now, it echoes in everything, and that includes VRAM capacities. Your idea of 'uneducated buyer' buying into high VRAM caps is an idea of the ultra low end/budget class of GPUs where we also saw said capacities to turn out being DDR instead of GDDR memory even. Its old news, those guys are buying laptops, tablets or smartphones and have been for a decade or more now.

Nvidia NEVER pushed the marketing button for any of its x50 > x60 and up cards like that, so I'm not sure where you're coming from. They always marketed the normal VRAM versions first and foremost, and the double cap ones were obviously meant for the (then also still normal) SLI solutions. Your frame does not fit on the current situation at all either, where is the 20GB 3080 if people are so eager to buy inflated capacities? Let's remember here what you're defending: 10GB is enough but 'because marketing' double cap is on offer. If it'd sell that well, why wasn't it sold first - especially knowing a 16GB Radeon card is also available?

EVEN the odd one out of late, the 1060 was marketed first and foremost as a 6GB card, which also had a budget-ey 3GB cut down version. Today, the 3GB one is sinking fast and cheap becomes expensive sooner rather than later. After all, you'll be looking to upgrade that by now, it won't do much anymore without heavily cutting back on IQ. Ironically around the same time a 970 is hardly ever recommended despite having "4"GB ;)
nvidianews.nvidia.com/news/a-quantum-leap-for-every-gamer:-nvidia-unveils-the-geforce-gtx-1060

People are apparently just making shit up now to defend 10GB. Wow. As if it cannot occur that Huang made a timing decision that isn't the best design/balance decision for its GPUs going forward. No - that couldn't possibly be the truth, right? Surely he isn't in it for the money?

:kookoo::banghead:

Here's some fun reading material
pcmasterrace/comments/kkl86c
All the ingredients are there - none of them speak of 10GB being a solid choice for anything other than pure necessity.

Ampere is rushed, baked on an inferior node, badly balanced, and not quite the value you'd want even with the competitive price tags. Its not bad... but it certainly isn't great. Let's stop deluding each other. 200% core power, but 120% VRAM compared to past gen same-tier examples, just face the numbers and draw the logical conclusion.
Posted on Reply
#81
Max(IT)
chriszhxino matter actually which is which or how you see it, once you hit the cap, the game is unplayable.
The point is: you don’t hit the cap, with allocated vram.
On a 24 Gb card, you see 10 Gb or more. The same game on a 10 Gb card would see 8 Gb.
there are absolutely no reports of unplayable titles on a 3080 due to vram limitations. Actually cyberpunk 2077 at 4K runs much better on a 10 gb 3080 than on a 16 Gb 6800xt
Vayra86Or the reality that there are now chips available that offer the required and actually intended sizes for all SKUs.

We know Nvidia rushed Ampere to market by now, it echoes in everything, and that includes VRAM capacities. Your idea of 'uneducated buyer' buying into high VRAM caps is an idea of the ultra low end/budget class of GPUs where we also saw said capacities to turn out being DDR instead of GDDR memory even. Its old news, those guys are buying laptops, tablets or smartphones and have been for a decade or more now.

Nvidia NEVER pushed the marketing button for any of its x50 > x60 and up cards like that, so I'm not sure where you're coming from. They always marketed the normal VRAM versions first and foremost, and the double cap ones were obviously meant for the (then also still normal) SLI solutions. Your frame does not fit on the current situation at all either, where is the 20GB 3080 if people are so eager to buy inflated capacities? Let's remember here what you're defending: 10GB is enough but 'because marketing' double cap is on offer. If it'd sell that well, why wasn't it sold first - especially knowing a 16GB Radeon card is also available?

EVEN the odd one out of late, the 1060 was marketed first and foremost as a 6GB card, which also had a budget-ey 3GB cut down version. Today, the 3GB one is sinking fast and cheap becomes expensive sooner rather than later. After all, you'll be looking to upgrade that by now, it won't do much anymore without heavily cutting back on IQ. Ironically around the same time a 970 is hardly ever recommended despite having "4"GB ;)
nvidianews.nvidia.com/news/a-quantum-leap-for-every-gamer:-nvidia-unveils-the-geforce-gtx-1060

People are apparently just making shit up now to defend 10GB. Wow. As if it cannot occur that Huang made a timing decision that isn't the best design/balance decision for its GPUs going forward. No - that couldn't possibly be the truth, right? Surely he isn't in it for the money?

:kookoo::banghead:

Here's some fun reading material
pcmasterrace/comments/kkl86c
All the ingredients are there - none of them speak of 10GB being a solid choice for anything other than pure necessity.

Ampere is rushed, baked on an inferior node, badly balanced, and not quite the value you'd want even with the competitive price tags. Its not bad... but it certainly isn't great. Let's stop deluding each other. 200% core power, but 120% VRAM compared to past gen same-tier examples, just face the numbers and draw the logical conclusion.
Amazing how many (wrong) conclusions one could take out of a technical article wrote by someone on reddit...
Rushed ? Based on what ? 3080 still is better than the not-rushed 6800XT with its 16 Gb frame buffer.
Inferior node ? Inferior compared to what ? To the "non existent" TSMC 7 nm node, that is causing so many availability trouble for everything made out of it (Zen 3, rdna2, ps5, Xbox ... Name it) ?
Posted on Reply
#82
Sunny and 75
Vayra86970 is hardly ever recommended despite having "4"GB ;)
Don't hurt my 970's feelings :p
Posted on Reply
#83
Vayra86
Max(IT)Rushed ? Based on what ? 3080 still is better than the not-rushed 6800XT with its 16 Gb frame buffer.
Inferior node ? Inferior compared to what ? To the "non existent" TSMC 7 nm node, that is causing so many availability trouble for everything made out of it (Zen 3, rdna2, ps5, Xbox ... Name it) ?
You ask questions that are answered in the very post you reply to ;) You are at liberty to disagree of course.

And are you seriously comparing Nvidia's GPU supply with supposedly nonexistant TSMC 7nm product? Last I checked several million consoles were already sold with those chips inside. How many Ampere GPUs on the not-inferior Samsung node now? Oh yeah... Q1, right? Last I checked, Nvidia booked with Samsung because they could at least offer some chips. Emphasis on some.

:slap:

The Reddit post specifically mentions the limitations that are connected to Nvidia's choice of GDDR6X, time to market (rushing it) and the problems this presents for their product stack. This is why we are looking at these silly VRAM capacities right now.

Wrong conclusions, you say... I wonder if you even read the piece. Here, I'll make it simple for you.

"Now if you count the memory chips on a 3080, you will find ten of them, but with twelve footprints on the PCB, two of which are always empty. People have been speculating that Nvidia could simply add two more chips to these spots, for the 12GB that would have been more comfortable. But this is where die yield and binning comes into play. The 3090 and 3080 use the same die design and production line, with the 3080s being built with the dies that have defects in limited areas. Perfect dies go into the 3090 with all components operating – critically, all twelve memory controllers – and dies with a defective controller or two get two controllers disabled and become 3080s."

Summary:
- Nvidia didn't want to repeat a 970 asymmetric bus that they got burned for in the past
- 3080's are in fact failed 3090's with 12 potential VRAM chips but only have 10 populated
- GDDR6X is not available in the required capacity at the time the 3080 was built, and we do all agree 20GB was a bit overkill for that class of GPU / price point.

Tell me again the 3080 is a well rounded, balanced product without letting your eyes bleed now. Nvidia simply has no other options, but here we have dozens of people selling it as the way it is supposed to be.
Posted on Reply
#84
Tom Sunday
Tom SundayIndeed a lot of gamers with deep pockets are still waiting or holding-off for a card with the specs of a 3080ti and as the gaming developers are throwing more and more graphic demanding games into the market. Actually many games are being purchased solely based on their 'pretty worlds' and over the top sandbox bling. There seems to be no end to this and it essentially having started when gamers were delighted to see water and cloud reflections making their day and nights. From what I hear a 20GB GPU investment is seen as sort of a minimum (gaming) future proofing at least for a few more years. I am also told hanging around the computer shops with the boys from Mumbai, that if the 3080ti will not come about for whatever reason, that most enthusiasts will get the RTX 3090 instead and call it a day. Surely NVIDIA and their investors will love that! My tech buddy Harry said: "What the hell, I paid $1,480 years ago for a MSI Gaming Trio 2080ti, so a RTX 3090 is not out of reach at all. Now a 20GB card is my sweetspot." As to me I am always short of cash, being the simple man on the steet, but I do have a blond which is my sweetpot.
Simply an aside...hanging around the Mumbai computer shop today with the usual suspects, somebody wanted to bet if I get my first vaccine shot before or after all the 3000 series cards hit the market in ready availability. I in turn wanted to bet on a (20GB) 3080ti timetable but nobody here was willing to bet on that. It's a pipe dream, everyone said.
Posted on Reply
Add your own comment
Dec 22nd, 2024 16:34 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts