• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes

Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
first produce chips, then harvest, then release bumped up version
It is an ugly practice, I don't like it one bit but sadly that's how it is right now. That is why I'm ignoring all these early bait cards and I am gonna get me one of them Super (Ti) ones, exactly like what they did with the Turing cards.

The 3090 is the ultimate graphics card as a matter of fact. 6900 XT the same. But not everyone can afford them. I, for one, can get an RX 6800 non-XT at best which can be had for roughly 950 US dollars but I'm waiting for Nvidia's response first, will check the price/perf ratio before pulling the trigger though.


[980 Ti] Spoiled Fury X launch. [1070 Ti] Undermined Vega.
And now AMD have returned the favor by launching RDNA 2. That 12 GB 3060 is just the beginning of a plethora of Nvidia cards changing for the better.


All hail the competition which drives us to improve even more to be the best we can be and pushes the advancement of technology further ahead.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
It is an ugly practice, I don't like it one bit but sadly that's how it is right now.
Ugly? I'd call it necessary. When you're launching a new product based on a new, large chip on a (relatively) new manufacturing node, yields are bound to be less than optimal. Thus, announcing a day 1 SKU based on a fully enabled die will be inviting scarcity, as that bin will always be in limited supply for the first few months. That is why you first launch a high end cut-down SKU, then wait half a year or so while you improve yields/build up stocks of fully enabled chips and then release the full-fat product. There's nothing wrong with this whatsoever, outside of pissing off people who lack the self control to wait until the full product stack has arrived, yet still somehow demand that their purchase decisions be ideal.
 
Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
pissing off people who lack the self control to wait until the full product stack has arrived, yet still somehow demand that their purchase decisions be ideal.
Interesting point of view! Though GTX 980 (full GM204 GPU) and GTX 1080 (full GP104 GPU) were full-fat products at launch and no Super (Ti) refresh variants needed the year after.
So it must have been the scale is so much higher now with Turing and Ampere that forces Nvidia and leaves them no choice outside of practicing "first produce chips, then harvest, then release bumped up version".

Thankfully AMD didn't feel the need to do so and released the full Navi 21 GPU with their 6900 XT. God bless AMD. God bless the United States of America. God bless the universe.
 
Last edited:

Locotes_Killa

New Member
Joined
Jan 16, 2021
Messages
6 (0.00/day)
Creators that have projects than actually could use more than 6GB of VRAM, are also not buying 3060s. Because those projects need processing power more than VRAM. They're buying 3070s and 3080s or even 3060Ti's with 6GB of VRAM.

No, there is not game that legitimately needs more than 6GB of VRAM for 1080p, there really isn't any that needs more than that for 1440p. There are games that will "use" more than that, but not ones that actually need it, and that's the big difference.

Plus, you have to wonder if the card is even powerful enough for that extra RAM to matter. By the time you raise the graphics settings to the point where 6GB would be a limit, it's likely a 3060 would be chugging anyway.

I mean, yeah, you can use ultra-high res textures in games and get VRAM usage pretty damn high(textures being the biggest VRAM hog these days). But when you can still use 6GB by just turning down the texture settings one notch, I have a hard time saying more than 6GB is needed in this level of card.

I think people really over-estimate how much VRAM games are actually using. We're talking about a card that, in modern games, will really struggle at 1440p. This is going to be this generations 1080p card. Having 6GB of VRAM doesn't make the card useless, it barely is going to affect the card at the performance segment it is aimed at(if it affects it at all). However, the extra cost of doubling the VRAM to 12GB is going to result in a high price. So you end up with a card that is priced too closely to the 3060Ti to really be a compelling buy.

Not all creators are able to afford the more powerful graphics cards, but it's ok as today's midrange gpu's are capable of being highly effective with their horsepower, and needing enough vram to handle projects being worked on holds a higher importance, as a workstation that is hitting a memory bottleneck is going to feel much worse to use than one which is able to perform tasks nice and smoothly.

As demonstrated here...


Check out the 3060Ti's rendering time improvements that Nvidia is showing off vs the more expensive previous gen gpu's too...


You've not been reading the previous comments have you now? No because if you did... then you would know that I have said yes there is a game that wants more than 6gb for 1080p to max it out... and DooM Eternal's built-in OSD shows actual, not allocated vram usage.

www.techspot.com/article/1999-doom-eternal-benchmarks/

You would know that the right choice due to how memory buses work is to double the vram, not to make it 8gb or 10gb for the 3060.

Task Manger doesn't show real VRAM usage either. The fact is, there isn't really a way currently to see real VRAM usage, only allocated size. You'd have to track the actual game to see how much of that allocated VRAM the game is actually accessing. And, at least AFAIK, there isn't anything that does that. The way developers are doing things these days is they are just cramming as many textures as possible into VRAM, even if the texture isn't anywhere near the player and isn't being seen by the player. They start at where the player is, and just load outward until VRAM is full or there aren't anymore textures to load.

There used to be a very good quick video that showed how games used to handle textures. They would only load into VRAM what the player was actually seeing and a small amount around the field of view, in a cone. This was back in the days when VRAM was actually limited. Now developers have become spoiled with huge VRAM sizes, so they just load that VRAM with as many textures as they can. But then there are textures sitting in VRAM that are never accessed, so the VRAM usage is drastically inflated.

Getting back to why this affects the 3060. There are going to be people that are like "but what if I run at 4k texture mod on [game], that uses like 10GB of VRAM?!?" Ok, so you're using a 4k texture mod on a card that is really only capable of playing at 1080p resolution? Do you really think those stupid high texture resolutions are really helping you any?

I have already shown there is a tool available which can monitor actual rather than allocated vram usage in a link I have posted, that also goes into detail about what you're saying here and the future of vram usage with games.

Do please watch the Unreal Rebirth demo too, if you can't see that the textures look far beyond anything seen before in games, even if you watch the video in 1080p then you really need to get your eyes checked... and it's fair to say that a gpu will be wanting a bit more than 6gb vram to handle this in games considering what DooM Eternal is already doing, don't ya think?

So, can you bother to actually pay attention to what has been said so far? That would be nice of you as then we wouldn't have to be going around in circles... repeating ourselves. Don't be a bottleneck... :wtf:
 
Last edited:
Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
The good ol' marketting tactic of sticking way more RAM on a card than it needs or could ever actually use just to catch the eye of the uneducated buyer...
Or the reality that there are now chips available that offer the required and actually intended sizes for all SKUs.

We know Nvidia rushed Ampere to market by now, it echoes in everything, and that includes VRAM capacities. Your idea of 'uneducated buyer' buying into high VRAM caps is an idea of the ultra low end/budget class of GPUs where we also saw said capacities to turn out being DDR instead of GDDR memory even. Its old news, those guys are buying laptops, tablets or smartphones and have been for a decade or more now.

Nvidia NEVER pushed the marketing button for any of its x50 > x60 and up cards like that, so I'm not sure where you're coming from. They always marketed the normal VRAM versions first and foremost, and the double cap ones were obviously meant for the (then also still normal) SLI solutions. Your frame does not fit on the current situation at all either, where is the 20GB 3080 if people are so eager to buy inflated capacities? Let's remember here what you're defending: 10GB is enough but 'because marketing' double cap is on offer. If it'd sell that well, why wasn't it sold first - especially knowing a 16GB Radeon card is also available?

EVEN the odd one out of late, the 1060 was marketed first and foremost as a 6GB card, which also had a budget-ey 3GB cut down version. Today, the 3GB one is sinking fast and cheap becomes expensive sooner rather than later. After all, you'll be looking to upgrade that by now, it won't do much anymore without heavily cutting back on IQ. Ironically around the same time a 970 is hardly ever recommended despite having "4"GB ;)

People are apparently just making shit up now to defend 10GB. Wow. As if it cannot occur that Huang made a timing decision that isn't the best design/balance decision for its GPUs going forward. No - that couldn't possibly be the truth, right? Surely he isn't in it for the money?

:kookoo::banghead:

Here's some fun reading material
https://www.reddit.com/r/pcmasterrace/comments/kkl86c
All the ingredients are there - none of them speak of 10GB being a solid choice for anything other than pure necessity.

Ampere is rushed, baked on an inferior node, badly balanced, and not quite the value you'd want even with the competitive price tags. Its not bad... but it certainly isn't great. Let's stop deluding each other. 200% core power, but 120% VRAM compared to past gen same-tier examples, just face the numbers and draw the logical conclusion.
 
Last edited:
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
no matter actually which is which or how you see it, once you hit the cap, the game is unplayable.
The point is: you don’t hit the cap, with allocated vram.
On a 24 Gb card, you see 10 Gb or more. The same game on a 10 Gb card would see 8 Gb.
there are absolutely no reports of unplayable titles on a 3080 due to vram limitations. Actually cyberpunk 2077 at 4K runs much better on a 10 gb 3080 than on a 16 Gb 6800xt

Or the reality that there are now chips available that offer the required and actually intended sizes for all SKUs.

We know Nvidia rushed Ampere to market by now, it echoes in everything, and that includes VRAM capacities. Your idea of 'uneducated buyer' buying into high VRAM caps is an idea of the ultra low end/budget class of GPUs where we also saw said capacities to turn out being DDR instead of GDDR memory even. Its old news, those guys are buying laptops, tablets or smartphones and have been for a decade or more now.

Nvidia NEVER pushed the marketing button for any of its x50 > x60 and up cards like that, so I'm not sure where you're coming from. They always marketed the normal VRAM versions first and foremost, and the double cap ones were obviously meant for the (then also still normal) SLI solutions. Your frame does not fit on the current situation at all either, where is the 20GB 3080 if people are so eager to buy inflated capacities? Let's remember here what you're defending: 10GB is enough but 'because marketing' double cap is on offer. If it'd sell that well, why wasn't it sold first - especially knowing a 16GB Radeon card is also available?

EVEN the odd one out of late, the 1060 was marketed first and foremost as a 6GB card, which also had a budget-ey 3GB cut down version. Today, the 3GB one is sinking fast and cheap becomes expensive sooner rather than later. After all, you'll be looking to upgrade that by now, it won't do much anymore without heavily cutting back on IQ. Ironically around the same time a 970 is hardly ever recommended despite having "4"GB ;)

People are apparently just making shit up now to defend 10GB. Wow. As if it cannot occur that Huang made a timing decision that isn't the best design/balance decision for its GPUs going forward. No - that couldn't possibly be the truth, right? Surely he isn't in it for the money?

:kookoo::banghead:

Here's some fun reading material
https://www.reddit.com/r/pcmasterrace/comments/kkl86c
All the ingredients are there - none of them speak of 10GB being a solid choice for anything other than pure necessity.

Ampere is rushed, baked on an inferior node, badly balanced, and not quite the value you'd want even with the competitive price tags. Its not bad... but it certainly isn't great. Let's stop deluding each other. 200% core power, but 120% VRAM compared to past gen same-tier examples, just face the numbers and draw the logical conclusion.
Amazing how many (wrong) conclusions one could take out of a technical article wrote by someone on reddit...
Rushed ? Based on what ? 3080 still is better than the not-rushed 6800XT with its 16 Gb frame buffer.
Inferior node ? Inferior compared to what ? To the "non existent" TSMC 7 nm node, that is causing so many availability trouble for everything made out of it (Zen 3, rdna2, ps5, Xbox ... Name it) ?
 
Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Rushed ? Based on what ? 3080 still is better than the not-rushed 6800XT with its 16 Gb frame buffer.
Inferior node ? Inferior compared to what ? To the "non existent" TSMC 7 nm node, that is causing so many availability trouble for everything made out of it (Zen 3, rdna2, ps5, Xbox ... Name it) ?
You ask questions that are answered in the very post you reply to ;) You are at liberty to disagree of course.

And are you seriously comparing Nvidia's GPU supply with supposedly nonexistant TSMC 7nm product? Last I checked several million consoles were already sold with those chips inside. How many Ampere GPUs on the not-inferior Samsung node now? Oh yeah... Q1, right? Last I checked, Nvidia booked with Samsung because they could at least offer some chips. Emphasis on some.

:slap:

The Reddit post specifically mentions the limitations that are connected to Nvidia's choice of GDDR6X, time to market (rushing it) and the problems this presents for their product stack. This is why we are looking at these silly VRAM capacities right now.

Wrong conclusions, you say... I wonder if you even read the piece. Here, I'll make it simple for you.

"Now if you count the memory chips on a 3080, you will find ten of them, but with twelve footprints on the PCB, two of which are always empty. People have been speculating that Nvidia could simply add two more chips to these spots, for the 12GB that would have been more comfortable. But this is where die yield and binning comes into play. The 3090 and 3080 use the same die design and production line, with the 3080s being built with the dies that have defects in limited areas. Perfect dies go into the 3090 with all components operating – critically, all twelve memory controllers – and dies with a defective controller or two get two controllers disabled and become 3080s."

Summary:
- Nvidia didn't want to repeat a 970 asymmetric bus that they got burned for in the past
- 3080's are in fact failed 3090's with 12 potential VRAM chips but only have 10 populated
- GDDR6X is not available in the required capacity at the time the 3080 was built, and we do all agree 20GB was a bit overkill for that class of GPU / price point.

Tell me again the 3080 is a well rounded, balanced product without letting your eyes bleed now. Nvidia simply has no other options, but here we have dozens of people selling it as the way it is supposed to be.
 
Last edited:
Joined
Sep 24, 2020
Messages
227 (0.15/day)
Location
Stehekin, Washington
System Name (2008) Dell XPS 730x H2C
Processor Intel Extreme QX9770 @ 3.8GHz (No OC)
Motherboard Dell LGA 775 (Dell Propiatary)
Cooling Dell AIO Ceramic Water Cooling (Dell Propiatary)
Memory Corsair Dominator Platinum 16GB (4 x 4) DDR3
Video Card(s) EVGA GTX 980ti 6GB (2016 ebay-used)
Storage (2) WD 1TB Velociraptor & (1) WD 2TB Black
Display(s) Alienware 34" AW3420DW (Amazon Warehouse)
Case Stock Dell 730x with "X" Side Panel (65 pounds fully decked out)
Audio Device(s) Creative X-FI Titanium & Corsair SP2500 Speakers
Power Supply PSU: 1000 Watt (Dell Propiatary)
Mouse Alienware AW610M (Amazon Warehouse)
Keyboard Corsair K95 XT (Amazon Warehouse)
Software Windows 7 Ultimate & Alienware FX Lighting
Benchmark Scores No Benchmarking & Overclocking
Indeed a lot of gamers with deep pockets are still waiting or holding-off for a card with the specs of a 3080ti and as the gaming developers are throwing more and more graphic demanding games into the market. Actually many games are being purchased solely based on their 'pretty worlds' and over the top sandbox bling. There seems to be no end to this and it essentially having started when gamers were delighted to see water and cloud reflections making their day and nights. From what I hear a 20GB GPU investment is seen as sort of a minimum (gaming) future proofing at least for a few more years. I am also told hanging around the computer shops with the boys from Mumbai, that if the 3080ti will not come about for whatever reason, that most enthusiasts will get the RTX 3090 instead and call it a day. Surely NVIDIA and their investors will love that! My tech buddy Harry said: "What the hell, I paid $1,480 years ago for a MSI Gaming Trio 2080ti, so a RTX 3090 is not out of reach at all. Now a 20GB card is my sweetspot." As to me I am always short of cash, being the simple man on the steet, but I do have a blond which is my sweetpot.
Simply an aside...hanging around the Mumbai computer shop today with the usual suspects, somebody wanted to bet if I get my first vaccine shot before or after all the 3000 series cards hit the market in ready availability. I in turn wanted to bet on a (20GB) 3080ti timetable but nobody here was willing to bet on that. It's a pipe dream, everyone said.
 
Top