Friday, January 15th 2021

NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes

Everyone and their mother expected NVIDIA to announce - if not a SUPER refresh to their existing graphics cards with increased memory sizes - at least the RTX 3080 Ti. That card surfaced as a planned NVIDIA counter to AMD's preemptive pricing of $999 on its RX 6900 XT graphics card (which to be fair, is in itself as abundant a card as unicorns this side of the galaxy). GamersNexus reported NVIDIA partners' comments on the indefinite postponement of the RTX 3080 Ti and possible SUPER derivatives of the RTX 30-series lineup. It's being said that NVIDIA decided (smartly, I would say) to ensure consistent supply of their existing lineup to sate demand, instead of dispersing its limited chip production across even more product lines.

This would result, I have no doubt, on NVIDIA only having even more SKUs out of stock than they currently do. Considering the market's current state of mind in regards to NVIDIA's lineup, this seems like the most sensible decision possible. TechPowerUp has in the meantime confirmed this information with NVIDIA partners themselves.
Source: GamersNexus
Add your own comment

85 Comments on NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes

#1
Camm
3060 has more memory than a 3080, and for likely quite some time going into the future. Crazy times.
Posted on Reply
#2
ViperXTR
yep, waiting for RTX 4000 series then
Posted on Reply
#3
Ravenmaster
The sensible thing to do would have been to make 3080's EOL (end of life) and use the silicon to make a 3080Ti instead. Because the 3080 has an inadequate amount of VRAM and should never have been released in the first place
Posted on Reply
#4
Doc-J
10GB of VRAM is sufficient today, no need more VRAM for play modern games at 4k resolution.

3080 with 10GB at 700$ is much more aforable that 3080 with 20GB at 1000$

Today 16GB of VRAM is pure marketing for noobs, maybe on next generation of GPUs is necessary but not now.

The 3060 with 12GB is a counterpart of 6700 with 12GB, neither of both cards needs this quantity of VRAM for play at 1080p or 2k but marketing is marketing and sell better these cards.
Posted on Reply
#5
cst1992
3060 is actually worse than the 3060Ti.
I don't know about Ultra, but the 3060 is ~78% of the performance for ~82%(according to local prices) of the price.
That makes it worse for price/performance by 5%.
Some might say it's because of the 4GB higher VRAM, but I don't think we'll need that much VRAM for this card. It'd be a shame if it couldn't use more than 7-8GB, but NVIDIA had to put in 12GB just because of the 192bit bus.

I thought I had made a hasty decision getting the 3060Ti before waiting for the 3060, but now I think it was the right decision - the 3060 doesn't seem to have a reference design and the price that's being quoted by NVIDIA is as much of a "virtual" price as any other 30-series card.
RavenmasterThe sensible thing to do would have been to make 3080's EOL (end of life) and use the silicon to make a 3080Ti instead. Because the 3080 has an inadequate amount of VRAM and should never have been released in the first place
EOL 3 months after launch? That'd be a PR nightmare.
Posted on Reply
#6
Valantar
Seriously, can you please stop using that ugly and erroneous fan render when covering future RTX GPUs? We'll never see the end of people asking about PCIe power plugs on the end of the card if you keep using it.
Posted on Reply
#7
spnidel
they (amd and nvidia) said the supply is going to get way better a couple of days ago, now they're backtracking on that? lol
Posted on Reply
#8
ZoneDymo
so instead of making the super variants that should have been the norm(al variants) and just outright replacing them, we are going to continue to get the inferior products....makes sense to me.
Posted on Reply
#9
Julhes
RavenmasterThe sensible thing to do would have been to make 3080's EOL (end of life) and use the silicon to make a 3080Ti instead. Because the 3080 has an inadequate amount of VRAM and should never have been released in the first place
It is false. the gpu is exceeded when the memory is too low in quantity. history proves it.
Posted on Reply
#10
cst1992
spnidelthey (amd and nvidia) said the supply is going to get way better a couple of days ago, now they're backtracking on that? lol
ZoneDymoso instead of making the super variants that should have been the norm(al variants) and just outright replacing them, we are going to continue to get the inferior products....makes sense to me.
Man proposes, COVID disposes.
Posted on Reply
#11
ZoneDymo
Doc-J10GB of VRAM is sufficient today, no need more VRAM for play modern games at 4k resolution.

3080 with 10GB at 700$ is much more aforable that 3080 with 20GB at 1000$

Today 16GB of VRAM is pure marketing for noobs, maybe on next generation of GPUs is necessary but not now.

The 3060 with 12GB is a counterpart of 6700 with 12GB, neither of both cards needs this quantity of VRAM for play at 1080p or 2k but marketing is marketing and sell better these cards.
again, its only sufficient because big green enforces it to be, why do you think games got textures packs in the past? because newer cards with more Vram could deal with those.
you cant make use of something if that something isnt there.

will a game be made today entirely path traced? no because nothing could run it...but if those cards were to exist already then sure.

if all cards had a minimum of 16 gb of ram, then games could ship with much higher quality textures, but they dont so those arnt made, but dont twist it around.
Posted on Reply
#12
cst1992
ZoneDymoagain, its only sufficient because big green enforces it to be, why do you think games got textures packs in the past? because newer cards with more Vram could deal with those.
you cant make use of something if that something isnt there.

will a game be made today entirely path traced? no because nothing could run it...but if those cards were to exist already then sure.

if all cards had a minimum of 16 gb of ram, then games could ship with much higher quality textures, but they dont so those arnt made, but dont twist it around.
Price competition amongst manufacturers is stiff right now, no one can afford to put extra memory on cards just like that.
Posted on Reply
#13
lemkeant
spnidelthey (amd and nvidia) said the supply is going to get way better a couple of days ago, now they're backtracking on that? lol
They said it would, but after Quarter 1. We have a few more months to go.

I was lucky and got mine before the tariff hike. If youre in the US, would you really want a new card right now anyway?
Posted on Reply
#14
DeathtoGnomes
spnidelthey (amd and nvidia) said the supply is going to get way better a couple of days ago, now they're backtracking on that? lol
likely the press release for this information was completed prior to announcing supply hype.
Posted on Reply
#15
currahee440
Doc-J10GB of VRAM is sufficient today, no need more VRAM for play modern games at 4k resolution.
I remember when the "is 512GB VRAM too much?" debates happened....

8GB is enough for 4k BUT if you want to do HDR, then you'll probably need 12. My 1070ti just can't handle 4k HDR.
Posted on Reply
#16
kapone32
So at the end of the day all of the preamble was just bloat ware from NvidiA
Posted on Reply
#17
Xaled
Doc-J10GB of VRAM is sufficient today, no need more VRAM for play modern games at 4k resolution.

3080 with 10GB at 700$ is much more aforable that 3080 with 20GB at 1000$

Today 16GB of VRAM is pure marketing for noobs, maybe on next generation of GPUs is necessary but not now.

The 3060 with 12GB is a counterpart of 6700 with 12GB, neither of both cards needs this quantity of VRAM for play at 1080p or 2k but marketing is marketing and sell better these cards.
10GB 3080 at 700$ is pure lie and marketing for noobs too. If found, the actual price is 1200-1300$

but yeah I agree, 16 gb vram 3060 is foor noobs and just an answer to 16gb 6xxx cards.
Posted on Reply
#18
ObiFrost
Nvidia releasing new cutdowns, instead of increasing the capacity for already existing models pretty much reflects on how little they care about the customers. All it had to be done was to just release a product from a competitor to make Nvidia go REEEEEEE and spawn another 3-5 abominations of the same card with 5-10% difference from each other lol. This is why I prefer AMD, not because they are underperforming and undercompeting, but rather less confusing from consumer point of view. They don't go out and rush to develop Tis, Supers, Ultras, rehashed models with increased memory and different memory type etc. Though, I'm aware of XT revisions for 3000s Ryzen which was kinda a**.
Posted on Reply
#19
Valantar
ObiFrostNvidia releasing new cutdowns, instead of increasing the capacity for already existing models pretty much reflects on how little they care about the customers. All it had to be done was to just release a product from a competitor to make Nvidia go REEEEEEE and spawn another 3-5 abominations of the same card with 5-10% difference from each other lol. This is why I prefer AMD, not because they are underperforming and undercompeting, but rather less confusing from consumer point of view. They don't go out and rush to develop Tis, Supers, Ultras, rehashed models with increased memory and different memory type etc. Though, I'm aware of XT revisions for 3000s Ryzen which was kinda a**.
Uhm, did you even read the title of this news post, let alone its contents? They have postponed those versions you're talking about indefinitely - i.e., they're not coming.

As for product stacks, Nvidia and AMD tend to be pretty close to each other in how confusing they are, though of course Pascal and the Super revisions were a mess. Also, what is confusing about a Ti SKU? 30xx is base performance, 30xx Ti is a step above. Just like 60xx is base for AMD, with 60xx XT being a step above. Nothing at all confusing about that. Time will tell if we'll have a repeat of the Super debacle down the line or not. And +~10% performance difference SKU to SKU is pretty standard for the GPU industry.
Posted on Reply
#20
Unregistered
Is it so hard for nVidia to just lock their GPUs and prevent them them from runing any crypto currency workload. It's just a waste of energy.
Posted on Edit | Reply
#21
Raevenlord
News Editor
Xex360Is it so hard for nVidia to just lock their GPUs and prevent them them from runing any crypto currency workload. It's just a waste of energy.
Cryptocurrency workloads are pretty specific in their loops and requirements from the graphics card, so I (who have no technical knowledge) believe it wouldn't be impossible to make the cards duds for cryptocurrency mining at a firmware or (maybe even) driver level.

However, none of these companies want to allienate prospective buyers of their products. A sale is (mostly, but not completely) a sale for both AMD and NVIDIA. They don't want to alienate their loyal users (gamers, professionals), but don't want to lose out on crypto-led revenue opportunities. And as such, balancing acts such as crypto-specific graphics cards come about.
Posted on Reply
#22
TheUn4seen
Xex360Is it so hard for nVidia to just lock their GPUs and prevent them them from runing any crypto currency workload. It's just a waste of energy.
It would cripple the card in many more workloads. Also, why the hell would they do such a thing? A sale is a sale, and while you can argue that shortages for gamers might be a bad PR, well, whatever. In times when the market is kind of shaky the companies prefer reliable clientele, and miners who buy tens or hundreds of cards in one go are just that. Know your place and wait for your turn. First, there are corporate customers, then the miners followed by smaller system integrators. DIY consumers are last on the list.
Posted on Reply
#23
Tom Sunday
RavenmasterBecause the 3080 has an inadequate amount of VRAM and should never have been released in the first place
Indeed a lot of gamers with deep pockets are still waiting or holding-off for a card with the specs of a 3080ti and as the gaming developers are throwing more and more graphic demanding games into the market. Actually many games are being purchased solely based on their 'pretty worlds' and over the top sandbox bling. There seems to be no end to this and it essentially having started when gamers were delighted to see water and cloud reflections making their day and nights. From what I hear a 20GB GPU investment is seen as sort of a minimum (gaming) future proofing at least for a few more years. I am also told hanging around the computer shops with the boys from Mumbai, that if the 3080ti will not come about for whatever reason, that most enthusiasts will get the RTX 3090 instead and call it a day. Surely NVIDIA and their investors will love that! My tech buddy Harry said: "What the hell, I paid $1,480 years ago for a MSI Gaming Trio 2080ti, so a RTX 3090 is not out of reach at all. Now a 20GB card is my sweetspot." As to me I am always short of cash, being the simple man on the steet, but I do have a blond which is my sweetpot.
Posted on Reply
#24
jeremyshaw
TheUn4seenIt would cripple the card in many more workloads. Also, why the hell would they do such a thing? A sale is a sale, and while you can argue that shortages for gamers might be a bad PR, well, whatever. In times when the market is kind of shaky the companies prefer reliable clientele, and miners who buy tens or hundreds of cards in one go are just that. Know your place and wait for your turn. First, there are corporate customers, then the miners followed by smaller system integrators. DIY consumers are last on the list.
Miners are the definition of unreliable clientele. Volatile (10s, 100s of thousands of orders one month, 0 the next), they flood markets when they are done with a card (unlike gamers and workstations, which retire their cards at vastly different rates), and of course, the stigma from even working with miners.

There is nothing to be gained from catering to miners, unless if your GPUs are good at mining and uncompetitive in the core gaming market. Right now, anything, even ancient Polaris, are flying off of the shelves. Everything is wildly competitive and in-demand in the gaming market. Overall, better to sell to traditional consumers than catering to mining.

Also, somewhat unrelated, but Nvidia was once, and was long the largest customer of TSMC, before being displaced by Bitmain (a mining ASIC company) a few years ago. I wonder how much of TSMC's production allocation is still going to mining. Same with UMC, SMIC, GloFo, any the myriad of other logic fabs out there. Well, probably not SMIC, given China's official policies on mining.
Posted on Reply
#25
Anymal
Time doesnt matter, performance do.
Posted on Reply
Add your own comment
Dec 22nd, 2024 20:24 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts