Tuesday, December 15th 2020

NVIDIA GeForce RTX 3080 Ti Graphics Card Launch Postponed to February

In the past, we heard rumors about NVIDIA's upcoming GeForce RTX 3080 Ti graphics card. Being scheduled for January release, we were just a few weeks away from it. The new graphics card is designed to fill the gap between the RTX 3080 and higher-end RTX 3090, by offering the same GA102 die with the only difference being that the 3080 Ti is GA102-250 instead of GA102-300 die found RTX 3090. It allegedly has the same CUDA core count of 10496 cores, same 82 RT cores, 328 Tensor Cores, 328 Texture Units, and 112 ROPs. However, the RTX 3080 Ti is supposed to bring the GDDR6X memory capacity down to 20 GBs, instead of the 24 GB found on RTX 3090.

However, all of that is going to wait a little bit longer. Thanks to the information obtained by Igor Wallosek from Igor's Lab, we have data that NVIDIA's upcoming high-end GeForce RTX 3080 Ti graphics card is going to be postponed to February for release. Previous rumors suggested that we are going to get the card in January with the price tag of $999. That, however, has changed and NVIDIA allegedly postponed the launch to February. It is not yet clear what the cause behind it is, however, we speculate that the company can not meet the high demand that the new wave of GPUs is producing.
Sources: Igor's Lab, via VIdeoCardz
Add your own comment

121 Comments on NVIDIA GeForce RTX 3080 Ti Graphics Card Launch Postponed to February

#1
Vayra86
So one week extra wait for every percent of a gap between the 3080 and the 3090.

Totally worth. Why does this card exist?
Posted on Reply
#2
EarthDog
I'll bite......... what is that news/information you gleaned from Igor that made you post news about it?

Can you link to the source?
EDIT: Got it.. have to go to the article on the front page...
Posted on Reply
#3
Animalpak
Now they start to have my attention. Let's see, let's see
Posted on Reply
#4
Vayra86
6,8, 10, 12, 20... why Nvidia why not just do it right the first time.

Now you've got cards short and cards overdone on their VRAM throughout the whole stack. Its either too little or far too much. Well played.
Posted on Reply
#5
Totally
3060 12GB? Is there an actual point to that card other than to part a fool with their money? 6gb should be enough for the resolutions going to be effective at with 8gb being the upper limit need.
Posted on Reply
#6
lepudruk
"Launch" like real launch or the usual? (paper one)
Posted on Reply
#7
ebivan
Are these regular power connectors in the rendering? Is nVidia abandoning their new nonsense connector after only 4 months?
Posted on Reply
#8
Vayra86
lepudruk"Launch" like real launch or the usual? (paper one)
This is the news announcement of the pre-announcement event that announces the pre-paper-launch release event sometime around Feb, to tease the actual paper launch later this year.

Expect GPUs by Q4 '21. They might have memory. Reviewers already got their cards by the way, well, those that only cover RT and DLSS that is, the cards rendered themselves to their doorstep faster.
Posted on Reply
#9
Vya Domus
This is pretty much just an attempt at grabbing back some attention since there aren't going to be any available. Smoke and mirrors to fool investors.
Posted on Reply
#11
27MaD
Totally3060 12GB?
GT 630 4GB LOL:roll:
Posted on Reply
#12
neatfeatguy
ebivanAre these regular power connectors in the rendering? Is nVidia abandoning their new nonsense connector after only 4 months?
No they're not. But it doesn't mean they'll utilize it on every card they put out.

Also, the image posted is most likely just a rendering. To me, the card looks kind of fake, like they tried way too hard to utilize RT in the image.
Posted on Reply
#13
P4-630
I need a RTX 3030 with 20GB vram. :rockout:
Posted on Reply
#14
lepudruk
ChomiqFebruary... 2022.
You optimist!
Posted on Reply
#15
Dristun
Totally3060 12GB? Is there an actual point to that card other than to part a fool with their money? 6gb should be enough for the resolutions going to be effective at with 8gb being the upper limit need.
The actual enough would've been 8 gigs but because they want to extract some more money they'll just make people pay for 12 gigs that these folks will never need. Same with 20 gigs for 3080, lol. For people who want 4k@60+ it won't do that in 3 years anyway, they'll just upgrade again. Bollocks, all of it.
Posted on Reply
#16
EarthDog
DristunThe actual enough would've been 8 gigs but because they want to extract some more money they'll just make people pay for 12 gigs that these folks will never need. Same with 20 gigs for 3080, lol. For people who want 4k@60+ it won't do that in 3 years anyway, they'll just upgrade again. Bollocks, all of it.
Tell that to the vocal and misinformed minority who say it isn't enough. ;)
Posted on Reply
#17
Caring1
Finally power connectors at the end of the card where they should have been.
Posted on Reply
#18
Vayra86
DristunThe actual enough would've been 8 gigs but because they want to extract some more money they'll just make people pay for 12 gigs that these folks will never need. Same with 20 gigs for 3080, lol. For people who want 4k@60+ it won't do that in 3 years anyway, they'll just upgrade again. Bollocks, all of it.
If they had just taken it all up a notch from the initial capacities it would have been very much fine. But what we have today is a reduction from past gen with an increase in core power - a substantial one. You're looking at -3GB for a 2080ti's performance; realistically a 3080 should've had at least 12, preferably 14-16.

And that trickles down the stack. I'll also remind you of the fact that Turing > Pascal was ALSO a reduction in net VRAM per % of core power already. So what was Pascal then if Ampere is just fine? Way over the top? I dunno man, I'm seeing nearly 7GB usage at 3440x1440 on an 8GB 1080, for fine gameplay balance in Cyberpunk. And consoles carry at least 10GB already on a far weaker GPU than a 3080.
Posted on Reply
#19
Valantar
Why are you using some old, misleading fan-made render to represent this?
Posted on Reply
#20
WonkoTheSaneUK
Caring1Finally power connectors at the end of the card where they should have been.
That's an old render, not a pic. Power connector on FE will likely remain 12-pin and in the top middle.
Posted on Reply
#21
GhostRyder
Well it is kinda a funny/interesting card. Though I feel like this is gonna butcher 3090 sales, I wonder if its gonna be severely power limited to stop much overclocking and if so I wonder if we will get some of those extreme overclocking editions.

Well seeing as how I cant get any card apparently, I may try to get this one when it releases lol.
Posted on Reply
#22
mouacyk
Even if confirmed at $999, still going to be a ripoff but it's going in the right direction: $1200 -> $999 -> $699?
Posted on Reply
#23
DeathtoGnomes
WonkoTheSaneUKThat's an old render, not a pic. Power connector on FE will likely remain 12-pin and in the top middle.
thats not an FE.
Posted on Reply
#24
EarthDog
DeathtoGnomesthats not an FE.
The one in the first post is...
Posted on Reply
#25
TheinsanegamerN
Totally3060 12GB? Is there an actual point to that card other than to part a fool with their money? 6gb should be enough for the resolutions going to be effective at with 8gb being the upper limit need.
You underestimate the number of fools. Look how many are banging the drum of "10GB isnt enough" despite the 6800xt consistently being a bit slower then the 3080
Posted on Reply
Add your own comment
Nov 21st, 2024 11:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts