Tuesday, December 15th 2020

NVIDIA GeForce RTX 3080 Ti Graphics Card Launch Postponed to February

In the past, we heard rumors about NVIDIA's upcoming GeForce RTX 3080 Ti graphics card. Being scheduled for January release, we were just a few weeks away from it. The new graphics card is designed to fill the gap between the RTX 3080 and higher-end RTX 3090, by offering the same GA102 die with the only difference being that the 3080 Ti is GA102-250 instead of GA102-300 die found RTX 3090. It allegedly has the same CUDA core count of 10496 cores, same 82 RT cores, 328 Tensor Cores, 328 Texture Units, and 112 ROPs. However, the RTX 3080 Ti is supposed to bring the GDDR6X memory capacity down to 20 GBs, instead of the 24 GB found on RTX 3090.

However, all of that is going to wait a little bit longer. Thanks to the information obtained by Igor Wallosek from Igor's Lab, we have data that NVIDIA's upcoming high-end GeForce RTX 3080 Ti graphics card is going to be postponed to February for release. Previous rumors suggested that we are going to get the card in January with the price tag of $999. That, however, has changed and NVIDIA allegedly postponed the launch to February. It is not yet clear what the cause behind it is, however, we speculate that the company can not meet the high demand that the new wave of GPUs is producing.
Sources: Igor's Lab, via VIdeoCardz
Add your own comment

121 Comments on NVIDIA GeForce RTX 3080 Ti Graphics Card Launch Postponed to February

#51
Ominence
where in the heck is my 7nm $3k Ampere Titan already :kookoo:
Posted on Reply
#52
MxPhenom 216
ASIC Engineer
mouacykNVidia likely can't do that for this generation... but nothing's stopping them in the next round though, because AMD went all 256-bit, because they have the clock speed and SAM to make up for some of it. It is telling that as nodes continue to shrink, clock speeds will continue to climb and big busses won't be needed (sadly.)
You can't make up the loss in bandwidth from just clock speeds alone though from a smaller bus width.
Posted on Reply
#53
robb
MxPhenom 216You can't make up the loss in bandwidth from just clock speeds alone though from a smaller bus width.
If all else was equal then it makes essentially no difference. If you had a 256-bit bus with vram fast enough to match a 512-bit bus with slower vram and they had the same effective bandwidth then they would have the same effective performance. Again that's if everything else was the same including ROPs. There is no memory fast enough at this point to make up for dropping down to a 256-bit bus on a 3080 though and that's something that pea brain can't comprehend.
Posted on Reply
#54
mouacyk
MxPhenom 216You can't make up the loss in bandwidth from just clock speeds alone though from a smaller bus width.
I was intentionally vague... but you're right, the only way to directly make up for lack of bandwidth would be to put in faster memory chips. I actually meant that the faster GPU clock speeds will make up elsewhere in your pipeline, and still net your render time to near the same (unless your very specific use-case is memory-bandwidth dependent like science apps.)
Posted on Reply
#55
lexluthermiester
robbyou are too goddamn stupid
Darwin awards are coming to mind...
Posted on Reply
#56
robb
lexluthermiesterDarwin awards are coming to mind...
Well I'm sure your name and those awards come up quite often together. But please feel free to actually prove how a 256-bit 3080 would still match the performance of the 320-bit 3080. I mean I know you're extremely slow and all because I've already explained to you that they don't have fast enough memory to make up for the smaller bus width but maybe you have some magical proposal.
Posted on Reply
#57
mouacyk
MxPhenom 216I'm just going to leave this here for anyone that wants to read

How the Bus of a GPU Affects the Amount of VRAM | ITIGIC
Very useful and interesting. The new A6000 GPU must be using the x8 mode to address 2 chips of 2GB each, so there are 12 chips on front and 12 chips on the back of the card. I guess that's more likely than Micron producing 4GB chips.

As for 3080Ti, the rumored drop to 320-bit is already quite a cut to this SKU. I would have preferred 352-bit, like the 1080 Ti or even the full 384-bit like 980Ti and 780Ti.
Posted on Reply
#58
yotano211
Ominencewhere in the heck is my 7nm $3k Ampere Titan already :kookoo:
Its on ebay selling for $6k, you wanna buy it from me for $5500.
lexluthermiesterDarwin awards are coming to mind...
I love this place.
Posted on Reply
#59
Totally
robbBy that I mean running 4K may only use a few hundred megs or maybe a gig and a half more even though it's four times the resolution of 1080p. Ray tracing eats up a lot of vram and we've already seen games at 1080p that will easily go right up against the 6 gig buffer and sometimes actually need more.
These aren't 4k cards, barely 1440p cards they're going to hit other limits before vram becomes an issue that's that argument but you have your head too far up you know where to realize that.
Plus it's obvious that games are going to use more vram over the next year or so.
It's that crystal ball the same one that told 2 years ago that it was obvious that RTRT is going to be ubiquitous in the next year or so?

Yes it's also pointless to have a 12GB on the 6700XT
Posted on Reply
#60
robb
TotallyThese aren't 4k cards, barely 1440p cards they're going to hit other limits before vram becomes an issue that's that argument but you have your head too far up you know where to realize that.



It's that crystal ball the same one that told 2 years ago that it was obvious that RTRT is going to be ubiquitous in the next year or so?

Yes it's also pointless to have a 12GB on the 6700XT
Perhaps work on your reading comprehension skills. I said nothing about playing games with this card at 4K. Please actually go back and look at what I said and pay attention to the context as I was talking about resolution not being the only factor and in fact nowhere near the factor that some people think it is when it comes to vram. And apparently you've been living under a rock because even at 1440p 6 gigs will not always be enough with ray tracing on. There are already a couple games right now where performance will drop off hard or stutter unless you lower textures or some other setting. So right now buying a brand new 60 class card with only 6 gigs of vram would be beyond idiotic for upcoming games. And if you comprehend how things work then you know 12 gigs is the only other option so of course it will not be a waste. If anything it's some of the other cards that need more vram to make any more sense. FFS even right now if I crank all the settings in Wolfenstein Youngblood the game will crash and be out of vram even with an 8 gig card. You can't run the textures on that fully maxed setting and you have to leave it on the default setting that comes in the highest preset with only 8 gigs of vram at 1440p.
Posted on Reply
#61
lexluthermiester
TotallyYes it's also pointless to have a 12GB on the 6700XT
That remains to be seen.
robbPerhaps work on your reading comprehension skills.
Are you always this nasty to everyone? Seriously...
Posted on Reply
#62
robb
lexluthermiesterThat remains to be seen.


Are you always this nasty to everyone? Seriously...
Well in your case you keep making idiotic comments over and over even when I explain to you how what you're suggesting will not work. In his case he clearly did not read what I said yet wants to argue about it. And his argument is 100% BS as even right now 6 gigs is not enough for a couple of games. It's not my fault that you are completely ignorant about how video cards work or that he does not seem to be up to snuff on how much vram is needed for current and upcoming games. Neither one of you want to accept anything and instead want to argue so you definitely both deserve to be rude to.
Posted on Reply
#63
95Viper
Stop the personal jabs.
Keep to the topic.
Remember, the guidelines state:
Be polite and Constructive, if you have nothing nice to say then don't say anything at all.
Posted on Reply
#64
lexluthermiester
robbWell in your case you keep making idiotic comments over and over even when I explain to you how what you're suggesting will not work. In his case he clearly did not read what I said yet wants to argue about it. And his argument is 100% BS as even right now 6 gigs is not enough for a couple of games. It's not my fault that you are completely ignorant about how video cards work or that he does not seem to be up to snuff on how much vram is needed for current and upcoming games. Neither one of you want to accept anything and instead want to argue so you definitely both deserve to be rude to.
You assume you are correct when history has already PROVEN you incorrect. You can make all the insults you wish, it doesn't make you right. A 16GB GA102 variant is completely possible and doable. Whether they name it a 3080S or a 3070ti is up for debate, but the config is completely possible. If you don't understand that, YOU have the problem. Let it go.
Posted on Reply
#65
Gmr_Chick
@robb - my dude, seriously...go eat a Snickers. You're being particularly nasty to some here, and over what? A stupid GPU and whether it can or can't have 16, 20, or a fuckton GB of VRAM. :rolleyes:
Posted on Reply
#66
Fclem
Totally3060 12GB? Is there an actual point to that card other than to part a fool with their money? 6gb should be enough for the resolutions going to be effective at with 8gb being the upper limit need.
Playing CoD WarZone in wide-5K (like on the G9) maxes out the 8GB Vram of my RTX2070. I was thinking to upgrade to an RTX3080, but considering the performance boost I don't think gaining 2GB vram is enought. And a 3060 is at least as powerful if not more than a 2070, thus I'll be glad to see any cards even a 3060 that would have 12GB or more of VRam
Posted on Reply
#67
xenocide
Is there really any reason for this GPU to exist?
Posted on Reply
#68
95Viper
Last warning!
Stop the personal jabs and attacks.
Keep it technical and on topic.
Posted on Reply
#69
saki630
can they stop with the paper launches of crap we dont need. Get us some 3070's asap
Posted on Reply
#70
yotano211
saki630can they stop with the paper launches of crap we dont need. Get us some 3070's asap
Whats you defination of a paper lanuch because AMD and Nvidia already release the 3000 cards and AMD their own cards that are out in the wild. The only issue is supply and huge demand since lots of people around the world are home much more.
Posted on Reply
#71
Sunny and 75
Currently, graphics cards may be fine in the VRAM dept. but 12GB of VRAM is gonna be the bare min late 2021 and beyond. I'm actually glad there will be a 3060 12GB and AMD is also gonna be offering the 6700 XT 12GB so overall I'm really excited. Just wish the prices become stable (closer to MSRP) later on (before Q3 I hope) fingers crossed! :)

And don't forget about the blue team, Intel plans to enter the discrete graphics arena sometime in 2H 2021. We'll see how intense this competition is going to be.
Posted on Reply
#72
spnidel
what a sad product stack
3080 with 10 gb, 3060 with 12 gb
LOL
Posted on Reply
#73
Valantar
I'm looking forward to when the 3040 comes out with 24GB of DDR3.
Posted on Reply
#74
xenocide
spnidelwhat a sad product stack
3080 with 10 gb, 3060 with 12 gb
LOL
It's almost like VRAM allocation doesn't matter much. Given we know Nvidia designs the memory system based around the performance of the GPU, and we saw for a decade what slapping more VRAM on cards did (Spoiler: Literally nothing). There is not a single game out there that is bottlenecked by VRAM currently, and I'm not sure there ever has been.
Posted on Reply
#75
Ominence
xenocideIt's almost like VRAM allocation doesn't matter much. Given we know Nvidia designs the memory system based around the performance of the GPU, and we saw for a decade what slapping more VRAM on cards did (Spoiler: Literally nothing). There is not a single game out there that is bottlenecked by VRAM currently, and I'm not sure there ever has been.
For someone whom at this stage gaming is like an interactive movie experience and who cannot stand low res textures or aliasing, memory matters incredibly mate. What i'm really :mad: about is that the nvidiots at nvidia were banking so heavily on DLSS that they 'forgot' that AMD has the freakin next gen console market and VMEM needs to be respectable for next gen games (Gears 5, CP77- 3090 actually makes a difference over and above the 3080o_O). No wonder that they are forced to do such idiotic product segmentation where double vram versions are required within 6 months of the current gen launch with no process improvement whatsoever. this is the first time a 90 and 80 Ti card will exist side by side. crazy shortsightedness to be actually forced into such a launch cycle. this is no rumourville stuff and the like. they are actually releasing this 20GB variant:wtf:. can't wait for Ampere end of life SKUs. i'd love to see as below now.
3090 Super Ti $2100
3090 Ti $1900
3090 Super $1700
3090 $1500
3080 Super Ti $1150
3080 Ti $1000
3080 Super $850
3080 $700
this is probably off topic already so i'll stop here.
Posted on Reply
Add your own comment
Mar 15th, 2025 22:19 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts