Tuesday, December 15th 2020

NVIDIA GeForce RTX 3080 Ti Graphics Card Launch Postponed to February

In the past, we heard rumors about NVIDIA's upcoming GeForce RTX 3080 Ti graphics card. Being scheduled for January release, we were just a few weeks away from it. The new graphics card is designed to fill the gap between the RTX 3080 and higher-end RTX 3090, by offering the same GA102 die with the only difference being that the 3080 Ti is GA102-250 instead of GA102-300 die found RTX 3090. It allegedly has the same CUDA core count of 10496 cores, same 82 RT cores, 328 Tensor Cores, 328 Texture Units, and 112 ROPs. However, the RTX 3080 Ti is supposed to bring the GDDR6X memory capacity down to 20 GBs, instead of the 24 GB found on RTX 3090.

However, all of that is going to wait a little bit longer. Thanks to the information obtained by Igor Wallosek from Igor's Lab, we have data that NVIDIA's upcoming high-end GeForce RTX 3080 Ti graphics card is going to be postponed to February for release. Previous rumors suggested that we are going to get the card in January with the price tag of $999. That, however, has changed and NVIDIA allegedly postponed the launch to February. It is not yet clear what the cause behind it is, however, we speculate that the company can not meet the high demand that the new wave of GPUs is producing.
Sources: Igor's Lab, via VIdeoCardz
Add your own comment

121 Comments on NVIDIA GeForce RTX 3080 Ti Graphics Card Launch Postponed to February

#26
dicktracy
Perfect! Just in time to return my 3090.
Posted on Reply
#27
MxPhenom 216
ASIC Engineer
mouacykEven if confirmed at $999, still going to be a ripoff but it's going in the right direction: $1200 -> $999 -> $699?
I hope you feel that way about the 6900xt
Posted on Reply
#28
Altair
Vayra866,8, 10, 12, 20... why Nvidia why not just do it right the first time.

Now you've got cards short and cards overdone on their VRAM throughout the whole stack. Its either too little or far too much. Well played.
What about 11? ;]
Posted on Reply
#29
BoboOOZ
They decided to wait to have at least 10k GPUs at launch ? :p
Posted on Reply
#30
MxPhenom 216
ASIC Engineer
If they switch to TSMC 7nm for this GPU I will take one. Probably set the ground running for the 3070 and 3080 cards going to TSMC too with some sort of refresh launch in 2H 2021.
Posted on Reply
#31
BigBonedCartman
Just say they launched today nobody will know the difference when they’re never in stock
Posted on Reply
#32
mouacyk
MxPhenom 216I hope you feel that way about the 6900xt
Absolutely not, I'm biased for the underdog.
Posted on Reply
#33
lexluthermiester
AleksandarKHowever, the RTX 3080 Ti is supposed to bring the GDDR6X memory capacity down to 20 GBs
You stated 20 GBs. It should be just 20GB.

Additionally, it should be noted that there are rumblings of a 16GB version of the 3080 coming as an answer to the RX6000 cards.
Posted on Reply
#34
Anymal
lepudruk"Launch" like real launch or the usual? (paper one)
Does it even matter since its 1000 usd msrp meaning 1300eur in EU.
Posted on Reply
#35
SIGSEGV
yes. winter still there..
Posted on Reply
#36
svan71
Awesome news ! Another card I can't get do to ZERO stock.
Posted on Reply
#37
Tomgang
Bummer, so now i have to wait even longer for the GPU i want. That will be RTX 3080 TI:(

Just one bad news after another this year. 2020 sucks:mad:
Posted on Reply
#38
robb
Totally3060 12GB? Is there an actual point to that card other than to part a fool with their money? 6gb should be enough for the resolutions going to be effective at with 8gb being the upper limit need.
What a stupid comment. If you actually knew anything about vram usage then you would know resolution is not the only thing that matters and in fact in some games it doesn't even have a huge amount of impact. By that I mean running 4K may only use a few hundred megs or maybe a gig and a half more even though it's four times the resolution of 1080p. Ray tracing eats up a lot of vram and we've already seen games at 1080p that will easily go right up against the 6 gig buffer and sometimes actually need more. Plus it's obvious that games are going to use more vram over the next year or so. Hell I mean you do realize the 6700 XT is going to have 12 gigs of vram don't you?
lexluthermiesterYou stated 20 GBs. It should be just 20GB.

Additionally, it should be noted that there are rumblings of a 16GB version of the 3080 coming as an answer to the RX6000 cards.
There are no rumblings about 3080 16 gig card unless it's by people that have no clue what they're talking about. You can't end up with that amount of vram on a 320 bit card.
Posted on Reply
#39
lexluthermiester
robbWhat a stupid comment.
Irony much?
robbThere are no rumblings about 3080 16 gig card unless it's by people that have no clue what they're talking about.
Again with the irony.
robbYou can't end up with that amount of vram on a 320 bit card.
You can with a 256bit bus. You were saying?
Posted on Reply
#40
robb
lexluthermiesterIrony much?

Again with the irony.

You can with a 256bit bus. You were saying?
I see you have absolutely no clue what you're talking about. Does it really have to be explained to you that if they dropped the 3080 down to a 256-bit bus that it would not be a 3080 anymore?
Posted on Reply
#41
KLMR
Released on February 2020, seen on February 2021.
Posted on Reply
#42
lexluthermiester
robbI see you have absolutely no clue what you're talking about.
More irony. You're really good at that.
robbDoes it really have to be explained to you that if they dropped the 3080 down to a 256-bit bus that it would not be a 3080 anymore?
Sure it would, it would just be an alternate version with alternate specs. Common practice. Seriously, history much?
Posted on Reply
#43
robb
lexluthermiesterMore irony. You're really good at that.

Sure it would, it would just be an alternate version with alternate specs. Common practice. Seriously, history much?
At this point it's obvious that you're too stupid to even realize you're stupid. The 3080 is a certain level of performance and if they neutered the memory bandwidth by cutting down the bus that much just to add more memory it would make no sense because you would have a slower card, dimwit. Only a complete and utter moron would think it would make sense to have a 3080 with more vram that's slower than the 3080 that has less vram. It would have to be a differently named card if it was cut down all the way to 256 bit. it could be a 3070 TI or something like that but there's no way in hell it could be a 3080.
Posted on Reply
#44
MxPhenom 216
ASIC Engineer
lexluthermiesterMore irony. You're really good at that.

Sure it would, it would just be an alternate version with alternate specs. Common practice. Seriously, history much?
They wouldn't do that with Ampere. I dont remember, at least in recent history that that was ever done either.

GPUs are designed with a certain memory type and bit width in mind. Its not just something they, from a design standpoint, can just change and expect GPU to perform the same way.

Frame bufffer size matters, but bandwidth also matters. Don't want to bandwidth starve a GPU with the raw power that Ampere has. Look at what the 3080 can do at 4k vs an RDNA2 card. Its a fine balance that is pre-determined by GPU arch.
Posted on Reply
#45
lexluthermiester
robbAt this point it's obvious that you're too stupid to even realize you're stupid.
Wow, harsh words. Projection much? I mean really, you need a mirror.
robbThe 3080 is a certain level of performance and if they neutered the memory bandwidth by cutting down the bus that much just to add more memory it would make no sense because you would have a slower card, dimwit.
Looks like a nerve was struck. LOL!
robbOnly a complete and utter moron would think
No, "a complete and utter moron" would not think, that's what makes them as you described.
robbhave a 3080 with more vram that's slower than the 3080 that has less vram.
Except that both NVidia and AMD have done exactly that in the past. And they have always equalized performance by boosting VRAM performance or making another change that refined performance.
robbit could be a 3070 TI or something like that but there's no way in hell it could be a 3080.
That's certainly possible. But NVidia will name their product line as THEY see fit, not you.
MxPhenom 216They wouldn't do that with Ampere. I dont remember, at least in recent history that that was ever done either.
Then you need to look at specs for past products a little closer.
MxPhenom 216GPUs are designed with a certain memory type and bit width in mind.
That does not mean that such designs are rigid and inflexible.
Posted on Reply
#46
robb
lexluthermiesterWow, harsh words. Projection much? I mean really, you need a mirror.

Looks like a nerve was struck. LOL!

No, "a complete and utter moron" would not think, that's what makes them as you described.

Except that both NVidia and AMD have done exactly that in the past. And they have always equalized performance by boosting VRAM performance or making another change that refined performance.

That's certainly possible. But NVidia will name their product line as THEY see fit, not you.
So even when I think you've hit rock bottom of ignorance you find a way of digging even deeper. And what is this magical way of increasing vram performance? Oh yeah that would be from faster vram of course. But guess what there is no faster vram available right now than 19,000 that they can effectively use. 21,000 is in somewhat limited supply and rumors suggested that it actually ran too hot and was quite power hungry. And even if they could use 21,000 it would still have less memory bandwidth on the 256 bit bus than using 19,000 on the 320 bit bus. And a 16 gig 3080 would cost more money so again get through your pea brain that they would not release a more expensive 3080 with more vram that was actually slower than the 10 gig 3080 model. Now please STFU and go educate yourself or stick to commenting on YouTube where you will fit in just fine.
Posted on Reply
#49
robb
lexluthermiesterYou first.
I have already explained to you every possible way why a 16 gig 3080 cannot happen. You have shown no viable way that it can happen and be able to maintain the same performance as the 3080 10 gig. It is your own fault that you are too goddamn stupid to accept that.
Posted on Reply
#50
mouacyk
NVidia likely can't do that for this generation... but nothing's stopping them in the next round though, because AMD went all 256-bit, because they have the clock speed and SAM to make up for some of it. It is telling that as nodes continue to shrink, clock speeds will continue to climb and big busses won't be needed (sadly.)

FYI, for those who don't already know... GDDR6/X are only produced in 1GB and 2GB capacity chips. It takes 32-bits to address each chip.
Posted on Reply
Add your own comment
Mar 15th, 2025 22:20 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts