• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Reportedly Cancels Launch of RTX 3080 20 GB, RTX 3070 16 GB

Folks complaining that the current available RAM on a RTX 3080 isn't enough for their gaming needs.....

I'm going to work on rounding up some GT 730 4GB cards for you folks to buy at a much inflated price over the 1 or 2GB versions. First, I'll market the 1/2GB versions as must have! Tout their performance awesomeness....then, after I soak up the money from the retards - I mean, customers - I'll bring out my 4GB models and make you all believe it's so much better and at a much higher price point....hahaha! I will be the winner here! Screw the people, just give me their money.

.... don't buy one and stop complaining.
 
I never understood the reasoning or desire for so much ram on a gaming card. Even at 4k resolution some games barely break the 4GB mark.
Witcher 3 didn't even break 2GB at 4k.
Especially since the typically graphics card life is 3yrs, dumping all that cash for ram, that may not get used and tossed in 3 years seems a bit wasteful to me.

I would say 8GB-12GB for 4k for the next 3yrs should be more than adequate. I think TPU should start adding gpu vram usage to it's reviews.

Maybe graphics card makers should put so-dimm slots on the cards and make 4GB modules so people can put on 32GB if they want to burn money.

As for me I am satisfied with an 8GB card, and I don't feel like spending extra cash on ram that I will never use.
 
Last edited:
Can we get a decent GPU release, please!?
Vega - Turing - Navi - Ampere... This is getting ridiculous, tiring, and stupid.
 
Cancelled for now yeah I believe that, 3080 3090 shortage must be because of gddrx6, so doubling that would be suicide at this moment, once demand is lower then launch 16 and 20gb versions.
 
They will go for TSMC next year

Whilst I think this is logical, with a launch of 3090 Ti/S / 3080 Ti/S etc next year as the retort to AMD, there is one other reason not to launch a 20GB card now, and that is 2GB modules for GDDR6X are still unavailable, meaning you need to put modules on the back of the card, meaning extra cooling requirements on the back, etc. That feeds back into the why increase your BOM when you likely need to make price cuts argument.

Whilst the 3070 is unaffected by this by using non-x memory, it'd be a weird product stack going 16GB/10GB/24GB.
 
Not surprised - it makes more sense to wait for RDNA2's performance analysis and if necessary, create Ti models with more VRAM. Alternatively, they may be preparing 7nm models with more VRAM.
 
Whilst I think this is logical, with a launch of 3090 Ti/S / 3080 Ti/S etc next year as the retort to AMD, there is one other reason not to launch a 20GB card now, and that is 2GB modules for GDDR6X are still unavailable, meaning you need to put modules on the back of the card, meaning extra cooling requirements on the back, etc. That feeds back into the why increase your BOM when you likely need to make price cuts argument.

Whilst the 3070 is unaffected by this by using non-x memory, it'd be a weird product stack going 16GB/10GB/24GB.
They doing it with 3090, so they dont cancel them for this. It just that 20GB will give them nothing more against competition and it would be absurd to charge 1000+$ for a card that already is close or loses to a card much much cheaper.
 
They doing it with 3090, so they dont cancel them for this.

The 3090 is also twice the price of a 3080, bit more room in the BOM, even with pricecuts.
 
The 3090 is also twice the price of a 3080, bit more room in the BOM, even with pricecuts.
If I say that I did understand that, it would be a lie.
If you can, please give more details on your thoughts.
 
Well I also suspect Nividia would want to wait until they switch to TSMC's far better 7nm node and launch improved versions next year than offer 16/20GB variants on Samesung's 8nm. It's hilarious to see how Nvidia's arrogance bit them in the arse. They thought they could leverage Samesung against TSMC for a better price, and it backfired spectacularly and they had to use an inferior node with low yields and look at the situation they are in now, with it being barely better than a paper launch.
 
If I say that I did understand that, it would be a lie.
If you can, please give more details on your thoughts.

So using the no 20GB 3080's until 2GB GDDR6X availability argument

1\ You need to put 1GB dies on the back of the card, and then cool them. Its not a technical issue as the 3090 does this, but it does increase cooling requirements, and thus cost.
2\ You need to use a double sided PCB, that PCB is more expensive, increasing BOM, and thus cost.
3\ Traditionally, at equal capacities, 1GB dies cost more than 2GB dies (same form factor and erata).

The 3090 can avoid most of the same issues by simply being twice the price. In essence, Nvidia already 'has' its 3080 20GB in the 3090, with a price cut if needed. It also gives more room to adjust 3080 pricing without increasing costs.

IMO, even if AMD takes the lead, we will see a 7nm refresh of Ampere next year on TSMC with Ti/Super models, with RAM likely increasing only with availability of GDDR6X 2GB modules. Also 48GB Titan next year.
 
  • Like
Reactions: Rei
do as you please. no need to worry since you have unlimited dollars and hard followers ready to send you gold. :laugh:
 
So using the no 20GB 3080's until 2GB GDDR6X availability argument

1\ You need to put 1GB dies on the back of the card, and then cool them. Its not a technical issue as the 3090 does this, but it does increase cooling requirements, and thus cost.
2\ You need to use a double sided PCB, that PCB is more expensive, increasing BOM, and thus cost.
3\ Traditionally, at equal capacities, 1GB dies cost more than 2GB dies (same form factor and erata).

The 3090 can avoid most of the same issues by simply being twice the price. In essence, Nvidia already 'has' its 3080 20GB in the 3090, with a price cut if needed. It also gives more room to adjust 3080 pricing without increasing costs.
Ok
You are saying this by taking into account the 700$ MSRP. Of course this price does not have any room for additional cost. Its hardly making any decent profit margin as it is (3080 10GB).
nVidia was probably going to set MSRP for the 3080 20GB no less than 1000, with AIBs around 1100~1200$ (at least). But after seeing how RDNA2 turns out, it would by stupid to offer a even more expensive card for marginal, if any, performance gains. If AMD wasnt matching 3080, then nVidia would be "free" to charge more for "the best GPU" on the market.

IMO, even if AMD takes the lead, we will see a 7nm refresh of Ampere next year on TSMC with Ti/Super models, with RAM likely increasing with availability of GDDR6X.
Even if even if AMD takes the lead?
Even if AMD dont takes the lead and simply match them, they are forced now to refresh to TSMC, because right now a 3080 20GB is a complete waste of money. +10GB of VRAM wont save the day. They need more GPU performance and the 7nm is the only way to achieve it.
 
Maybe Nvidia is now sure what AMD is coming out with and it turned out to be less of a threat than expected?

According to the latest leaks Big Navi clocks super high, so I think it is quite the opposite (i.e. 20GB $900-1000 card looks even worse against a $600-700 16GB card if the performance difference is small).
 
Indeed. It all started as rumors, it all may end as a rumor as well. Many reasons could be pointed and speculated regarding this, but speculating on products that were never fully announced is nothing more than mental gymnastics.

Gotta love social media.
 
NVIDIA and INTEL are making a mockery of customers...Good for AMD, let AMD take away a big chunk off their market share.
 
I dont know what executive guys at Nvidia has been smoking in these pandemic times, but they certainly have shot themselves in the foot this time around.
 
It is called arrogance...
They thought that AMD is nowhere near them, gone cheap with samsung, had their plan to push 2080Ti level prices with 20GB, but... AMD happened.
 
They doing it with 3090, so they dont cancel them for this. It just that 20GB will give them nothing more against competition and it would be absurd to charge 1000+$ for a card that already is close or loses to a card much much cheaper.

Much much cheaper? Has a price been released yet? Amd could be same price for all we know. Look at the cpu side. Current releases are priced higher because they know that they are competitive now. I know that the gpu side hasn't had a win in a long while, but many consumers don't differentiate, they just see "AMD"
 
When their 10GB is selling like crazy, why bother. They are competing against themselves. There is ZERO reason to undermine the 3090.
 
I doubt NVIDIA are under threat. RTX, DLSS etc.

1) RTX isn't an advantage. We all know AMD has a ray tracing alternative (it'll be in all the consoles) and IMO real time ray tracing is nowhere near good enough right now.

2) DLSS is in far few too games. I have yet to play a single game that's supported it.

Folks complaining that the current available RAM on a RTX 3080 isn't enough for their gaming needs.....

I'm going to work on rounding up some GT 730 4GB cards for you folks to buy at a much inflated price over the 1 or 2GB versions. First, I'll market the 1/2GB versions as must have! Tout their performance awesomeness....then, after I soak up the money from the retards - I mean, customers - I'll bring out my 4GB models and make you all believe it's so much better and at a much higher price point....hahaha! I will be the winner here! Screw the people, just give me their money.

.... don't buy one and stop complaining.

It's a valid point given that VRAM typically goes up, not down, on newer video cards. The 1080 Ti had 11GB of VRAM as did the 2080 Ti. Not increasing VRAM size after 3 generations but instead decreasing it? That's pretty sad.

Complaining about people complaining isn't adding thing to the conversation.

Can we get a decent GPU release, please!?
Vega - Turing - Navi - Ampere... This is getting ridiculous, tiring, and stupid.

Agreed
 
Folks complaining that the current available RAM on a RTX 3080 isn't enough for their gaming needs.....
It's a valid point given that VRAM typically goes up, not down, on newer video cards. The 1080 Ti had 11GB of VRAM as did the 2080 Ti. Not increase VRAM size after 3 generations but instead decreasing it? That's pretty sad.
One thing is simply allocating VRAM for future usage and another is actual usage. My question is: do games actually use all that memory instead of simply allocating as much as they can just for the heck of it? Because as I understand it, we don't have insight on that, we just see whether a portion of VRAM has been allocated by whatever game/app you're running or not.
 
8GB is plenty for video games, but compute applications really want more VRAM. 8GBs / (5120 shaders * Occupany4) == 400kBs per compute thread.

Going from 400kB to 800kB is a big jump in per-thread memory resources.

Game VRAM requirements increase as Video cards release with more VRAM. Given that the 1080 Ti launched with 11 GB, the 2080 Ti launched with 11GB, and the 3080 with 10GB of course VRAM requirements aren't going to increase much. Devs aren't being given more resources to work with.
 
When their 10GB is selling like crazy, why bother. They are competing against themselves. There is ZERO reason to undermine the 3090.
This could be one of the reasons. In my opinion, creating a 20GB version of the RTX 3080 will create more headache for Nvidia in terms of getting enough supply to meet demand. Sharing the same GPU as the higher end RTX 3090, this should not be a high volume GPU given the size and complexity of the chip. Nvidia made the decision to price their top end retail chip at mid high end price to increase their lead over AMD have resulted in this severe shortage. If RTX 2080 Ti retailed at USD 699 at launch instead of over 1 grand, it would also face the same supply issue.
 
One thing is simply allocating VRAM for future usage and another is actual usage. My question is: do games actually use all that memory instead of simply allocating as much as they can just for the heck of it? Because as I understand it, we don't have insight on that, we just see whether a portion of VRAM has been allocated by whatever game/app you're running or not.

For what I have read the Dedicated GPU memory stat shows memory in use, not just allocated.


GPU-Z also has a GPU memory usage stat that seems to report a value a bit higher than windows.
 
Back
Top