• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3080 Ti Graphics Card Launch Postponed to February

what a sad product stack
3080 with 10 gb, 3060 with 12 gb
LOL
It's almost like VRAM allocation doesn't matter much. Given we know Nvidia designs the memory system based around the performance of the GPU, and we saw for a decade what slapping more VRAM on cards did (Spoiler: Literally nothing). There is not a single game out there that is bottlenecked by VRAM currently, and I'm not sure there ever has been.
 
It's almost like VRAM allocation doesn't matter much. Given we know Nvidia designs the memory system based around the performance of the GPU, and we saw for a decade what slapping more VRAM on cards did (Spoiler: Literally nothing). There is not a single game out there that is bottlenecked by VRAM currently, and I'm not sure there ever has been.
For someone whom at this stage gaming is like an interactive movie experience and who cannot stand low res textures or aliasing, memory matters incredibly mate. What i'm really :mad: about is that the nvidiots at nvidia were banking so heavily on DLSS that they 'forgot' that AMD has the freakin next gen console market and VMEM needs to be respectable for next gen games (Gears 5, CP77- 3090 actually makes a difference over and above the 3080o_O). No wonder that they are forced to do such idiotic product segmentation where double vram versions are required within 6 months of the current gen launch with no process improvement whatsoever. this is the first time a 90 and 80 Ti card will exist side by side. crazy shortsightedness to be actually forced into such a launch cycle. this is no rumourville stuff and the like. they are actually releasing this 20GB variant:wtf:. can't wait for Ampere end of life SKUs. i'd love to see as below now.
3090 Super Ti $2100
3090 Ti $1900
3090 Super $1700
3090 $1500
3080 Super Ti $1150
3080 Ti $1000
3080 Super $850
3080 $700
this is probably off topic already so i'll stop here.
 
Last edited:
Don't worries guys, all two card per store will be in-stock.
 
Cool it gives people not shitting money a chance to save enough.
Does it though? I would think a $1000 GPU is still well outside of the reach of most people not shitting money. Most gamers use $2-300 GPUs, after all, so for them the difference between $1000 and $1500 likely doesn't matter much.
 
Does it though? I would think a $1000 GPU is still well outside of the reach of most people not shitting money. Most gamers use $2-300 GPUs, after all, so for them the difference between $1000 and $1500 likely doesn't matter much.

I would never buy a $1k+ GPU anyway. They are are for true shitters. I will be looking at £5-600 max. There is too much eliteism now in PC's that is exactly why people will pay the inflated prices on Ebay for new GPU's because they need to have it now, rather than wait. you cant brag online with a old GPU can you.
Big deal you have a 3090, do you have to go on every forum and social media group on the net to post pics of your PC showing the card, and post pics of dumb meaningless benchies showing how much better your score is. IMO good luck to the scalpers they are providing a way for rich shitheads to spend their money.
 
Big deal you have a 3090, do you have to go on every forum and social media group on the net to post pics of your PC showing the card, and post pics of dumb meaningless benchies showing how much better your score is. IMO good luck to the scalpers they are providing a way for rich shitheads to spend their money.
i share your sentiment mate, social media age means that true progress is stunted and profits are undeservedly elevated. the globally networked herd hive mentality and their FOMO compensating behaviours.
 
I have a dream of getting a low bug version of msfs next spring and loading it up with 8k textures to look good on my OLED TV. This card could be tempting for that ... But so might an AMD card. The price is a bit uncomfortable but if it would give me a near steady 30fps with full eye candy...well that could be worth it.
The 20GB would make me less worried if loading up a bunch of 8k textures.
Maybe more a fantasy than a dream ;)
 
It's almost like VRAM allocation doesn't matter much. Given we know Nvidia designs the memory system based around the performance of the GPU, and we saw for a decade what slapping more VRAM on cards did (Spoiler: Literally nothing). There is not a single game out there that is bottlenecked by VRAM currently, and I'm not sure there ever has been.
massive cope
 
can they stop with the paper launches of crap we dont need. Get us some 3070's asap

I almost got up and walked out of work yesterday when I saw my local MicroCenter showed they had a limited number of 3070 cards (only 1 model) from ASUS at the store. Coupled with the fact they also showed 12+ 5600X CPUs in stock.....damn stupid work and me needing to have a job to pay for things! Sadly, I opted it was better to stay employed instead of running off to pay for new hardware.

I really need to stop stalking microcenter.com for hardware right now. I'm still expecting things to settle down and become easily available more towards March....I can wait a few more months.
 
There is too much eliteism
Is it eliteism...or is it people saving and spending their hard earned cash and buying what they want because they can? I worked my ass off to get where I am now. If I happen buy a $1500 GPU (I wouldn't buy the inflated prices, note) I don't want any whiny ass people who can't/don't want to afford it for whatever reason running their yap. People are excited about being able to get a rare, pricy item and they post it. Who cares! :)

Some of us are beyond a 1440p/75Hz monitor where a 3080 or greater is needed for 165Hz/FPS or 4K/60+ in the first place. ;)
 
Low quality post by yotano211
Ah, of course, so that's where that batch of missing Chineeese MSieeee 3090 graphics cards went :cry::roll:
What........

I would never buy a $1k+ GPU anyway. They are are for true shitters. I will be looking at £5-600 max. There is too much eliteism now in PC's that is exactly why people will pay the inflated prices on Ebay for new GPU's because they need to have it now, rather than wait. you cant brag online with a old GPU can you.
Big deal you have a 3090, do you have to go on every forum and social media group on the net to post pics of your PC showing the card, and post pics of dumb meaningless benchies showing how much better your score is. IMO good luck to the scalpers they are providing a way for rich shitheads to spend their money.
I thank the "rich shitheads" for their support for buying a new "used" radar dome for the boat.
 
More irony. You're really good at that.

Sure it would, it would just be an alternate version with alternate specs. Common practice. Seriously, history much?

History says that an x80ti will absolutely have to be faster than a 3080 below it. Cone on now...
 
History says that an x80ti will absolutely have to be faster than a 3080 below it. Cone on now...
Lets take the GTX460 for example. There were five different variants of that card, 768MB, three 1GB models and a 2GB each with various shader and core counts. Would it not have made more sense to name the 768MB version a GTX450ti? Yes it would have. But they didn't and here we are. AMD has done similar. RX400 & RX500 series anyone? I could go on like that. Locically you might be correct, but in practice, NVidia will do as they please with their product line-up and name their products as they please, as they always have.

Now as for the aforementioned rumor, if NVidia brings a 16GB version of the 3080 but with the slightly reduce VRAM speed, I'm totally OK with that tradeoff. And I couldn't care less if they called it a 3080SE or just a 3080. The specs are what people should care about, not the name.
 
Last edited:
Lets take the GTX460 for example. There were five different variants of that card, 768MB, three 1GB models and a 2GB each with various shader and core counts. Would it not have made more sense to name the 768MB version a GTX450ti? Yes it would have. But they didn't and here we are. AMD has done similar. RX400 & RX500 series anyone? I could go on like that. Locically you might be correct, but in practice, NVidia will do as they please with their product line-up and name their products as they please, as they always have.

Now as for the aforementioned rumor, if NVidia brings a 16GB version of the 3080 but with the slightly reduce VRAM speed, I'm totally OK with that tradeoff. And I couldn't care less if they called it a 3080SE or just a 3080. The specs are what people should care about, not the name.
I mean, Nvidia clearly has the resources to test whether a cut-down memory bus (possibly combined with higher clocked memory) would disadvantage a potential 3080 Ti. It wouldn't even be particularly difficult for them to test, just write a custom BIOS for some test board and run it. If it works well, I have little doubt they'd make that, though I struggle to see a wealth of scenarios where 16GB of VRAM would deliver a performance boost significant enough to outweigh the far more common bandwidth limitations. I don't see it as likely, but I don't have access to Nvidia's engineering resources either.
 
I mean, Nvidia clearly has the resources to test whether a cut-down memory bus (possibly combined with higher clocked memory) would disadvantage a potential 3080 Ti
Lets be fair, the 20GB version will be the 3080ti. A 16gb version will likely just be a 3070ti or stick with the 3080 with the different identifier.
It wouldn't even be particularly difficult for them to test, just write a custom BIOS for some test board and run it. If it works well, I have little doubt they'd make that
True
though I struggle to see a wealth of scenarios where 16GB of VRAM would deliver a performance boost significant enough to outweigh the far more common bandwidth limitations.
Absolute performance isn't always the deciding factor of a product focus. The extra 6GB VRAM would be very handy for non-gaming tasks that many people do, myself included, that greatly benefit from lot of VRAM but that coming at a price that is not going to be as hard on the pocket-book as a 3080ti or 3090.
 
Last edited:
Lets take the GTX460 for example. There were five different variants of that card, 768MB, three 1GB models and a 2GB each with various shader and core counts. Would it not have made more sense to name the 768MB version a GTX450ti? Yes it would have. But they didn't and here we are. AMD has done similar. RX400 & RX500 series anyone? I could go on like that. Locically you might be correct, but in practice, NVidia will do as they please with their product line-up and name their products as they please, as they always have.

Now as for the aforementioned rumor, if NVidia brings a 16GB version of the 3080 but with the slightly reduce VRAM speed, I'm totally OK with that tradeoff. And I couldn't care less if they called it a 3080SE or just a 3080. The specs are what people should care about, not the name.

If they would sure, but its a fantasy and will not happen. If anything we might see a refresh with higher GDDR6X speeds.
 
If they would sure, but its a fantasy and will not happen.
I don't agree. They need a 16GB model to compete with AMD. Even if there are only limited use-case-scenario's for that amount of VRAM ATM, they'll look inferior if they don't match up. And let's be fair, the extra VRAM is going to be useful for future gaming/compute possibilities. There is more than a market for such a card.
If anything we might see a refresh with higher GDDR6X speeds.
They might do that too!
 
I don't agree. They need a 16GB model to compete with AMD. Even if there are only limited use-case-scenario's for that amount of VRAM ATM, they'll look inferior if they don't match up. And let's be fair, the extra VRAM is going to be useful for future gaming/compute possibilities. There is more than a market for such a card.

They might do that too!
I dont know - I'm under the impression that GPU product segmentation and naming, at least what is directed at Western markets, has homogenized and become more systematic in recent years (at least partially alongside hardware segmentation becoming clearer and better defined with GPU core components multiplying and being grouped together). It's been a long, long time since you could buy anything remotely high end where the same naming tier covered vastly different hardware configurations. The 1060 is the closest, and that was a midrange series. The low end is still a free-for-all though.
 
It's been a long, long time since you could buy anything remotely high end where the same naming tier covered vastly different hardware configurations. The 1060 is the closest, and that was a midrange series. The low end is still a free-for-all though.
Good points. Still, anything is possible. I just don't see NVidia letting that market gap stay unfilled.
 
Good points. Still, anything is possible. I just don't see NVidia letting that market gap stay unfilled.
No, that's true. They do seem to have painted themselves into a bit of a corner, but there are several ways out of that, of course. They did a mixed memory density config on the 970, and the XSX hs that too, so I guess they could go that route for a 14/16GB 3080 Ti, though of course those last few GB would be quite slow.
 
They did a mixed memory density config on the 970....though of course those last few GB would be quite slow.
Oh hell NO! Bad idea! Let's not have that crap going on again. 2GB chips x8(or 1GB x16 with a dual sided PCB) at 256bit bus is perfectly acceptable.
 
Oh hell NO! Bad idea! Let's not have that crap going on again. 2GB chips x8(or 1GB x16 with a dual sided PCB) at 256bit bus is perfectly acceptable.

If Jensen is that arrogant after getting burned with the 970.... wow
 
Back
Top