• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Ti Graphics Card Launch Postponed to February

NVidia likely can't do that for this generation... but nothing's stopping them in the next round though, because AMD went all 256-bit, because they have the clock speed and SAM to make up for some of it. It is telling that as nodes continue to shrink, clock speeds will continue to climb and big busses won't be needed (sadly.)

FYI, for those who don't already know... GDDR6/X are only produced in 1GB and 2GB capacity chips. It takes 32-bits to address each chip.
 
Last edited:
where in the heck is my 7nm $3k Ampere Titan already :kookoo:
 
NVidia likely can't do that for this generation... but nothing's stopping them in the next round though, because AMD went all 256-bit, because they have the clock speed and SAM to make up for some of it. It is telling that as nodes continue to shrink, clock speeds will continue to climb and big busses won't be needed (sadly.)

You can't make up the loss in bandwidth from just clock speeds alone though from a smaller bus width.
 
You can't make up the loss in bandwidth from just clock speeds alone though from a smaller bus width.
If all else was equal then it makes essentially no difference. If you had a 256-bit bus with vram fast enough to match a 512-bit bus with slower vram and they had the same effective bandwidth then they would have the same effective performance. Again that's if everything else was the same including ROPs. There is no memory fast enough at this point to make up for dropping down to a 256-bit bus on a 3080 though and that's something that pea brain can't comprehend.
 
You can't make up the loss in bandwidth from just clock speeds alone though from a smaller bus width.
I was intentionally vague... but you're right, the only way to directly make up for lack of bandwidth would be to put in faster memory chips. I actually meant that the faster GPU clock speeds will make up elsewhere in your pipeline, and still net your render time to near the same (unless your very specific use-case is memory-bandwidth dependent like science apps.)
 
Darwin awards are coming to mind...
Well I'm sure your name and those awards come up quite often together. But please feel free to actually prove how a 256-bit 3080 would still match the performance of the 320-bit 3080. I mean I know you're extremely slow and all because I've already explained to you that they don't have fast enough memory to make up for the smaller bus width but maybe you have some magical proposal.
 
I'm just going to leave this here for anyone that wants to read

How the Bus of a GPU Affects the Amount of VRAM | ITIGIC
Very useful and interesting. The new A6000 GPU must be using the x8 mode to address 2 chips of 2GB each, so there are 12 chips on front and 12 chips on the back of the card. I guess that's more likely than Micron producing 4GB chips.

As for 3080Ti, the rumored drop to 320-bit is already quite a cut to this SKU. I would have preferred 352-bit, like the 1080 Ti or even the full 384-bit like 980Ti and 780Ti.
 
By that I mean running 4K may only use a few hundred megs or maybe a gig and a half more even though it's four times the resolution of 1080p. Ray tracing eats up a lot of vram and we've already seen games at 1080p that will easily go right up against the 6 gig buffer and sometimes actually need more.

These aren't 4k cards, barely 1440p cards they're going to hit other limits before vram becomes an issue that's that argument but you have your head too far up you know where to realize that.

Plus it's obvious that games are going to use more vram over the next year or so.

It's that crystal ball the same one that told 2 years ago that it was obvious that RTRT is going to be ubiquitous in the next year or so?

Yes it's also pointless to have a 12GB on the 6700XT
 
These aren't 4k cards, barely 1440p cards they're going to hit other limits before vram becomes an issue that's that argument but you have your head too far up you know where to realize that.



It's that crystal ball the same one that told 2 years ago that it was obvious that RTRT is going to be ubiquitous in the next year or so?

Yes it's also pointless to have a 12GB on the 6700XT
Perhaps work on your reading comprehension skills. I said nothing about playing games with this card at 4K. Please actually go back and look at what I said and pay attention to the context as I was talking about resolution not being the only factor and in fact nowhere near the factor that some people think it is when it comes to vram. And apparently you've been living under a rock because even at 1440p 6 gigs will not always be enough with ray tracing on. There are already a couple games right now where performance will drop off hard or stutter unless you lower textures or some other setting. So right now buying a brand new 60 class card with only 6 gigs of vram would be beyond idiotic for upcoming games. And if you comprehend how things work then you know 12 gigs is the only other option so of course it will not be a waste. If anything it's some of the other cards that need more vram to make any more sense. FFS even right now if I crank all the settings in Wolfenstein Youngblood the game will crash and be out of vram even with an 8 gig card. You can't run the textures on that fully maxed setting and you have to leave it on the default setting that comes in the highest preset with only 8 gigs of vram at 1440p.
 
Last edited:
That remains to be seen.


Are you always this nasty to everyone? Seriously...
Well in your case you keep making idiotic comments over and over even when I explain to you how what you're suggesting will not work. In his case he clearly did not read what I said yet wants to argue about it. And his argument is 100% BS as even right now 6 gigs is not enough for a couple of games. It's not my fault that you are completely ignorant about how video cards work or that he does not seem to be up to snuff on how much vram is needed for current and upcoming games. Neither one of you want to accept anything and instead want to argue so you definitely both deserve to be rude to.
 
Well in your case you keep making idiotic comments over and over even when I explain to you how what you're suggesting will not work. In his case he clearly did not read what I said yet wants to argue about it. And his argument is 100% BS as even right now 6 gigs is not enough for a couple of games. It's not my fault that you are completely ignorant about how video cards work or that he does not seem to be up to snuff on how much vram is needed for current and upcoming games. Neither one of you want to accept anything and instead want to argue so you definitely both deserve to be rude to.
You assume you are correct when history has already PROVEN you incorrect. You can make all the insults you wish, it doesn't make you right. A 16GB GA102 variant is completely possible and doable. Whether they name it a 3080S or a 3070ti is up for debate, but the config is completely possible. If you don't understand that, YOU have the problem. Let it go.
 
@robb - my dude, seriously...go eat a Snickers. You're being particularly nasty to some here, and over what? A stupid GPU and whether it can or can't have 16, 20, or a fuckton GB of VRAM. :rolleyes:
 
3060 12GB? Is there an actual point to that card other than to part a fool with their money? 6gb should be enough for the resolutions going to be effective at with 8gb being the upper limit need.
Playing CoD WarZone in wide-5K (like on the G9) maxes out the 8GB Vram of my RTX2070. I was thinking to upgrade to an RTX3080, but considering the performance boost I don't think gaining 2GB vram is enought. And a 3060 is at least as powerful if not more than a 2070, thus I'll be glad to see any cards even a 3060 that would have 12GB or more of VRam
 
Last edited:
Is there really any reason for this GPU to exist?
 
can they stop with the paper launches of crap we dont need. Get us some 3070's asap
 
can they stop with the paper launches of crap we dont need. Get us some 3070's asap
Whats you defination of a paper lanuch because AMD and Nvidia already release the 3000 cards and AMD their own cards that are out in the wild. The only issue is supply and huge demand since lots of people around the world are home much more.
 
Currently, graphics cards may be fine in the VRAM dept. but 12GB of VRAM is gonna be the bare min late 2021 and beyond. I'm actually glad there will be a 3060 12GB and AMD is also gonna be offering the 6700 XT 12GB so overall I'm really excited. Just wish the prices become stable (closer to MSRP) later on (before Q3 I hope) fingers crossed! :)

And don't forget about the blue team, Intel plans to enter the discrete graphics arena sometime in 2H 2021. We'll see how intense this competition is going to be.
 
Last edited:
Low quality post by yotano211
what a sad product stack
3080 with 10 gb, 3060 with 12 gb
LOL
 
I'm looking forward to when the 3040 comes out with 24GB of DDR3.
 
Back
Top