• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Ti Graphics Card Launch Postponed to February

3060 12GB? Is there an actual point to that card other than to part a fool with their money? 6gb should be enough for the resolutions going to be effective at with 8gb being the upper limit need.
You underestimate the number of fools. Look how many are banging the drum of "10GB isnt enough" despite the 6800xt consistently being a bit slower then the 3080
 
Even if confirmed at $999, still going to be a ripoff but it's going in the right direction: $1200 -> $999 -> $699?

I hope you feel that way about the 6900xt
 
They decided to wait to have at least 10k GPUs at launch ? :p
 
If they switch to TSMC 7nm for this GPU I will take one. Probably set the ground running for the 3070 and 3080 cards going to TSMC too with some sort of refresh launch in 2H 2021.
 
However, the RTX 3080 Ti is supposed to bring the GDDR6X memory capacity down to 20 GBs
You stated 20 GBs. It should be just 20GB.

Additionally, it should be noted that there are rumblings of a 16GB version of the 3080 coming as an answer to the RX6000 cards.
 
yes. winter still there..
 
Bummer, so now i have to wait even longer for the GPU i want. That will be RTX 3080 TI:(

Just one bad news after another this year. 2020 sucks:mad:
 
3060 12GB? Is there an actual point to that card other than to part a fool with their money? 6gb should be enough for the resolutions going to be effective at with 8gb being the upper limit need.
What a stupid comment. If you actually knew anything about vram usage then you would know resolution is not the only thing that matters and in fact in some games it doesn't even have a huge amount of impact. By that I mean running 4K may only use a few hundred megs or maybe a gig and a half more even though it's four times the resolution of 1080p. Ray tracing eats up a lot of vram and we've already seen games at 1080p that will easily go right up against the 6 gig buffer and sometimes actually need more. Plus it's obvious that games are going to use more vram over the next year or so. Hell I mean you do realize the 6700 XT is going to have 12 gigs of vram don't you?

You stated 20 GBs. It should be just 20GB.

Additionally, it should be noted that there are rumblings of a 16GB version of the 3080 coming as an answer to the RX6000 cards.
There are no rumblings about 3080 16 gig card unless it's by people that have no clue what they're talking about. You can't end up with that amount of vram on a 320 bit card.
 
Irony much?

Again with the irony.

You can with a 256bit bus. You were saying?
I see you have absolutely no clue what you're talking about. Does it really have to be explained to you that if they dropped the 3080 down to a 256-bit bus that it would not be a 3080 anymore?
 
I see you have absolutely no clue what you're talking about.
More irony. You're really good at that.
Does it really have to be explained to you that if they dropped the 3080 down to a 256-bit bus that it would not be a 3080 anymore?
Sure it would, it would just be an alternate version with alternate specs. Common practice. Seriously, history much?
 
More irony. You're really good at that.

Sure it would, it would just be an alternate version with alternate specs. Common practice. Seriously, history much?
At this point it's obvious that you're too stupid to even realize you're stupid. The 3080 is a certain level of performance and if they neutered the memory bandwidth by cutting down the bus that much just to add more memory it would make no sense because you would have a slower card, dimwit. Only a complete and utter moron would think it would make sense to have a 3080 with more vram that's slower than the 3080 that has less vram. It would have to be a differently named card if it was cut down all the way to 256 bit. it could be a 3070 TI or something like that but there's no way in hell it could be a 3080.
 
More irony. You're really good at that.

Sure it would, it would just be an alternate version with alternate specs. Common practice. Seriously, history much?

They wouldn't do that with Ampere. I dont remember, at least in recent history that that was ever done either.

GPUs are designed with a certain memory type and bit width in mind. Its not just something they, from a design standpoint, can just change and expect GPU to perform the same way.

Frame bufffer size matters, but bandwidth also matters. Don't want to bandwidth starve a GPU with the raw power that Ampere has. Look at what the 3080 can do at 4k vs an RDNA2 card. Its a fine balance that is pre-determined by GPU arch.
 
At this point it's obvious that you're too stupid to even realize you're stupid.
Wow, harsh words. Projection much? I mean really, you need a mirror.
The 3080 is a certain level of performance and if they neutered the memory bandwidth by cutting down the bus that much just to add more memory it would make no sense because you would have a slower card, dimwit.
Looks like a nerve was struck. LOL!
Only a complete and utter moron would think
No, "a complete and utter moron" would not think, that's what makes them as you described.
have a 3080 with more vram that's slower than the 3080 that has less vram.
Except that both NVidia and AMD have done exactly that in the past. And they have always equalized performance by boosting VRAM performance or making another change that refined performance.
it could be a 3070 TI or something like that but there's no way in hell it could be a 3080.
That's certainly possible. But NVidia will name their product line as THEY see fit, not you.

They wouldn't do that with Ampere. I dont remember, at least in recent history that that was ever done either.
Then you need to look at specs for past products a little closer.
GPUs are designed with a certain memory type and bit width in mind.
That does not mean that such designs are rigid and inflexible.
 
Wow, harsh words. Projection much? I mean really, you need a mirror.

Looks like a nerve was struck. LOL!

No, "a complete and utter moron" would not think, that's what makes them as you described.

Except that both NVidia and AMD have done exactly that in the past. And they have always equalized performance by boosting VRAM performance or making another change that refined performance.

That's certainly possible. But NVidia will name their product line as THEY see fit, not you.
So even when I think you've hit rock bottom of ignorance you find a way of digging even deeper. And what is this magical way of increasing vram performance? Oh yeah that would be from faster vram of course. But guess what there is no faster vram available right now than 19,000 that they can effectively use. 21,000 is in somewhat limited supply and rumors suggested that it actually ran too hot and was quite power hungry. And even if they could use 21,000 it would still have less memory bandwidth on the 256 bit bus than using 19,000 on the 320 bit bus. And a 16 gig 3080 would cost more money so again get through your pea brain that they would not release a more expensive 3080 with more vram that was actually slower than the 10 gig 3080 model. Now please STFU and go educate yourself or stick to commenting on YouTube where you will fit in just fine.
 
Back
Top