• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GALAX Confirms GeForce RTX 3080 20GB and RTX 3060, RTX 3060 Matches RTX 2080

Won't the 20 gb model piss off the people who purchased the 10 gb model 3080?

Why would it? Adding a 20GB variant does not make original model any slower. And you won't need more than 10GB for years to come anyway.
But I like the train of thought: Nvidia releases the 10GB 3080 - omg, not enough VRAM; Nvidia supposed to release the 20GB 3080 - omg, initial buyers are screwed.
 
As an early adopter (in this case pre-order / day one) , you waive your right to be pissed of at anything.
They bought the Hype, these are the consequences.
I say screw 'em...
Early adopters are usually suckers. Sometimes they get lucky and accidentally purchase a true gem but this time around it's an over-promised, under-delivering power-hog that's short on VRAM.
 
i think nvidia should go back to using old numbers from geforce 4 :). Should leave the whole 70,80 thing with GTX and bring back 4200,4400,4600 for rtx
 
Now only if Nvidia installed 19GBps GDDR6x on there is 2080Ti level possible, to match the bandwidth of 600GBs, with 14 can only go this far. same goes for 3060 and 2080. Delusional to think 3080 is 2x2080.
Invest in a reballing station and do it yourself.
 
  • Haha
Reactions: ppn
Morality : Wait 5 years to get a better video card ?
Naaaaaaaaaaaaah gimme a damn 3090 test already :D
My 1080Ti is still waiting for Cyberpunk *downgrade edition* so I want to see TPU benches to feed my curiosity please, no powerpoints.
I can guarantee you that the 4060Ti will be so powerful it will beat the 3080 by...let's be crazy, at least around 1% minimal FPS !!!!!!!!!! Crazy, isn't it ?
And I don't need no crystal ball.
Want more ?
The 5060 will beat the 4080 because it will be a whole new process. Ok ok.
Can't wait to see the 3090 in action.
 
Test will show that 112 rops are faster than 96, by 16%. not hard to see. 40 series is on a new node already, somwhere in the next 10-18 month. Nvidia can't survive on 8nm for very long if RDNA2 is too good..
 
i was expecting 3070 to be like the 1070 vs 980Ti but looks like it wont happen and the 3070Ti/Super will be doing it instead
 
10gb is enough, don't fall into placebo guys...

it really is. even on games like shadow of mordor and shadow of war, when you select the graphics option that says 8gb vram required, it doesn't matter if its a 1080 ti, 2080 ti, or w.e the fps just tanks hard. optimization is really all thats key and that goes for any well made game, this is going to be the case here too.

even if cyberpunk 2077 for example has a 10+ vram option for ultra ultra mode, i doubt its stable, even if it is, i doubt it looks that much better than the option just under it, might not even be possible to tell the difference honestly. So I think the idea of 10gb vram not being enough is kind of silly honestly even for future games.

at least for 1080p 1440p high refresh gamers, which is the vast vast majority of gamers. even watching Linus play Doom in 8k today with a 3090 i was like eh... it looked so crappy at 60 fps imo, sure its beautiful when its not in motion, but i prefer high refresh. so eh. 1440p for me.
 
Won't the 20 gb model piss off the people who purchased the 10 gb model 3080?

Yes, all 10 of them are going to be very upset.
 
PCMR #1: kneel before me console peasants for I have the latest 3080 10GB.
PCMR #2: LOL 10GB VRAM peasant, kneel before my 3080 20GB
 
PCMR #1: kneel before me console peasants for I have the latest 3080 10GB.
PCMR #2: LOL 10GB VRAM peasant, kneel before my 3080 20GB


you have to admit though, high refresh gaming is good stuff. consoles just can't do it bb
 
you have to admit though, high refresh gaming is good stuff. consoles just can't do it bb

Yeah it's gonna be a long time before people switch to 4K high refresh, probably after the 3080 has become obsolete.
4K 120hz is nice but 1440p 240hz is still nicer :D, especially when people mix in competitive games.
 
Two interesting takeaways from this set of images.

1. 3070 is clearly lower in the heirarchy than 2080Ti, counter to Nvidia's claims.

2. What is a PG142?

The fact that Jensen said nothing about the RTX 3070, other than a vague 1 liner statement of it being "faster than RTX 2080 Ti", is a good indication it will not be faster for sure. If the performance is so good, I am sure he will spend time marketing/ boasting about it as well. In some specific cases, it may perform better, but I suspect most of the time, it will be slower.
 
Was 'lucky' enough to snag a Asus TUF RTX3080 and already have it in my PC and I couldn't care less about the 20GB version, I wouldn't have bought it anyway.
 
if 3060 = 2080, which I hope 250-300 price range, it would be a very ideal time for 1060 upgrade.
 
i hope this comes true.......
from ampere whitepaper:
The GeForce RTX 3070 incorporates NVIDIA’s GA104 GPU.
This GPU is designed to deliver the best performance and energy efficiency in its class.
In fact, the RTX 3070 delivers performance that is faster than the former flagship GeForce RTX 2080 Ti GPU from the previous generation!
Note that the GA104 GPU includes all GA10x architectural features except NVLink, and while GA104 supports GDDR6X, the RTX 3070 GPU uses GDDR6, not GDDR6X.
 
Nice marketing. NV will do anything to look better than AMD. Since AMD is releasing 16GB mem cards NV will take whatever it takes to make it 20GB. Good. If the price for the 3080 10GB drops, I might just buy it for the sake of it :D
I wish to see a game that must have more than 8GB-10GB proven benchmark that shows you really need more than that. So far, haven't found it.
In fact, the RTX 3070 delivers performance that is faster than the former flagship GeForce RTX 2080 Ti GPU from the previous generation!
Well, according to the leak here (the photo representing performance) 3070 is not faster than a 2080Ti.
 
Nvidia's customer base is muppet central. Lack of patience and willingness to buy whatever they have on offer day one generally comes back to bite you on the backside. Could see this coming a mile off, and with AMD about to release something that's quite clearly spooked Nvidia, you know fully well they'll release something to one up them, and in the process irritate their loyal customers.
 
Nvidia's customer base is muppet central. Lack of patience and willingness to buy whatever they have on offer day one generally comes back to bite you on the backside. Could see this coming a mile off, and with AMD about to release something that's quite clearly spooked Nvidia, you know fully well they'll release something to one up them, and in the process irritate their loyal customers.

List 1 AMD GPU in the last 5 years that made Nvidia buyers reconsidered their purchase decision :banghead:
There is none, people just learned after waiting for Vega, Radeon VII and Navi that AMD has nothing to offer beside a slightly cheaper alternative to Nvidia's midrange...
 
Early adopters are usually suckers. Sometimes they get lucky and accidentally purchase a true gem but this time around it's an over-promised, under-delivering power-hog that's short on VRAM.
I disagree. If you don't know the first thing about GPU, chances are you're not following new releases anyway. Early adopters are tech savvy, they know what they buy into.
 
Not all people prefer lower res with high refresh. I play at 4k since i own the r9 295x2. 20GB ain't a placebo when you can see a game like doom using 8,3GB VRAM.
Next gen games with RTX are gonna take more and more. I can't afford to change my GPU every year.
 
Not all people prefer lower res with high refresh. I play at 4k since i own the r9 295x2. 20GB ain't a placebo when you can see a game like doom using 8,3GB VRAM.
Next gen games with RTX are gonna take more and more. I can't afford to change my GPU every year.
What the game uses and what it actually needs the card to have as a minimum not to bottleneck the game are two different stories.
 
Not all people prefer lower res with high refresh. I play at 4k since i own the r9 295x2. 20GB ain't a placebo when you can see a game like doom using 8,3GB VRAM.
Next gen games with RTX are gonna take more and more. I can't afford to change my GPU every year.

I bet even a single 4GB VRAM GPU give a better gaming experience than Xfire/Sli setup at 4K. Nvidia just gave up on SLI already.
 
I would be surprised if AMD were to pull a rabbit out of their hat and be competitive in the GPU market again.

Are you still going to go Big Navi top tier even if nvidia clobbers it to death in the benchmarks? Just curious.

They have already pulled one rabbit from the hat.........Ryzen
 
Back
Top