Sunday, January 9th 2022

NVIDIA GeForce RTX 3080 12 GB Edition Rumored to Launch on January 11th

During the CES 2022 keynote, we have witnessed NVIDIA update its GeForce RTX 30 series family with GeForce RTX 3050 and RTX 3090 Ti. However, this is not an end to NVIDIA's updates to the Ampere generation, as we now hear industry sources from Wccftech suggest that we could see a GeForce RTX 3080 GPU with 12 GB of GDDR6X VRAM enabled, launched as a separate product. Compared to the regular RTX 3080 that carries only 10 GB of GDDR6X, the new 12 GB version is supposed to bring a slight bump up to the specification list. The GA102-220 GPU SKU found inside the 12 GB variant will feature 70 SMs with 8960 CUDA, 70 RT cores, and 280 TMUs.

This represents a minor improvement over the regular GA102-200 silicon inside the 8 GB model. However, the significant difference is the memory organization. With the new 12 GB model, we have a 384-bit memory bus allowing GDDR6X modules to achieve a bandwidth of 912 GB/s, all while running at 19 Gbps speeds. The overall TDP will also receive a bump to 350 Watts, compared to 320 Watts of the regular RTX 3080 model. For more information regarding final clock speeds and pricing, we have to wait for the alleged launch date - January 11th.
Source: Wccftech
Add your own comment

18 Comments on NVIDIA GeForce RTX 3080 12 GB Edition Rumored to Launch on January 11th

#1
Space Lynx
Astronaut
more stuff no one but bots can buy, neat
Posted on Reply
#2
aQi
Keep playing Nvidia. Amd is already managing to get parallel performance while introducing new technologies harmonising gpu with cpu advantages as it has its own production of both. Intel will offer the same eventually after we get to see ARC dgpu.
Where does the green team leads us to ? Extra 2gb and slight bumps ?
Posted on Reply
#3
watzupken
To no fanfare. With this new SKU, it will just end up costing more than the 10GB version, and cheaper than the Ti version. Ultimately, availability is still poor, and, prices still absurdly inflated.
Posted on Reply
#4
gmn17
It's like MSRP is meaningless
Posted on Reply
#5
watzupken
aQiKeep playing Nvidia. Amd is already managing to get parallel performance while introducing new technologies harmonising gpu with cpu advantages as it has its own production of both. Intel will offer the same eventually after we get to see ARC dgpu.
Where does the green team leads us to ? Extra 2gb and slight bumps ?
They will lead us to more RT and Tensor cores, if not something proprietary with the next gen. This is why Nvidia is so keen to pay and acquire ARM to mitigate the disadvantage here.
Posted on Reply
#6
Space Lynx
Astronaut
watzupkenThey will lead us to more RT and Tensor cores, if not something proprietary with the next gen. This is why Nvidia is so keen to pay and acquire ARM to mitigate the disadvantage here.
yeah they won't get ARM, the world has already made that clear to Nvidia, Jensen can hug his leather jackets at night and cry over it if he wants lol
Posted on Reply
#7
Unregistered
lynx29more stuff no one but bots can buy, neat
I can see the cards available, the issue is the prices are already scalped from the start.
#8
aQi
watzupkenThey will lead us to more RT and Tensor cores, if not something proprietary with the next gen. This is why Nvidia is so keen to pay and acquire ARM to mitigate the disadvantage here.
That arm acquisition can effect the handheld battle quite well as AMD and samsung combinely bringing rdna2 to portable through exynos this month. Yet seeing nvidia develop an risc based processor to compete here will be something neat rather then playing with vram increase and resizing of tensor cores. Admittedly nvidia introduced ray tracing but its of no use if the end user cannot enjoy it. Thats what Intel and Amd are aiming for. And upcoming igpu would also have ray tracing.
Posted on Reply
#9
TheHughMan
Despite mining at full capacity with no room for more growth they still buy all the GPUs
Posted on Reply
#10
Berfs1
gmn 17It's like MSRP is meaningless
NVIDIA and AMD should stop putting MSRPs for these graphics card, and let the public decide what the price is. Auction them off like Intel did with the 9990XE. No fucking point of MSRP anymore.
Posted on Reply
#11
stimpy88
Pay $2000 for one of these now, and in less than a years time, 16GB will be the new standard on all mid to high end cards. nGreedia have been very clever with their drip drip dripping of higher memory densities, which should have never been a thing in the first place!
Posted on Reply
#12
Space Lynx
Astronaut
stimpy88Pay $2000 for one of these now, and in less than a years time, 16GB will be the new standard on all mid to high end cards. nGreedia have been very clever with their drip drip dripping of higher memory densities, which should have never been a thing in the first place!
ya and they won't be able to justify these prices in 2024 for next gen cards, AMD will undercut the living **** out of them. if it wasn't for AMD, Nvidia would never come down in price in 2024. also lets hope Intel gpu's are competitive, the more the better to keep these greedy bastards in check.
Posted on Reply
#13
Vayra86
stimpy88Pay $2000 for one of these now, and in less than a years time, 16GB will be the new standard on all mid to high end cards. nGreedia have been very clever with their drip drip dripping of higher memory densities, which should have never been a thing in the first place!
Yep, because that's where the consoles are. The trend is already in effect as it was even prior to Ampere's launch. 8GB required is already 'mid range territory' even below 4K.

Nvidia is being the Intel right now, pushing metrics nobody cares about for excessive power figures. Pioneering wild technologies that have no support in the industry like RT or E-cores, to save face. Well, someone's gotta be first right. So far the price of RT is two generations of utter shite GPUs and exploding TDPs. And its not just crypto making them unobtanium either, the dies are big to begin with.

We're still transitioning and Turing + Ampere are still early adopter territory with subpar spec. The industry is still adjusting. I wonder what rabbits Nvidia's Hopper (wasn't it?) will pull out.
Posted on Reply
#14
Space Lynx
Astronaut
Vayra86Yep, because that's where the consoles are. The trend is already in effect as it was even prior to Ampere's launch. 8GB required is already 'mid range territory' even below 4K.

Nvidia is being the Intel right now, pushing metrics nobody cares about for excessive power figures. Pioneering wild technologies that have no support in the industry like RT or E-cores, to save face. Well, someone's gotta be first right. So far the price of RT is two generations of utter shite GPUs and exploding TDPs. And its not just crypto making them unobtanium either, the dies are big to begin with.

We're still transitioning and Turing + Ampere are still early adopter territory with subpar spec. The industry is still adjusting. I wonder what rabbits Nvidia's Hopper (wasn't it?) will pull out.
if you have not read my hypothesis on the shortage, I recommend it here:

www.techpowerup.com/forums/threads/my-hypothesis-proposal-on-graphics-card-shortages-there-is-no-shortage-watch-the-spoon-bend-neo.289186/
Posted on Reply
#16
Space Lynx
Astronaut
Vayra86Yeah I think its nonsense, sorry.
seriously? seems quite obvious to me. there are loads of third party sellers now on amazon/walmart, etc. way way more than their used to be. its an easy way to make a quick buck. lot of them have bots, supply will never meet demand cause there are so many third party sellers now that suck up items before they drop, only way to beat them is if walmart and best buy, etc start selling ps5's or gpu'
s in store only.

no need to be sorry, we can agree to disagree
Posted on Reply
#17
WhoDecidedThat


Hmm... I can't help but wonder what would have happened to TDP if Nvidia chose to go with GDDR6 instead of GDDR6X. RTX 3070 increased its power consumption by 30% when it switched to GDDR6X with 3070 Ti.

Of course then it would be a side grade from RTX 3080's 10 GB 760 GBps memory (320-bit * GDDR6X 19 Gbps) to a 12 GB 768 GBps memory (384-bit * GDDR6 16 Gbps). But we would see much lower power consumption, maybe a sub-300 watt 3080.
Posted on Reply
#18
goodeedidid
Meanwhile Apple makes more money from Airpods alone than nvidia Amd and Intel combined
Posted on Reply
Add your own comment
Nov 19th, 2024 08:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts