• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants

Joined
Jan 14, 2019
Messages
12,548 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Nvidia - The way you're meant to be played.
 
Joined
Nov 20, 2012
Messages
163 (0.04/day)
This rumor is technical bullshit.

To have 12GB, a card would have to have either a 192-Bit or a 384-Bit memory interface, since NV won't use unequal configurations anymore and half-sized VRAM-modules (e.g. 1,5GB) never became a thing. A RTX 4080 with only 192-Bit would be crap, while for 384-Bit it would have to use AD102.
I can only imagine one of the two SKUs interpreted as a 4080 12GB here ist either intended to be a 4070/Ti with AD104 capped at 192-Bit, or a 4080Ti with AD102.
 
Joined
Dec 18, 2005
Messages
8,253 (1.19/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
if the new generation of cards hit the market at cheaper prices it will be for one simple reason.. nobody has the money to buy one.. :)

trog
 
Joined
Jun 21, 2013
Messages
605 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
Let me guess, 16GB variant will be 50% more expensive.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,417 (4.69/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710 (Red/Black)
Audio Device(s) Asgard 2, Modi 3, HD58X
Power Supply Corsair RM850x Gold
MSRP I see half that.

Nvidia will use inflation as an excuse. I fully expect a 4080 12gb to cost around $799 and the 16gb to cost $899
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
When they did that in the past used to be named GTX 660 Ti and GTX 680 for example, one 192 bit and 256 bit based on the same die. except they tricked it to still hold 2GB despite being 192 bit with asymmetric setup. I guess the idea is 4080 12GB to replace 3080 Ti at 699.
 
Joined
Nov 20, 2012
Messages
163 (0.04/day)
They stayed away from asymmetric memory setup since the infamous GTX 970 and I don't expect them to reassume this bad habit on the 4080 in particular. Plus, there would be a huge performance difference between 192Bit with 12GB and 256Bit with 16GB.
 
Joined
Feb 23, 2019
Messages
6,104 (2.87/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3600 CL14
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Quick poll missing option: I'm not buying such expensive GPUs.
What past 2 years have shown us is that people will still buy.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,958 (2.95/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Arctic Freezer 50 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
I suppose that the 12GB model will have a narrower memory bus, unless they use mixed density chips (like with 550 Ti, 660 and 660 Ti). Or then the 12GB model will have a 192-bit bus.

They stayed away from asymmetric memory setup since the infamous GTX 970 and I don't expect them to reassume this bad habit on the 4080 in particular. Plus, there would be a huge performance difference between 192Bit with 12GB and 256Bit with 16GB.
970 didn't use mixed density chips, it was due to the chip architecture why they had to use that 224bit + 32bit bus on that.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
The whole idea of RT was not to use any memory, because it's all made out of Ray tracing magic, just rays coordinates as opposed to heavy textures. why the need for extra VRAM.
 
Joined
Oct 6, 2021
Messages
1,605 (1.37/day)
Realistically you will need to double that at least.
You need to multiply by 3, no joke.
Nvidia has every reason to release everything overpriced.

- Massive chips.
- More expensive process, at least 2X more expensive.
- Overstock of current generation GPUs.

4090 = U$ 1999
4080 = U$ 1499
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
This rumor is technical bullshit.

To have 12GB, a card would have to have either a 192-Bit or a 384-Bit memory interface, since NV won't use unequal configurations anymore and half-sized VRAM-modules (e.g. 1,5GB) never became a thing. A RTX 4080 with only 192-Bit would be crap, while for 384-Bit it would have to use AD102.
I can only imagine one of the two SKUs interpreted as a 4080 12GB here ist either intended to be a 4070/Ti with AD104 capped at 192-Bit, or a 4080Ti with AD102.
Weren't they going to use some magical cache now?
 
Joined
Aug 4, 2020
Messages
1,623 (1.02/day)
Location
::1
[ ... ]

Euhhh that makes absolutely not a single drop of sense at all. 32GB and 16GB GPU is for gaming also complete and utter nonsense right now. You can't get em full even if you try.

12GB GPU, sure, 16GB RAM, sure. That's where its at right now and for this coming console gen, at best. 32 is only needed if you do more and its not gaming.
there's plenty of games (especially the moddable ones) where you'll easily exhaust even 32gib of memory. if i were to make a build today, i would put at least 32gib unless it's like, hardcore budget.
 
Joined
Jul 13, 2016
Messages
3,321 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
You need to multiply by 3, no joke.
Nvidia has every reason to release everything overpriced.

- Massive chips.
- More expensive process, at least 2X more expensive.
- Overstock of current generation GPUs.

4090 = U$ 1999
4080 = U$ 1499

People seem to point to the cost of a new node every time there is a shrink but there are multiple prior examples where costs increased by a lot yet there was no large increase to GPU prices:

1662394766179.png


Looking at the cost of a wafer in a singular point in time is a very narrow examination of the total costs to build a GPU. For starters the cost of a node isn't a static amount, it drops over time. That price was given 2 years ago and it has almost certainly dropped by a good amount. Second, AMD and Nvidia are not paying market rate. They sign a wafer supply agreement with a pre-negotiated price.

On top of those factors, 5nm (according to TSMC) has higher yield than their 7nm node: https://www.anandtech.com/show/16028/better-yield-on-5nm-than-7nm-tsmc-update-on-defect-rates-for-n5

As evidenced by the chart above, 7nm was also an extremely expensive node jump yet Nvidia saw a record 68% profit margin on those GPUs. Clearly there is much much more to the cost of building GPUs than a single static price of a node from 2 years ago.
 
Joined
Jul 29, 2022
Messages
525 (0.60/day)
Those prices would not work with a competitive AMD and in the current market.
AMD can just raise prices, to make everyone see that they are serious contenders and not budget options any more. Like they did with the rx5700.
 
Joined
Oct 6, 2021
Messages
1,605 (1.37/day)
People seem to point to the cost of a new node every time there is a shrink but there are multiple prior examples where costs increased by a lot yet there was no large increase to GPU prices:

View attachment 260747

Looking at the cost of a wafer in a singular point in time is a very narrow examination of the total costs to build a GPU. For starters the cost of a node isn't a static amount, it drops over time. That price was given 2 years ago and it has almost certainly dropped by a good amount. Second, AMD and Nvidia are not paying market rate. They sign a wafer supply agreement with a pre-negotiated price.

On top of those factors, 5nm (according to TSMC) has higher yield than their 7nm node: https://www.anandtech.com/show/16028/better-yield-on-5nm-than-7nm-tsmc-update-on-defect-rates-for-n5

As evidenced by the chart above, 7nm was also an extremely expensive node jump yet Nvidia saw a record 68% profit margin on those GPUs. Clearly there is much much more to the cost of building GPUs than a single static price of a node from 2 years ago.
I don't think the current situation can be compared to any other time in the past. At this point TSMC is alone at the forefront, effectively being the only option for high-end products, all the volume is focused on TSMC, even Qualcomm has given up on manufacturing the high-end chips at Samsung due to the latter's low efficiency and yields. What stops TSMC from charging as much as they want?

If I remember correctly, the only 7nm chips that Nvidia has are the ones focused on servers that are sold many times more expensive than gaming GPUs. So no need to ask how Nvidia keeps the profit margin so high...
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
there's plenty of games (especially the moddable ones) where you'll easily exhaust even 32gib of memory. if i were to make a build today, i would put at least 32gib unless it's like, hardcore budget.

Plenty... a handful ;) And only if you go complete crazy on them. So yeah, sure for that tiny 1% subset, there is a market over 16GB RAM (in gaming).

The very vast majority can make everything happen with far less. VRAM, sure, you could get 16GB today and fill it up. But I think with the new architectures, 12GB is fine, and it will also carry all older titles that can't use said architectural improvements - for that, the olde 8GB was just on the edge of risk; and its also why those 11GB Pascal GPUs were in a fine top spot. They will munch through anything you can get right now, and will run into GPU core constraints if you push the latest hard. Its similar to the age of 4GB Maxwell while the 980ti carried 6GB. It is almost exclusively the 6GB that carried it another few years longer over everything else that competed. I think we're at that point now, but for 12GB versus 8 or 10; - not 16. The reason is console developments; and parity with the PC mainstream. We're still dealing with developers, market share, and specs targeted at those crowds. Current consoles have 16GB total but they have to run an OS; and as a result both run in some crippled combo which incurs speed penalties. At best, either console will have 13,5GB at a (much) lower bandwidth/speed. That's the reason a tiny handful of games can already exhibit small problems with 8 or 10.

16GB, perhaps only in the very top end is going to matter much like it does now, nice to have at best in a niche. Newer titles that lean heavy on VRAM will also be leaning on the new architectures. We've also seen that 10GB 3080's are mostly fine, but occasionally/rarely not. 12GB should carry another few years at least even on top end.

That said... I think we're very close to 'the jump to 32GB RAM' as a mainstream gaming option for PCs, but that'll really only be useful in more than a few games by the time the next console gen is out, and DDR5 is priced to sanity at decent speeds.

The TL DR / my point: there is no real 'sense' in chasing ahead way over the mainstream 'high end' gaming spec. Games won't utilize it proper, won't even support it sometimes, or you just will find a crapload of horsepower wasted for more heat and noise. Its just getting the latest greatest for giggles. Let's call it what it is - the same type of sense that lives in gamer minds buying that 3090(ti) over that 3080 at an immense premium.
 
Last edited:
Joined
Sep 13, 2021
Messages
86 (0.07/day)
There will only be a few games that take advantage of and benefit from the the 16GB of VRAM, but there will be benefit!
16 GB is a big plus, if used in a productive environment. Spared VRam is always used as fast cache, speeding up most operations. For 700$, I would buy it to upgrade from my 2070.

this seems like a mistake...12gb for the future with RT does not seem sufficient
GPUs are bought for actual needs. It's never a good deal, to buy a more advanced card to be safe for 3-4 years. Nobody knows, whats coming and GPUs can fall rapidly in price. If it fits your actual needs, it's fine.
 
Joined
Jul 13, 2016
Messages
3,321 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
AMD can just raise prices, to make everyone see that they are serious contenders and not budget options any more. Like they did with the rx5700.

AMD's prices are already comparable to Nvidia's save for the ultra-high end. There's not much room to raise prices except there and it's questionable whether consumers are even fine with Nvidia's pricing.

I don't think the current situation can be compared to any other time in the past. At this point TSMC is alone at the forefront, effectively being the only option for high-end products, all the volume is focused on TSMC, even Qualcomm has given up on manufacturing the high-end chips at Samsung due to the latter's low efficiency and yields. What stops TSMC from charging as much as they want?

If I remember correctly, the only 7nm chips that Nvidia has are the ones focused on servers that are sold many times more expensive than gaming GPUs. So no need to ask how Nvidia keeps the profit margin so high...

The fact that we already know the market pricing of 5nm says that no, in fact TSMC is not just charging whatever they want. As the chart I linked demonstrated, the price increase of wafers to 5nm isn't the first time the market has seen such a jump.

Clearly AMD doesn't have any issues with the cost of 7nm either. They have both GPUs and CPUs on the node.

What's stopping TSMC from jacking up prices? The fact that if they did it would disrupt the word's most valuable resource, chips, and cause a government crackdown and a potential loss of massive subsidies they are receiving to build new fabs. TSMC is going to be relying on help from foreign governments should the Chinese come knocking. It doesn't make any sense for TSMC to disrupt the status quo when they already are receiving tons of profits and subsidies being a reasonable chip manufacturer.
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
The whole idea of RT was not to use any memory, because it's all made out of Ray tracing magic, just rays coordinates as opposed to heavy textures. why the need for extra VRAM.
Ray tracing is for simulation of photorealistic lighting and shadows, it needs more texture memory if you have surfaces with a complex physical behavior like human skin. Memory usage depends mainly on resolution and distance, not RT.
 
Joined
Jan 3, 2015
Messages
3,024 (0.83/day)
System Name The beast and the little runt.
Processor Ryzen 5 5600X - Ryzen 9 5950X
Motherboard ASUS ROG STRIX B550-I GAMING - ASUS ROG Crosshair VIII Dark Hero X570
Cooling Noctua NH-L9x65 SE-AM4a - NH-D15 chromax.black with IPPC Industrial 3000 RPM 120/140 MM fans.
Memory G.SKILL TRIDENT Z ROYAL GOLD/SILVER 32 GB (2 x 16 GB and 4 x 8 GB) 3600 MHz CL14-15-15-35 1.45 volts
Video Card(s) GIGABYTE RTX 4060 OC LOW PROFILE - GIGABYTE RTX 4090 GAMING OC
Storage Samsung 980 PRO 1 TB + 2 TB - Samsung 870 EVO 4 TB - 2 x WD RED PRO 16 GB + WD ULTRASTAR 22 TB
Display(s) Asus 27" TUF VG27AQL1A and a Dell 24" for dual setup
Case Phanteks Enthoo 719/LUXE 2 BLACK
Audio Device(s) Onboard on both boards
Power Supply Phanteks Revolt X 1200W
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard Logitech G910 Orion Spectrum
Software WINDOWS 10 PRO 64 BITS on both systems
Benchmark Scores Se more about my 2 in 1 system here: kortlink.dk/2ca4x
The 16 gb variant would be the most interesting card. But price and performance also play in. For not to forget the power usage as electricity is far from cheap, specially here in Europe where I live. Also I don't really know if I would even be able to purchase a card, as the economic situation in Europe is very unstable now, that Russia has cut of gaz.

So I might be forced to just stay with my rtx 3080. Not a bad card, but the 10 gb vram comes in short a bit to often amd that's annoying.
 
Joined
Jul 19, 2006
Messages
43,609 (6.48/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e-Plus Wifi
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD/Samsung m.2's
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Amp, Adam Audio T5V's, Hifiman Sundara's.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Epomaker 84 key
Software Windows 11 Pro
Nvidia will use inflation as an excuse. I fully expect a 4080 12gb to cost around $799 and the 16gb to cost $899
They still have to compete with supply, demand and competition. The first release of the cards could go as high as you say, but with the current overstock of cards, I don't think it will be as bad.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,354 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Not sure at all but it could also be the separation between the Laptop GPU vs Desktop. With Nvidia you never know.
 
Joined
Feb 11, 2009
Messages
5,569 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
That depends on settings and resolution.

I mean sure, but who is going to get an RTX4080 only to run limited settings because you got the 12gb model....its just a weird limitation that should nto exist.
 
Top