• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 Founders Edition PCB Pictured, Revealing AD103 Silicon

Joined
Aug 2, 2012
Messages
1,987 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
Consumers who purchases these makes this possible. Vote with your wallet guys..

Never paid more than 300EUR for a card myself :laugh:
Wish that was possible, in the netherlands that means a 1030 or something, i may as well ditch my PC then.

And no, i will never buy second hand.
 
Joined
Jul 9, 2020
Messages
113 (0.07/day)
Location
RU
System Name N\A
Processor AMD Ryzen 7 5800X3D (BOX)
Motherboard ASUS ROG Crosshair VIII Dark Hero (BIOS v4902)
Cooling Noctua NH-D15 + NA-HC4 + NM-AMB12 (all chromax.black)
Memory 4x8GB Team Group Xtreem DDR4-4133 (3800@1900 15-15-15-15-30-45_T1 (55), V1.48)
Video Card(s) EVGA GeForce RTX 3080 Ti FTW3 Ultra Gaming
Storage 500GB Samsung SSD 980 Pro (System); 1TB Samsung SSD 990 Pro (Games and other)
Display(s) Philips Brilliance 239CQH (IPS, 1080p, 60Hz)
Case Open Stand
Power Supply Seasonic PRIME Ultra 850 Titanium
Keyboard Corsair K70 RGB RAPIDFIRE (1000Hz, with CHERRY MX Speed switches)
Software Microsoft WIndows 11 Pro 23H2
$1200 and no secure frame around GPU die? WTH...
 
Joined
May 11, 2018
Messages
1,257 (0.53/day)
"Finally, we have the Vulkan benchmarks where the NVIDIA GeForce RTX 4080 graphics card ends up being only 5.5% faster than the RTX 3090 Ti"

I'm pretty sure we'll see games where RTX 4080 16 GB actualy looses to RTX 3090 Ti due to much less memory bandwidth!
 
Last edited:
Joined
Jul 15, 2020
Messages
1,021 (0.64/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
There was a time, way way back, when you had 16bit vs 32 bit charts (it is a fun nostalgic read). Playing the game with 16bit was faster until a point when a new tech came along and 32bit was faster then 16bit. That was the tipping point and 16bit died that moment.

Some day in the future, at least 5 years if not 10 from now, a time will come when RT on will be faster than RT off. Than I will use it. Untill than, leave it off. It doesn't worth the pref hit.

Anyway and to the point, 4080 is a nice but pointless GPU outside of professional CUDA usages, like 4090, in it's current price.
For gaming, wait for AMD offer.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,784 (2.93/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-A
Cooling Arctic Freezer 50 / Thermaltake Contac 21
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) RTX 3080 10GB / RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Creative Omni BT speaker
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
Wish that was possible, in the netherlands that means a 1030 or something, i may as well ditch my PC then.

And no, i will never buy second hand.
I've been a PC hobbyists for almost 20 years and I may have bought about 5 cards new.. :laugh:
 
Joined
Jun 16, 2019
Messages
373 (0.19/day)
System Name Cyberdyne Systems Core
Processor AMD Sceptre 9 3950x Quantum neural processor (384 nodes)
Motherboard Cyberdyne X1470
Cooling Cyberdyne Superconduct SC5600
Memory 128TB QRAM
Storage SK 16EB NVMe PCI-E 9.0 x8
Display(s) Multiple LG C9 3D Matrix OLED Cube
Software Skysoft Skynet
Joined
Feb 15, 2019
Messages
1,659 (0.79/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
It won't be a problem on this card though.
Fire hazard aside, it is still a problem for average sized PC case users trying to close their side panels.
 
Joined
Apr 6, 2021
Messages
1,131 (0.85/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
The first benchmark results are out. :wtf: Will it be enough to battle the 7900XTX?

Geekbench4080.PNG


3DMark4080.PNG


 
Joined
Sep 17, 2014
Messages
22,452 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
There was a time, way way back, when you had 16bit vs 32 bit charts (it is a fun nostalgic read). Playing the game with 16bit was faster until a point when a new tech came along and 32bit was faster then 16bit. That was the tipping point and 16bit died that moment.

Some day in the future, at least 5 years if not 10 from now, a time will come when RT on will be faster than RT off. Than I will use it. Untill than, leave it off. It doesn't worth the pref hit.

Anyway and to the point, 4080 is a nice but pointless GPU outside of professional CUDA usages, like 4090, in it's current price.
For gaming, wait for AMD offer.
Minor exception, in that 16-32 bit age we still had many node shrinks ahead of us.

But today? Already the whole thing is escalating to meet gen to gen perf increases... 5-10 years better bring a game changer in that sense or RT is dead or of too little relevance. Besides, its not 'raster OR rt'. Its 'and'. Another thing 16>32 bit was not. So devices will need raster perf still...
 
Joined
Sep 1, 2009
Messages
1,233 (0.22/day)
Location
CO
System Name 4k
Processor AMD 5800x3D
Motherboard MSI MAG b550m Mortar Wifi
Cooling ARCTIC Liquid Freezer II 240
Memory 4x8Gb Crucial Ballistix 3600 CL16 bl8g36c16u4b.m8fe1
Video Card(s) Nvidia Reference 3080Ti
Storage ADATA XPG SX8200 Pro 1TB
Display(s) LG 48" C1
Case CORSAIR Carbide AIR 240 Micro-ATX
Audio Device(s) Asus Xonar STX
Power Supply EVGA SuperNOVA 650W
Software Microsoft Windows10 Pro x64
all these 4080s will be OOS in 5 minutes on launch day. plenty of gamers with more money than sense out there
Dont get me wrong, i do believe these cards will sell out in record time but not bought by consumers. I believe they will be bought by scalpers right away and the cards price will rise. This will make it seem like this card has sold out and is a hit for Nvidia but the actuality is scalpers trying to make a profit. To be fair Nvidia does not care if scalpers or gamers are buying these cards, they just want them sold out like the 4090. You can buy plenty of 4090's now they are just all scalped.
 
Joined
Sep 26, 2022
Messages
231 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
Wish that was possible, in the netherlands that means a 1030 or something, i may as well ditch my PC then.

And no, i will never buy second hand.
And are now probably wondering if they should buy a two year Ampere card for a launch MSRP
Here in Portugal there are a lot of 1660 Super's at about 300€. Even though they're better than a 1030, it's insane to think it might still cost more than the launch MSRP for a 3 year old, low-midrange card...
1667945845971.png

Oh and BTW the ones marked "Melhor Preço" means it's the lowest price recorded (for a specific SKU) on this price comparison website, which I can tell you I've been using to compare this kind of stuff for well over 3 years...
 
Joined
May 3, 2018
Messages
2,881 (1.20/day)
There's tonns of people who gave RTX 20 (Turing) series a miss since it didn't bring any price / performance increase over GTX 10 (Pascal), and then they couldn't get a RTX 30 (Ampere) cards because of the cryptoidiotism.

And are now probably wondering if they should buy a two year Ampere card for a launch MSRP, or wait for a full RTX 40 (Ada) release and pay even more money for the same performance...
My 2080 Super kicks the crap out of my 1070, it a was plenty big enough upgrade for me for 1440p gaming. I got it for less than half the price 3070s were going for at the time. Now I will upgrade my 1080 Ti to 7900XT(X) and give Lovelace a wide berth.
 
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
People need to keep in mind just how absurdly expensive a 4nm wafer is.

If AD103 is 372mm^2 like the TPU database says, there are ~140 die per 300mm wafer (using an online wafer layout calculator). At $18000 per 5nm/4nm wafer that's $130 just for the silicon (even assuming no defective die). Then you have to include the cost of assembling the chip package, the cost of the memory chips and all the other stuff on the PCB, and the cost of mounting everything to the PCB.

In the end there's no way that a card based on an AD103 chip is costing less than $250 just to manufacture let alone pay for R&D and marketing. $1200 may be a bit much to ask for a 4080, but unless TSMC drops its wafer prices, these simple calculations make me conclude that AD103 based cards can't be priced less than $650 while still selling for a profit. AMD pricing their new GPU at $1000 is about right to keep the same profit margins as the past.

Perhaps this demonstrates that there needs to be a new paradigm of rebranding the previous generations of high-end GPUs to sell to the mid range and low end. Making mid-range and low-end GPUs on the latest process node doesn't seem to make financial sense anymore.
 
Joined
Jul 15, 2020
Messages
1,021 (0.64/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
People need to keep in mind just how absurdly expensive a 4nm wafer is.

If AD103 is 372mm^2 like the TPU database says, there are ~140 die per 300mm wafer (using an online wafer layout calculator). At $18000 per 5nm/4nm wafer that's $130 just for the silicon (even assuming no defective die). Then you have to include the cost of assembling the chip package, the cost of the memory chips and all the other stuff on the PCB, and the cost of mounting everything to the PCB.

In the end there's no way that a card based on an AD103 chip is costing less than $250 just to manufacture let alone pay for R&D and marketing. $1200 may be a bit much to ask for a 4080, but unless TSMC drops its wafer prices, these simple calculations make me conclude that AD103 based cards can't be priced less than $650 while still selling for a profit. AMD pricing their new GPU at $1000 is about right to keep the same profit margins as the past.

Perhaps this demonstrates that there needs to be a new paradigm of rebranding the previous generations of high-end GPUs to sell to the mid range and low end. Making mid-range and low-end GPUs on the latest process node doesn't seem to make financial sense anymore.
Indeed.
I think we will see a change in the near future: The same architecture with different process levels to different performance level\tier.
It will be good if only the top tier will use the latest process to achieve max absolut pref. The people who buy them will 'gladly' pay the extra to be on the bleeding edge of tech and will also pay for the extra work regarding design of two node processes for the same architecture.
Mid and low tier will use older, more mature and higher yielded process.
No need to xx30/xx50/xx60 to use 4nm if cost to pref is what you after and new wafer cost is skyrocket.
7/6nm is very much fine with me right now to any mid-level-GPU, as long as it come with enough memory.
Architecture improvement, process refinement and new software\tech (DLSS\LA,FSR,XESS ect.) will take care of the pref improvement.
Basically a 1 process lag for the mid and low tier, so when NV 5xxx series will be out on a better 2/3nm node for the 5080/5090 we will have 5030/5050/5060 on 4/5nm node.

And with that segmentation, we will be one step closer to 'GAMERS master race' who pay big and all the others who make the economic decision and don't care about races.
 

toooooot

New Member
Joined
Dec 9, 2022
Messages
19 (0.03/day)
Indeed.
I think we will see a change in the near future: The same architecture with different process levels to different performance level\tier.
It will be good if only the top tier will use the latest process to achieve max absolut pref. The people who buy them will 'gladly' pay the extra to be on the bleeding edge of tech and will also pay for the extra work regarding design of two node processes for the same architecture.
Mid and low tier will use older, more mature and higher yielded process.
No need to xx30/xx50/xx60 to use 4nm if cost to pref is what you after and new wafer cost is skyrocket.
7/6nm is very much fine with me right now to any mid-level-GPU, as long as it come with enough memory.
Architecture improvement, process refinement and new software\tech (DLSS\LA,FSR,XESS ect.) will take care of the pref improvement.
Basically a 1 process lag for the mid and low tier, so when NV 5xxx series will be out on a better 2/3nm node for the 5080/5090 we will have 5030/5050/5060 on 4/5nm node.

And with that segmentation, we will be one step closer to 'GAMERS master race' who pay big and all the others who make the economic decision and don't care about races.
I wholeheartedly support this. With a middle end cpu costing 300 dollars, middle end GPU shouldnt be 700-800 dollars.
 
Top