• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA "Ada Lovelace" Architecture Designed for N5, GeForce Returns to TSMC

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,293 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA's upcoming "Ada Lovelace" architecture, both for compute and graphics, is reportedly being designed for the 5 nanometer silicon fabrication node by TSMC. This marks NVIDIA's return to the Taiwanese foundry after its brief excursion to Samsung, with the 8 nm "Ampere" graphics architecture. "Ampere" compute dies continue to be built on TSMC 7 nm nodes. NVIDIA is looking to double the compute performance on its next-generation GPUs, with throughput approaching 70 TFLOP/s, from a numeric near-doubling in CUDA cores, generation-over-generation. These will also be run at clock speeds above 2 GHz. One can expect "Ada Lovelace" only by 2022, as TSMC N5 matures.



View at TechPowerUp Main Site
 
Joined
Jul 1, 2011
Messages
364 (0.07/day)
System Name Matar Extreme PC.
Processor Intel Core i9-12900KS 5.2GHZ All P-Cores ,4.2GHZ All E-Cores & Ring 4.2GhZ bus speed 100.27
Motherboard NZXT N5 Z690 Wi-Fi 6E
Cooling CoolerMaster ML240L V2 AIO with MX6
Memory 4x16 64GB DDR4 3600MHZ CL15-19-19-36-55 G.SKILL Trident Z NEO
Video Card(s) Nvidia ZOTAC RTX 3080 Ti Trinity + overclocked 100 core 1000 mem. Re-pasted MX6
Storage WD black 1GB Nvme OS + 1TB 970 Nvme Samsung & 4TB WD Blk 256MB cache 7200RPM
Display(s) Lenovo 34" Ultra Wide 3440x1440 144hz 1ms G-Snyc
Case NZXT H510 Black with Cooler Master RGB Fans
Audio Device(s) Internal , EIFER speakers & EasySMX Wireless Gaming Headset
Power Supply Aurora R9 850Watts 80+ Gold, I Modded cables for it.
Mouse Onn RGB Gaming Mouse & Logitech G923 & shifter & E-Break Sim setup.
Keyboard GOFREETECH RGB Gaming Keyboard, & Xbox 1 X Controller & T-Flight Hotas Joystick
VR HMD Oculus Rift S
Software Windows 10 Home 22H2
Benchmark Scores https://www.youtube.com/user/matttttar/videos
I bought this year RTX 2070 super and its Rocking @3440x1440 and not upgrading for a while, i even tried rtx 3060 but wasn't happy.
 
Joined
Mar 28, 2020
Messages
1,760 (1.02/day)
If this is true, then I think Nvidia is truly worried about AMD's progress in the GPU space. The reason why I said that is because Nvidia' "gaming" GPUs have not been manufactured on near cutting edge nodes when Nvidia was dominating the high end GPU space over the last few years. When AMD introduced their first TSMC N7 GPU, Turing was introduced on TSMC 12nm (basically a 16nm), then they slowly move to Samsung 8nm (essentially a 10nm) even though AMD was already using N7 for a year or 2. So now with competition heating up, if they continue going for cheaper nodes, it is not going to do them any favor.

I bought this year RTX 2070 super and its Rocking @3440x1440 and not upgrading for a while, i even tried rtx 3060 but wasn't happy.
RTX 2070 Super is faster than a RTX 3060 for sure. The only benefit of going with the RTX 3060 is the 50% increase in VRAM, which may be more beneficial in the longer run.
 
Joined
Jul 1, 2011
Messages
364 (0.07/day)
System Name Matar Extreme PC.
Processor Intel Core i9-12900KS 5.2GHZ All P-Cores ,4.2GHZ All E-Cores & Ring 4.2GhZ bus speed 100.27
Motherboard NZXT N5 Z690 Wi-Fi 6E
Cooling CoolerMaster ML240L V2 AIO with MX6
Memory 4x16 64GB DDR4 3600MHZ CL15-19-19-36-55 G.SKILL Trident Z NEO
Video Card(s) Nvidia ZOTAC RTX 3080 Ti Trinity + overclocked 100 core 1000 mem. Re-pasted MX6
Storage WD black 1GB Nvme OS + 1TB 970 Nvme Samsung & 4TB WD Blk 256MB cache 7200RPM
Display(s) Lenovo 34" Ultra Wide 3440x1440 144hz 1ms G-Snyc
Case NZXT H510 Black with Cooler Master RGB Fans
Audio Device(s) Internal , EIFER speakers & EasySMX Wireless Gaming Headset
Power Supply Aurora R9 850Watts 80+ Gold, I Modded cables for it.
Mouse Onn RGB Gaming Mouse & Logitech G923 & shifter & E-Break Sim setup.
Keyboard GOFREETECH RGB Gaming Keyboard, & Xbox 1 X Controller & T-Flight Hotas Joystick
VR HMD Oculus Rift S
Software Windows 10 Home 22H2
Benchmark Scores https://www.youtube.com/user/matttttar/videos
If this is true, then I think Nvidia is truly worried about AMD's progress in the GPU space. The reason why I said that is because Nvidia' "gaming" GPUs have not been manufactured on near cutting edge nodes when Nvidia was dominating the high end GPU space over the last few years. When AMD introduced their first TSMC N7 GPU, Turing was introduced on TSMC 12nm (basically a 16nm), then they slowly move to Samsung 8nm (essentially a 10nm) even though AMD was already using N7 for a year or 2. So now with competition heating up, if they continue going for cheaper nodes, it is not going to do them any favor.


RTX 2070 Super is faster than a RTX 3060 for sure. The only benefit of going with the RTX 3060 is the 50% increase in VRAM, which may be more beneficial in the longer run.
OH yes I know the 2070 super is faster but gave the 3060 a try for testing not that i was going to replace my 2070 super with it.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Is that supposed to be the multi chip design?
 
Joined
Nov 11, 2016
Messages
3,456 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
OH yes I know the 2070 super is faster but gave the 3060 a try for testing not that i was going to replace my 2070 super with it.

I would prefer the 3060 over 2070 Super because of the HDMI 2.1, that make the 3060 very suitable for HTPC, goes along well with OLED TV too :D
 
Joined
Apr 16, 2013
Messages
549 (0.13/day)
Location
Bulgaria
System Name Black Knight | White Queen
Processor Intel Core i9-10940X (28 cores) | Intel Core i7-5775C (8 cores)
Motherboard ASUS ROG Rampage VI Extreme Encore X299G | ASUS Sabertooth Z97 Mark S (White)
Cooling Noctua NH-D15 chromax.black | Xigmatek Dark Knight SD-1283 Night Hawk (White)
Memory G.SKILL Trident Z RGB 4x8GB DDR4 3600MHz CL16 | Corsair Vengeance LP 4x4GB DDR3L 1600MHz CL9 (White)
Video Card(s) ASUS ROG Strix GeForce RTX 4090 OC | KFA2/Galax GeForce GTX 1080 Ti Hall of Fame Edition
Storage Samsung 990 Pro 2TB, 980 Pro 1TB, 850 Pro 256GB, 840 Pro 256GB, WD 10TB+ (incl. VelociRaptors)
Display(s) Dell Alienware AW2721D 240Hz| LG OLED evo C4 48" 144Hz
Case Corsair 7000D AIRFLOW (Black) | NZXT ??? w/ ASUS DRW-24B1ST
Audio Device(s) ASUS Xonar Essence STX | Realtek ALC1150
Power Supply Enermax Revolution 1250W 85+ | Super Flower Leadex Gold 650W (White)
Mouse Razer Basilisk Ultimate, Razer Naga Trinity | Razer Mamba 16000
Keyboard Razer Blackwidow Chroma V2 (Orange switch) | Razer Ornata Chroma
Software Windows 10 Pro 64bit
Ampere is such a flop with Samsung's 8nm.
 
Joined
Aug 21, 2013
Messages
1,936 (0.47/day)
Ampere is such a flop with Samsung's 8nm.
Agreed. Tho there are people who argue that its not that much worse than TSMC's 7nm. Tho that argument only looks at the density and not the power characteristics, output quantity or yields. It does not help matters that Micron's G6X is also very power hungry for a small bump in effective speed over standard 16Gbps (18Gbps G6 has existed since Turing).

I hope that if Lovelace or whatever it ends up beingh called uses TSMC once again and Micron fixes their G6X power draw or Samsung comes out with 20Gbps G6 to replace G6X. Turing was an insult with nonexistant (RT) and bad (DLSS 1.0) features and high price. Ampere is just expensive to produce, hot, low yielding and power hungry. Samsung's 8nm process was never meant to produce such large chips. Even in smartphones Samsung's 8nm was always losing to TSMC.
The only reason Ampere is half decent is Nvidia's architecture and monstrous cooling solutions by Nvidia and AIB's to keep it in check.

If we were not in the middle of a global pandemic, supply shortage and mining boom the low (atleast lower than Turing) MSRP's would have made Ampere tolerable. But not as great as Maxwell or Pascal were. Especially 1080Ti when it came out. 700 was a steal for it and even years later Nvidia could only produce 2080Ti that was slightly faster. Only with Ampere was 1080Ti defeated by midrange cards. Cards that cost more than 700....
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Ampere is such a flop with Samsung's 8nm.
Desktop Ampere is not what it *could* have been on, for example, TSMC 7nm, but a flop?

*checks notes*

Sure doesn't seem that way.
 
Joined
May 3, 2018
Messages
2,881 (1.19/day)
Nope Lovelace is supposed to be monolithic. They also have Hopper that is MCM but that is for data center and HPC customers.
That is not a given. There are leaks that if RDNA3 is so good, Nividia will skip Lovelace and go straight to Hopper for desktop. RDNA3 will be MCM on Big NAvi at least, but Lovelace is just evolution of Ampere. It reportedly 60-80% faster than Ampere, but RDNA3 is at least 100% faster and on biggest Navi31 it could be 200% faster but at an obscene $2K price.
 
Joined
Nov 11, 2016
Messages
3,456 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
That is not a given. There are leaks that if RDNA3 is so good, Nividia will skip Lovelace and go straight to Hopper for desktop. RDNA3 will be MCM on Big NAvi at least, but Lovelace is just evolution of Ampere. It reportedly 60-80% faster than Ampere, but RDNA3 is at least 100% faster and on biggest Navi31 it could be 200% faster but at an obscene $2K price.

I wouldn't bet on MCM design for gaming in this early stage, SLI and Xfire died for a reason LOL.
 
Joined
May 3, 2018
Messages
2,881 (1.19/day)
I wouldn't bet on MCM design for gaming in this early stage, SLI and Xfire died for a reason LOL.
I'm only talking MCM in the flagship, not the mainstream. They might have a 7950XT, 7900XT and 7800XT. 7950XT would be $2K and just for bragging rights. I doubt 4090 would get near it if specs are to believed.
 
Joined
Aug 21, 2013
Messages
1,936 (0.47/day)
That is not a given. There are leaks that if RDNA3 is so good, Nividia will skip Lovelace and go straight to Hopper for desktop. RDNA3 will be MCM on Big NAvi at least, but Lovelace is just evolution of Ampere. It reportedly 60-80% faster than Ampere, but RDNA3 is at least 100% faster and on biggest Navi31 it could be 200% faster but at an obscene $2K price.
We dont know. Nvidia is a black (green?) box when it comes to keeping these things close to it's chest. The leaks about AMD and Intel products tend to be far more reliable.
I wouldn't bet on MCM design for gaming in this early stage, SLI and Xfire died for a reason LOL.
MCM is invisible to the OS and games. It's a hardware solution that does not depend on OS or game developers optimizing for it. As far as they are concerned they see one monolithic chip. Load balancing is done in hardware. Atleast that is what AMD patents thus far have shown. SLI and Crossfire being dead is good. Nothing good ever came out of those.
 
Joined
Nov 11, 2016
Messages
3,456 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
MCM is invisible to the OS and games. It's a hardware solution that does not depend on OS or game developers optimizing for it. As far as they are concerned they see one monolithic chip. Load balancing is done in hardware. Atleast that is what AMD patents thus far have shown. SLI and Crossfire being dead is good. Nothing good ever came out of those.

If the MCM design lead to unwanted stutterings I would rather stick to huge monolithic chip.
between 120FPS with mad stutterings and smooth 80FPS I would pick the later LOL, I play games, not benchmarks, same reason I haven't gone back to SLI ever since I bought the first ever SLI GPU (7950GX2).
 
Joined
Aug 21, 2013
Messages
1,936 (0.47/day)
If the MCM design lead to unwanted stutterings I would rather stick to huge monolithic chip.
between 120FPS with mad stutterings and smooth 80FPS I would pick the later LOL, I play games, not benchmarks, same reason I haven't gone back to SLI ever since I bought the first ever SLI GPU (7950GX2).
Why would MCM lead to stuttering? MCM CPU's have been fine for example. Monolithic chips are getting more and more expensive and have essentially a 800mm² limit. MCM's can scale higher. For example four 400mm² chips. Tho first iterations use two. Atleast in gaming.
 
Joined
Nov 11, 2016
Messages
3,456 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Why would MCM lead to stuttering? MCM CPU's have been fine for example. Monolithic chips are getting more and more expensive and have essentially a 800mm² limit. MCM's can scale higher. For example four 400mm² chips. Tho first iterations use two. Atleast in gaming.

Well MCM will have higher latency than monolithic, that's for sure.
The overhead associated with MCM for gaming is not yet known at this point, Nvidia and AMD probably have thought about MCM a long time ago and just waited for the right kind of interconnect technology to make it possible.
While AMD is going to use a big pool of Infinity Cache, Nvidia will probably use networking tech from Mellanox like the PAM4 on GDDR6x, no one knows which interconnect will allow better MCM design at this point, or whether MCM is suitable for gaming at all or just meant for workstation tasks.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
If the MCM design lead to unwanted stutterings I would rather stick to huge monolithic chip.
I guesss I'd have to hope, and to an extent bank on that if they are going to do it, they've figured that out, because nobody wants that stuttery mess.
 
Joined
Aug 21, 2013
Messages
1,936 (0.47/day)
Well MCM will have higher latency than monolithic, that's for sure.
The overhead associated with MCM for gaming is not yet known at this point, Nvidia and AMD probably have thought about MCM a long time ago and just waited for the right kind of interconnect technology to make it possible.
While AMD is going to use a big pool of Infinity Cache, Nvidia will probably use networking tech from Mellanox like the PAM4 on GDDR6x, no one knows which interconnect will allow better MCM design at this point, or whether MCM is suitable for gaming at all or just meant for workstation tasks.
40ns vs 60ns monolithic vs MCM. At least on CPU's- On GPU's latency is far less of an issue. GDDR6 itself has much higher latency than DDR4 for example. But despite that it is still used as system RAM on consoles. GPU's are more about bandwidth and troughput. If they are bringing out MCM GPU's then im assuming it's ok.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
looking at GA100 65.6M / mm² density on N7, this new N5 should land around 118. Ampere 8nm sits at only 44. this means the maximum EUV die of 421 mm² can contain 50 B transistors, and this is just mindblowing.
 
Joined
Nov 23, 2010
Messages
317 (0.06/day)
Would it make sense to not work with Samsung after just 1 product launch? I would think given the supply contains Nvidia would continue to use both TSMC and Samsung. Samsung themselves are investing many billions to fix their manufacturing issues, how much validity does this news item carry?
 
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Good. The Samsung 8N process is trash for big chips. Ampere efficiency is garbage without severe undervolting.

My 3080 can push over 270 W with ray-tracing, at just 1800 MHz and 0.8 V. That is crazy.

Regular games do 200-230 W. Vsynced, rarely getting past 70-80% GPU usage.

At stock settings the clock can actually drop below 1800 MHz with ray tracing while drawing over 350 W. That is madness.
 
Last edited:
Joined
Aug 21, 2013
Messages
1,936 (0.47/day)
Good. The Samsung 8N process is trash for big chips. Ampere efficiency is garbage without severe undervolting.

My 3080 can push over 270 W with ray-tracing, at just 1800 MHz and 0.8 V. That is crazy.

Regular games do 200-230 W. Vsynced, rarely getting past 70-80% GPU usage.

At stock settings the clock can actually drop below 1800 MHz with ray tracing while drawing over 350 W. That is madness.
That's crazy. TSMC's 12nm 2080Ti with a 380W limit BIOS can do 2050Mhz+ with 380W. 1800 stock 350W is just bad for a "8nm" process.
 
Top