• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Might Consider Major Design Shift for Future 300 GPU Series

Nomad76

News Editor
Staff member
Joined
May 21, 2024
Messages
540 (3.75/day)
NVIDIA is reportedly considering a significant design change for its GPU products, shifting from the current on-board solution to an independent GPU socket design following the GB200 shipment in Q4, according to reports from MoneyDJ and the Economic Daily News quoted by TrendForce. This move is not new in the industry, AMD has already introduced socket design in 2023 with their MI300A series via Supermicro dedicated servers. The B300 series, expected to become NVIDIA's mainstream product in the second half of 2025, is rumored to be the main beneficiary of this design change that could improve yield rates, though it may come with some performance trade-offs.

According to the Economic Daily News, the socket design will simplify after-sales service and server board maintenance, allowing users to replace or upgrade the GPUs quickly. The report also pointed out that based on the slot design, boards will contain up to four NVIDIA GPUs and a CPU, with each GPU having its dedicated slot. This will bring benefits for Taiwanese manufacturers like Foxconn and LOTES, who will supply different components and connectors. The move seems logical since with the current on-board design, once a GPU becomes faulty, the entire motherboard needs to be replaced, leading to significant downtime and high operational and maintenance costs.



View at TechPowerUp Main Site | Source
 
Joined
Nov 27, 2023
Messages
2,010 (6.28/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
@lexluthermiester
VRM difference. Traces difference on the board for different memory buses and configs (and PCI-e lanes nowadays). Different sized chips requiring different cold plates on coolers since there is no unified IHS. Lots of reasons, really. They are POTENTIALLY solvable, but you would have to have a board and cooler combo that would be interchangeable between a hypothetical 4060 and 4090. This would be hilariously costly. Do we really want GPUs coating even more?
 
Joined
Aug 13, 2020
Messages
173 (0.11/day)
I'll fall on that grenade...

Seriously? Why the hell hasn't GPU sockets been a thing? VDIMM modules? It's not like it would all that difficult..
Money? Nvidia selling directly with a GPU will grant far less money than a full board and chip from a 3rd party... Plus then NVIDIA gets to control the wholesale GPU chip price and 3rd party gets to set the margins..win win for the middleman...
 
Joined
Dec 12, 2016
Messages
1,705 (0.60/day)
@lexluthermiester
VRM difference. Traces difference on the board for different memory buses and configs (and PCI-e lanes nowadays). Different sized chips requiring different cold plates on coolers since there is no unified IHS. Lots of reasons, really. They are POTENTIALLY solvable, but you would have to have a board and cooler combo that would be interchangeable between a hypothetical 4060 and 4090. This would be hilariously costly. Do we really want GPUs coating even more?
Traditional PC architectures haven't been significantly revamped since the beginning of consumer client desktops. While you are correct, that just adding another socket to existing motherboards is probably not the answer, we can create a two daughter board standard connection through x16 PCIe Gen 5 interlink. Each daughter board can have up its own socket, memory slots, I/O connections and even separate power supplies/delivery.

Cost is hard to gauge as you could have two 600W PSU's (one for CPU daughter board and one for GPU daughter board) which can be cheaper than one 1.2 kW power supply. The ability to upgrade just the VRAM and GPU itself could generate cost savings over time.

Either way, we are stuck in the socketed CPU, DIMM slot, multiple PCIe slots off of one motherboard era as it has been since the 80386 days (circa late 80s). Only the integration of I/O to the motherboard has really changed from those days requiring less expansion slots. Oh and I'm glad the era of jumpers is over. I hated those little plastic pieces of you know what. :)
 
Joined
Oct 22, 2014
Messages
13,935 (3.83/day)
Location
Sunshine Coast
System Name Lenovo ThinkCentre
Processor AMD 5650GE
Motherboard Lenovo
Memory 32 GB DDR4
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Lenovo
Software W11 Pro 64 bit
@lexluthermiester
VRM difference. Traces difference on the board for different memory buses and configs (and PCI-e lanes nowadays). Different sized chips requiring different cold plates on coolers since there is no unified IHS. Lots of reasons, really. They are POTENTIALLY solvable, but you would have to have a board and cooler combo that would be interchangeable between a hypothetical 4060 and 4090. This would be hilariously costly. Do we really want GPUs coating even more?
Why not make the GPU socket a standard and all GPUs have to comply so one cooler would suffice for any unit used?
 
Joined
Nov 27, 2023
Messages
2,010 (6.28/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
@Caring1
Because it doesn’t work this way even with CPUs? Good luck cooling a 14900KS with a stock cooler. Same here - if you have a hypothetical 600W 5090 and standardize with THAT in mind you’d have to make it so that “any cooler” can reasonably keep THAT cool. Which is costly. I suppose the argument can be made for establishing a ceiling for power usage on GPUs and going from there, but it wouldn’t fly.

Essentially, socketed GPUs, be it for AIBs or directly onto MoBos, would require a complete overhaul of everything about the modern PC form-factor, as @Daven has mentioned. And it’s just not really feasible as it stands right now - ATX and its offspring has a profoundly ingrained effect on how every single PC part is created and functions. And it has been this way for decades now. Shifting the entire ecosystem just for some potential upgradeability benefits wouldn’t fly with any of the current players.
 
Joined
Oct 22, 2014
Messages
13,935 (3.83/day)
Location
Sunshine Coast
System Name Lenovo ThinkCentre
Processor AMD 5650GE
Motherboard Lenovo
Memory 32 GB DDR4
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Lenovo
Software W11 Pro 64 bit
@Caring1
Because it doesn’t work this way even with CPUs? Good luck cooling a 14900KS with a stock cooler. Same here - if you have a hypothetical 600W 5090 and standardize with THAT in mind you’d have to make it so that “any cooler” can reasonably keep THAT cool. Which is costly. I suppose the argument can be made for establishing a ceiling for power usage on GPUs and going from there, but it wouldn’t fly.

Essentially, socketed GPUs, be it for AIBs or directly onto MoBos would require a complete overhaul of everything about the modern PC form-factor, as @Daven has mentioned. And it’s just not really feasible as it stands right now - ATX and its offspring has a profoundly ingrained effect on how every single PC part is created and functions. And it has been this way for decades now. Shifting the entire ecosystem just for some potential upgradeability benefits wouldn’t fly with any of the current players.
You can use the same socket for anything from an i3 to an i9, the cooler can be whatever you want it to be.
The same should apply to GPUs.
 
Joined
Jan 8, 2017
Messages
9,347 (3.30/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Yeah not happening, part of Nvidia's strategy is making upgrading the hardware as expensive as possible because customers don't have a choice.
 
Joined
Nov 27, 2023
Messages
2,010 (6.28/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
@Caring1
I mean, sure, in that you can theoretically put the 14900K into the shittiest board possible and slap an Intel stock cooler on top of it. You won’t be able to actually access the full potential of the chip, but I suppose by the metric of “it can be physically done” it works.
You also haven’t provided any actual argument other than “it should be like this just cause” which… okay. I will be sure to forward your feedback to whoever it may concern.
 
Joined
Jul 5, 2013
Messages
27,005 (6.56/day)
I'm not going to respond to anyone separately, instead offering the following thought to all: Anything we can do with CPUs, we can do just as easily with GPUs. It's a design shift and spec that should have happened decades ago. Build a GPU daughter-card that has a socket and VDIMMs, have it plug into the motherboard with power circuitry to support whatever you want to install. I mean after all, we have the MXM standard, and that's fairly close.
 
Joined
Dec 30, 2010
Messages
2,187 (0.43/day)
Waiting for a comment asking why this isn't a thing on consumer cards.

It was.

90's had their own chip(s) with each and every function. Dozens of examples, 3DFX for example. Chips and tech got advanced, now pretty much house everything inside of it, but Nvidia's approach right now is extremely inefficient. High failure rates on wafers is expensive. AMD on the other hand now makes the compute dies seperate from the memory cache dies - yielding much better on wafers and thus get more out of it.
 
Joined
Mar 16, 2017
Messages
225 (0.08/day)
Location
behind you
Processor Threadripper 1950X (4.0 GHz OC)
Motherboard ASRock X399 Professional Gaming
Cooling Enermax Liqtech TR4
Memory 48GB DDR4 2934MHz
Video Card(s) Nvidia GTX 1080, GTX 660TI
Storage 2TB Western Digital HDD, 500GB Samsung 850 EVO SSD, 280GB Intel Optane 900P
Display(s) 2x 1920x1200
Power Supply Cooler Master Silent Pro M (1000W)
Mouse Logitech G602
Keyboard Corsair K70 MK.2
Software Windows 10
Surprised this is possible given the speed of modern VRAM, unless they're thinking everything will use HBM. Not sure how practical GPU sockets would be on consumer GPUs with separate VRAM for signal integrity reasons, that was ostensibly the reason for ditching upgradable VRAM so long ago (it used to be a thing).
 
Joined
Sep 17, 2014
Messages
22,110 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Ah, improved yields! Cheaper GPUs!

Stop laughing pls!
 
Joined
Dec 28, 2012
Messages
3,771 (0.88/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
It was.

90's had their own chip(s) with each and every function. Dozens of examples, 3DFX for example. Chips and tech got advanced, now pretty much house everything inside of it, but Nvidia's approach right now is extremely inefficient. High failure rates on wafers is expensive. AMD on the other hand now makes the compute dies seperate from the memory cache dies - yielding much better on wafers and thus get more out of it.
Yeah, but wafer yields and centralization of components has nothing to do with why GPus are not socketed.

I'm not going to respond to anyone separately, instead offering the following thought to all: Anything we can do with CPUs, we can do just as easily with GPUs. It's a design shift and spec that should have happened decades ago. Build a GPU daughter-card that has a socket and VDIMMs, have it plug into the motherboard with power circuitry to support whatever you want to install.
But why? To what end?

Let's say you did make a socketed GPU. By the time you are ready to upgrade your GPU, guess what? The VRAM is going to need an upgrade too. Which means a new board. So you get that "upgradeability" and never use it, OR waste tons of $$$ upgrading inside a generation or every generation, like the people who bought a ryzen 1600, 2600, 3600, and 5600 for the same PC and spent twice what it would have cost to simply buy an i7 PC originally then replace it with a 5600 PC later.

how many people out there are salivating at the idea of putting a 9800x on a x370 board with 2400 mhz DDR4? not many, I'd reckon.

Speaking of AMD, there's another great reason. AMD controlled their chipsets and firmware, yet getting the ryzen 3000 and 5000 on the 300 series chipsets was a major clusterfuck. To date we STILL dont have full support. There are 400 series and 500 series chips that never got the X3d update and cannot run those chips. YAY, what a wonderful support nightmare! Intel couldnt pull it off either, remember the issues with PCIe 4.0 and rocket lake?

That's before getting into issues with clock scaling with current DIMM design, or the extra space it would require. And the cost, you'd have to either have multiple socket designs OR even the cheapest boards would have to support a 4090. So all the people whining about GPU costs now get to pay 4090 board price for 4060 GPUs! WOOOHHOOOOO!! :slap:

Most people dont upgrade every gen anyway. Much like CPUs, GPUs often go several generations before being upgraded. The primary market for Pascal was Fermi holdouts and first gen Kepler buyers, not those who owned Maxwell. And if you wait 5-6 years, well, you gotta upgrade everything anyway.
I mean after all, we have the MXM standard, and that's fairly close.
The only difference between MXM and desktop GPUs is that one is mobile sized and has the video out circuit wired to the motherboard. It's not anymore modular then a 4090.
You can use the same socket for anything from an i3 to an i9, the cooler can be whatever you want it to be.
The same should apply to GPUs.
But why? What does this gain you? You buy a 4060, then later buy a 4070 chip wasting a ton of cash, but WHOOPS you've choked your new 4070 with a smaller bus, time for a new mobo!

Or maybe you get a 5060, but WHOOPS, you've again choked it, this time with slower DDR6, and to fix that, you need a new GPU mobo! Just LOOk at the savings!

How many people do you know want to run an alder lake, or even coffee lake, chips on a ivy bridge boards? Or haswell?

The same reason we dont put 10+ CPU gens on one motherboard, GPU tech advances quickly, and you cant plug newer tech into an older board. You need updated traces and power components, so socketing the GPU only introduces a new point of failure with thermal expansions.
 
Last edited:
  • Like
Reactions: SL2
Joined
May 10, 2023
Messages
158 (0.30/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Uhhhh, is everyone really missing the point, or did I really missread the rumor???

This is talking about their GB200 chip, which has TWO blackwell GPUs (with HBM memory built-in) along with a Grace CPU (with LPDDR5X soldered next to it).
The current device has that entire board as a single piece.

This article is likely proposing that the Grace CPU will be socketed instead of built in a single board alongside the GPUs.

They even compare it to the Instinct MI300A, which is an APU (with HBM on top of it) that can be socketed.
Memory won't be something apart, it'll still be soldered, you can't just pack LPDDR5X on a socket easily (I don't think Nvidia will be using those CAMM modules), nor can HBM.

Nvidia's GPU-only offerings are already socketed, either in SXM or PCIe. That rumor is NOT referring to those.
The G in GB200 refers to the Grace chip that lives alongisde the 2 Blackwell GPUs on the board.

EDIT: Also funny to see people comparing this to desktop products when those have nothing to do with one another lol
 

SL2

Joined
Jan 27, 2006
Messages
2,309 (0.34/day)
EDIT: Also funny to see people comparing this to desktop products when those have nothing to do with one another lol
I don't see much comparing going on, just a few fantasies
 
Top