Basic models have the 2 connectors and will likely suffer from power limit issues on such base models when overclocked (since I apparently forgot to mention this),
Overclocked to where?. All cards produced these days are already running at their limits. And AIB models with two 8pin will in no way be limited by nonexistent OC potential.
same situation the RTX 3090 had. That's why NV adopted the 2x6 connector, it eliminates that problem and ensures an ample power supply regardless of model.
Nvidia mainly adopted the 12pin (back then, not 16pin or 2x6) because their PCB was so small that it could not properly fit 3x8pin there without either artificially extending the PCB or using a soldered on addon board on the cooler. It had nothing to do with 2x8pin inability to provide 450W.
Never gonna complain about spreading the power load among more, lower gauge connectors after seeing the 12V debacle of the 40 series…
8pin has never had this problem. Hence load balancing is unnecessary.
I like how most the 9070s are bigger than the 5090FE lol.... Although people love them some BFGs....
As if 50 series AIB cards are all two slot models?
DLSS 4 is good, it's people's perception (and I guess nvidias push) of how it should be used that's problematic. Sure you might argue "who needs 240 fps" but I'd say, why not? We have the monitors capable of doing that, we don't have CPUs or GPUs, MFG fixes that.
It's not 240fps. It is the perception of 240fps smoothness, but with latency of what the original framerate is.
FG always requires the use of latency reducing option to be enabled such as Reflex and now Reflex 2.
Without this enabled you get 240fps but it doesn't feel like 240 because of the input delay.
Sure, until you realize the clock speeds here are exceeding 3 GHz out of the box and it may perhaps overclock to near 4-ish? At least 3.5.
Show me a card in the last ten years that did +1Ghz OC on air. I think the closest might have been 980 Ti in 2014 as it OC'ed well but even that was not able to do a single Ghz on top of it's boost clocks. At least not on air or without hardmods.
They'd rather stack on 3x8 pins than do a single 12V-2X6 lol. I wonder if pleasing a vocal minority is really the better option over space efficiency.
"Vocal minority" are people like you asking for 16pin. Most AMD users i know dont want that.
Also this is just typical AIB flexing. Reference designs will have 2x8pin.
And im sure "space efficiency" is paramount on a 3,5 slot behemoth of a card...
Well, that kills any intent I had to even consider switching to AMD for my GPU.
That some AIB models include a third connector? (one you dont even need to use btw).
What an odd reason to write off entire brand.
9070XT is supposed to be a 7900XT-class GPU on a much more power-efficient node.
If it consumes more than 300W then AMD have really f***ed up.
Leaks say stock is ~260W. These are all AIB models that are supposedly up to 330W.
By your logic Nvidia also f***ed up with 5090 going to 600W from 450W on 4090 despite using a more power efficient node with a slightly larger chip.
Its not the traces, but broken solder balls do to gpu sag. It's going to be core reballing festival.
You're still wrong about this, the closer the GPU core is to the PCIe slot the smaller the forces exerted on the solder joints, it's like a lever the further away you go from the pivoting point (the slot) the more the PCB will flex.
Exactly. Near the PCIe connector, there should the least PCB warping.