Monday, June 12th 2023

GIGABYTE RTX 4090 WindForce V2 Unveiled, with Unique Tail-ended Power Connector

GIGABYTE GeForce RTX 4090 WindForce V2 may not be the company's most premium RTX 4090 custom-design graphics card, but it has arguably the best power connector design. Almost every custom RTX 4080 and RTX 4090 comes with a design where in the PCB is half or two-thirds the length of the card, and the remainder of the length is dedicated to the cooling solution, where the heatsink can extend along the thickness of the card. In case of the RTX 4090 WindForce V2, this extended portion of the cooler is recessed by half, exposing the tail end of the PCB. The 16-pin 12VHPWR power input is pointed toward the tail-end, rather than toward the top of the card.

This makes cabling very convenient, as it no longer needs to bend nearly 180° as it emerges from the back of your motherboard tray. The power cable goes straight into the connector with no bending up to a roughly 8 cm length, before making a 90° turn to the back of your motherboard tray, reducing mechanical strain on the connector. Even the NVIDIA-designed adapter included with the card (the one that converts 4x 8-pin to a 12VHPWR), should look neater. It's likely that as a WindForce-series product, this card offers the lowest tier of factory overclocks by GIGABYTE. You still get dual-BIOS, which lets you toggle between this factory-OC, and a Silent BIOS that runs the card at reference speeds, with tighter fan tuning. The selling point, however, is its unique power connector design.
Sources: harukaze5719 (Twitter), VideoCardz
Add your own comment

53 Comments on GIGABYTE RTX 4090 WindForce V2 Unveiled, with Unique Tail-ended Power Connector

#26
dyonoctis
AusWolfIt's more like half, but I don't see the point of cutting the rest of the fins just to cram the connector a little bit deeper into the card. It shows what's wrong with modern day gaming in general: form over function.

IMO, leave the fin stack alone, or make the card shorter. This intermediate solution is the worst of both worlds.
But if it's shorther it's going to be louder :D. Or they will have to make the card thicker again :D. You are always going to lose in somewhay with the 4090
Posted on Reply
#27
eidairaman1
The Exiled Airman
Count von SchwalbeMuch better if the cutout was at the bottom of the card. Although it would make it necessary to take the card out to insert or remove the connector.

Why not put the connector on an extension to make it end-mounted?
Yup and with all of their cards getting a crack by the pcie locking lug, it's not worth it.
Posted on Reply
#28
N/A
I was hoping for a dual fan 100mm WF2 version, 3.5 slots but 22-25 cm. From the looks of it V2 introduced a new small PCB the height of normal card. Perfect for a water-cooled mini pc.
Posted on Reply
#30
AusWolf
dyonoctisBut if it's shorther it's going to be louder :D. Or they will have to make the card thicker again :D. You are always going to lose in somewhay with the 4090
IMO, it's better to keep one's expectations in check, play at a resolution that makes sense, and buy a card that isn't overly expensive, has good power consumption and comes with a cooler that fits your case while not being too loud - but it might be just me.
Posted on Reply
#32
TechLurker
AusWolfSo they cut half of the fin stack under the third fan, so it's blowing through nothing. How pointless!
On the contrary, it IS blowing over the connection point, so it should theoretically keep the pins a bit cooler and further reduce the risk of melting.
Posted on Reply
#33
nomdeplume
AusWolfIt's more like half, but I don't see the point of cutting the rest of the fins just to cram the connector a little bit deeper into the card. It shows what's wrong with modern day gaming in general: form over function.

IMO, leave the fin stack alone, or make the card shorter. This intermediate solution is the worst of both worlds.
Practical observations are good. Seeing a thermodynamic demonstration of how this design actually works inside a couple of what we would define as typical builds are going to be needed before I pass too strong of comment.

This passthrough had to (should?) be designed with a fair amount of sculpting based upon knowledge air flow will immediately be impeded by impact with the cable and connector. Meaning I'm fairly sure that fan noise and vibration tuning informed the layout. It's also important to note this occurs at the first entry point for cool outside air. A lot of question marks here to be sure could prove... "It shows what's wrong with modern day gaming in general: form over function."
Posted on Reply
#34
Dragokar
Does it only look like that the gap between the connector and the backplate is pretty narrow?
Posted on Reply
#35
dyonoctis
AusWolfIMO, it's better to keep one's expectations in check, play at a resolution that makes sense, and buy a card that isn't overly expensive, has good power consumption and comes with a cooler that fits your case while not being too loud - but it might be just me.
To be fair, the majority of 4090 coolers turned out to be overtuned, and created a set of fitement issues that don't feel justified, along with the bulkiness of the 12VHPWR adapter. The v1 of that gigabyte had no issue staying around 65°c at 32~35dB. The coolers where meant to cool 600w, when it reality you don't even come close to that...and those issues trickled down all the way to some 4070ti.

I won't comment too much on the price/resolution aspect, since imo the 4090 is only wearing a "gaming" skin, I've seen lots of high profile motion/3D designer buying those GPUs to make work for big clients...and not caring for gaming at all. (And people with a large disposable income are always outliers anyway :D)
Posted on Reply
#36
Chaitanya
AusWolfLike this?
www.techpowerup.com/309313/asus-shows-concept-geforce-rtx-4070-without-power-connector
It would be nice if it was standardised.


Probably. The problem in my opinion, is not where the cables are, or how they look. The problem is that the third fan cools basically nothing. It's just there for the noise.

Edit: If they had cut the third fan completely, and put the power connector at the rear end of the card, it would have been a much better design, methinks.
That would be upto PCI-SIG to formalize that power connector on motherboard. though it would mean having to connect multiple PCI-e power connectors on motherboard(which will become a stupid headache quickly on server/WS/HEDT platforms where multiple GPUs are a norm.
Posted on Reply
#37
AusWolf
ChaitanyaThat would be upto PCI-SIG to formalize that power connector on motherboard. though it would mean having to connect multiple PCI-e power connectors on motherboard(which will become a stupid headache quickly on server/WS/HEDT platforms where multiple GPUs are a norm.
Still better than bothering with the cable every time you swap your GPU.
nomdeplumePractical observations are good. Seeing a thermodynamic demonstration of how this design actually works inside a couple of what we would define as typical builds are going to be needed before I pass too strong of comment.

This passthrough had to (should?) be designed with a fair amount of sculpting based upon knowledge air flow will immediately be impeded by impact with the cable and connector. Meaning I'm fairly sure that fan noise and vibration tuning informed the layout. It's also important to note this occurs at the first entry point for cool outside air. A lot of question marks here to be sure could prove... "It shows what's wrong with modern day gaming in general: form over function."
A fair point. I still think the design is kind of pointless, though. If you need a shorter graphics card, then just... dunno... buy a shorter graphics card?
Posted on Reply
#38
Gmr_Chick
Bomby569this dude is just a resonance box for whatever is the latest drama. The real drama queen.
Unsurprising, his content is so bad, it's the way to get views.
Did you even watch the video? The PCB on the Gigabyte card was THICKER than the PCB of the Asus card, yet the Asus card exhibited little to no flex, whereas the Gigabyte card flexed so badly it was making me nervous. :fear:

So you tell me: how can a thicker PCB have so much flex compared to a thinner one in that specific area of the GPU?
Posted on Reply
#39
Bomby569
Gmr_ChickDid you even watch the video? The PCB on the Gigabyte card was THICKER than the PCB of the Asus card, yet the Asus card exhibited little to no flex, whereas the Gigabyte card flexed so badly it was making me nervous. :fear:

So you tell me: how can a thicker PCB have so much flex compared to a thinner one in that specific area of the GPU?
things that flex are more resistant to breaking, the ones that don't flex are the ones that are easy to break. Oh boy!
Posted on Reply
#40
awesomesauce
Probably already said but damn the connector is soo close to the heat sink

if too hot feel like it can be problematic
Posted on Reply
#41
N/A
Gmr_ChickDid you even watch the video? The PCB on the Gigabyte card was THICKER than the PCB of the Asus card, yet the Asus card exhibited little to no flex, whereas the Gigabyte card flexed so badly it was making me nervous. :fear:

So you tell me: how can a thicker PCB have so much flex compared to a thinner one in that specific area of the GPU?
You mean when jay z was holding the caliper and his hand twitches and instead of 1,6 it locked to 2.1,

Gigabyte flexed more because the Asus had this pillar standoff on top of the PCB that provided reinforcement in the opposite direction.

And yeah of course gigabyte was probably not 2 Oz, they skimped on that promise. One could expect if the motherboards are being marketed as 2 Oz, the GPU will employ the same principle.
Posted on Reply
#42
dyonoctis
Bomby569things that flex are more resistant to breaking, the ones that don't flex are the ones that are easy to break. Oh boy!
It's not just about the PCB, but the construction of the card itself. Asus doesn't flex because the structure of the cooler itself help handling the weight, when it's seems that gigabyte just let the PCB handle a big chunk of the weight. The founder edition are good exemple of a strong structure : they are pretty heavy for their size, but they don't flex...and don't make the news about breaking their PCB :D
Posted on Reply
#43
Warrior24_7
It better have an ironclad warranty against cracks. It also pales in comparison to Asus’s 4090 Matrix!
Posted on Reply
#44
Bomby569
dyonoctisIt's not just about the PCB, but the construction of the card itself. Asus doesn't flex because the structure of the cooler itself help handling the weight, when it's seems that gigabyte just let the PCB handle a big chunk of the weight. The founder edition are good exemple of a strong structure : they are pretty heavy for their size, but they don't flex...and don't make the news about breaking their PCB :D
"making the news", aka one Ytuber made a video about a old issue

I have gigabyte products and i knew about this on their subreddit (were it's mostly complains of all sorts, warranty especially), hardly news. Happens from time to time, mostly pre build and i bet it is just because of.... being on prebuilds.
Posted on Reply
#45
RH92
That's a stupid idea for a virtually inexistent problem considering they are sacrificing a significant portion of the heatsink.

But hey it's all about cutting corners ( pun intended ) and pitching it as an advance.

If Gigabyte want to fix some real issues on their GPU's then how about they design a real support bracket and start to honour their warranties ?

Oh yeah that won't happen , they would have to stop cutting corners for that.
Posted on Reply
#46
AusWolf
RH92That's a stupid idea for a virtually inexistent problem considering they are sacrificing a significant portion of the heatsink.

But hey it's all about cutting corners ( pun intended ) and pitching it as an advance.

If Gigabyte want to fix some real issues on their GPU's then how about they design a real support bracket and start to honour their warranties ?

Oh yeah that won't happen , they would have to stop cutting corners for that.
Stop thinking about problems! Just buy this shiny new thing that you never needed, and RMA it when it breaks! :p
Posted on Reply
#47
Chomiq
AusWolfStop thinking about problems! Just buy this shiny new thing that you never needed, and RMA it when it breaks! :p
RMA and get a free red arrow sticker back!
Posted on Reply
#48
dyonoctis
Bomby569"making the news", aka one Ytuber made a video about a old issue

I have gigabyte products and i knew about this on their subreddit (were it's mostly complains of all sorts, warranty especially), hardly news. Happens from time to time, mostly pre build and i bet it is just because of.... being on prebuilds.
jay wasn't the only one, Louis Rossman made the first video, and the video itself was started because someone had to repair a lot of gigabyte GPU send by people who didn't get any help from Gygabyte, and even with the 4000 series, he keep receiving a lot of craked GPU from that brand. Other medias then picked up that story. That's bad RMA from their part. Shutting up about will just entice Gigabyte to keep providing bad RMA
repair.wiki/w/Repairing_a_Cracked_Gigabyte_30_or_40_series
Posted on Reply
#49
Bomby569
dyonoctisjay wasn't the only one, Louis Rossman made the first video, and the video itself was started because someone had to repair a lot of gigabyte GPU send by people who didn't get any help from Gygabyte, and even with the 4000 series, he keep receiving a lot of craked GPU from that brand. Other medias then picked up that story. That's bad RMA from their part. Shutting up about will just entice Gigabyte to keep providing bad RMA
repair.wiki/w/Repairing_a_Cracked_Gigabyte_30_or_40_series
The Gigabyte bad RMA is legendary, it's hardly news. Their subreddit is 80% bad RMA.
I know he wasn't the first, he just jumped on the drama as usual. He is the ambulance chaser guy.
Rossman isn't a pc tech guy, he just reports on bad consumer practices for his right to repair cause. For him i believe the Gigabyte RMA can be news. For someone in the business it can't be.
Rossman also said it was mostly from prebuilds, but i guess that was lost in the drama.

Like with Asus and many other dramas recently there is a story, but the drama is overblown (the Gigabyte PSU and that case that caught fire were the exception), out of context, and clickbait.
Posted on Reply
#50
N/A
Warrior24_7It better have an ironclad warranty against cracks. It also pales in comparison to Asus’s 4090 Matrix!
The problem mostly exists after improper transport of pre built systems. And user error of misaligned slots and the case. And as jayz2c pointed out the shroud is not properly attached to the back.

Aorus and matrix look pretty much the same to me with the obvious design differences.



V2 PCB is as big as the inside of the asus frame. SO it could lead to some really tiny 4090 with dual fan and such.
Posted on Reply
Add your own comment
Dec 22nd, 2024 04:27 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts