• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Gigabyte GeForce RTX 5090 Gaming OC

What's wrong with Gigabyte? Is it hard to make good cooling solution? Most of the time Gigabyte GPU is the loudest or hottest. Even huge 3x slot 3x fan solutions are pretty bad.
Something reviewers never talk about with Gigabyte GPUs is how the Thermal Paste does not cover the entire GPU. Just look at the PCB pictures in this very review and you will see a spot where there is no TP applied. The first time I saw it was my Vega 64 but the exact same issue plagued the 6500XT with 3 fans for a GPU die that is smaller than a VRAM module.
 
Would you say that it feels way smoother than without framegen? I find it hard to believe until I try if out for myself
Yes it does for me. I play with 4X in CP2077, but yes the base framerate needs to be around 60.
 
"Gigabyte's card also has its default power limit increased to 600 W for extra performance."

When the safety factor of the 12vHpwr 12c2x6 connector is 1.1 or 684W, compared to 8pin's safety factor of 1.9x or 252W (https://en.wikipedia.org/wiki/16-pin_12VHPWR_connector). It dosen't matter how fast/good the card is, when there is a very real fire hazard that could deprive someone of thier life. Maybe MFG acronym should be "Multi Flame Generation"

I can't tell from PCB, @W1zzard do you know if this card have per-pin monitoring of the 6x 12v connections like the Asus card does?
 
Last edited:
I can't tell from PCB, @W1zzard do you know if this card have per-pin monitoring of the 6x 12v connections like the Asus card does?

I see two shunts, which means it's probably this:
1739821937211.png
credit Buildzoid on YouTube

It's definitely not per-pin monitoring, but it's possible that half of the wires are being divided per shunt like this:
1739822224147.png


Outside of a Gigabyte engineer replying, only a CT scan of the PCB layers, or someone probing the board with a multimeter could give you the answer you're looking for. I would expect that with only two shunts, Gigabyte are using a reference design from Nvidia and therefore all 6 12V pins are bridged before the shunts (and the decision to use two shunts to do the job of one shunt still makes no sense to me at all).
 
*Based on a reference. interesting there is a stealthy 6x2 near the slot.

1739824255740.png

2060 Fe
1739824001635.png
 
I hope we see a 5080 Super for $1499 but like 95% of a 5090 in gaming and 20-24gb vram. I'd probably buy that honestly. 5090 with 32gb vram is def targeting non-gamers too, so i think there is room in the market for this, assuming the performance is still there for gaming.
 
What's wrong with Gigabyte? Is it hard to make good cooling solution? Most of the time Gigabyte GPU is the loudest or hottest. Even huge 3x slot 3x fan solutions are pretty bad.

In my point of view most graphic cards are unacceptable.

Please ignore the fact that this is my / a AMD grpahic card. (That is another topic.) Just check the temperature and fan noise. Than compare it with this card or similar cards. I bought it on purpose because of such reviews.

Gamers Nexus had a day ago a livestream with the ASUS 5090. What a noisy card ... Just check out yourself that video stream on youtube. 100% FAN speed.
 
Correct; so the irony is, it's a useless feature unless your GPU is powerful enough, anything below a 5080, won't give you that 60FPS+ with Pathtracing. You just get to pay for old gen tech for a higher price, ergo, a overpriced turd.

Yeah, yeah but somehow Lossless Scaling, FSR FG, AFMF are godsend.
 
Something about using messy af thermal paste instead of thermal pads on a $2300 GPU just feels horribly wrong. Or is it just me?
 
*Based on a reference. interesting there is a stealthy 6x2 near the slot.

View attachment 385396
2060 Fe
View attachment 385395
Reference designs are jack-of-all-trades. There are half-decent chances it's on multiple cards:

Gigabyte 4090 had similar, presumably for workstation blower design or server GPU:
1739828946177.png


Palit 4090 (2x EPS12V for server?)
1739829052468.png
 
On Quiet BIOS, it seems fine. It's a cheaply made cooler and fan shroud on their "OC" line-up as I mentioned before. Loud and prone to rattles. But, again, on the quiet BIOS, it's fine.
Yeah the Gaming OC is their lowest OC variant so they don't care as much... I'm sure the MASTER OC and XTREME WATERFORCE have much better cooling!
 
Outside of a Gigabyte engineer replying, only a CT scan of the PCB layers, or someone probing the board with a multimeter could give you the answer you're looking for. I would expect that with only two shunts, Gigabyte are using a reference design from Nvidia and therefore all 6 12V pins are bridged before the shunts (and the decision to use two shunts to do the job of one shunt still makes no sense to me at all).
So no better than FE, so could still result in melted connectors/cables.
 
You can try just basic frame gen with lossless scaling and ime even with that 60fps + frame gen feels much better than 60fps without. Latency doesn't change but visual smoothness goes up a lot
Imo Lossless Scaling is good, but far to be as good as FG/MFG since it adds a lot more Latency due to the Driver doing the job, and it has a huge performance hit too!
You lose about 1/3 of performance when using LS, so if you get 4K60fps you need to lock the fps to 40fps to make it work as intended. LS doesn't work well with a Variable Framerate, so it needs to be locked at all times. So you might "get 120fps" with 3X or "160fps with 4x" but the Input Lag feels a lot closer to 30fps than anything else, and LS has a lot more artifacts too since it's not using the game motion vectors either.

LS can be good for people who can't use FG and want more smoothness, but it's definitely not as good as FG. And DLSS 4 has a much better image quality too!

I would expect that with only two shunts, Gigabyte are using a reference design from Nvidia and therefore all 6 12V pins are bridged before the shunts (and the decision to use two shunts to do the job of one shunt still makes no sense to me at all).
Idk why but I fell like all AIBs have been told to do the same thing. Why is no AIB model using 2x 16-pin connectors? and Why did none of them use 3 capacitors like on the 3090 Ti to balance the load?
It seems like NVIDIA prevented everyone from doing something different. They probably just want people to feel like the FE is the one people should buy (Lower cost (MSRP) & and even if you OC AIB models, you don't get much more performance due to power limits so better get a FE).

So no better than FE, so could still result in melted connectors/cables.
Nvidia or "how to make AIB models irrelevant" because if they were better & safer then nobody would buy the FE lol
 
Imo Lossless Scaling is good, but far to be as good as FG/MFG since it adds a lot more Latency due to the Driver doing the job, and it has a huge performance hit too!
You lose about 1/3 of performance when using LS, so if you get 4K60fps you need to lock the fps to 40fps to make it work as intended. LS doesn't work well with a Variable Framerate, so it needs to be locked at all times. So you might "get 120fps" with 3X or "160fps with 4x" but the Input Lag feels a lot closer to 30fps than anything else, and LS has a lot more artifacts too since it's not using the game motion vectors either.

LS can be good for people who can't use FG and want more smoothness, but it's definitely not as good as FG. And DLSS 4 has a much better image quality too!
Yeah for sure. My main point is that that's a technology that pretty much everyone can check out, and if even that is better than base fps, then FG is pretty much guaranteed to be an improvement in the context of who I was responding to
 
Generally, the RTX 5090 is very expensive, but it is the best and everyone wants to have one.

Simply, no.
 
Correct; so the irony is, it's a useless feature unless your GPU is powerful enough, anything below a 5080, won't give you that 60FPS+ with Pathtracing. You just get to pay for old gen tech for a higher price, ergo, a overpriced turd.
If not so much Nvidia hate ppls even Amd users understand there is option in games, and pathtracing is also optional.

We can turn settings down and using FG to get high FPS to use that high Hz Pc monitor and enjoy smooth ride, even whitout 5080/5090 gpus

In sure u also turn settings down in options?! right? ;)

Would you say that it feels way smoother than without framegen? I find it hard to believe until I try if out for myself
it looks shit whitout FG
Even its fake frames, its looks smooth its feels smooth, no going back when gaming high FPS even its FG, makes huge difference.

Simply, no.
Simply yes and yes
it is expensive gpu, and also if ppls can get it free there is no more AMD users left.. everyone want one if free
 
B&Hphotovideo is yet to receive... ONE card so far. Since the launch, they've received none to sell. What are we even talking about here.
 
Simply yes and yes
it is expensive gpu, and also if ppls can get it free there is no more AMD users left.. everyone want one if free
Would just sell it if got one for free TBH. I don’t fuck with the idea of a 600W space heater as my GPU. But I realize that I am in minority and have weird tastes.
 
This is just getting more and more ridiculous!

Now the overpriced "overclocking" card that has a MRSP that's $350 over already overpriced "$2000" RTX 5090 looks like "a better deal", because Founder's Edition (or equivalent) is listed at "street price", and this card of course doesn't have street price, so it's compared to it at MSRP? And if we don't like it, we have another chart, with all cards listed at their "fantasy" MSRP prices?

All the new cards will be viewed as a good deal in this scalping time. This is nothing but saying "it is what it is", and giving very little of focus to the absolutely crap deal we are getting.

For instance, how is the value of this brand new card at street price (you can easily guess it, we already have other RTX 5090 in shops) compared to street prices of previous generations, when they were at normal prices?

Oh, it doesn't matter, you can't buy an RTX 4080 Super for $999 now! You could for most of the year, but that ship has sailed!

This is nothing but apologizing scalping.
 
This is just getting more and more ridiculous!

Now the overpriced "overclocking" card that has a MRSP that's $350 over already overpriced "$2000" RTX 5090 looks like "a better deal", because Founder's Edition (or equivalent) is listed at "street price", and this card of course doesn't have street price, so it's compared to it at MSRP? And if we don't like it, we have another chart, with all cards listed at their "fantasy" MSRP prices?

All the new cards will be viewed as a good deal in this scalping time. This is nothing but saying "it is what it is", and giving very little of focus to the absolutely crap deal we are getting.

For instance, how is the value of this brand new card at street price (you can easily guess it, we already have other RTX 5090 in shops) compared to street prices of previous generations, when they were at normal prices?

Oh, it doesn't matter, you can't buy an RTX 4080 Super for $999 now! You could for most of the year, but that ship has sailed!

This is nothing but apologizing scalping.
There is no scalping time. There's no cards to purchase. Of course the 5 only produced that were snagged will be resold for more. Problem is the supply, not scalpers.
 
There is no scalping time. There's no cards to purchase. Of course the 5 only produced that were snagged will be resold for more. Problem is the supply, not scalpers.
I mean "scalping" as ridiculously increasing the price over MSRP, we don't even have the scalpers here - because all the new Blackwells came "pre-scalped" to shops, for instance all the RTX 5080 came above 1640 EUR, to all shops - as if that's the new base price, fuck the MSRP.
 
This must be the worst most horribly ineffecient computing products ever made. The idle power, the max power draw, the heat, it's just atrocious. Lol at this joke of an overpriced product.
The days of exceptional power efficiency advancements are behind us. on N3E GB202 would have been -32% less power and -37% shader area which is still translated 600 W but 18% faster because you can always choose 450 W at same performance later. I would probably hack the cable sense pins to run at 450 just to be safe. It can go lower than 30W in idle with 512 bit 51Mhz memory 200 MHz GPU if it allows lower than 800mV. The price of the GB202 is $400 not to mention other RD costs and middle men. The end price is now starting at insane markup and gradually reduced over time but this was to be expected as they just don't know what the market is willing to pay and have to get the pulse. The PCB needs to uncover the second fan as well. Now only fan has a direct flow.
 
Last edited:
Gigabyte still using sleeve bearing fans on an expensive card like this lol

Gotta pony up even more for their Aorus Master SKU to get double ball bearing fans
 
This must be the worst most horribly ineffecient computing products ever made. The idle power, the max power draw, the heat, it's just atrocious. Lol at this joke of an overpriced product.
Oh you sweet summer child. I have such sights to show you.
or
or
The 5090 isn’t great for efficiency due to hilariously overkill power target, but it still is absolutely not the “most inefficient” GPU in relative terms overall. Far from it. Latter day GCN, Vega and Fermi were much worse.
 
Back
Top