• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Gigabyte GeForce RTX 5090 Gaming OC

Joined
Jan 19, 2023
Messages
406 (0.53/day)
Oh you sweet summer child. I have such sights to show you.
or
or
The 5090 isn’t great for efficiency due to hilariously overkill power target, but it still is absolutely not the “most inefficient” GPU in relative terms overall. Far from it. Latter day GCN, Vega and Fermi were much worse.
That just reminded me that I really need to go through some of the TPU reviews of old GPUs as a reminder, a history lesson :D
 
Joined
Oct 6, 2021
Messages
1,723 (1.40/day)
System Name Raspberry Pi 7 Quantum @ Overclocked.
That just reminded me that I really need to go through some of the TPU reviews of old GPUs as a reminder, a history lesson :D

Strange... when they mention poor efficiency I automatically think of the 295x2, because it was the first time I saw some Youtubers doing PSU CF. XD

But when I check the performance again, I'm reminded that the card was a monster, and it makes me miss the Multi-GPU.

It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.
 
Joined
Jan 19, 2023
Messages
406 (0.53/day)

Strange... when they mention poor efficiency I automatically think of the 295x2, because it was the first time I saw some Youtubers doing PSU CF. XD

But when I check the performance again, I'm reminded that the card was a monster, and it makes me miss the Multi-GPU.

It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.
Yeah I really hope next gen we can see true MCM designs and that that's how AMD will make it's comeback to high end. Just slap two 9070XT class GCD or even better four and let her rip :D
Would probably need to move to 3nm process but well it would be progress and can easily be scaled down to lower SKUs.
 
Joined
Nov 27, 2023
Messages
2,888 (6.39/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.
That would require developers taking time and putting effort into making sure their games actually use mGPU properly. With DX12 and Vulcan there is no more crutch in the form of NV creating driver profiles for SLI for them. And developers essentially abandoned the idea of implementing explicit mGPU in their games due to it being an extremely niche feature which takes away development resources. So I would say that without such support no combination of cards would run over anything. mGPU didn’t die because NV and AMD stopped supporting them, it died because it was suboptimal in many cases and the new development paradigm meant that nobody could be arsed.
 
Joined
Oct 6, 2021
Messages
1,723 (1.40/day)
System Name Raspberry Pi 7 Quantum @ Overclocked.
That would require developers taking time and putting effort into making sure their games actually use mGPU properly. With DX12 and Vulcan there is no more crutch in the form of NV creating driver profiles for SLI for them. And developers essentially abandoned the idea of implementing explicit mGPU in their games due to it being an extremely niche feature which takes away development resources. So I would say that without such support no combination of cards would run over anything. mGPU didn’t die because NV and AMD stopped supporting them, it died because it was suboptimal in many cases and the new development paradigm meant that nobody could be arsed.
This is amusing because, in the past, with smaller budgets, developers created their own assets, maintained their own engines, and a wide range of games worked well with multi-GPU setups. Now, despite million-dollar budgets, third-party engines full of shortcuts, ready-made assets, and heavily automated processes… supposedly, mGPU has become "too complicated". :p
 
Joined
Nov 27, 2023
Messages
2,888 (6.39/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
@Denver
A shitton of said “golden past” games were a nightmare of spaghetti code barely adhering to DX standards and often bailed out to workable state by NVidia and AMDs driver teams. I remember an AMA on Reddit with an ex-NV driver guy who straight up said that a lot of console ports in the late 00s to early 10s that came to them were essentially unshippable and they had to scramble with Day 1 drivers. I think that might have been one of Assassins Creeds. That includes SLI implementation. Don’t get it wrong, in the olden (relatively) times a lot of games were just as terribly made as now. It’s just that now with low-level APIs the onus is entirely on developers themselves and NV and AMD can no longer bail them out.

Having their own engine is also a two-sided coin - I despise modern over-reliance on Unreal, sure, and we had absolute black magic stuff in the past like the original Source or MT Framework, yeah. But for every one of those there was a RAGE (an absolute piece of shit engine from what I heard from the people who worked with it and it continues being one to this day seemingly) or Bethesda trotting out the raped corpse of NetImmerse/Gamebryo/Creation/whatever for the Xs time or Hero Engine or EA trying to force a square peg into a round hole with Frostbite for every game.
 
Joined
Aug 28, 2023
Messages
262 (0.48/day)
Correct; so the irony is, it's a useless feature unless your GPU is powerful enough, anything below a 5080, won't give you that 60FPS+ with Pathtracing. You just get to pay for old gen tech for a higher price, ergo, a overpriced turd.
I wish I would have that useless frame gen on my current weak gpu to push that 60fps to 120 or 180.
 
  • Like
Reactions: N/A
Joined
Dec 14, 2011
Messages
1,299 (0.27/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
I wish I would have that useless frame gen on my current weak gpu to push that 60fps to 120 or 180.

You can, buy and download Lossless Scaling from Steam, you will find out exactly what I mean. The input lag is so bad, the game isn't worth playing, you are seeing a smoother picture, sure, but you can't really play the game in "real-time"
 
Joined
Apr 16, 2013
Messages
566 (0.13/day)
Location
Bulgaria
System Name Black Knight | White Queen
Processor Intel Core i9-10940X (28 cores) | Intel Core i7-5775C (8 cores)
Motherboard ASUS ROG Rampage VI Extreme Encore X299G | ASUS Sabertooth Z97 Mark S (White)
Cooling Noctua NH-D15 chromax.black | Xigmatek Dark Knight SD-1283 Night Hawk (White)
Memory G.SKILL Trident Z RGB 4x8GB DDR4 3600MHz CL16 | Corsair Vengeance LP 4x4GB DDR3L 1600MHz CL9 (White)
Video Card(s) ASUS ROG Strix GeForce RTX 4090 OC | KFA2/Galax GeForce GTX 1080 Ti Hall of Fame Edition
Storage Samsung 990 Pro 2TB, 980 Pro 1TB, 850 Pro 256GB, 840 Pro 256GB, WD 10TB+ (incl. VelociRaptors)
Display(s) Dell Alienware AW2721D 240Hz| LG OLED evo C4 48" 144Hz
Case Corsair 7000D AIRFLOW (Black) | NZXT ??? w/ ASUS DRW-24B1ST
Audio Device(s) ASUS Xonar Essence STX | Realtek ALC1150
Power Supply Enermax Revolution 1250W 85+ | Super Flower Leadex Gold 650W (White)
Mouse Razer Basilisk Ultimate, Razer Naga Trinity | Razer Mamba 16000
Keyboard Razer Blackwidow Chroma V2 (Orange switch) | Razer Ornata Chroma
Software Windows 10 Pro 64bit
Suddenly RTX 40 are more valuable for backwards compatibility.


RTX 50 series silently removed 32-bit PhysX support


 
Joined
Jan 19, 2023
Messages
406 (0.53/day)
You can, buy and download Lossless Scaling from Steam, you will find out exactly what I mean. The input lag is so bad, the game isn't worth playing, you are seeing a smoother picture, sure, but you can't really play the game in "real-time"
Not really comparable with DLSS FG or FSR FG.
I used both often and surely I did play in "real-time"

Suddenly RTX 40 are more valuable for backwards compatibility.






Yes totally everybody will suddenly use Physx in 5 games it supported 10 years ago.
At some point old software needs to be depreciated, if not for cleanup purposes then for security so that no vulnerability will suddenly appear for software that was not updated for 10 years. And I said the same when AMD have dropped monthly driver support for older GPUs.
It's not a big deal.
 
Joined
Jun 24, 2017
Messages
194 (0.07/day)
I can't remember such a bad launch of nvidia series since post 2xx series or so.
Bad product: safety, retrocompatibility, ram, etc. Bad performance: barely improved over last gen. Bad price. Scarcity.

It is planned or can TPU warn us about the internal shunt distributions for the 5090 in their reviews for now on? Can this be added to the tpu's database of GPUs as a new field?

The only power we can have as a consumers is, thanks to this info, boicot the awful power-in systems they are deliverately using.
Good shunts are not cheap, but neither are their boards: the BOM is far from the final price.
 
Joined
Dec 14, 2011
Messages
1,299 (0.27/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
Not really comparable with DLSS FG or FSR FG.
I used both often and surely I did play in "real-time"

"Above all, do not lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others."
 

Robbmrp

New Member
Joined
Feb 18, 2025
Messages
1 (0.33/day)
Do you have the Thermal readings for the Gigabyte 5090 Waterforce GPU's? I'd like to see how those compare with the MSI versions.
 
Joined
Jan 19, 2023
Messages
406 (0.53/day)
"Above all, do not lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others."
Dramatic much?
Great poetry night however it has nothing to do with a simple fact that Lossless scaling does not have access to the same data that either DLSS or FSR FG. If you want to compare it with something compare it with AFMF or Smooth Motion.
And yeah believe it or not not everybody is a leet gamer that can detect every 1ms of input lag with eyes closed.
 
Joined
Oct 1, 2014
Messages
2,038 (0.54/day)
Location
Calabash, NC
System Name The Captain (2.0)
Processor Ryzen 7 7700X
Motherboard Asus ROG Strix X670E-A
Cooling 280mm Arctic Liquid Freezer II, 4x Be Quiet! 140mm Silent Wings 4 (1x exhaust 3x intake)
Memory 32GB (2x16) Kingston Fury Beast CL30 6000MT/s
Video Card(s) MSI GeForce RTX 3070 SUPRIM X
Storage 1x Crucial MX500 500GB SSD; 1x Crucial MX500 500GB M.2 SSD; 1x WD Blue HDD, 1x Crucial P5 Plus
Display(s) Aorus CV27F 27" 1080p 165Hz
Case Phanteks Evolv X (Anthracite Gray)
Power Supply Corsair RMx (2021) 1000W 80-Plus Gold
Mouse Varies based on mood/task; is currently Razer Basilisk V3 Pro or Razer Cobra Pro
Keyboard Varies based on mood; currently Razer Blackwidow V4 75% and Hyper X Alloy 65

Strange... when they mention poor efficiency I automatically think of the 295x2, because it was the first time I saw some Youtubers doing PSU CF. XD

But when I check the performance again, I'm reminded that the card was a monster, and it makes me miss the Multi-GPU.

It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.

Ah, yes, the behemoth 295X2. 2014 was around the time I really got into the PC DIY hobby, and I remember I saw that card in a computer magazine. It was a review. I thought it was crazy cool to see a card with two GPUs on it. :D
 
Joined
Oct 2, 2020
Messages
1,086 (0.68/day)
System Name Laptop ASUS TUF F15 | Desktop 1 | Desktop 2
Processor Intel Core i7-11800H | Intel Core i5-14600K @ 95W P-Cores only | Intel Core i3-10100
Motherboard ASUS FX506HC | Gigabyte B660M DS3H DDR4 | MSI MAG B560M Bazooka
Cooling Laptop built-in cooling lol | i9-12900 stock box cooler | Intel black stock box cooler with copper
Memory 24 GB @ 3200 | 16 GB @ 3600 | 16 GB @ 3200
Video Card(s) Nvidia RTX 3050 Mobile 4GB | Nvidia GTX 1050 Ti | Nvidia GTX 960 2 GB
Storage Adata XPG SX8200 Pro 512 GB | Samsung M2 SSD 256 GB & 1 TB 2.5" HDD @ 7200| SSD 250 GB & SSD 240 GB
Display(s) Laptop built-in 144 Hz FHD screen | Dell 27" WQHD @ 75 Hz & 49" TV FHD | Samsung 32" TV FHD
Case It's a laptop, it doesn't need case lmfao | Deepcool Mattrexx 55 MESH | Aerocool Cylon PRO
Audio Device(s) laptop built in audio | Logitech stereo speakers | Logitech 2.1 speakers
Power Supply ASUS 180W PSU | SeaSonic Focus GX-550 | MSI MAG A550BN
Mouse Logitech G604 | Corsair Harpoon wired mouse| Logitech G305
Keyboard Laptop built-in keyboard |Razer Blackwidow | Steelseries APEX 7 TKL
VR HMD Quest 2 sold out and don't need VR anymore lol
Software Windows 10 Enterprise 20H2 | Windows 11 24H2 LTSC | Windows 11 24H2 LTSC
Benchmark Scores good enough
What's wrong with Gigabyte? Is it hard to make good cooling solution? Most of the time Gigabyte GPU is the loudest or hottest. Even huge 3x slot 3x fan solutions are pretty bad.
the "wrong" is when their fans "rattle". I HOPE this is not an issue with SUCH cards tho...:D:rolleyes::oops:
 
Joined
Sep 19, 2014
Messages
167 (0.04/day)

Strange... when they mention poor efficiency I automatically think of the 295x2, because it was the first time I saw some Youtubers doing PSU CF. XD

But when I check the performance again, I'm reminded that the card was a monster, and it makes me miss the Multi-GPU.

It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.
its was very bad slow gpu that use lot of power
even 980Ti was faster 1080p and OC 980Ti was faster in 1440p, there was so much stutter problems in games also whit 295x2

You can, buy and download Lossless Scaling from Steam, you will find out exactly what I mean. The input lag is so bad, the game isn't worth playing, you are seeing a smoother picture, sure, but you can't really play the game in "real-time"
There is no that kind of imputlags using FG
 
Joined
Dec 14, 2011
Messages
1,299 (0.27/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
Joined
Jun 21, 2022
Messages
163 (0.17/day)
That would require developers taking time and putting effort into making sure their games actually use mGPU properly. With DX12 and Vulcan there is no more crutch in the form of NV creating driver profiles for SLI for them. And developers essentially abandoned the idea of implementing explicit mGPU in their games due to it being an extremely niche feature which takes away development resources. So I would say that without such support no combination of cards would run over anything. mGPU didn’t die because NV and AMD stopped supporting them, it died because it was suboptimal in many cases and the new development paradigm meant that nobody could be arsed.

Due to the added DLSS4 latency dual GPU metas are beeing (successfully) tested right now.

Still in the early stages though.
 
Top