• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Gigabyte GeForce RTX 5090 Gaming OC

Oh you sweet summer child. I have such sights to show you.
or
or
The 5090 isn’t great for efficiency due to hilariously overkill power target, but it still is absolutely not the “most inefficient” GPU in relative terms overall. Far from it. Latter day GCN, Vega and Fermi were much worse.
That just reminded me that I really need to go through some of the TPU reviews of old GPUs as a reminder, a history lesson :D
 
That just reminded me that I really need to go through some of the TPU reviews of old GPUs as a reminder, a history lesson :D

Strange... when they mention poor efficiency I automatically think of the 295x2, because it was the first time I saw some Youtubers doing PSU CF. XD

But when I check the performance again, I'm reminded that the card was a monster, and it makes me miss the Multi-GPU.

It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.
 

Strange... when they mention poor efficiency I automatically think of the 295x2, because it was the first time I saw some Youtubers doing PSU CF. XD

But when I check the performance again, I'm reminded that the card was a monster, and it makes me miss the Multi-GPU.

It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.
Yeah I really hope next gen we can see true MCM designs and that that's how AMD will make it's comeback to high end. Just slap two 9070XT class GCD or even better four and let her rip :D
Would probably need to move to 3nm process but well it would be progress and can easily be scaled down to lower SKUs.
 
It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.
That would require developers taking time and putting effort into making sure their games actually use mGPU properly. With DX12 and Vulcan there is no more crutch in the form of NV creating driver profiles for SLI for them. And developers essentially abandoned the idea of implementing explicit mGPU in their games due to it being an extremely niche feature which takes away development resources. So I would say that without such support no combination of cards would run over anything. mGPU didn’t die because NV and AMD stopped supporting them, it died because it was suboptimal in many cases and the new development paradigm meant that nobody could be arsed.
 
That would require developers taking time and putting effort into making sure their games actually use mGPU properly. With DX12 and Vulcan there is no more crutch in the form of NV creating driver profiles for SLI for them. And developers essentially abandoned the idea of implementing explicit mGPU in their games due to it being an extremely niche feature which takes away development resources. So I would say that without such support no combination of cards would run over anything. mGPU didn’t die because NV and AMD stopped supporting them, it died because it was suboptimal in many cases and the new development paradigm meant that nobody could be arsed.
This is amusing because, in the past, with smaller budgets, developers created their own assets, maintained their own engines, and a wide range of games worked well with multi-GPU setups. Now, despite million-dollar budgets, third-party engines full of shortcuts, ready-made assets, and heavily automated processes… supposedly, mGPU has become "too complicated". :p
 
@Denver
A shitton of said “golden past” games were a nightmare of spaghetti code barely adhering to DX standards and often bailed out to workable state by NVidia and AMDs driver teams. I remember an AMA on Reddit with an ex-NV driver guy who straight up said that a lot of console ports in the late 00s to early 10s that came to them were essentially unshippable and they had to scramble with Day 1 drivers. I think that might have been one of Assassins Creeds. That includes SLI implementation. Don’t get it wrong, in the olden (relatively) times a lot of games were just as terribly made as now. It’s just that now with low-level APIs the onus is entirely on developers themselves and NV and AMD can no longer bail them out.

Having their own engine is also a two-sided coin - I despise modern over-reliance on Unreal, sure, and we had absolute black magic stuff in the past like the original Source or MT Framework, yeah. But for every one of those there was a RAGE (an absolute piece of shit engine from what I heard from the people who worked with it and it continues being one to this day seemingly) or Bethesda trotting out the raped corpse of NetImmerse/Gamebryo/Creation/whatever for the Xs time or Hero Engine or EA trying to force a square peg into a round hole with Frostbite for every game.
 
Correct; so the irony is, it's a useless feature unless your GPU is powerful enough, anything below a 5080, won't give you that 60FPS+ with Pathtracing. You just get to pay for old gen tech for a higher price, ergo, a overpriced turd.
I wish I would have that useless frame gen on my current weak gpu to push that 60fps to 120 or 180.
 
I wish I would have that useless frame gen on my current weak gpu to push that 60fps to 120 or 180.

You can, buy and download Lossless Scaling from Steam, you will find out exactly what I mean. The input lag is so bad, the game isn't worth playing, you are seeing a smoother picture, sure, but you can't really play the game in "real-time"
 
Suddenly RTX 40 are more valuable for backwards compatibility.


RTX 50 series silently removed 32-bit PhysX support
GkD8hGnXQAAdbpi


GkD8hGvXgAAPCP6
 
You can, buy and download Lossless Scaling from Steam, you will find out exactly what I mean. The input lag is so bad, the game isn't worth playing, you are seeing a smoother picture, sure, but you can't really play the game in "real-time"
Not really comparable with DLSS FG or FSR FG.
I used both often and surely I did play in "real-time"

Suddenly RTX 40 are more valuable for backwards compatibility.



GkD8hGnXQAAdbpi


GkD8hGvXgAAPCP6

Yes totally everybody will suddenly use Physx in 5 games it supported 10 years ago.
At some point old software needs to be depreciated, if not for cleanup purposes then for security so that no vulnerability will suddenly appear for software that was not updated for 10 years. And I said the same when AMD have dropped monthly driver support for older GPUs.
It's not a big deal.
 
I can't remember such a bad launch of nvidia series since post 2xx series or so.
Bad product: safety, retrocompatibility, ram, etc. Bad performance: barely improved over last gen. Bad price. Scarcity.

It is planned or can TPU warn us about the internal shunt distributions for the 5090 in their reviews for now on? Can this be added to the tpu's database of GPUs as a new field?

The only power we can have as a consumers is, thanks to this info, boicot the awful power-in systems they are deliverately using.
Good shunts are not cheap, but neither are their boards: the BOM is far from the final price.
 
Not really comparable with DLSS FG or FSR FG.
I used both often and surely I did play in "real-time"

"Above all, do not lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others."
 
Do you have the Thermal readings for the Gigabyte 5090 Waterforce GPU's? I'd like to see how those compare with the MSI versions.
 
"Above all, do not lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others."
Dramatic much?
Great poetry night however it has nothing to do with a simple fact that Lossless scaling does not have access to the same data that either DLSS or FSR FG. If you want to compare it with something compare it with AFMF or Smooth Motion.
And yeah believe it or not not everybody is a leet gamer that can detect every 1ms of input lag with eyes closed.
 

Strange... when they mention poor efficiency I automatically think of the 295x2, because it was the first time I saw some Youtubers doing PSU CF. XD

But when I check the performance again, I'm reminded that the card was a monster, and it makes me miss the Multi-GPU.

It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.

Ah, yes, the behemoth 295X2. 2014 was around the time I really got into the PC DIY hobby, and I remember I saw that card in a computer magazine. It was a review. I thought it was crazy cool to see a card with two GPUs on it. :D
 
What's wrong with Gigabyte? Is it hard to make good cooling solution? Most of the time Gigabyte GPU is the loudest or hottest. Even huge 3x slot 3x fan solutions are pretty bad.
the "wrong" is when their fans "rattle". I HOPE this is not an issue with SUCH cards tho...:D:rolleyes::oops:
 

Strange... when they mention poor efficiency I automatically think of the 295x2, because it was the first time I saw some Youtubers doing PSU CF. XD

But when I check the performance again, I'm reminded that the card was a monster, and it makes me miss the Multi-GPU.

It would greatly reduce the "sexy appeal" of the xx90 tier, more than the price already does. Two 9070/5070ti would brutally run over that.
its was very bad slow gpu that use lot of power
even 980Ti was faster 1080p and OC 980Ti was faster in 1440p, there was so much stutter problems in games also whit 295x2

You can, buy and download Lossless Scaling from Steam, you will find out exactly what I mean. The input lag is so bad, the game isn't worth playing, you are seeing a smoother picture, sure, but you can't really play the game in "real-time"
There is no that kind of imputlags using FG
 
That would require developers taking time and putting effort into making sure their games actually use mGPU properly. With DX12 and Vulcan there is no more crutch in the form of NV creating driver profiles for SLI for them. And developers essentially abandoned the idea of implementing explicit mGPU in their games due to it being an extremely niche feature which takes away development resources. So I would say that without such support no combination of cards would run over anything. mGPU didn’t die because NV and AMD stopped supporting them, it died because it was suboptimal in many cases and the new development paradigm meant that nobody could be arsed.

Due to the added DLSS4 latency dual GPU metas are beeing (successfully) tested right now.

Still in the early stages though.
 
Back
Top