• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Founders Edition

there is a lot of information to catch up.

Suddenly AMD cards do not look that worse. I really wonder if "high end" and "middle end" segment cards have automatically higher power consumption in different scenarios like video playback and idle consumption and single monitor consumption.
Memory overclocking troubles ... Please do it better than AMD ... so I have an excuse and say look Nvidia is awesome.

That fan stuff I expected before I saw any reviews.
  • Very high idle/multi-monitor/video playback power consumption
  • Fans are "not quiet" during gaming
  • Memory overclocking artificially limited by the driver
 
I don't think Mcgee intended any anger or malice.
Neither do I. I wasn't sarcastic with my "hurray," I was genuinely happy to see my tomfoolery having a proper sequel.
 
Good luck finding one. The already unavailable-most-of-the-time edition" with "very limited stock"

Yeah, will have to get an AIB partner card with block to suit. I've never purchased a founders. Will be waiting for reviews of partner cards.

Looking forward to power limiting this card for better efficiency. Should double my frames over my current 4080.

The 4090's msrp @ 1600 + the 5090 more avg fps of 35% gives me 2160 so not too shabby tbh. It's just getting a 5090 @ msrp is going to be the problem..
 
I've never purchased a founders
You're not alone. These are quite a rare sight here in Russia, always with an extorting price tag. I prefer buying a higher tier model at this point.
 
Tech yes did under volt.

Thank you! I'll have to check the video out.

Computerbase tested it. Unlike the 4090, it scales with more power so lowering the power limit does come with tradeoffs, depending upon the game. Space Marine 2 is probably the worst case in their benchmark suite.

View attachment 381293

Appreciate it. So between a 12-15% uplift from the 4090 when limited to 450w.


More UV testing.

View attachment 381294

No performance loss at 90 W savings.

6% performance loss at 170 W savings.

Not bad, I think this scaling is more or less in line with the 4090s (but with the different wattages values shifted up due to the higher base TDP of the 5090).

Hopefully TPU does an indepth power limiting / UV piece with more than just game performance. From what I can tell with my 4090, performance doesn't suffer as much in AI tasks from more aggressive power limiting. This is where the faster VRAM might be especially beneficial.
 
A new GPU generation is about 50% less power with 50% more performance, this gpu for me is just a 4090ti. You can expect a true new generation gpu with 2nm. This is the reason nvidia decided not to charge 3k plus for it.
 
It is 4090Ti, power, spec and performance are all increased linearly as well as price and noise. Very disappointed, but I have to get it.
 
A full cover water block that cools everything on both sides of the PCB is very justified for this beast.
 
Not sure the Frame Gen analysis is very useful. Sure, you can measure the stupid number of "frames", but I don't think that's the best measure of this technology. I think it's into a more subjective analysis we go: how does a game feel with the technology: better or worse?
 
"On the other hand, the $2000 price point is extremely high, +$400 over the RTX 4090. The RTX 5090 is the best, be prepared to pay for it."

No! What is happening now, is nVidia are price gouging thier customers. Sorry @W1zzard, but with the respect, you are unfortunaly re-amplifying the gaslighting enshitification that nvidia has been pushing. But whats worse is people are too shorted sighted and the idots will enbolden nVidia et al.
 
Finally, Nvidia took the crown from Intel on the video playback power consumption! Yesss! :rockout:
 
I doubt many understand what efficiency is.

I pay the cash on the wall socket. And the power supply unit should have decent efficiency in the lower regions also. That value is hardly tested from 1 Watt to 70 Watts on the output of the power supply unit.
First 10 percent measurement point, at 150 Watts, on the power supply output is kinda nonsense.
You will be surprised when you find efficiency curves for 230V AC or 110 V AC in the region for 1 to 100 Watts for your power supply unit.
 
What I learned from this review: this is a great card for Counter Strike 2 players.

1737671429045.png


Haha, kidding aside, maybe it is time to remove Counter Strike 2 and Witcher 3 from reviews. Witcher 3 is 10 years old this year. CS2 is just more CS, a high frame rate game for old GPUs.

What else did I learn? The Radeon 7900 GRE and 7800 XT are still not dethroned. Hope we see movement on performance per dollar with the RTX 5070. Looking forward to the review of that card, a card that actually might make sense, unlike the awful 5090 card.

We really need 16GB though, so I imagine most heavy gamers are most interested in the RTX 5070 Ti review.
 
Last edited:
Sep 19th, 2024

I considered Final Fantasy 16, but decided against it because it's not a big commercial hit ...
Bad perf & timing (4 months ago), I don't think anyone in their right mind would consider picking up a minimum $750 card to play it at 1440p when the new gens about to come out if they were holding onto 30 series.
I saw the numbers computerbase was reporting for XVI today but they used DLSS at 1440p, imo dlss is a bandaid to make playable 4k possible.
@W1zzard would you reconsider FFXVI for the 80/70 tier releases?
 

Attachments

  • cb5090.png
    cb5090.png
    84.4 KB · Views: 92
5070 may get the biggest gains from lots of extra bandwidth but than its still 12gb only..
 
"On the other hand, the $2000 price point is extremely high, +$400 over the RTX 4090. The RTX 5090 is the best, be prepared to pay for it."

No! What is happening now, is nVidia are price gouging thier customers. Sorry @W1zzard, but with the respect, you are unfortunaly re-amplifying the gaslighting enshitification that nvidia has been pushing. But whats worse is people are too shorted sighted and the idots will enbolden nVidia et al.
Seriously? W1z doesn't set prices for GPU's. All he can do is review and critique what is delivered. And really, who are YOU calling short-sighted?
 
not sure what some of you clowns were expecting. it's made on the same node as the 4090 AND it has to power an extra 4 (corrected, said 8 initially because of a brain fart, but point still remains) memory chips. GDDR7 uses 11% less power, but there are more memory modules to feed power to. that alone adds around 50W more power draw to the card. people were screaming for more memory and more compute. well, you got it. now the same people are screaming that it's too much

the card did better than i expected in every way imaginable. i would buy one no questions asked, but i don't have time to play games any more. i got old :(
 
Last edited:
not sure what some of you clowns were expecting. it's made on the same node as the 4090 AND it has to power an extra 8 memory chips.

only 4 more memory chips. The problem with it is that 33% more SMs are added to each GPC instead of giving it 4 extra GPCs, therefore it retains the same 176 Rops.

What I care about is the new RTX 5080's performance compared with the RTX 4090. If it doesn't beat it by 30% at least, this generation is one of the worst in nVidia's history.

With MFG it's 2.4x faster than 4080. And it's not going to matter that those are fake.
You do get more smooth frames, 5070 is 4090 in performance. This is seriously impressive.
 
I'm still confused about how the memory clock speed is 1750 MHZ, which would indicate 16 bits transferred per clock to get to 28 GT/s. But GDDR7 has PAM3 instead of GDDR6X's PAM4, so wouldn't it transfer 12 bits per clock and need a clock speed of 2333 MHz to achieve 28 GT/s? Someone enlighten me please.
 
Not sure if anyone's posted this already, but someone has uploaded the dll file for DLSS4, along with an updated Profile Inspector that allows you to force the new transformer model (preset J).

I don't know if it still has to be done in the Base Profile, but this used to be the case for DLSS3.

 
Back
Top