Friday, September 23rd 2022

NVIDIA RTX 4090 Boosts Up to 2.8 GHz at Stock Playing Cyberpunk 2077, Temperatures around 55 °C

NVIDIA GeForce RTX 4090 is turning out to be a cool operator, with the GPU reportedly boosting up to 2.8 GHz (2810 to 2850 MHz) at stock settings, when playing Cyberpunk 2077 at 1440p, in its "psycho" settings preset. with DLSS and Reflex disabled. At native resolution, the RTX 4090 scores 59 FPS (49 FPS at 1% lows), with a frame-time of 72 to 75 ms. With 100% GPU utilization, the card barely breaks a sweat, with GPU temperatures reported in the region of 50 to 55 °C. With DLSS 3 enabled, the game nearly doubles in frame-rate, to 119 FPS (1% lows), and an average latency of 53 ms. This is a net 2X gain in frame-rate with latency reduced by a third. The power-draw is also said to be significantly reduced. The card pulls up to 461 W when rendering at native-resolution, but this drops down to 348 W with DLSS 3 "quality," a 25% reduction.
Source: Wccftech
Add your own comment

77 Comments on NVIDIA RTX 4090 Boosts Up to 2.8 GHz at Stock Playing Cyberpunk 2077, Temperatures around 55 °C

#26
Sir_Coleslaw
I was hoping to see more hybrid AiO 2-slot cards. To counteract this trend with the more and more slots needed.
Posted on Reply
#27
The Quim Reaper
Why would you spend nearly $2000 to play at 1440(p)otato resolution...
Posted on Reply
#28
Unregistered
1600$ (let be honest it's more like 2000$ in real world) RTX name, 3rd gen RT cores and cannot reach 60fps at 1440p not even 4k.
Waste of silicon, I thought by next gen GPUs ray tracing will stop being a gimmick but it seems we are still far away.
#29
JalleR
They have borrowed Intels chiller from the 28C 5GHZ demo :D
Posted on Reply
#30
b1k3rdude
djuiceCould be hitting a CPU bottleneck at 170+ fps in CP2077, that's why it uses less power probably, maybe.. seems logical.
So Im not feeling the wow factor of the 4090, Im running a 3080-12gb & 5900x and @3440x1440, noRT, DLSS2-Auto and am getting on average 124 fps@350W. Which on my LG G-synce compatible monitor, the gameplay is flawless imho. May upgrade to a 4080-16gb down the road, but more than lickly my next upgrade will be to a RX 7800/7900 card as Im betting the power draw will similar to my current card, whith rasterisation perf that will be somewhere between the RTX3000 and RTX4000 series. Even though I have never used RT, because the visual candy isnt all that noticable when playing and isnt worth the perf hit. I imagine the RT perf of the new RDNA3 cards will match or beat that of RTX3000 series.
Posted on Reply
#31
ratirt
Xex3601600$ (let be honest it's more like 2000$ in real world) RTX name, 3rd gen RT cores and cannot reach 60fps at 1440p not even 4k.
Waste of silicon, I thought by next gen GPUs ray tracing will stop being a gimmick but it seems we are still far away.
It wont stop being a gimmick. RT is driving the sales and now you have from NV that reached highest $1600 price for top card in the segment. I remember the 1080 Ti had cost $600. $1k up to the price one card within 5 years? yeah it is 5 years when 1080TI was released or am I making things up? Anyway, Huang can shove those cards up his to be fair with the RT promises and DLSS when the new DLSS3 does not work on older gen cards which still cost so damn much. How is the support for the DLLS 2 look now? Is there going to be support for it? Doubt that developers will spend time for implementing DLSS2 and DLSS3 for individual line of cards. Heck, 5000 series cards will have exclusive DLSS 4 probably.
That is an insult and a spit straight in gamers faces.
Posted on Reply
#32
Unregistered
ratirtIt wont stop being a gimmick. RT is driving the sales and now you have from NV that reached highest $1600 price for top card in the segment. I remember the 1080 Ti had cost $600. $1k up to the price one card within 5 years? yeah it is 5 years when 1080TI was released or am I making things up? Anyway, Huang can shove those cards up his to be fair with the RT promises and DLSS when the new DLSS3 does not work on older gen cards which still cost so damn much. How is the support for the DLLS 2 look now? Is there going to be support for it? Doubt that developers will spend time for implementing DLSS2 and DLSS3 for individual line of cards. Heck, 5000 series cards will have exclusive DLSS 4 probably.
That is an insult and a spit straight in gamers faces.
I think you have to balme the tech media (which for the most are a bunch of philistines and fanboys) also as well, I remember one outlet praising ray tracing in Control while the damn mug had 2 polygons.
#33
Vayra86
ir_cowIts 55c because the heatsink is nearly 4 slots lol. Soon the case will be just a giant heatsink, kinda like a laptop.
No, its 55C because the GPU isn't fully loaded, this is also why you get peak boost clocks on earlier generations. But at 100% load, suddenly you'll be dropping boost bins left and right.

Yawnfest... we're looking at CBP at 1440p...
Posted on Reply
#34
AusWolf
Bah... my 6500 XT runs at 2.95 GHz all day and night... and at 55-60 °C with a cooler that's half the size than that on the 4090. :pimp:

But seriously, when will the gaming community learn that clock speed alone doesn't mean anything?
Posted on Reply
#35
P4-630
I see also low GPU temps when I just started up a game....
Posted on Reply
#36
Micko
djuiceCould be hitting a CPU bottleneck at 170+ fps in CP2077, that's why it uses less power probably, maybe.. seems logical.
One of advantages of DLSS 3 interframe creation is that CPU is not used for creating interframes. So in this case CPU is rendering 85 fps on average.

That is why it looks like DLSS 3 will be godsend for notebooks, which have lower clocked CPUs and energy limitations for both cpu and gpu.
Posted on Reply
#37
TheinsanegamerN
fancuckerIt's 4 slots because the industry has to deal with ridiculous expectations from consumers and deliver biannually, which Nvidia miraculously does. Not to mention while innovating on multiple fronts (DLSS, NVENC). If you're unhappy, there's always a 2-slot 3XXX series card until you buy a proper case :).
Is this trolling or bait? :confused:
Posted on Reply
#39
tehehe
Dirt ChipSo the quality setting for dlss3 determent the power use. Nice.

You can have low fps with high power or high fps with low power.

Remind me of this:
What do you prefer- to be healthy and rich or poor and ill?
DLSS3 is not all roses and sunshine. You will get better fps but latency will not improve. Let's assume you will get 30fps without dlss3 (33.3ms latency) then no matter how much frames you will generate outside of game engine (60 or 3000fps) you will still have that (33.3ms) input/world update latency because game engine doesn't know about these frames. It's a mixed bag and IMO not something worth buying 4000 series for because you only get half the coin of the promised performance (only throughput but not latency).
Posted on Reply
#40
mrthanhnguyen
Monster aib card with optimus waterblock. I think i will need a bigger case.
Posted on Reply
#41
Punkenjoy
teheheDLSS3 is not all roses and sunshine. You will get better fps but latency will not improve. Let's assume you will get 30fps without dlss3 (33.3ms latency) then no matter how much frames you will generate outside of game engine (60 or 3000fps) you will still have that (33.3ms) input/world update latency because game engine doesn't know about these frames. It's a mixed bag and IMO not something worth buying 4000 series for because you only get half the coin of the promised performance (only throughput but not latency).
I think DLSS3 will be amazing for in game cutscene as they would be able to crank up the details without tanking too much the frame rate. Maybe also on slow paced game like point and click games.

But for any fast action games, i think it will not be worth it. It would lead to strange feels and since you don't know your base FPS (the one the simulation run at), you won't know that you need more frames.

A game that feel like that is No Mans Sky, in some good games, I can handle 50-70 fps and be fine with it where on game like NMS, i set it up for having over 120 fps and still feel junky.
Posted on Reply
#42
sagbobbit
fancuckerIt's 4 slots because the industry has to deal with ridiculous expectations from consumers and deliver biannually, which Nvidia miraculously does. Not to mention while innovating on multiple fronts (DLSS, NVENC). If you're unhappy, there's always a 2-slot 3XXX series card until you buy a proper case :).
- Not use :kookoo: amount of electricity
- At least offer better price/performance than previous generation
- Don't be a dickbag to the dwindling pool of companies that seem to somewhat care about consumer (rip EVGA)
- And gawdayam don't cost $1000 for xx80 and $1500 for anything.
Are those really that unreasonable??
Posted on Reply
#43
DeeJay1001
djuiceCould be hitting a CPU bottleneck at 170+ fps in CP2077, that's why it uses less power probably, maybe.. seems logical.

If these numbers are true, the 4090 is 67% faster than a 3090ti, apples to apples.

Nvidia claimed 2X-4X

This is why we don't trust them.
Posted on Reply
#44
maxfly
mrthanhnguyenMonster aib card with optimus waterblock. I think i will need a bigger case.
If aibs stick to the reference pcb you can go itx hehe.
Check out the alphacool 4090 article.
Posted on Reply
#45
dyonoctis
On topic: I was really surprised when I saw those temps, and questioned why every aib design where so massive. Something is odd there
Posted on Reply
#46
bug
DeeJay1001If these numbers are true, the 4090 is 67% faster than a 3090ti, apples to apples.

Nvidia claimed 2X-4X

This is why we don't trust them.
Nvidia talks raw HP. That doesn't translate directly to FPS, the same way as more HP for a car doesn't directly translate into its top speed.
Posted on Reply
#47
MentalAcetylide
ir_cowIts 55c because the heatsink is nearly 4 slots lol. Soon the case will be just a giant heatsink, kinda like a laptop.
heh, with cards like the 4090 its already reached a point where you need a case that supports at least 7-10 fans for proper air flow. Imo, anything less than that and you just end up with too much heat building up inside of the case.
Posted on Reply
#48
DeeJay1001
bugNvidia talks raw HP. That doesn't translate directly to FPS, the same way as more HP for a car doesn't directly translate into its top speed.



This chart says, "Relative Performance" which is by definition NOT "RAW HP"
Posted on Reply
#49
Upgrayedd
W1zzardIf that is the actual temp on the final cards, then the fan speed/noise balance is terrible and someone at NVIDIA will get fired after my review
Fired for what exactly? Having a slightly louder card for a much cooler card? Seems like a good tradeoff off to me. Most people play games when the card is heading up so gpu noise is drowned out.

I use open back headphones to play games. GPU fan noise isn't really a big deal.
Posted on Reply
#50
AusWolf
UpgrayeddFired for what exactly? Having a slightly louder card for a much cooler card? Seems like a good tradeoff off to me. Most people play games when the card is heading up so gpu noise is drowned out.

I use open back headphones to play games. GPU fan noise isn't really a big deal.
I disagree. I only use headphones for music, so fan noise is my greatest enemy while playing games. That's why I couldn't bear my 2070 and bought a 6500 XT instead.
Posted on Reply
Add your own comment
Jan 7th, 2025 03:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts