Friday, September 23rd 2022

NVIDIA RTX 4090 Boosts Up to 2.8 GHz at Stock Playing Cyberpunk 2077, Temperatures around 55 °C

NVIDIA GeForce RTX 4090 is turning out to be a cool operator, with the GPU reportedly boosting up to 2.8 GHz (2810 to 2850 MHz) at stock settings, when playing Cyberpunk 2077 at 1440p, in its "psycho" settings preset. with DLSS and Reflex disabled. At native resolution, the RTX 4090 scores 59 FPS (49 FPS at 1% lows), with a frame-time of 72 to 75 ms. With 100% GPU utilization, the card barely breaks a sweat, with GPU temperatures reported in the region of 50 to 55 °C. With DLSS 3 enabled, the game nearly doubles in frame-rate, to 119 FPS (1% lows), and an average latency of 53 ms. This is a net 2X gain in frame-rate with latency reduced by a third. The power-draw is also said to be significantly reduced. The card pulls up to 461 W when rendering at native-resolution, but this drops down to 348 W with DLSS 3 "quality," a 25% reduction.
Source: Wccftech
Add your own comment

77 Comments on NVIDIA RTX 4090 Boosts Up to 2.8 GHz at Stock Playing Cyberpunk 2077, Temperatures around 55 °C

#1
ir_cow
Its 55c because the heatsink is nearly 4 slots lol. Soon the case will be just a giant heatsink, kinda like a laptop.
Posted on Reply
#2
Dirt Chip
So the quality setting for dlss3 determent the power use. Nice.

You can have low fps with high power or high fps with low power.

Remind me of this:
What do you prefer- to be healthy and rich or poor and ill?
Posted on Reply
#3
fancucker
It's 4 slots because the industry has to deal with ridiculous expectations from consumers and deliver biannually, which Nvidia miraculously does. Not to mention while innovating on multiple fronts (DLSS, NVENC). If you're unhappy, there's always a 2-slot 3XXX series card until you buy a proper case :).
Posted on Reply
#4
Hyderz
now that gpu is getting massive i wonder if they can redesign the motherboard...
one socket for the cpu and one socket for the gpu and then add a custom fan yourself or watercool it.
instead of the massive heatsink block we get now and attaching onto the frail pci-e bracket needing column support holding the gpu up.
cheaper motherboards doesnt even have metal reinforced pci-e slots
Posted on Reply
#5
sepheronx
Cool

Now, still pissed that:
1) will need a new PSU for this thing if I were to get it (yeah, my ITX psu doesn't have enough PCIe connections for the adapter). ATX 3.0 I guess is nice but they kind of pushed this quickly before PSU's were really available. At least here.
2) To sidestep the lack of real gain in performance, they need to use AI in order to "simulate" the frames by injecting frames between frames. DLSS 2.0 was nice that it just did upscalings. This new function isn't something good as per my opinion. I could say more regarding how this will just make more developers lazy but yeah, I think most of you get the idea.
3) Price - yes I know, you want to make your money but its hard with so many 3000 series still in the market unsold. But instead, why not drop the prices of the 3000 series and offer the 4000 series at reasonable price?
4) Does it need to be so big?
Posted on Reply
#6
Crackong
A reduction of power draw simply means something is already maxed out within the silicon and became an internal bottleneck before the actual computation power does.

In this case it means the DLSS hardware is maxed out and cannot handle anymore from the pipeline.
Posted on Reply
#7
chrcoluk
ir_cowIts 55c because the heatsink is nearly 4 slots lol. Soon the case will be just a giant heatsink, kinda like a laptop.
Well in one of your motherboard threads I think, some of us mentioned the direction PCs are going in with reduction of pcie slots, its starting to get to the point that the design will be just centred around CPUs and GPUs.
Posted on Reply
#8
steen
Dirt ChipYou can have low fps with high power or high fps with low power.
Sure, higher perf DLSS modes mean less work for the GPU & higher fps. Native rendering sees higher GPU use, more power draw & lower fps. Native rendering vs native rendering comparison will be interesting. Also Ga vs Ad RT without NVAPI RT hooks will be interesting. Let's see who follows the reviewer's guide closest...
sepheronx2) To sidestep the lack of real gain in performance, they need to use AI in order to "simulate" the frames by injecting frames between frames. DLSS 2.0 was nice that it just did upscalings. This new function isn't something good as per my opinion. I could say more regarding how this will just make more developers lazy but yeah, I think most of you get the idea.
Can I ask why? DLSS2 just reduces precision within a frame & DLSS3 reduces precision between frames. Logical next step if you ask me.
Posted on Reply
#9
djuice
Could be hitting a CPU bottleneck at 170+ fps in CP2077, that's why it uses less power probably, maybe.. seems logical.

Posted on Reply
#10
sepheronx
steenSure, higher perf DLSS modes mean less work for the GPU & higher fps. Native rendering sees higher GPU use, more power draw & lower fps. Native rendering vs native rendering comparison will be interesting. Also Ga vs Ad RT without NVAPI RT hooks will be interesting. Let's see who follows the reviewer's guide closest...

Can I ask why? DLSS2 just reduces precision within a frame & DLSS3 reduces precision between frames. Logical next step if you ask me.
I thought DLSS 2 was simply scaling feature? If it is reduces precision within a frame, explains blurriness.

With that said. I am more into the raw performance than using tricks to make the games playable. Something that needs to be addressed imo.
Posted on Reply
#11
ZoneDymo
55 degrees is nice but makes it even more silly that the card has such an insanely large cooler.....like what is the point?
Or are they banking on people just OCing the heck out of it and getting a 800Watt powerdraw from the gpu alone?

also why does the power consumption drop? I would think that, regardless of settings, the gpu works as hard as it can to pump out as many frames as it can right?
If it would draw the full 460 watts, would one get higher fps then a mere 110? (and yes I say I mere because its brand new 1600 dollar card running cyberpunk....with DLSS...imo it should do 110 without any aid but thats just me).
(its probably like Djuice mentioned, a bottleneck)
Posted on Reply
#12
Rahmat Sofyan
with a really big cooler .. it should be and must be like that ..

otherwise, what for ?
Posted on Reply
#13
Ferrum Master
Winter is coming... heating is expensive...

We have a solution for you. Game on on RTX4000 series... overall system consumption over 600W as minimum. Don't fuss with that DLSS3, just leave your sweater hanging and enjoy the steaming hot air from the PC.
Posted on Reply
#14
Dirt Chip
ZoneDymo55 degrees is nice but makes it even more silly that the card has such an insanely large cooler.....like what is the point?
Or are they banking on people just OCing the heck out of it and getting a 800Watt powerdraw from the gpu alone?

also why does the power consumption drop? I would think that, regardless of settings, the gpu works as hard as it can to pump out as many frames as it can right?
If it would draw the full 460 watts, would one get higher fps then a mere 110? (and yes I say I mere because its brand new 1600 dollar card running cyberpunk....with DLSS...imo it should do 110 without any aid but thats just me).
(its probably like Djuice mentioned, a bottleneck)
This cooler will take you to 650w with >3ghz and max out ada silicon aka 4090ti
Posted on Reply
#15
Chomiq
Something doesn't add up here. Either that cooler is super optimized or that's BS data.

460W to achieve 59 fps at psycho in 1440p, with 98% load on GPU and yet somehow it stays at 50-55C?

To me this looks like from cold boot immediately into game (seeing how lowest temp is 33C). There wasn't enough time for temps to stabilize.
Posted on Reply
#16
Hyderz
ChomiqSomething doesn't add up here. Either that cooler is super optimized or that's BS data.

460W to achieve 59 fps at psycho in 1440p, with 98% load on GPU and yet somehow it stays at 50-55C?

To me this looks like from cold boot immediately into game (seeing how lowest temp is 33C). There wasn't enough time for temps to stabilize.
lets just wait for benchmarks to see all of nvidia's claim is true or not
Posted on Reply
#17
djuice
ChomiqSomething doesn't add up here. Either that cooler is super optimized or that's BS data.

460W to achieve 59 fps at psycho in 1440p, with 98% load on GPU and yet somehow it stays at 50-55C?

To me this looks like from cold boot immediately into game (seeing how lowest temp is 33C). There wasn't enough time for temps to stabilize.
Or it could be the 4 slot HS with 12+ heatpipes + vapor chamber is extremely effective at cooling 450W. All the 4090 AIB coolers I've seen so far seems to be way larger than what's been seen on the 3090TIs.
Posted on Reply
#18
AsRock
TPU addict
ChomiqSomething doesn't add up here. Either that cooler is super optimized or that's BS data.

460W to achieve 59 fps at psycho in 1440p, with 98% load on GPU and yet somehow it stays at 50-55C?

To me this looks like from cold boot immediately into game (seeing how lowest temp is 33C). There wasn't enough time for temps to stabilize.
Temp noted are with DLSS3 maybe ? ( CPU maxing out with DLSS ), it's when it's turned off the power draw gets much higher ?. and on top of that i bet a lot other things are getting hot too.

No hot spot shown no VRM temps shown, basically bare bones lacks a lot of detail.
Posted on Reply
#21
djuice
If you compare the non-DLSS results between the 3090TI and 4090, its only a 62% increase in FPS, this is with the 4090 having 55% more CUDA cores, 40%~ more clock speed, and 90MB more L2 cache (96MB vs 6MB on the 3090TI). This does no bode well for the 4080 16GB results in comparison to the 3090TI in non DLSS or heavy RT performance.
Posted on Reply
#22
W1zzard
If that is the actual temp on the final cards, then the fan speed/noise balance is terrible and someone at NVIDIA will get fired after my review
Posted on Reply
#23
Hyderz
W1zzardIf that is the actual temp on the final temps, then the fan speed/noise balance is terrible and someone at NVIDIA will get fired after my review
look foward to your review
Posted on Reply
#24
xSneak
How does it still have such a high frame time at that fps ?
Posted on Reply
#25
AlwaysHope
All this discussion for 1 card in 1 game engine... :rolleyes:
Geeeezzzz....
Posted on Reply
Add your own comment
Jan 5th, 2025 12:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts