• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 4090 Boosts Up to 2.8 GHz at Stock Playing Cyberpunk 2077, Temperatures around 55 °C

I was hoping to see more hybrid AiO 2-slot cards. To counteract this trend with the more and more slots needed.
 
They have borrowed Intels chiller from the 28C 5GHZ demo :D
 
Could be hitting a CPU bottleneck at 170+ fps in CP2077, that's why it uses less power probably, maybe.. seems logical.
So Im not feeling the wow factor of the 4090, Im running a 3080-12gb & 5900x and @3440x1440, noRT, DLSS2-Auto and am getting on average 124 fps@350W. Which on my LG G-synce compatible monitor, the gameplay is flawless imho. May upgrade to a 4080-16gb down the road, but more than lickly my next upgrade will be to a RX 7800/7900 card as Im betting the power draw will similar to my current card, whith rasterisation perf that will be somewhere between the RTX3000 and RTX4000 series. Even though I have never used RT, because the visual candy isnt all that noticable when playing and isnt worth the perf hit. I imagine the RT perf of the new RDNA3 cards will match or beat that of RTX3000 series.
 

Attachments

  • CP2077_3080-12gb_DLSS-Auto2.jpg
    CP2077_3080-12gb_DLSS-Auto2.jpg
    135.3 KB · Views: 72
Last edited:
1600$ (let be honest it's more like 2000$ in real world) RTX name, 3rd gen RT cores and cannot reach 60fps at 1440p not even 4k.
Waste of silicon, I thought by next gen GPUs ray tracing will stop being a gimmick but it seems we are still far away.
It wont stop being a gimmick. RT is driving the sales and now you have from NV that reached highest $1600 price for top card in the segment. I remember the 1080 Ti had cost $600. $1k up to the price one card within 5 years? yeah it is 5 years when 1080TI was released or am I making things up? Anyway, Huang can shove those cards up his to be fair with the RT promises and DLSS when the new DLSS3 does not work on older gen cards which still cost so damn much. How is the support for the DLLS 2 look now? Is there going to be support for it? Doubt that developers will spend time for implementing DLSS2 and DLSS3 for individual line of cards. Heck, 5000 series cards will have exclusive DLSS 4 probably.
That is an insult and a spit straight in gamers faces.
 
It wont stop being a gimmick. RT is driving the sales and now you have from NV that reached highest $1600 price for top card in the segment. I remember the 1080 Ti had cost $600. $1k up to the price one card within 5 years? yeah it is 5 years when 1080TI was released or am I making things up? Anyway, Huang can shove those cards up his to be fair with the RT promises and DLSS when the new DLSS3 does not work on older gen cards which still cost so damn much. How is the support for the DLLS 2 look now? Is there going to be support for it? Doubt that developers will spend time for implementing DLSS2 and DLSS3 for individual line of cards. Heck, 5000 series cards will have exclusive DLSS 4 probably.
That is an insult and a spit straight in gamers faces.
I think you have to balme the tech media (which for the most are a bunch of philistines and fanboys) also as well, I remember one outlet praising ray tracing in Control while the damn mug had 2 polygons.
 
Its 55c because the heatsink is nearly 4 slots lol. Soon the case will be just a giant heatsink, kinda like a laptop.

No, its 55C because the GPU isn't fully loaded, this is also why you get peak boost clocks on earlier generations. But at 100% load, suddenly you'll be dropping boost bins left and right.

Yawnfest... we're looking at CBP at 1440p...
 
Bah... my 6500 XT runs at 2.95 GHz all day and night... and at 55-60 °C with a cooler that's half the size than that on the 4090. :pimp:

But seriously, when will the gaming community learn that clock speed alone doesn't mean anything?
 
I see also low GPU temps when I just started up a game....
 
Could be hitting a CPU bottleneck at 170+ fps in CP2077, that's why it uses less power probably, maybe.. seems logical.
One of advantages of DLSS 3 interframe creation is that CPU is not used for creating interframes. So in this case CPU is rendering 85 fps on average.

That is why it looks like DLSS 3 will be godsend for notebooks, which have lower clocked CPUs and energy limitations for both cpu and gpu.
 
It's 4 slots because the industry has to deal with ridiculous expectations from consumers and deliver biannually, which Nvidia miraculously does. Not to mention while innovating on multiple fronts (DLSS, NVENC). If you're unhappy, there's always a 2-slot 3XXX series card until you buy a proper case :).
Is this trolling or bait? :confused:
 
playing at 1444p with an rtx 4090 LOL
 
So the quality setting for dlss3 determent the power use. Nice.

You can have low fps with high power or high fps with low power.

Remind me of this:
What do you prefer- to be healthy and rich or poor and ill?
DLSS3 is not all roses and sunshine. You will get better fps but latency will not improve. Let's assume you will get 30fps without dlss3 (33.3ms latency) then no matter how much frames you will generate outside of game engine (60 or 3000fps) you will still have that (33.3ms) input/world update latency because game engine doesn't know about these frames. It's a mixed bag and IMO not something worth buying 4000 series for because you only get half the coin of the promised performance (only throughput but not latency).
 
Monster aib card with optimus waterblock. I think i will need a bigger case.
 
DLSS3 is not all roses and sunshine. You will get better fps but latency will not improve. Let's assume you will get 30fps without dlss3 (33.3ms latency) then no matter how much frames you will generate outside of game engine (60 or 3000fps) you will still have that (33.3ms) input/world update latency because game engine doesn't know about these frames. It's a mixed bag and IMO not something worth buying 4000 series for because you only get half the coin of the promised performance (only throughput but not latency).
I think DLSS3 will be amazing for in game cutscene as they would be able to crank up the details without tanking too much the frame rate. Maybe also on slow paced game like point and click games.

But for any fast action games, i think it will not be worth it. It would lead to strange feels and since you don't know your base FPS (the one the simulation run at), you won't know that you need more frames.

A game that feel like that is No Mans Sky, in some good games, I can handle 50-70 fps and be fine with it where on game like NMS, i set it up for having over 120 fps and still feel junky.
 
It's 4 slots because the industry has to deal with ridiculous expectations from consumers and deliver biannually, which Nvidia miraculously does. Not to mention while innovating on multiple fronts (DLSS, NVENC). If you're unhappy, there's always a 2-slot 3XXX series card until you buy a proper case :).
- Not use :kookoo: amount of electricity
- At least offer better price/performance than previous generation
- Don't be a dickbag to the dwindling pool of companies that seem to somewhat care about consumer (rip EVGA)
- And gawdayam don't cost $1000 for xx80 and $1500 for anything.
Are those really that unreasonable??
 
Monster aib card with optimus waterblock. I think i will need a bigger case.
If aibs stick to the reference pcb you can go itx hehe.
Check out the alphacool 4090 article.
 
On topic: I was really surprised when I saw those temps, and questioned why every aib design where so massive. Something is odd there
 
Last edited by a moderator:
If these numbers are true, the 4090 is 67% faster than a 3090ti, apples to apples.

Nvidia claimed 2X-4X

This is why we don't trust them.
Nvidia talks raw HP. That doesn't translate directly to FPS, the same way as more HP for a car doesn't directly translate into its top speed.
 
Its 55c because the heatsink is nearly 4 slots lol. Soon the case will be just a giant heatsink, kinda like a laptop.
heh, with cards like the 4090 its already reached a point where you need a case that supports at least 7-10 fans for proper air flow. Imo, anything less than that and you just end up with too much heat building up inside of the case.
 
If that is the actual temp on the final cards, then the fan speed/noise balance is terrible and someone at NVIDIA will get fired after my review
Fired for what exactly? Having a slightly louder card for a much cooler card? Seems like a good tradeoff off to me. Most people play games when the card is heading up so gpu noise is drowned out.

I use open back headphones to play games. GPU fan noise isn't really a big deal.
 
Back
Top