• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti

So my 3080Ti overclocked gets 10800 points, the 4090 overclocked maybe 20000, nice improvement for maybe more or less the same price (unless the 4090 lands at 2000$/€ o_O).

Although price/performance on anything above 4080 will be bad like on the 3000 series, so maybe i'll stick to the 4080 for this generation as the 3080Ti was a kick in the balls even at MSRP.
 
Kopite7kimmi wants everyone to believe the 4090 will be >2x faster than 3090 in gaming somehow with only 50% more cuda cores. Correct me if I am wrong but never in the last 5 generations of gpu's did we have a >2x fold gaming performance bump. Why would this generation be any different? Especially when hitting a literal power wall like rumours suggest. If this synthetic score is anything indicative of final performance, then once again we will see around a 40-60% gaming performance gain and expectations will finally be subverted. Same thing applies to AMD of course.
 
3090 is 28 B transistors, 4090 is 59 B and significantly higher clock, expect 2x performance boost in 4K.
not the same CUDA.

Could be 16384 fp32 + 8192 Int32.
10752 is 5376 FP32 + 5376 FP32 or INT32

in most cases 10752 = 8064 FP23 + 2688 INT32

SO it's double FP32 CUDA at 50% higher clock, makes it 3x faster. Memory controller holds it back.
 
Last edited:
Whats the VRAM though. 3000 series can already handle the performance I need, I would only be interested in higher VRAM, what was the power consumption. :p
 
Still waiting for 970GTX level of price/pref.
Will keep on waiting I guess...

>250W reference GPU`s should carry an extra anti-green-product tax.
1% tax for each 1W over 250W.
 
Let's see how does it fare in real world testing. Synthetic only tell half the story.
 
3090 is 28 B transistors, 4090 is 59 B and significantly higher clock, expect 2x performance boost in 4K.
not the same CUDA.

Could be 16384 fp32 + 8192 Int32.
10752 is 5376 FP32 + 5376 FP32 or INT32

in most cases 10752 = 8064 FP23 + 2688 INT32

SO it's double FP32 CUDA at 50% higher clock, makes it 3x faster. Memory controller holds it back.
A large portion of the new transistors will be dedicated to the larger L2 cache. There might be less cores than expected but the effective memory bandwidth will be increase even if they stay with the same memory.
 
Is it just me, or does that pic look like a component from a nuclear power station control center ?
 
66% performance boost at only 100% more power, oh boy! Just throw a few extra solar panels on your roof to power it.
 
But will it run Minecraft full raytraced at 144hz@1440p? Or more importantly, make me better at Terraria. Cheesing every boss with minecart tracks is a crutch i’m quite proud of lol.
 
TPD is what worrys me, the 3090 told me ill never own one so i shudder to think what the 4090 pulls. when will AMD and Nvidia see the writing on the wall ?.
Theres nothing to see. X090 is for max performance, semiprofessional use. Who wants 250 max should by x070, gaming enthusiasts x080 and 800 to 1000 w powersupply. There are enough (water)cooling solutions.
 
4090 Price will also cause 66% faster bankruptcy
 
that's a lot of performance, you could almost say that card can finally run 8k
 
39 posts, probably 30 adding nothing but crap to the topic

You are a weird individual on this website man, this is the second time I see you making a comment like this in a short amount of time.
 
As long as this new generation undervolts as well as the 3000 series, I'll be happy. Trading 3-5% performance for 60 - 100 watts less power draw and much less heat is a win in my book.
Yes the AMD undervolts really well but IMHO the reason was for the 5000 series they pushed the chip to its limit to keep up with what Ngreedia had. As stated before my undervolted 5700 runs at under 130 watts while playing video games/editing. The difference in playing games was 5fpm on the average. The average core temp was @ 64-66c The hottest was at 78c while the room was 40c. Yea my room gets really hot in the summer as the sun hits that side of the house fully. If I did not undervolted my card the card would run in the 72 to 84c and IMHO that's just too hot to run any card regardless that it is in specs.

Again my rig has been posted on what it is and how I run it on only 3 fans to keep things cool.

I'm waiting on to see about the newest video cards available from AMD however if I see the prices to continue to drop I just might buy a 6800 or a 6900XT. I'll skip the 6800XT because it runs at 300 watts just the same as the 6900XT and the 6800 runs at 250 watts. If so then I'll undervolt these cards as well. I do think the 6800 is a sleeper card with good overall performance for what it is.

I've got 80 watts to play with too keep my concept of how I keep my overall rig cool, running efficiently with overall good performance.

But if the 7000 series of cards give me overall 30%+ performance at the same wattage of the 6000 series then I would go for that option too. As long as AMD is not going to give me the shaft in pricing then yes it's a definitely a look.
 
With the avg screen res going up (i guess) and games getting harder to run, we need to get used to this and not expect a GPU that will run your games at max settings in 4k to use 250w max. As long as i could cool it, idgaf how many watts it uses really and no doubt i would have a 3090ti if i had the cash.

All you guys with 4k high refresh screens, you really think GPU power use is going to stop going up. If you want to run your games at that res, high refresh, get used to it or get a lower res screen.

Strongly disagree here: “We” do not “need” to get used to less progress for ever-increasing prices, energy consumption and heat generation. You are of course entitled to your own opinion, but If anything, “We” should get back to not rewarding less innovation gen over gen, in my opinion.
 
Strongly disagree here: “We” do not “need” to get used to less progress for ever-increasing prices, energy consumption and heat generation. You are of course entitled to your own opinion, but If anything, “We” should get back to not rewarding less innovation gen over gen, in my opinion.

Well again, imo it is not going to change. People complain but still buy the products, so no inducement for them to stop doing it.
 
  • Like
Reactions: ppn
Everyone whining about TDP….. nobody said you HAVE to buy the highest end card. You can buy an upper mid range TDP card, enjoy massive power/watt improvements, and not turn your room into a furnace.

You can also undervolt and get 95 % of the performance for about 70% draw too.

you also don’t need a 200w CPU. You can get more efficient i3s and i5s that draw little power like the good old days. Stick an i7 sticker on the case and 99% wouldn’t be able to tell the difference.
 
Theres nothing to see.
your missing the point bud, gpu makers need to find a way of moving forward without pumping x amount of increasing power into there products at a time when most folks are trying to save the planet. even a 3070/80 uses far too much power and leaves wasted heat because its easy for them to make them that way overclockers have used this method since the first chips. now this is how i feel im not dissing anyone who uses said gpus im just pointing out that AMD/Nvidia ect dont see or dont want to see whats happening. as i write London is burning.
 
Back
Top