Tuesday, July 19th 2022

NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti

NVIDIA's next-generation GeForce RTX 4090 "Ada" flagship graphics card allegedly scores over 19000 points in the 3DMark Time Spy Extreme synthetic benchmark, according to kopite7kimi, a reliable source with NVIDIA leaks. This would put its score around 66 percent above that of the current RTX 3090 Ti flagship. The RTX 4090 is expected to be based on the 5 nm AD102 silicon, with a rumored CUDA core count of 16,384. The higher IPC from the new architecture, coupled with higher clock speeds and power limits, could be contributing to this feat. Time Spy Extreme is a traditional DirectX 12 raster-only benchmark, with no ray traced elements. The Ada graphics architecture is expected to reduce the "cost" of ray tracing (versus raster-only rendering), although we're yet to see leaks of RTX performance, yet.
Sources: kopite7kimi (Twitter), VideoCardz
Add your own comment

96 Comments on NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti

#26
Dimitriman
Kopite7kimmi wants everyone to believe the 4090 will be >2x faster than 3090 in gaming somehow with only 50% more cuda cores. Correct me if I am wrong but never in the last 5 generations of gpu's did we have a >2x fold gaming performance bump. Why would this generation be any different? Especially when hitting a literal power wall like rumours suggest. If this synthetic score is anything indicative of final performance, then once again we will see around a 40-60% gaming performance gain and expectations will finally be subverted. Same thing applies to AMD of course.
Posted on Reply
#27
ppn
3090 is 28 B transistors, 4090 is 59 B and significantly higher clock, expect 2x performance boost in 4K.
not the same CUDA.

Could be 16384 fp32 + 8192 Int32.
10752 is 5376 FP32 + 5376 FP32 or INT32

in most cases 10752 = 8064 FP23 + 2688 INT32

SO it's double FP32 CUDA at 50% higher clock, makes it 3x faster. Memory controller holds it back.
Posted on Reply
#28
Dimitriman
ppn3090 is 28 B transistors, 4090 is 59 B and significantly higher clock, expect 2x performance boost in 4K.
I will be happy if you are correct. But I am not holding my breath for that.
Posted on Reply
#29
chrcoluk
Whats the VRAM though. 3000 series can already handle the performance I need, I would only be interested in higher VRAM, what was the power consumption. :p
Posted on Reply
#30
Dirt Chip
Still waiting for 970GTX level of price/pref.
Will keep on waiting I guess...

>250W reference GPU`s should carry an extra anti-green-product tax.
1% tax for each 1W over 250W.
Posted on Reply
#31
Tsukiyomi91
Let's see how does it fare in real world testing. Synthetic only tell half the story.
Posted on Reply
#32
Punkenjoy
ppn3090 is 28 B transistors, 4090 is 59 B and significantly higher clock, expect 2x performance boost in 4K.
not the same CUDA.

Could be 16384 fp32 + 8192 Int32.
10752 is 5376 FP32 + 5376 FP32 or INT32

in most cases 10752 = 8064 FP23 + 2688 INT32

SO it's double FP32 CUDA at 50% higher clock, makes it 3x faster. Memory controller holds it back.
A large portion of the new transistors will be dedicated to the larger L2 cache. There might be less cores than expected but the effective memory bandwidth will be increase even if they stay with the same memory.
Posted on Reply
#33
PapaTaipei
Dirt ChipStill waiting for 970GTX level of price/pref.
Will keep on waiting I guess...

>250W reference GPU`s should carry an extra anti-green-product tax.
1% tax for each 1W over 250W.
I don't think making the customers pay is good.
Posted on Reply
#34
R0H1T
ppnexpect 2x performance boost in 4K.
Definitely not 2x maybe 1.5~1.8x realistically speaking.
Posted on Reply
#35
bonehead123
Is it just me, or does that pic look like a component from a nuclear power station control center ?
Posted on Reply
#36
CyberCT
As long as this new generation undervolts as well as the 3000 series, I'll be happy. Trading 3-5% performance for 60 - 100 watts less power draw and much less heat is a win in my book.
Posted on Reply
#37
Zareek
66% performance boost at only 100% more power, oh boy! Just throw a few extra solar panels on your roof to power it.
Posted on Reply
#38
Fleurious
But will it run Minecraft full raytraced at 144hz@1440p? Or more importantly, make me better at Terraria. Cheesing every boss with minecart tracks is a crutch i’m quite proud of lol.
Posted on Reply
#39
Xaled
Tek-CheckDoes 66% faster in this test mean ~35% faster in gaming?
I'm not sure, but certainly it would be at least 66% more expensive
Posted on Reply
#40
Sisyphus
xtreemchaosTPD is what worrys me, the 3090 told me ill never own one so i shudder to think what the 4090 pulls. when will AMD and Nvidia see the writing on the wall ?.
Theres nothing to see. X090 is for max performance, semiprofessional use. Who wants 250 max should by x070, gaming enthusiasts x080 and 800 to 1000 w powersupply. There are enough (water)cooling solutions.
Posted on Reply
#41
Bubster
4090 Price will also cause 66% faster bankruptcy
Posted on Reply
#42
Bomby569
that's a lot of performance, you could almost say that card can finally run 8k
Posted on Reply
#43
ZoneDymo
Tigger39 posts, probably 30 adding nothing but crap to the topic
You are a weird individual on this website man, this is the second time I see you making a comment like this in a short amount of time.
Posted on Reply
#44
Icon Charlie
CyberCTAs long as this new generation undervolts as well as the 3000 series, I'll be happy. Trading 3-5% performance for 60 - 100 watts less power draw and much less heat is a win in my book.
Yes the AMD undervolts really well but IMHO the reason was for the 5000 series they pushed the chip to its limit to keep up with what Ngreedia had. As stated before my undervolted 5700 runs at under 130 watts while playing video games/editing. The difference in playing games was 5fpm on the average. The average core temp was @ 64-66c The hottest was at 78c while the room was 40c. Yea my room gets really hot in the summer as the sun hits that side of the house fully. If I did not undervolted my card the card would run in the 72 to 84c and IMHO that's just too hot to run any card regardless that it is in specs.

Again my rig has been posted on what it is and how I run it on only 3 fans to keep things cool.

I'm waiting on to see about the newest video cards available from AMD however if I see the prices to continue to drop I just might buy a 6800 or a 6900XT. I'll skip the 6800XT because it runs at 300 watts just the same as the 6900XT and the 6800 runs at 250 watts. If so then I'll undervolt these cards as well. I do think the 6800 is a sleeper card with good overall performance for what it is.

I've got 80 watts to play with too keep my concept of how I keep my overall rig cool, running efficiently with overall good performance.

But if the 7000 series of cards give me overall 30%+ performance at the same wattage of the 6000 series then I would go for that option too. As long as AMD is not going to give me the shaft in pricing then yes it's a definitely a look.
Posted on Reply
#45
Testsubject01
TiggerWith the avg screen res going up (i guess) and games getting harder to run, we need to get used to this and not expect a GPU that will run your games at max settings in 4k to use 250w max. As long as i could cool it, idgaf how many watts it uses really and no doubt i would have a 3090ti if i had the cash.

All you guys with 4k high refresh screens, you really think GPU power use is going to stop going up. If you want to run your games at that res, high refresh, get used to it or get a lower res screen.
Strongly disagree here: “We” do not “need” to get used to less progress for ever-increasing prices, energy consumption and heat generation. You are of course entitled to your own opinion, but If anything, “We” should get back to not rewarding less innovation gen over gen, in my opinion.
Posted on Reply
#46
Unregistered
Testsubject01Strongly disagree here: “We” do not “need” to get used to less progress for ever-increasing prices, energy consumption and heat generation. You are of course entitled to your own opinion, but If anything, “We” should get back to not rewarding less innovation gen over gen, in my opinion.
Well again, imo it is not going to change. People complain but still buy the products, so no inducement for them to stop doing it.
Posted on Edit | Reply
#47
Dirt Chip
PapaTaipeiI don't think making the customers pay is good.
If you choose to buy it, pay for it ;)
Posted on Reply
#48
TheinsanegamerN
Everyone whining about TDP….. nobody said you HAVE to buy the highest end card. You can buy an upper mid range TDP card, enjoy massive power/watt improvements, and not turn your room into a furnace.

You can also undervolt and get 95 % of the performance for about 70% draw too.

you also don’t need a 200w CPU. You can get more efficient i3s and i5s that draw little power like the good old days. Stick an i7 sticker on the case and 99% wouldn’t be able to tell the difference.
Posted on Reply
#49
xtreemchaos
SisyphusTheres nothing to see.
your missing the point bud, gpu makers need to find a way of moving forward without pumping x amount of increasing power into there products at a time when most folks are trying to save the planet. even a 3070/80 uses far too much power and leaves wasted heat because its easy for them to make them that way overclockers have used this method since the first chips. now this is how i feel im not dissing anyone who uses said gpus im just pointing out that AMD/Nvidia ect dont see or dont want to see whats happening. as i write London is burning.
Posted on Reply
#50
TheoneandonlyMrK
TiggerWith the avg screen res going up (i guess) and games getting harder to run, we need to get used to this and not expect a GPU that will run your games at max settings in 4k to use 250w max. As long as i could cool it, idgaf how many watts it uses really and no doubt i would have a 3090ti if i had the cash.

All you guys with 4k high refresh screens, you really think GPU power use is going to stop going up. If you want to run your games at that res, high refresh, get used to it or get a lower res screen.
Having lived with such a room heater, enjoy it y'all, I'll not bother.

Yet I agree to a point, however in 5 years using north of 500/600 watts to get this much performance will be idiotic and that's a fact.
Posted on Reply
Add your own comment
Oct 31st, 2024 19:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts