Tuesday, July 19th 2022

NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti

NVIDIA's next-generation GeForce RTX 4090 "Ada" flagship graphics card allegedly scores over 19000 points in the 3DMark Time Spy Extreme synthetic benchmark, according to kopite7kimi, a reliable source with NVIDIA leaks. This would put its score around 66 percent above that of the current RTX 3090 Ti flagship. The RTX 4090 is expected to be based on the 5 nm AD102 silicon, with a rumored CUDA core count of 16,384. The higher IPC from the new architecture, coupled with higher clock speeds and power limits, could be contributing to this feat. Time Spy Extreme is a traditional DirectX 12 raster-only benchmark, with no ray traced elements. The Ada graphics architecture is expected to reduce the "cost" of ray tracing (versus raster-only rendering), although we're yet to see leaks of RTX performance, yet.
Sources: kopite7kimi (Twitter), VideoCardz
Add your own comment

96 Comments on NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti

#1
wolf
Performance Enthusiast
Another Day another Rumor... 66% faster than 3090Ti and 82% Faster than 3090 in one synthetic benchmark... Not nearly enough to go off yet.
Posted on Reply
#2
ratirt
This performance kinda aligns with the increased number of cores. 10,753 for the 3090 Ti vs 16,384 for the 4090.
Posted on Reply
#4
Flanker
That explains the recent heatwaves
Posted on Reply
#5
Tek-Check
Does 66% faster in this test mean ~35% faster in gaming?
Posted on Reply
#6
Unregistered
although we're yet to see leaks of RTX performance, yet.
Isn't the 4090 an RTX card? Seems nVidia is very good at marketing.
Most important is the performance per Watt, this card should offer this performance while consuming similar or lower power than the 3090 to be acceptable, otherwise it will follow the steps of Ampere and Turing in terms of mediocrity.
#7
MarsM4N
ratirtThis performance kinda aligns with the increased number of cores. 10,753 for the 3090 Ti vs 16,384 for the 4090.
10.753m transistors (100%) & 16.384m transistors (=152%) + advancements from the new silicon
Well, it sounds plausible. :) But ...

On the other side:

1080ti (11.800m transistors) vs. 2080ti (18.600m transistors) = 19,0% faster (58% more transistors)
2080ti (18.600m transistors) vs. 3080ti (28.300m transistors) = 24,4% faster (52% more transistors)

So in reality performance increase will be around 25%, my bet.
Posted on Reply
#8
john_
Xex360Isn't the 4090 an RTX card? Seems nVidia is very good at marketing.
Most important is the performance per Watt, this card should offer this performance while consuming similar or lower power than the 3090 to be acceptable, otherwise it will follow the steps of Ampere and Turing in terms of mediocrity.
We might get 3090 performance at lower power consumption from, probably 4070?
Posted on Reply
#9
Dirt Chip
The move from Samsung foundry back to TSMC will make the extra boost to performance in the 4XXX.
That is beside the 8nm to 5 nm lithography advancement.
Samsung 8nm was holding Ampere back, Ada might max the architecture potential with TSMC.
Posted on Reply
#10
64K
Tek-CheckDoes 66% faster in this test mean ~35% faster in gaming?
That's why we need hands-on gaming reviews. Hopefully a 4090 will land on here for review. The 4080 and 4070 probably won't be out until the end of the year or maybe early next year depending on how quickly the glut of Amperes can reside.
Posted on Reply
#11
Unregistered
Sold, cant wait to see TPU members post their 4090's :laugh:
#12
Icon Charlie
MarsM4N10.753m transistors (100%) & 16.384m transistors (=152%) + advancements from the new silicon
Well, it sounds plausible. :) But ...

On the other side:

1080ti (11.800m transistors) vs. 2080ti (18.600m transistors) = 19,0% faster (58% more transistors)
2080ti (18.600m transistors) vs. 3080ti (28.300m transistors) = 24,4% faster (52% more transistors)

So in reality performance increase will be around 25%, my bet.
I'm going to agree with this assessment. Too much market speak going on. I think on its going to be between 15 and 30% on all general applications that are used. But of course as usual there will be programs NGREEDIA will cherry pick with their cheery picked graphics card that has been tweaked out for just that situation.

Heh Marketing, Don't you just love it :)
Posted on Reply
#13
xtreemchaos
TPD is what worrys me, the 3090 told me ill never own one so i shudder to think what the 4090 pulls. when will AMD and Nvidia see the writing on the wall ?.
Posted on Reply
#14
ratirt
MarsM4N10.753m transistors (100%) & 16.384m transistors (=152%) + advancements from the new silicon
Well, it sounds plausible. :) But ...

On the other side:

1080ti (11.800m transistors) vs. 2080ti (18.600m transistors) = 19,0% faster (58% more transistors)
2080ti (18.600m transistors) vs. 3080ti (28.300m transistors) = 24,4% faster (52% more transistors)

So in reality performance increase will be around 25%, my bet.
I think you are mixing transistors with cores which are not 1 to 1.
You have 10735 cores in a 3090 Ti but 28.3 billion transistors.
xtreemchaosTPD is what worrys me, the 3090 told me ill never own one so i shudder to think what the 4090 pulls. when will AMD and Nvidia see the writing on the wall ?.
Same here. It's going out of hand with the power consumption for me.
When they will see? I guess when their revenues and products sold decline.
Posted on Reply
#15
xtreemchaos
ratirtSame here
ive got a 2080 which pulls around 250w flat out and i feel bad about that to tell the truth.
Posted on Reply
#16
ratirt
xtreemchaosive got a 2080 which pulls around 250w flat out and i feel bad about that to tell the truth.
My preference is a bit different but it has also changed. I needed to buy a GPU due to some circumstances and I ended up with a 6900XT but my target was different. I had to buy what was available. In terms of power, it does not go above 300Watts but it was still too much for me. I wanted something to around 220Watts. I had to compromise on power usage My next GPU will not go above 300Watts for sure but I will definitely try aiming below 250Watts.
Posted on Reply
#17
Unregistered
With the avg screen res going up (i guess) and games getting harder to run, we need to get used to this and not expect a GPU that will run your games at max settings in 4k to use 250w max. As long as i could cool it, idgaf how many watts it uses really and no doubt i would have a 3090ti if i had the cash.

All you guys with 4k high refresh screens, you really think GPU power use is going to stop going up. If you want to run your games at that res, high refresh, get used to it or get a lower res screen.
#18
PapaTaipei
Tek-CheckDoes 66% faster in this test mean ~35% faster in gaming?
Most probable is 66% faster at 8K resolution. So maybe 15-25% faster at 1080p.
Posted on Reply
#19
ratirt
TiggerWith the avg screen res going up (i guess) and games getting harder to run, we need to get used to this and not expect a GPU that will run your games at max settings in 4k to use 250w max. As long as i could cool it, idgaf how many watts it uses really and no doubt i would have a 3090ti if i had the cash.

All you guys with 4k high refresh screens, you really think GPU power use is going to stop going up. If you want to run your games at that res, high refresh, get used to it or get a lower res screen.
I'm a 4k user and the power should stay at the same level offering better performance. That's my take. I disagree with the notion more power more performance to be able to stay with 4k as long the performance/watt is OK.
Posted on Reply
#20
xtreemchaos
agreed this more power is not sustainable, and as for 4k i have a 1080FE in my work shop that runs 4k lovely at mid to high setting in games .
Posted on Reply
#21
Unregistered
xtreemchaosagreed this more power is not sustainable, and as for 4k i have a 1080FE in my work shop that runs 4k lovely at mid to high setting in games .
I need to try my 1080ti on the 4k tv
#23
Unregistered
xtreemchaosyour 1080ti will run lovely at 4k bud.
Might give it a try later
#24
Chrispy_
I'm looking forward to seeing how many Watts a 4000-series GPU midrange GPU with, say, ~6000 CUDA cores uses on TSMC 5nm.

I'm not interested in the behemoths that don't fit in standard ATX cases or work with the perfectly good ~800W PSUs that so many people own. Those flagship models always completely ignore efficiency and cost-effectiveness, almost to the level of obscenity.

I'll probably buy whatever falls in the 200-250W range and call it a day. As long as it has 12GB of VRAM or more and runs every single AAA game at 1440p I'm going to be perfectly happy to ignore sillier, faster, hungrier, hotter, more expensive cards.
Posted on Reply
#25
igralec84
So my 3080Ti overclocked gets 10800 points, the 4090 overclocked maybe 20000, nice improvement for maybe more or less the same price (unless the 4090 lands at 2000$/€ o_O).

Although price/performance on anything above 4080 will be bad like on the 3000 series, so maybe i'll stick to the 4080 for this generation as the 3080Ti was a kick in the balls even at MSRP.
Posted on Reply
Add your own comment
May 15th, 2024 23:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts