Tuesday, July 19th 2022

NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti

NVIDIA's next-generation GeForce RTX 4090 "Ada" flagship graphics card allegedly scores over 19000 points in the 3DMark Time Spy Extreme synthetic benchmark, according to kopite7kimi, a reliable source with NVIDIA leaks. This would put its score around 66 percent above that of the current RTX 3090 Ti flagship. The RTX 4090 is expected to be based on the 5 nm AD102 silicon, with a rumored CUDA core count of 16,384. The higher IPC from the new architecture, coupled with higher clock speeds and power limits, could be contributing to this feat. Time Spy Extreme is a traditional DirectX 12 raster-only benchmark, with no ray traced elements. The Ada graphics architecture is expected to reduce the "cost" of ray tracing (versus raster-only rendering), although we're yet to see leaks of RTX performance, yet.
Sources: kopite7kimi (Twitter), VideoCardz
Add your own comment

96 Comments on NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti

#51
noname00
MarsM4N1080ti (11.800m transistors) vs. 2080ti (18.600m transistors) = 19,0% faster (58% more transistors)
2080ti (18.600m transistors) vs. 3080ti (28.300m transistors) = 24,4% faster (52% more transistors)

So in reality performance increase will be around 25%, my bet.
According to the TPU founders editions review of the 2080ti and 3080ti, at 4k, the 2080ti is 38.9% faster than the 1080ti, and the 3080ti is 47% faster than the 2080ti.

www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/33.html
www.techpowerup.com/review/nvidia-geforce-rtx-3080-ti-founders-edition/28.html

I have no idea where you got those percentages from.
Posted on Reply
#52
dicobalt
I don't undersatnd why people run these silly benchmark programs. Run some games or apps, nobody cares about esoteric meaningless benchmarks.
Posted on Reply
#53
ZoneDymo
dicobaltI don't undersatnd why people run these silly benchmark programs. Run some games or apps, nobody cares about esoteric meaningless benchmarks.
Well its a leak right? I think they do what they can do and still get away with
Posted on Reply
#54
ModEl4
Despite the +66% score in Timespy, I'll stick with my old prediction that Full AD102 is going to be at least 2X in 4K raster vs GA102.(fastest OC AD102 (full) at least 2X vs fastest OC 3090Ti)
Like I said in the past 2X uplift means it will be CPU limited even at 4K (It may need a testbed change from 5800X to Zen4 V-cache (coming this year according to rumors) to show it's true colors.
Another thing that I said in the past is that maybe Ada Lovelace will ship with 2 TBP options making comparisons with Ampere models more difficult, we will see.
Posted on Reply
#55
erocker
*
dicobaltI don't undersatnd why people run these silly benchmark programs. Run some games or apps, nobody cares about esoteric meaningless benchmarks.
3d Mark results are pretty consistent and can be interpolated into possible performance of other games/applications.
Posted on Reply
#56
ppn
At least 2X should be expected because those transistors can't just go to waste. Titan Xp to 3090Ti is 2.4X perf for 2.4X trasistor count even with all the RT included to hold it back. This time no new RT tax added, it's already there, so we are looking at almost the same kind of jump as seen between 1080 Ti and 3080 Ti, hard to believe.
Posted on Reply
#57
Sisyphus
xtreemchaosyour missing the point bud, gpu makers need to find a way of moving forward without pumping x amount of increasing power into there products at a time when most folks are trying to save the planet. even a 3070/80 uses far too much power and leaves wasted heat because its easy for them to make them that way overclockers have used this method since the first chips. now this is how i feel im not dissing anyone who uses said gpus im just pointing out that AMD/Nvidia ect dont see or dont want to see whats happening. as i write London is burning.
GPU makers need to earn money. High end gaming is a waste of ressources anyway. Most folks didnt try to save the planet, only a percentage of the western world, about 10% of world population. The industry migrates from the west to asia, thats all the green agenda does. Lower wealth in the west then leads to bad emergency infrastructure. Thats happening in London. Of couse the responsible politicians blame the weather
To the topic: Buy a GPU that fits your need, enjoy your games, there are many who gives you a chance to save the world.
PS: I am using a undervolted, slightly underclocked rtx 2070 around 120 W.
Posted on Reply
#58
HeadRusch1
This is one of those amazing stats on paper that makes you drool, but then you go back and realize your games probably already run at 60fps....so unless you're hunting 4k at 244hz or trying to run Cyberpunk above 1440p at 60fps, this card is just utter overkill for any game out today that dips into the past....its a niche product requiring a significant uptick in power consumption and heat output. Alaskans, your card has arrived!

I know, this is an 'old argument', who needs more than 256kb of RAM, right? But the thing is.......the software market has dried up. Where is the killer app release that will showcase the raw power of these cards.......it's not there. It's like trying to sell people on Ray Tracing by showcasing MINECRAFT running at 35fps. It's just.....out of step with software development.

Then when we get amazing Ray Tracing patches, only one in every 10 games showcases a marked improvement in visuals....the rest can largely be categorized as "I guess yeah it sorta looks better a little I guess if you're telling me it should"....and your framerate is cut in half, so then you have to render the game at resolutions we thought were amazing in 2009 and then use black magic and math to scale up, which works to a certain degree but then.....if we can already do that....why bother with brute-forcing further?

Its the endless teeter-totter of software versus hardware development.

Can it run Crysis? Who'd want to, these days, when its just a port of the 360 version remastered........
Posted on Reply
#59
xtreemchaos
SisyphusGPU makers need to earn money
you are not getting the point bud, id take a guess that your not too old and you dont have kids. i am old and have kids and grandkids and i want them to have a good life. i see no point in carrying this anymore we dont have anything in common. good day.
Posted on Reply
#60
mechtech
That heatsink alone looks like it's worth $300 lol
Posted on Reply
#61
skates
Lots of words
Mean nothing new
Wait until Launch
Then go Poo.
TiggerMight give it a try later
I used to game on my 1080Ti on 4K ever since launch and have had no issues, maybe one game which didn't have 4K resolution, but nothing major. Yes, give it a try.
Posted on Reply
#62
LuxZg
HeadRusch1This is one of those amazing stats on paper that makes you drool, but then you go back and realize your games probably already run at 60fps....so unless you're hunting 4k at 244hz or ...
I know, this is an 'old argument', who needs more than 256kb of RAM, right? But the thing is.......the software market has dried up. Where is the killer app ...
Then when we get amazing Ray Tracing patches, only one in every 10 games showcases a marked improvement in visuals....
... Can it run Crysis? Who'd want to, these days, when its just a port of the 360 version remastered........
This comment is so true.

Here's a rant from an "old guy". I've been playing PC games since original Tetris and Prince of Persia days. So much so, I ended up working in IT when I grew up. I was also always salivating looking at high end hardware, but never really had money for it back then. Then as I actually got older I've learned a few things...

1) yes, hardware overtook software years ago, first you were always hunting next gen just to run OS properly (damn Windows ;D ), nowdays except games and few niche/pro apps EVERYTHING runs fine even on entry level hardware. And well done games fly on midrange hardware.
2) hardware also overtook the human abilities, eg 6" phones with 8K screens or whatever. "Retina" showed it years ago. Then, as you also aren't in your prime that 8K 144Hz becomes just a gimmick because my eyes aren't what they used to be, and my reflexes are far from what they used to be.
3) You realize that good games don't need eye candy and twitch reflexes. Recently Valheim showed that to the world, very mediocre graphics, very simple gameplay, multiplayer optional. But so addictive! Then I looked back and thought about games I spent most time with, and it was like Civ III, or shooters like CS that weren't about graphics or fps, just about good coop and simple fun. Guess what, those are still top games today.
4) playing less of latest games saves lots of money! Be 2-3 years behind, and you get gold/Legendary editions and bundles that include DLCs, and you can play on "last gen" hardware, plus you get all the patching and optimizations.

So yeah, buying high end to play games on release, and then you get fail like Cyberpunk that is only now really ready, think of all the money saved if you targeted that game on release vs today.

Sure if you're rolling in cash and have 20h a day just to play, then go for it, I would too :D
Posted on Reply
#63
Palladium
IMO the current breed of AAA games are so trite and boring to the point I'm not even interested at watching them on free streaming.
Posted on Reply
#64
Dr. Dro
LuxZgSure if you're rolling in cash and have 20h a day just to play, then go for it, I would too :D
This is my reasoning to spend a bit more generously on my rig, I spend quite a lot of time on it :)

But other than that, I agree!
PalladiumIMO the current breed of AAA games are so trite and boring to the point I'm not even interested at watching them on free streaming.
That reminds me, Ubisoft games have been trite and boring for as long as I can remember, yet they still sell so well :oops:

I just hope Square Enix gives Yoko Taro his big bag of money for another NieR game soon
Posted on Reply
#65
Blueberries
I'm predicting roughly 3070ti performance from the 4060 if this is true.

That would make it priced in line with Intel's Alchemist GPUs if MSRP stays in line with Ampere.
Posted on Reply
#66
PapaTaipei
LuxZgThis comment is so true.

Here's a rant from an "old guy". I've been playing PC games since original Tetris and Prince of Persia days. So much so, I ended up working in IT when I grew up. I was also always salivating looking at high end hardware, but never really had money for it back then. Then as I actually got older I've learned a few things...

1) yes, hardware overtook software years ago, first you were always hunting next gen just to run OS properly (damn Windows ;D ), nowdays except games and few niche/pro apps EVERYTHING runs fine even on entry level hardware. And well done games fly on midrange hardware.
2) hardware also overtook the human abilities, eg 6" phones with 8K screens or whatever. "Retina" showed it years ago. Then, as you also aren't in your prime that 8K 144Hz becomes just a gimmick because my eyes aren't what they used to be, and my reflexes are far from what they used to be.
3) You realize that good games don't need eye candy and twitch reflexes. Recently Valheim showed that to the world, very mediocre graphics, very simple gameplay, multiplayer optional. But so addictive! Then I looked back and thought about games I spent most time with, and it was like Civ III, or shooters like CS that weren't about graphics or fps, just about good coop and simple fun. Guess what, those are still top games today.
4) playing less of latest games saves lots of money! Be 2-3 years behind, and you get gold/Legendary editions and bundles that include DLCs, and you can play on "last gen" hardware, plus you get all the patching and optimizations.

So yeah, buying high end to play games on release, and then you get fail like Cyberpunk that is only now really ready, think of all the money saved if you targeted that game on release vs today.

Sure if you're rolling in cash and have 20h a day just to play, then go for it, I would too :D
High end is needed for only a few competitive and very fast FPS games (Quake titles) where very high framerate is essential. Don't forget 500Hz displays are coming and it might go higher in the future. Probably hitting LCD limits is very close.
Posted on Reply
#67
Zareek
PapaTaipeiHigh end is needed for only a few competitive and very fast FPS games (Quake titles) where very high framerate is essential. Don't forget 500Hz displays are coming and it might go higher in the future. Probably hitting LCD limits is very close.
500hz forget about LCD limitations, how about human ones. I have a feeling most people aren't really capable of seeing a difference beyond 120fps, we don't see in fps per se, but at some point we can no longer determine the difference from real life. I know LTT did a blind test with its staff and kind of proved that higher FPS paid off for twitch titles, but I'm guessing even then there are limitations or at the very least diminishing returns.
Posted on Reply
#68
ratirt
Dr. DroThat reminds me, Ubisoft games have been trite and boring for as long as I can remember, yet they still sell so well :oops:

I just hope Square Enix gives Yoko Taro his big bag of money for another NieR game soon
I like Rayman :) Cool game.
Posted on Reply
#69
ZoneDymo
Dr. DroThat reminds me, Ubisoft games have been trite and boring for as long as I can remember, yet they still sell so well :oops:

I just hope Square Enix gives Yoko Taro his big bag of money for another NieR game soon
Actually things are going quite poorly for Ubisoft atm
Posted on Reply
#70
TheinsanegamerN
xtreemchaosyour missing the point bud, gpu makers need to find a way of moving forward without pumping x amount of increasing power into there products at a time when most folks are trying to save the planet. even a 3070/80 uses far too much power and leaves wasted heat because its easy for them to make them that way overclockers have used this method since the first chips. now this is how i feel im not dissing anyone who uses said gpus im just pointing out that AMD/Nvidia ect dont see or dont want to see whats happening. as i write London is burning.
If the power draw from a 4090 is bad for planet saving don’t ever look at EV power draw, a single Tesla may shock you into a coma.
Posted on Reply
#72
Totally
....buuuut consumes 2x power and generates 3x more heat. Yeah, yeah, 66% sounds awesome yeah.
Posted on Reply
#73
ratirt
TheinsanegamerNIf the power draw from a 4090 is bad for planet saving don’t ever look at EV power draw, a single Tesla may shock you into a coma.
Tesla is a car. A big machine which you can drive for miles with the battery. This is a graphics card chip with ram and PCB and that is it.
It's like comparing a bicycle to a car.
Posted on Reply
#74
PapaTaipei
ratirtTesla is a car. A big machine which you can drive for miles with the battery. This is a graphics card chip with ram and PCB and that is it.
It's like comparing a bicycle to a car.
What you don't get in what he said is that there are things that uses A LOT more power. Besides EV are extremely polluting.
Posted on Reply
#75
ratirt
PapaTaipeiWhat you don't get in what he said is that there are things that uses A LOT more power. Besides EV are extremely polluting.
I know what he said and I do understand the premise, but I disagree with justifying 3090's power usage saying Tesla car (a freaking car) uses more power which he basically did.
By that notion you can say a refrigerator uses more as well or washing machines so what? Should we stop using these or make Graphics card use 1000watt since other appliances use that much? There's boundaries here which probably you have missed or have no respect for. I can't believe this is even a discussion or a comparison of some sort.
Posted on Reply
Add your own comment
Dec 19th, 2024 07:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts