Tuesday, July 19th 2022
NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti
NVIDIA's next-generation GeForce RTX 4090 "Ada" flagship graphics card allegedly scores over 19000 points in the 3DMark Time Spy Extreme synthetic benchmark, according to kopite7kimi, a reliable source with NVIDIA leaks. This would put its score around 66 percent above that of the current RTX 3090 Ti flagship. The RTX 4090 is expected to be based on the 5 nm AD102 silicon, with a rumored CUDA core count of 16,384. The higher IPC from the new architecture, coupled with higher clock speeds and power limits, could be contributing to this feat. Time Spy Extreme is a traditional DirectX 12 raster-only benchmark, with no ray traced elements. The Ada graphics architecture is expected to reduce the "cost" of ray tracing (versus raster-only rendering), although we're yet to see leaks of RTX performance, yet.
Sources:
kopite7kimi (Twitter), VideoCardz
96 Comments on NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti
www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/33.html
www.techpowerup.com/review/nvidia-geforce-rtx-3080-ti-founders-edition/28.html
I have no idea where you got those percentages from.
Like I said in the past 2X uplift means it will be CPU limited even at 4K (It may need a testbed change from 5800X to Zen4 V-cache (coming this year according to rumors) to show it's true colors.
Another thing that I said in the past is that maybe Ada Lovelace will ship with 2 TBP options making comparisons with Ampere models more difficult, we will see.
To the topic: Buy a GPU that fits your need, enjoy your games, there are many who gives you a chance to save the world.
PS: I am using a undervolted, slightly underclocked rtx 2070 around 120 W.
I know, this is an 'old argument', who needs more than 256kb of RAM, right? But the thing is.......the software market has dried up. Where is the killer app release that will showcase the raw power of these cards.......it's not there. It's like trying to sell people on Ray Tracing by showcasing MINECRAFT running at 35fps. It's just.....out of step with software development.
Then when we get amazing Ray Tracing patches, only one in every 10 games showcases a marked improvement in visuals....the rest can largely be categorized as "I guess yeah it sorta looks better a little I guess if you're telling me it should"....and your framerate is cut in half, so then you have to render the game at resolutions we thought were amazing in 2009 and then use black magic and math to scale up, which works to a certain degree but then.....if we can already do that....why bother with brute-forcing further?
Its the endless teeter-totter of software versus hardware development.
Can it run Crysis? Who'd want to, these days, when its just a port of the 360 version remastered........
Mean nothing new
Wait until Launch
Then go Poo. I used to game on my 1080Ti on 4K ever since launch and have had no issues, maybe one game which didn't have 4K resolution, but nothing major. Yes, give it a try.
Here's a rant from an "old guy". I've been playing PC games since original Tetris and Prince of Persia days. So much so, I ended up working in IT when I grew up. I was also always salivating looking at high end hardware, but never really had money for it back then. Then as I actually got older I've learned a few things...
1) yes, hardware overtook software years ago, first you were always hunting next gen just to run OS properly (damn Windows ;D ), nowdays except games and few niche/pro apps EVERYTHING runs fine even on entry level hardware. And well done games fly on midrange hardware.
2) hardware also overtook the human abilities, eg 6" phones with 8K screens or whatever. "Retina" showed it years ago. Then, as you also aren't in your prime that 8K 144Hz becomes just a gimmick because my eyes aren't what they used to be, and my reflexes are far from what they used to be.
3) You realize that good games don't need eye candy and twitch reflexes. Recently Valheim showed that to the world, very mediocre graphics, very simple gameplay, multiplayer optional. But so addictive! Then I looked back and thought about games I spent most time with, and it was like Civ III, or shooters like CS that weren't about graphics or fps, just about good coop and simple fun. Guess what, those are still top games today.
4) playing less of latest games saves lots of money! Be 2-3 years behind, and you get gold/Legendary editions and bundles that include DLCs, and you can play on "last gen" hardware, plus you get all the patching and optimizations.
So yeah, buying high end to play games on release, and then you get fail like Cyberpunk that is only now really ready, think of all the money saved if you targeted that game on release vs today.
Sure if you're rolling in cash and have 20h a day just to play, then go for it, I would too :D
But other than that, I agree! That reminds me, Ubisoft games have been trite and boring for as long as I can remember, yet they still sell so well :oops:
I just hope Square Enix gives Yoko Taro his big bag of money for another NieR game soon
That would make it priced in line with Intel's Alchemist GPUs if MSRP stays in line with Ampere.
It's like comparing a bicycle to a car.
By that notion you can say a refrigerator uses more as well or washing machines so what? Should we stop using these or make Graphics card use 1000watt since other appliances use that much? There's boundaries here which probably you have missed or have no respect for. I can't believe this is even a discussion or a comparison of some sort.