Tuesday, July 19th 2022
NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti
NVIDIA's next-generation GeForce RTX 4090 "Ada" flagship graphics card allegedly scores over 19000 points in the 3DMark Time Spy Extreme synthetic benchmark, according to kopite7kimi, a reliable source with NVIDIA leaks. This would put its score around 66 percent above that of the current RTX 3090 Ti flagship. The RTX 4090 is expected to be based on the 5 nm AD102 silicon, with a rumored CUDA core count of 16,384. The higher IPC from the new architecture, coupled with higher clock speeds and power limits, could be contributing to this feat. Time Spy Extreme is a traditional DirectX 12 raster-only benchmark, with no ray traced elements. The Ada graphics architecture is expected to reduce the "cost" of ray tracing (versus raster-only rendering), although we're yet to see leaks of RTX performance, yet.
Sources:
kopite7kimi (Twitter), VideoCardz
96 Comments on NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti
At least they could of said how they mine :cool:
What I'd love to see on w7 x64? A Linux video driver in a Win 7 wrapper...
www.statista.com/statistics/993868/worldwide-windows-operating-system-market-share/
At gs.statcounter.com Win7 had 13.03% as of May 2022, that had dropped to 10.96% in June 2022 which is the same share as Win11...
gs.statcounter.com/os-version-market-share/windows/desktop/worldwide/
That's hardly 99.99%... More like 84% or so.
If you still insist on using Windows 7, being fully aware of the fact that it is insecure, obsolete, unsupported and unmaintained, buy a GTX 10 series or older card and stick to old driver branches such as r470 that still run on that OS. You don't need anything newer or anything faster, the operating system does not support any technology provided by an RTX 20 series or newer GPU whatsoever.
This is next-gen hardware for next-gen computers, it wasn't at all intended for anyone running a vintage, EOL operating system. 13 years later, the world has moved on, and it will not stop to pander to your own selfish desires simply because you don't like change.
Edit: I see a lot of hate for people buying these things and enjoying them. So what if my rig uses 600W+ when I game? I don't travel, I don't drink, I don't go camping and burn trees... this is my one vice, and while it is wasteful I make concessions for the environment in every other part of my life. How does the saying/passage go again? "Let he who is without sin cast the first stone."
So when 4090 is so overly overkill (for the normal guy) one can be just fine with a 4050 level of GPU for the day to day content creation an some medium-light gaming.
It is somewhat like all the new PCI-E GEN 4/5/6... that aren't relevant as x16, but do a lot as x4 (or even x1 in some situations).
Just need to wait for the price to get back to realty (and today MSRP Isn`t it)...
The sync'd FPS and refresh rate really helps and the need to constantly push the max refresh rate of the display to avoid screen tearing is no longer necessary.
Just undervolt for the best reduced heat / reduced voltage / slight performance % decrease to your liking and call it a day.
I have a Kill-A-Watt meter and just watched the wattage reading during the same benchmark tests on each build. Obviously I cannot measure and do not know how bad the transient spikes are. I have a 650w gold PSU in my 3070ti build and a platinum 750w PSU in my 3080 12GB build ... so this might skew the 3080 build's numbers a bit in its favor, but ...
The build with the 3070 ti undervolted to 0.875v GPU core (@1860 MHz constant) draws roughly the same amount of power at the wall as the 3080 12GB build undervolted to 0.8 GPU core (@1725 MHz constant) ... while the 3080 12GB build is 26% - 29% faster in these benchmarks, while being quieter too. To me, that's a win. Just pair with a VRR display.
The wattage at the wall stayed under 400w max, 95% of the time, other than a few spikes into 400w territory.
I'll buy a 4XXX series card for my basement PC once I see the performance numbers, power consumption, and price ... and undervolted it too.
And one might be limited to 1 or 2 video cards per power supply...