Tuesday, July 19th 2022

NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti

NVIDIA's next-generation GeForce RTX 4090 "Ada" flagship graphics card allegedly scores over 19000 points in the 3DMark Time Spy Extreme synthetic benchmark, according to kopite7kimi, a reliable source with NVIDIA leaks. This would put its score around 66 percent above that of the current RTX 3090 Ti flagship. The RTX 4090 is expected to be based on the 5 nm AD102 silicon, with a rumored CUDA core count of 16,384. The higher IPC from the new architecture, coupled with higher clock speeds and power limits, could be contributing to this feat. Time Spy Extreme is a traditional DirectX 12 raster-only benchmark, with no ray traced elements. The Ada graphics architecture is expected to reduce the "cost" of ray tracing (versus raster-only rendering), although we're yet to see leaks of RTX performance, yet.
Sources: kopite7kimi (Twitter), VideoCardz
Add your own comment

96 Comments on NVIDIA RTX 4090 "Ada" Scores Over 19000 in Time Spy Extreme, 66% Faster Than RTX 3090 Ti

#76
PapaTaipei
ratirtI know what he said and I do understand the premise, but I disagree with justifying 3090's power usage saying Tesla car (a freaking car) uses more power which he basically did.
By that notion you can say a refrigerator uses more as well or washing machines so what? Should we stop using these or make Graphics card use 1000watt since other appliances use that much? There's boundaries here which probably you have missed or have no respect for. I can't believe this is even a discussion or a comparison of some sort.
I have no opinions if we should or not have regulation for that. But I do know there ALREADY ARE regulations for how much power a vacuum cleaner or a refrigerator can use, at least in EU.
Posted on Reply
#77
ratirt
PapaTaipeiI have no opinions if we should or not have regulation for that. But I do know there ALREADY ARE regulations for how much power a vacuum cleaner or a refrigerator can use, at least in EU.
Obviously and I don't these regulation as a measure to suggesting we should have a regulation about the graphics cards. You are missing the point I'm trying to make here. Justifying high power usage for a graphics card giving a comparison to an electric car that uses way more is ridiculous. Size and usage of the two devices vary so much it is even hard to comprehend someone would make such a comparison to justify something. You can argue if cars, or a fridge or any other device,should use that much power though, but that's a different subject.
Posted on Reply
#78
ThrashZone
Hi,
At least they could of said how they mine :cool:
Posted on Reply
#79
ratirt
ThrashZoneHi,
At least they could of said how they mine :cool:
I think you know how ;)
Posted on Reply
#80
zoom314
Well a 40x0 card sounds nice and all, but is kinda useless without a driver, not everyone has upgraded to win10 or soon to be 11 and now 12, first Microshaft abandons the OS, then Nvidia follows suit, of course soon win10 users will be joining 7, 8, and 8.1, the highest card supported by win7 x64 is the 3090Ti under 473.30, Linux now has open source drivers, I'm using 472.12 which powers my 1660Ti's.

What I'd love to see on w7 x64? A Linux video driver in a Win 7 wrapper...
Posted on Reply
#81
Dr. Dro
zoom314Well a 40x0 card sounds nice and all, but is kinda useless without a driver, not everyone has upgraded to win10 or soon to be 11 and now 12, first Microshaft abandons the OS, then Nvidia follows suit, of course soon win10 users will be joining 7, 8, and 8.1, the highest card supported by win7 x64 is the 3090Ti under 473.30, Linux now has open source drivers, I'm using 472.12 which powers my 1660Ti's.

What I'd love to see on w7 x64? A Linux video driver in a Win 7 wrapper...
Do you realize that this is very much your problem and that 99.99% of people aren't holdouts on 13 year old operating systems?
Posted on Reply
#82
zoom314
Dr. DroDo you realize that this is very much your problem and that 99.99% of people aren't holdouts on 13 year old operating systems?
At statista.com 12.91% as of Nov 2021 still use Win7...
www.statista.com/statistics/993868/worldwide-windows-operating-system-market-share/
At gs.statcounter.com Win7 had 13.03% as of May 2022, that had dropped to 10.96% in June 2022 which is the same share as Win11...
gs.statcounter.com/os-version-market-share/windows/desktop/worldwide/

That's hardly 99.99%... More like 84% or so.
Posted on Reply
#83
Dr. Dro
zoom314At statista.com 12.91% as of Nov 2021 still use Win7...
www.statista.com/statistics/993868/worldwide-windows-operating-system-market-share/
At gs.statcounter.com Win7 had 13.03% as of May 2022, that had dropped to 10.96% in June 2022 which is the same share as Win11...
gs.statcounter.com/os-version-market-share/windows/desktop/worldwide/

That's hardly 99.99%... More like 84% or so.
What a bore. Windows 7 diehards always use the exact same argument in bad faith, you just forgot to account for the obvious. These are by large old machines that are installed in businesses, libraries and other places that have budgeting limitations that do not allow them to upgrade, or machines that must remain on Windows 7 due to a very specific requirement. In the former, any good security counsel would suggest such systems should be switched to Linux, in the latter case, they should long be offline. You're already two years and a half past the EOL date, that was announced half a decade before that.

If you still insist on using Windows 7, being fully aware of the fact that it is insecure, obsolete, unsupported and unmaintained, buy a GTX 10 series or older card and stick to old driver branches such as r470 that still run on that OS. You don't need anything newer or anything faster, the operating system does not support any technology provided by an RTX 20 series or newer GPU whatsoever.

This is next-gen hardware for next-gen computers, it wasn't at all intended for anyone running a vintage, EOL operating system. 13 years later, the world has moved on, and it will not stop to pander to your own selfish desires simply because you don't like change.
Posted on Reply
#84
Darller
I'll buy it. MSFS 2020 is no joke, and neither is running games at native resolution on the reverb G2. I'll take all the GPU I can get.

Edit: I see a lot of hate for people buying these things and enjoying them. So what if my rig uses 600W+ when I game? I don't travel, I don't drink, I don't go camping and burn trees... this is my one vice, and while it is wasteful I make concessions for the environment in every other part of my life. How does the saying/passage go again? "Let he who is without sin cast the first stone."
Posted on Reply
#85
Dirt Chip
I like those kind of behemoth GPU, not because I'm buying one but because they pull all the rest of the pack with them (pack=lower tire GPU`s).
So when 4090 is so overly overkill (for the normal guy) one can be just fine with a 4050 level of GPU for the day to day content creation an some medium-light gaming.
It is somewhat like all the new PCI-E GEN 4/5/6... that aren't relevant as x16, but do a lot as x4 (or even x1 in some situations).

Just need to wait for the price to get back to realty (and today MSRP Isn`t it)...
Posted on Reply
#86
Dr. Dro
DarllerI'll buy it. MSFS 2020 is no joke, and neither is running games at native resolution on the reverb G2. I'll take all the GPU I can get.

Edit: I see a lot of hate for people buying these things and enjoying them. So what if my rig uses 600W+ when I game? I don't travel, I don't drink, I don't go camping and burn trees... this is my one vice, and while it is wasteful I make concessions for the environment in every other part of my life. How does the saying/passage go again? "Let he who is without sin cast the first stone."
I want one but AMD's competitor. Full configuration Navi 31 sounds like a great gaming card to retire my soon to be 2 year old 3090
Posted on Reply
#87
CyberCT
The other thing to consider is variable refresh rates. These days, there are plenty of Gsync and Freesync monitors out there. Heck, even plenty of Freesync TVs are now out there with HDMI 2.1. I have a 2021 Samsung Neo Gsync TV (why they heck don't they advertise it as Gsync. They admit it is Gsync but don't even say it in their marketing materials or website). It's connected to my 3080 12G GPU and I play games w/ tweaked settings, if necessary, to allow frame rates between 70 - 120fps. Honestly, with the very smooth frame transition with no screen tearing, it's hard to notice a fps difference in that range. Totally worth the sale price of last year's model.

The sync'd FPS and refresh rate really helps and the need to constantly push the max refresh rate of the display to avoid screen tearing is no longer necessary.

Just undervolt for the best reduced heat / reduced voltage / slight performance % decrease to your liking and call it a day.
Posted on Reply
#89
zoom314
There won't be an RTX 6800... ;)
Posted on Reply
#91
Darller
Dr. DroI want one but AMD's competitor. Full configuration Navi 31 sounds like a great gaming card to retire my soon to be 2 year old 3090
I'm kinda stuck with Nvidia because my main monitor is GSync; all three of my previous primary screens have been. My side monitors are FreeSync and their variable refresh performance is inferior, though that probably says more about the panels' limited range(42 - 75) than about FreeSync itself. That said I haven't found ANY FreeSync monitors' VRR performance to reach GSync levels of smoothness or consistency. That was true of the first ones I used, and it remains true with the panels I purchased this spring.
CyberCTIt's connected to my 3080 12G GPU and I play games w/ tweaked settings, if necessary, to allow frame rates between 70 - 120fps. Honestly, with the very smooth frame transition with no screen tearing, it's hard to notice a fps difference in that range.
I completely agree, though my sensitivity range is more like 82 and above. I still notice a very slight stutter effect below 82Hz, and anything below 70Hz is borderline unplayable now. This likely explains my aversion to the FreeSync displays I've used.
Posted on Reply
#92
CyberCT
Over the last month I did some benchmarks on both my newer builds (both 11900K limited to boost to 4.8GHz at 1.19 or 1.2 Vcore max). One has a 3070ti, one has a 3080 12GB. Both have an M2 SSD and an SATA SSD. The 3080 build has an additional 3 fans.

I have a Kill-A-Watt meter and just watched the wattage reading during the same benchmark tests on each build. Obviously I cannot measure and do not know how bad the transient spikes are. I have a 650w gold PSU in my 3070ti build and a platinum 750w PSU in my 3080 12GB build ... so this might skew the 3080 build's numbers a bit in its favor, but ...

The build with the 3070 ti undervolted to 0.875v GPU core (@1860 MHz constant) draws roughly the same amount of power at the wall as the 3080 12GB build undervolted to 0.8 GPU core (@1725 MHz constant) ... while the 3080 12GB build is 26% - 29% faster in these benchmarks, while being quieter too. To me, that's a win. Just pair with a VRR display.

The wattage at the wall stayed under 400w max, 95% of the time, other than a few spikes into 400w territory.

I'll buy a 4XXX series card for my basement PC once I see the performance numbers, power consumption, and price ... and undervolted it too.
Posted on Reply
#94
zoom314
kiss4lunaI'm buying the 4080
Of course after the 3090 Ti you'll need a compatible psu... from what I've read at least.

And one might be limited to 1 or 2 video cards per power supply...
Posted on Reply
#95
pavle
GFFX 4090 66% over 3090Ti? That's more than half-way decent. One third is reserved for overclockers. :)
Posted on Reply
Add your own comment
Dec 19th, 2024 07:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts