Wednesday, August 22nd 2018

NVIDIA Releases First Internal Performance Benchmarks for RTX 2080 Graphics Card

NVIDIA today released their first official performance numbers for their new generation of GeForce products - particularly, the RTX 2080. The RTX 20 series of graphics cards, according to the company, offers some 50% performance improvements (on average) on architectural improvements alone, in a per-core basis. This number is then built upon with the added RTX performance of the new RT cores, which allows the RTX 2080 to increase its performance advantage over the last generation 1080 by up to 2x more - while using the new DLSS technology. PUBG, Shadow of the Tomb Raider, and Final Fantasy XV are seeing around 75 percent or more improved performance when using this tech.

NVIDIA is also touting the newfound ability to run games at 4K resolutions at over 60 FPS performance, making the RTX 2080 the card to get if that's your preferred resolution (especially if paired with one of those dazzling OLED TVs...) Of course, IQ settings aren't revealed in the slides, so there's an important piece of the puzzle still missing. But considering performance claims of NVIDIA, and comparing the achievable performance on last generation hardware, it's fair to say that these FPS scores refer to the high or highest IQ settings for each game.
Add your own comment

107 Comments on NVIDIA Releases First Internal Performance Benchmarks for RTX 2080 Graphics Card

#27
phill
Unimpressed with the price, that's enough to really put me off.. Must be due to the mining maybe that the prices are so high or is it because they've gone from GTX to RTX?? Maybe the bosses who thought that up needed a bigger bonus or something..

Either ways, I'll stick with my first word of the post, unimpressed....
Posted on Reply
#29
therealmeep
Durvelle27300W+ under load I bet throw in overclocking and you might scratch 350-400W
Puts it about in line with a decent 1080ti, I'd be willing to bet that the 2080ti just edges out the titan xp service pack 3 or whatever we're on at this point. Pre laugh it seems like rtx is setting up for failure and become cards people say aren't gaming oriented. I personally am incredibly interested in the fact that the cards come with tensor cores in them and ignoring the ray tracing crap, I think the 2070 and 2080 may be go to cards for ai hobbiests and devs in the future.
Posted on Reply
#30
Naito
Let's wait for some figures regarding 2080 vs 1080 without all the tensor/RT features. This more 'raw' comparison is what will really show the performance of the new generation, not the cherry-picked FPS scores which shows off the tech in the new cards. Looking at the FLOPs of the 20 series, they're not far beyond the 10 series, but with the boosted memory bandwidth, we may see only 20% to 30% gains over the outgoing generation.

GTX 1080 8227.8 GFLOPS vs RTX 2080 8920.3 GFLOPS at base clocks.
Posted on Reply
#31
Fluffmeister
Prima.VeraWth is DLSS ????????????
It stands for Deep Learning Super Sampling, it takes the idea of a simple graph and tries to understand why some people don't get it.

Joking aside, it's a new AA technique that uses those shiny new tensor cores to create a better image, how it pans out in practice remains to be seen.
Posted on Reply
#32
Prima.Vera
FluffmeisterIt stands for Deep Learning Super Sampling, it takes the idea of a simple graph and tries to understand why some people don't get it.

Joking aside, it's a new AA technique that uses those shiny new tensor cores to create a better image, how it pans out in practice remains to be seen.
Thanks for the update. I'm still confused on how do you gain that huge performance increase by enabling that??
Posted on Reply
#33
btarunr
Editor & Senior Moderator
Not looking good. Focus on the dark green bars (minus DLSS, why do you need AA at 4K?). Consider that not a lot of these are current AAA titles, and that you're comparing a 215W TDP chip to a 180W TDP chip (GTX 1080), or 19% higher TDP right off the bat. Certainly doesn't warrant these prices.
Posted on Reply
#35
btarunr
Editor & Senior Moderator
BTW, why would anyone need any kind of AA on 4K displays unless they're >40-inch TVs?
Posted on Reply
#36
Parn
btarunrNot looking good. Focus on the dark green bars (minus DLSS). Consider that not a lot of these are current AAA titles, and that you're comparing a 215W TDP chip to a 180W TDP chip (GTX 1080), or 19% higher TDP right off the bat. Certainly doesn't warrant these prices.
Exactly my thought.

Cherry picked titles with the shiny DLSS bar on top the actual dark green bar to divert public attention... 19% higher TDP, 40 - 50% higher MSRP for an average 30% (dark green bar only) performance increase, not worth it.
Prima.VeraThanks for the update. I'm still confused on how do you gain that huge performance increase by enabling that??
As far as I understand it (very basic), DLSS (a type of AA) involves heavy matrix operations and these tensor cores are specialised in making them as fast as possible.
Posted on Reply
#37
hat
Enthusiast
btarunrBTW, why would anyone need any kind of AA on 4K displays unless they're >40-inch TVs?
No clue. I haven't used AA in a long while. Not at 1080p on a 22" screen, not at 1440x900 on this 19" tv, not at 1680x1050 on a 20" screen. In fact the only time I've thought something could use some AA is when playing PS2 games... on an actual PS2. In PCSX2, just up the resolution (even 3x native, which is close to 1920x1080) looks really good. Some of the lines on the actual PS2 though kinda look like, well, this

Posted on Reply
#38
ViperXTR
Prima.VeraWth is DLSS ????????????
Imagine a hardware part dedicated for super sampling ( i think this was being imagined many years ago like free 4x AA with no performance hit), but like they say it has yet to be seen in action, and how much it affects the 2080ti, 2080 and 2070.

Them tensor cores are deep learning AI processors and it tries to predict the ideal smooth image, per frame using its knowledge and algorithm (probably setup by the devs of the game), so its like its trying to fill up the pixels and eliminate jaggies based on what it understands what it looks smooth
Posted on Reply
#39
Prima.Vera
On higher resolutions SMAA injected with SweetFX it's the best AA tech ever.
Posted on Reply
#40
rtwjunkie
PC Gaming Enthusiast
The PR releases and the crap slides combined remind me of AMD publicity, complete with everyone’s reactions! :laugh:
Posted on Reply
#41
krykry
These "benchmarking results" look very shady to me. There's no mention of what settings are used, there's no mention what the baseline on the graphs is... and the titles themselves look incredibly dodgy to me. They're either "HDR-tested" whereas we know Nvidia GPUs can have trouble with HDR and performance losses, are games or applications that aren't out yet, or have unclear testing methodology (like dx11 or dx12 was used for hitman?), Ark Survival is on the graph on the left, but we don't have FPS numbers for the game which we could compare with existing 10xx line benchmarks. We don't have PUBG numbers either. Not a single reliable information.

I have no doubt it'll be better, since I doubt the performance regressed, but these "internal benchmarks" are all fishy to me.
Posted on Reply
#42
ViperXTR
all internal benchmarks are fishy just wait for a month before them reviewers gets their hands on it, or maybe some leaks
Posted on Reply
#43
Prima.Vera
Considering the card's prices, if the 2070 is not faster or at least same perf levels as the 1080Ti, I will consider those cards to be a complete failure from nVidia.
Posted on Reply
#44
xorbe
Prima.VeraThanks for the update. I'm still confused on how do you gain that huge performance increase by enabling that??
I don't see that anyone took a stab at answering that. If I understand correctly, the benchmarks were run with some unspecified level of AA, and that AA was then replaced by DLSS running on the "other" cores, freeing up previously used resources to render more frames.

So, the benchmarks were run at probably a very high/expensive level of AA + 4K, something you wouldn't normally ask a 1080FE to do ...
Posted on Reply
#45
ViperXTR
Prima.VeraConsidering the card's prices, if the 2070 is not faster or at least same perf levels as the 1080Ti, I will consider those cards to be a complete failure from nVidia.
Yeah, x70 cards are normally faster than the prev high end in raw performance, but they are saying that the 2070 is faster than the ol titan but in RTX or other metrics rather than raw performance
Posted on Reply
#46
TheGuruStud
xorbeI don't see that anyone took a stab at answering that. If I understand correctly, the benchmarks were run with some unspecified level of AA, and that AA was then replaced by DLSS running on the "other" cores, freeing up previously used resources to render more frames.

So, the benchmarks were run at probably a very high/expensive level of AA + 4K, something you wouldn't normally ask a 1080FE to do ...
8x MSAA then blur fest for 20 series, sounds about right for nvidia marketing
Posted on Reply
#47
Fatalfury
should have released these slides on Day 1 itself...
there would'nt have been so much negativity/uncertanity then...
Posted on Reply
#48
Minus Infinity
Why are they comparing 2080 to 1080, when 2080 is essentially 1080 Ti replacement and 2080 Ti is Titan replacement. They should be comparing 2070 to 1080. Sure it'll still be better but nowhere near as much as they are claiming. Cynical attempt to fool people with absurdly priced cards with features most people don't need. Who the hell needs Tensor cores for gaming? That should be a Ti and Quadro feature only. Even the ray tracing is a gimmick at this stage and should also be eclusive to the Ti. The lower models could have been much smaller chips without RTX and tensor cores and much cheaper. GTX 2080 and 2070 to directly replace 1080 and 1070 and RTX 2080 Ti replaces 1080Ti and Titan. AMD can make a killing if they get the 7nm update pricing and performance right.
Posted on Reply
#49
Naito
ViperXTRThem tensor cores are deep learning AI processors and it tries to predict the ideal smooth image, per frame using its knowledge and algorithm (probably setup by the devs of the game), so its like its trying to fill up the pixels and eliminate jaggies based on what it understands what it looks smooth
From what I understood from the reveal is that Nvidia will process each game in their labs for the deep learning/AI training. The resulting DLSS profile for each title will then be delivered through driver updates. I could be wrong, though.
Posted on Reply
#50
ToxicTaZ
2020
Nvidia Ampere is 7nm (3000 Series) Second Generation Ray Tracing.

Vs Intel Arctic Sound
Vs AMD Navi

Unfortunately no competition 2018 and 2019

Vega 20 best won't match 2080Ti performance by no means.

Nvidia has market Monopoly till 2020.
Posted on Reply
Add your own comment
Nov 24th, 2024 12:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts