Wednesday, October 5th 2022

NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

Rather than another leak of performance numbers for NVIDIA's upcoming RTX 4090 and 4080 cards, the company has shared some performance numbers of their upcoming cards in Overwatch 2. According to NVIDIA, Blizzard had to increase the framerate cap in Overwatch 2 to 600 FPS, as the new cards from NVIDIA were simply too fast for the previous 400 FPS framerate cap. NVIDIA also claims that with Reflex enabled, system latency will be reduced by up to 60 percent.

As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
Source: NVIDIA
Add your own comment

98 Comments on NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

#51
ratirt
the54thvoidIt's an easy enough study to conduct. Gather a whole bunch of participants subjected to 100, 200, 300, 400, 500Hz images without prior knowledge of what FPS they are viewing. Gather results of subjective experience and you have a population sample of sensitivity to high refresh imagery.
I've seen similar study conducted by one of the YT channels. Most of the people were considered gamers and only managed to guess? what refresh rate he has been playing.
the range was from 30 to 144. Also, it has been said in the article, even though you see the difference it is most likely it will do nothing for your playing abilities
Posted on Reply
#52
N/A
The inclusion of 3080 Ti would indicate how 4080 12 is the 407o. But the 192 bit bus really makes it a 406o. So the 405o is the 4060 instead. Scares me to think it would cost what I had prepared for the 4070, Jensen is one hell of a player.
Posted on Reply
#53
TheLostSwede
News Editor
N/AThe inclusion of 3080 Ti would indicate how 4080 12 is the 407o. But the 192 bit bus really makes it a 406o. So the 405o is the 4060 instead. Scares me to think it would cost what I had prepared for the 4070, Jensen is one hell of a player.
Nvidia didn't specify which 3080 they used, so it could be a 10 GB card for all we know.
Posted on Reply
#54
watzupken
I won't count on 1 game result to conclude how "good" the RTX 4000 will be. There is no doubt that we should see a good bump in performance, but we also know that performance don't scale proportionately with increased CUDA cores, and/or, clock speed. Seeing a 60 to 70% bump in performance is likely the best case scenario for a generational improvement. Those 2 to 4x performance improvements are just fluffy, and only a result of DLSS 3. If the numbers are so good, why not do an apple to apple comparison of rasterization performance to start off before going into the fluff?
Posted on Reply
#56
Chomiq
sickofnormiesSo many uninformed people just jumping on bandwagon and spreading fake news. Please fact check yourself before posting.

Overwatch and Overwatch 2 does not support DLSS. All of that fps gains are rasterized without AI image upscaling.

www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/
By Andrew Burnes on May 31, 2021
Posted on Reply
#57
sickofnormies
why dont you login the free to play game and find me DLSS option?
Posted on Reply
#58
Colddecked
Dirt ChipWake me up when we reach 1000fps.


It's just a name, drop it.
Actually its a further devaluing of the xx70 series. So more people get less performance. But lets hope their stock price recovers!
Posted on Reply
#60
Icon Charlie
Chrispy_Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.

I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:
  1. 8.3ms server tick interval
  2. 10-25ms median ping/jitter to the game server via your ISP
  3. 2ms input sampling latency
  4. 2-5ms pixel response time
  5. 80-120ms human reflex time
    or
    20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.
nice comment.

You also have to add.
What kind of monitor use AND
What size of monitor used.

I did a comparison on Playing Overwatch on a 27 inch screen window and a 32 inch screen window on my monitor @1440p @165Mhtz. with a AMD 5700 The difference in size was noticeable for posting data, but actual playing performance? In my case no, not really.

Since you mentioned Tick Rate OW tried to pull a fast one when it came out in 2016 when the game was being ran on it's legacy servers at the Tick rate of 24. A loud riot ensued and I believe it went up to 120 because of all of the complaints.

Finally NGreedia can talk all about these fantastic numbers but if the game and the netcode for that game can not use it.. It will not be used.
Posted on Reply
#61
..0
bobsledWhat’s the bet NVIDIA has DLSS enabled and this isn’t native 2560x1440 rendering?
that can not be the case, then the latency would be double.
Posted on Reply
#62
Dirt Chip
The 4090 is only for those who think they can see the difference between 500 and 240 fps.
If your not one of those people- 4090 is not for you.
Please don't buy it and leave it for those who can.
Posted on Reply
#63
neatfeatguy
I don't have too many games in my library that utilize DLSS. DLSS and FSR don't interest me, they are not parts of the reviews I look at. I can't wait for rasterization performance comparisons. You can't really expect people to be excited about DLSS or FSR if all games don't utilize them, right now it's just a niche software - think Nvida Physx - until they're more widely utilized.

Nvidia must be shitting bricks if they're heavily focusing on sharing gaming performance results for competitive gaming to make the numbers look good. Right now all these two graphs are showing me is that the 3070 and 3080 (10GB...? 12GB...? Both?) give remarkable performance in Overwatch 2, even the 3060 handles the game well on maxed out settings at 1440p.

I don't play these competitive games and I don't need freakishly high fps to play games. Clearly I'm not their target market here for this display of performance and I understand that, but I don't see how they're doing anything positive with their PR stuff here with these kind of "leaks" or shared information.
Posted on Reply
#64
Dirt Chip
BwazeAnd half a grand. ;-)
Money is of no importance, only 500fps is.
Posted on Reply
#65
Legacy-ZA
Wait a second... they can actually log in?
Posted on Reply
#66
Ruru
S.T.A.R.S.
I'll test soon how 1080 Ti handles that. I hope it's not that much demanding than the original OW; that ran like a charm with 980 @ 1500MHz back in the day.
Posted on Reply
#67
Chrispy_
sickofnormiesSo many uninformed people just jumping on bandwagon and spreading fake news. Please fact check yourself before posting.

Overwatch and Overwatch 2 does not support DLSS.
Can you please quote the exact posts that you consider "so many uninformed people"?

In the 55 posts before yours, DLSS is only mentioned by three people and two of them are talking about DLSS as the marketing focus of the 40-series as a whole, and clearly not referring to these 4090 Overwatch 2 results. That one person:
bobsledWhat’s the bet NVIDIA has DLSS enabled and this isn’t native 2560x1440 rendering?
is making a cynical, accusatory remark in the form of a question. One single person questioning the legitimacy of an unverified benchmark isn't what I'd call "so many people" or "spreading fake news". Whilst it's unlikely that Nvidia are using a not-released-to-the-public internal build for DLSS development, his cynicism is at least warranted because it would not be the first (or even tenth time) that Nvidia has been caught cheating in benchmarks and fudging their numbers.

As your first post on TPU, it really helps to not lead with accusations and then insult the community, that's never going to be a good entrance however you try and defend it.

Welcome to TPU and try to be better.
Posted on Reply
#68
TheinsanegamerN
If the tick rate of the servers is only 120 then anything over 120 FPS should be useless, no?
Posted on Reply
#69
Prima.Vera
Why tf do you need 600fps for a shitty game?????????????????????
Posted on Reply
#70
Hxx
Chrispy_Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.

I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:
  1. 8.3ms server tick interval
  2. 10-25ms median ping/jitter to the game server via your ISP
  3. 2ms input sampling latency
  4. 2-5ms pixel response time
  5. 80-120ms human reflex time
    or
    20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.
Can’t only speak from my limited experience but I do notice a difference in gameplay like COD between 144 fps and 220-240 fps (using rtx 3080 and Samsung g7 1440p display). Gameplay seems smoother and slightly faster . Again my subjective experience but I’ve been lowering settings on purpose to hit that 200+ mark
Posted on Reply
#71
DeathtoGnomes
TheDeeGeelolololol, people still expecting 400% increase every generation.
I dont!


I expect 399%. :D
Posted on Reply
#72
Chrispy_
HxxCan’t only speak from my limited experience but I do notice a difference in gameplay like COD between 144 fps and 220-240 fps (using rtx 3080 and Samsung g7 1440p display). Gameplay seems smoother and slightly faster . Again my subjective experience but I’ve been lowering settings on purpose to hit that 200+ mark
I can definitely tell the difference between 120 and 165Hz, I have never used a display faster than 240Hz but even side by side (well okay, desk in front of me vs desk behind me) I cannot tell the difference between 165 and 240Hz.

Interestingly, I don't notice any increase in immediacy of input between 120 and 240Hz. Maybe it depends on the game engine itself, and I don't play COD.
Posted on Reply
#73
exodus1337
Get ready guys, some how NVidia stats will blow your mind yet again. A full 4X more fps at 100% more price with a 10% gain. It makes perfect sense, just trust them.

Posted on Reply
#74
spnidel
TheDeeGeelolololol, people still expecting 400% increase every generation.
nvidia's own marketing said the 4080 is "2-4x faster than the 3080 ti", so yes, lololol
good job being a corporate cocksucker :)
Posted on Reply
#75
Steevo
I’m gonna guess all the extra hardware is really good at something and that something recently crashed. I will wait to see the reviews.
Posted on Reply
Add your own comment
Sep 18th, 2024 08:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts