Wednesday, October 5th 2022
NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers
Rather than another leak of performance numbers for NVIDIA's upcoming RTX 4090 and 4080 cards, the company has shared some performance numbers of their upcoming cards in Overwatch 2. According to NVIDIA, Blizzard had to increase the framerate cap in Overwatch 2 to 600 FPS, as the new cards from NVIDIA were simply too fast for the previous 400 FPS framerate cap. NVIDIA also claims that with Reflex enabled, system latency will be reduced by up to 60 percent.
As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
Source:
NVIDIA
As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
98 Comments on NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers
the range was from 30 to 144. Also, it has been said in the article, even though you see the difference it is most likely it will do nothing for your playing abilities
Overwatch and Overwatch 2 does not support DLSS. All of that fps gains are rasterized without AI image upscaling.
www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/
NV: "4080 is the new 4070"
Intel: "Arc"
All is fine, nothing to see here, move along.
You also have to add.
What kind of monitor use AND
What size of monitor used.
I did a comparison on Playing Overwatch on a 27 inch screen window and a 32 inch screen window on my monitor @1440p @165Mhtz. with a AMD 5700 The difference in size was noticeable for posting data, but actual playing performance? In my case no, not really.
Since you mentioned Tick Rate OW tried to pull a fast one when it came out in 2016 when the game was being ran on it's legacy servers at the Tick rate of 24. A loud riot ensued and I believe it went up to 120 because of all of the complaints.
Finally NGreedia can talk all about these fantastic numbers but if the game and the netcode for that game can not use it.. It will not be used.
If your not one of those people- 4090 is not for you.
Please don't buy it and leave it for those who can.
Nvidia must be shitting bricks if they're heavily focusing on sharing gaming performance results for competitive gaming to make the numbers look good. Right now all these two graphs are showing me is that the 3070 and 3080 (10GB...? 12GB...? Both?) give remarkable performance in Overwatch 2, even the 3060 handles the game well on maxed out settings at 1440p.
I don't play these competitive games and I don't need freakishly high fps to play games. Clearly I'm not their target market here for this display of performance and I understand that, but I don't see how they're doing anything positive with their PR stuff here with these kind of "leaks" or shared information.
In the 55 posts before yours, DLSS is only mentioned by three people and two of them are talking about DLSS as the marketing focus of the 40-series as a whole, and clearly not referring to these 4090 Overwatch 2 results. That one person: is making a cynical, accusatory remark in the form of a question. One single person questioning the legitimacy of an unverified benchmark isn't what I'd call "so many people" or "spreading fake news". Whilst it's unlikely that Nvidia are using a not-released-to-the-public internal build for DLSS development, his cynicism is at least warranted because it would not be the first (or even tenth time) that Nvidia has been caught cheating in benchmarks and fudging their numbers.
As your first post on TPU, it really helps to not lead with accusations and then insult the community, that's never going to be a good entrance however you try and defend it.
Welcome to TPU and try to be better.
I expect 399%. :D
Interestingly, I don't notice any increase in immediacy of input between 120 and 240Hz. Maybe it depends on the game engine itself, and I don't play COD.
good job being a corporate cocksucker :)