Wednesday, October 5th 2022

NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers
Rather than another leak of performance numbers for NVIDIA's upcoming RTX 4090 and 4080 cards, the company has shared some performance numbers of their upcoming cards in Overwatch 2. According to NVIDIA, Blizzard had to increase the framerate cap in Overwatch 2 to 600 FPS, as the new cards from NVIDIA were simply too fast for the previous 400 FPS framerate cap. NVIDIA also claims that with Reflex enabled, system latency will be reduced by up to 60 percent.
As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
Source:
NVIDIA
As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
98 Comments on NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers
Nvidia have managed to create a lineup that makes the crazy pricing of the xx90 tier look good. Higher FPS is still beneficial because the server tickrate is not synchronized to when your GPU outputs a frame. In otherwords there is a gap between when a frame is generated on your end and when the server ticks. The higher you framerate the smaller that gap will be.
Higher FPS is beneficial but that benefit is smaller the higher you go.
In 2017 the street price of an x80 (with a notably more powerful x80ti above it, 1080ti) was about 450~520 eur. Now we're getting a handicapped product with the same 'just a name' at twice the amount. This is proof that Nvidia is stalling in progress gen-to-gen, and bit off more than they can chew with RT. Are you paying for their fucked up strategy? I sure as hell am not. I called this very thing in every RT topic since Huang announced it: this is going to be bad for us. And here we are now, 3 generations of RT, still very little to show for it, but everything sensible has escalated into absolute horror: TDP, price, and size. And that's ON TOP of the bandaid to fix abysmal FPS numbers called DLSS. And here's the kicker: its not even turning out to be great news for Nvidia in terms of share price.
They have a third left of their peak share price now, and no signs of it turning.
So sure, I'll drop it now :) No ADA anytime soon, its no biggie, still have not the slightest feeling I'm missing out on anything useful.
Screw this tech party. The PC master race is dead and Nvidia is killing it. At least, that's my opinion.
In the end by going console you are literally saying 'I go for content' more so than buying a 1600 dollar GPU, that's for sure.
The name isn't the preformance.
The preformance are the preformance and as always: no good - no money. So simple.
I'm with a 970gtx and will use it for the time to come.
....stuck with the 2080Ti, I was hoping to skip the 3000.
Maybe I'll wait for the 4080Ti.
The RDNA3 gpus is said that they'll use chiplets.
If they keep the same 6800XT/6900XT raster performance and double or triple the RT numbers, I'm sold to AMD.
Nvidia is 100% taking advantage of early adopters and this small window they have before AMD launch and 30 series sits on shelves. More models will follow, price adjustments should follow too, all depends how greedy AMD also want to be, and on that front sadly I don't expect much either.
For example I can't see when FPS is above 120, but I can feel the input delay difference when FPS is in the 100 vs in the 200 in PUBG.
Here is how Nvidia Reflex vs AMD Anti Lag stack up in some e-sport, including Overwatch
Yes, the difference between 144 and 240 is smaller than between 60 and 144. That much is obvious. I've seen LTT's testing, but it's fundamentally flawed. What you really need to do to figure out if going from 144 to 240 makes a difference, is to play for a week on a 240hz, and then go back to 144. Then your 144 will look like a snail. Playing on a 144 and then going to a 240, yeah - you are not going to notice much difference, but the moment you go back to your 144 after a week, oh my gawd.
And im saying this from personal experience, i've played exactly for a week exclusively on a 240hz apex legends. After going back to my 120, yeah, it seemed terrible.
Apart from that: If you look up Overwatch 2 performance videos: Not even the 3090ti manages the numbers nVIDIA is using in it's comparison for the 3080. As always: 1st party numbers need to be seen with a hefty grain of salt.
Nvidia marketing is a joke.
Just for some reference my two Diamond Monster Voodoo II's in SLI could do over 600 FPS at 1280x1024 and that was in 1999.
After this mega monster hope nvidia learns bigger is not always better.
Activision Blizzard shrewdly chose a minimum system requirement that is fairly inclusive. The game does run on older and relatively modest powered consoles such as PS4, Xbox One and Nintendo Switch.
These older devices have a very significant user base. Switch alone has sold over 111 million units. Welcoming previous generation consoles provides a larger pool of players unlike Valorant (another 5v5) which is Windows only.
Naturally this low bar to entry means that PC players can use relatively modest systems like many notebooks.
Unlike its predecessor, Overwatch 2 is free to play and makes its revenue via in-game purchases such as cosmetics and battle passes. Having a large number of active and engaged players is crucial for sustained revenue generation.
From a PC standpoint, it's more important to Activision Blizzard that the game runs well on a GTX 1060, 1050 Ti and 1050 (respectively #1, #4 and #9 graphics cards in the latest Steam Hardware Survey) not the RTX 4090.