• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

lol they pretended 3090 (Ti) didn't existed ?
 
If the 4090 is good then why didn't they show the results at 4K resolution?
This.

if you buying an 4090 and playing at 1440P you doing it wrong..

Weird it from nvidia, but their markeking PR suck
 
Saying how fast your card can go in a game like Overwatch is like boasting how fast your caravan can go on a downhill slope.

Can't wait for reviews with real world performance.
 
So the 4090 got a pr fluff piece and that's news.

The interesting thing to me are the two 4080's, one of which is pure arse relative to it's namesake.

20% is not a small amount, so one better be 20% cheaper than the other or I'm done with any interest in 4###.
 
Yeah, well I'd like to see Cyberpunk hit that.

How much do you think one of those bad boys would cost too?
Only for E-sports titles...

More than most of us are going to be willing to spend on a 1080p monitor?
 
This is the first time in 25 years that I'm happy to say,

Fuck NVIDIA.

The pricing is atrocious. The naming is misleading if not outright deceitful.

Also fuck DLSS 3.0. The tech is great but the way they use it to inflate performance figures? Absolutely scammy.

I want the GeForce 40 series sales to tank hard. This is a bloody rip-off.

What's the worst about this situation? AMD will release RDNA 3.0 cards with similar raster/ray tracing performance and will make them just 5-10% cheaper and thus both companies will enjoy insane margins. No competition whatsoever, only a fucking duopoly.
 
Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.

I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:

  1. 8.3ms server tick interval
  2. 10-25ms median ping/jitter to the game server via your ISP
  3. 2ms input sampling latency
  4. 2-5ms pixel response time
  5. 80-120ms human reflex time
    or
    20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.
 
So the 4090 got a pr fluff piece and that's news.

The interesting thing to me are the two 4080's, one of which is pure arse relative to it's namesake.

20% is not a small amount, so one better be 20% cheaper than the other or I'm done with any interest in 4###.
I think the 12gb price should be in the ballpark of a 3070 which it will not be obviously.
It would seem like NV is using last chance to cash in resort and it is all over. Weird.
 
Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.

I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:

  1. 8.3ms server tick interval
  2. 10-25ms median ping/jitter to the game server via your ISP
  3. 2ms input sampling latency
  4. 2-5ms pixel response time
  5. 80-120ms human reflex time
    or
    20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.
I think the same. Over 240fps your eyes will not see the difference and between 120 to 240 it marginal

they are showing off in a game where the graphics suck
 
I think the same. Over 240fps your eyes will not see the difference and between 120 to 240 it marginal
I can bet there will be people that will tell you they see a difference between the 240hz and 500hz for sure. It would seem nowadays (it's been happening for some time now) People just want to be considered different or special or better. Freakin' sickness.
 
I can bet there will be people that will tell you they see a difference between the 240hz and 500hz for sure. It would seem nowadays (it's been happening for some time now) People just want to be considered different or special or better. Freakin' sickness.
13 ms per picture is apparently what's required for us to understand what we've seen.

PC Gamer wrote a piece on FPS a few years ago as well.
 
24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.
False, the gap between 4090 and 16GB 4080 shows that the 16GB should be a 4070 and the 12GB should be a 4060.
 
4080 12gb only 47 more fps compared to 3080 lolololol
lolololol, people still expecting 400% increase every generation.
 
False, the gap between 4090 and 16GB 4080 shows that the 16GB should be a 4070 and the 12GB should be a 4060.
Finally , someone gets it :)
 
Price to performance?
 
Man OW2 is such a shit e-sport, I wonder if anyone would invest into being a pro OW2 player LOL
 
False, the gap between 4090 and 16GB 4080 shows that the 16GB should be a 4070 and the 12GB should be a 4060.
Agreed.

Nvidia has historically always aimed for a 25%-30% performance gap between tiers. This has been approximately true going back as far as the Maxwell architecture for about 8 years worth of consistent product segmentation from Nvidia
 
Can't wait to hear GN take on this with their "Rainbow Six Siege becomes completely unplayable below 615.3 fps average" meme :D
 
13 ms per picture is apparently what's required for us to understand what we've seen.

PC Gamer wrote a piece on FPS a few years ago as well.
Nothing has been mentioned about the 500hz refresh profound difference that a person can see. If there are people able to perceive the 200hz refresh rate these are rather scarce.
But I can bet people will see difference with 500hz with they first day of purchase. I think that is a but of a stretch from their side.
 
lolololol, people still expecting 400% increase every generation.
Yeah totally madness.

Except Huang declared 2/4x the performance so they're getting it from the horses asss, I mean mouth. :D
 
Back
Top