• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

Joined
Nov 4, 2005
Messages
11,870 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
I’m gonna guess all the extra hardware is really good at something and that something recently crashed. I will wait to see the reviews.
 
Joined
Jul 13, 2016
Messages
3,079 (1.04/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
The performance gap between the 4090 and "4080" 12GB and 4080 16GB really shows little Nvidia is giving people below the flagship this time.

Nvidia have managed to create a lineup that makes the crazy pricing of the xx90 tier look good.

If the tick rate of the servers is only 120 then anything over 120 FPS should be useless, no?

Higher FPS is still beneficial because the server tickrate is not synchronized to when your GPU outputs a frame. In otherwords there is a gap between when a frame is generated on your end and when the server ticks. The higher you framerate the smaller that gap will be.

Higher FPS is beneficial but that benefit is smaller the higher you go.
 
Joined
Sep 17, 2014
Messages
21,735 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Wake me up when we reach 1000fps.


It's just a name, drop it.
Its a name and a tier/price point buddy. Wake up. The fact we get a new x80 that is doing so little gen-to-gen is reason they're respinning their old Ampere stack below the x80 for a while, for example. Its the reason we're overpaying for 2 year old performance. Meanwhile, the die underneath is just way below expectations for an x80 tier product, while the price is beyond reasonable for that tier. And everytime, its memory that's just subpar. In Ampere it was capacity, now it's bus width/bandwidth, and its the most effective way to reinforce planned obscolescence. We can talk about magical new cache all day alleviating memory requirements, but in the end, if you start pushing that memory, it just won't suffice and you'll experience subpar performance. History repeats.

In 2017 the street price of an x80 (with a notably more powerful x80ti above it, 1080ti) was about 450~520 eur. Now we're getting a handicapped product with the same 'just a name' at twice the amount. This is proof that Nvidia is stalling in progress gen-to-gen, and bit off more than they can chew with RT. Are you paying for their fucked up strategy? I sure as hell am not. I called this very thing in every RT topic since Huang announced it: this is going to be bad for us. And here we are now, 3 generations of RT, still very little to show for it, but everything sensible has escalated into absolute horror: TDP, price, and size. And that's ON TOP of the bandaid to fix abysmal FPS numbers called DLSS. And here's the kicker: its not even turning out to be great news for Nvidia in terms of share price.

They have a third left of their peak share price now, and no signs of it turning.

1664992888455.png


So sure, I'll drop it now :) No ADA anytime soon, its no biggie, still have not the slightest feeling I'm missing out on anything useful.
 
Last edited:

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,740 (2.38/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Its a name and a tier/price point buddy. Wake up. The fact we get a new x80 that is doing so little gen-to-gen is reason they're respinning their old Ampere stack below the x80 for a while, for example. Its the reason we're overpaying for 2 year old performance. Meanwhile, the die underneath is just way below expectations for an x80 tier product, while the price is beyond reasonable for that tier. And everytime, its memory that's just subpar. In Ampere it was capacity, now it's bus width/bandwidth.

In 2017 the street price of an x80 (with a notably more powerful x80ti above it, 1080ti) was about 450~520 eur. Now we're getting a handicapped product with the same 'just a name' at twice the amount. This is proof that Nvidia is stalling in progress gen-to-gen, and bit off more than they can chew with RT. Are you paying for their fucked up strategy? I sure as hell am not.

So sure, I'll drop it now :) No ADA anytime soon, its no biggie, still have not the slightest feeling I'm missing out on anything useful.

I don't always agree with you but in this we have Nvidia trying to justify a fucking 4-slot GPU at $1600 (or whatever) with crazy power consumption and they're selling performance on a bunch of software implementations like DLSS 3 and other voodoo. No. When my 2080ti dies, I'll go AMD, or PS5.

Screw this tech party. The PC master race is dead and Nvidia is killing it. At least, that's my opinion.
 
Joined
Jul 10, 2015
Messages
752 (0.23/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
And where are 3090 and 3090ti ?

Edit: oh! even the 3080ti is not included.
Do you want to put W1zzard out of bussines? There is a reason why we are waiting for legit reviews.
 
Joined
Sep 17, 2014
Messages
21,735 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
I don't always agree with you but in this we have Nvidia trying to justify a fucking 4-slot GPU at $1600 (or whatever) with crazy power consumption and they're selling performance on a bunch of software implementations like DLSS 3 and other voodoo. No. When my 2080ti dies, I'll go AMD, or PS5.

Screw this tech party. The PC master race is dead and Nvidia is killing it. At least, that's my opinion.

PCMR is just reaching logical pinnacles IMHO and we should stop trying to reinvent things that are just fine. The only reason that happens is disgusting shareholders and commercial considerations. None of it is making gaming more fun or better (its more likely to do the opposite, looking at monetization in games). At some point, things are just good. I'm still missing that kind of focus on just making cool game concepts, franchises, etc, and that also reflects on the RT push; its just a graphics effect, and not integral to gameplay. That's the stuff indie developers do, not focusing on graphics. PCMR isn't dead, but needs to readjust its focus. And let's place that perspective in the current time too: climate is an issue, going bigger and more power hungry is completely counter intuitive and soon an economical fallacy because the bill to fix that is so much higher (and growing faster). Nvidia is seriously heading for an iceberg here, and that one ain't melting.

In the end by going console you are literally saying 'I go for content' more so than buying a 1600 dollar GPU, that's for sure.
 
Last edited:
Joined
Jul 15, 2020
Messages
995 (0.67/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.025mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F26 with "Instant 6 GHz" on
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 85%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
Its a name and a tier/price point buddy. Wake up. The fact we get a new x80 that is doing so little gen-to-gen is reason they're respinning their old Ampere stack below the x80 for a while, for example. Its the reason we're overpaying for 2 year old performance. Meanwhile, the die underneath is just way below expectations for an x80 tier product, while the price is beyond reasonable for that tier. And everytime, its memory that's just subpar. In Ampere it was capacity, now it's bus width/bandwidth, and its the most effective way to reinforce planned obscolescence. We can talk about magical new cache all day alleviating memory requirements, but in the end, if you start pushing that memory, it just won't suffice and you'll experience subpar performance. History repeats.

In 2017 the street price of an x80 (with a notably more powerful x80ti above it, 1080ti) was about 450~520 eur. Now we're getting a handicapped product with the same 'just a name' at twice the amount. This is proof that Nvidia is stalling in progress gen-to-gen, and bit off more than they can chew with RT. Are you paying for their fucked up strategy? I sure as hell am not. I called this very thing in every RT topic since Huang announced it: this is going to be bad for us. And here we are now, 3 generations of RT, still very little to show for it, but everything sensible has escalated into absolute horror: TDP, price, and size. And that's ON TOP of the bandaid to fix abysmal FPS numbers called DLSS. And here's the kicker: its not even turning out to be great news for Nvidia in terms of share price.

They have a third left of their peak share price now, and no signs of it turning.

View attachment 264333

So sure, I'll drop it now :) No ADA anytime soon, its no biggie, still have not the slightest feeling I'm
So you say you get less for more? The product is not for your taste for some reason?
The name isn't the preformance.
The preformance are the preformance and as always: no good - no money. So simple.
I'm with a 970gtx and will use it for the time to come.
 
Joined
Sep 17, 2014
Messages
21,735 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
So you say you get less for more? The product is not for your taste for some reason?
The name isn't the preformance.
The preformance are the preformance and as always: no good - no money. So simple.
I'm with a 970gtx and will use it for the time to come.
I'm looking at the market and how it develops, that's the angle of my post, and it also influences my buying decision.
 
Joined
Jun 19, 2019
Messages
208 (0.11/day)
Nvidia is totally disgusting, they were not giving us videocards on retail price for two years, just for miners. Now they want to sell a 4060ti under 4080 name for super big price tag. I already know in 2001 they are big bastards when they killed of 3dfx.
 
Joined
Apr 14, 2022
Messages
696 (0.81/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
It's funny that the 4090 is the only card of the line up that has some value.
....stuck with the 2080Ti, I was hoping to skip the 3000.
Maybe I'll wait for the 4080Ti.

The RDNA3 gpus is said that they'll use chiplets.
If they keep the same 6800XT/6900XT raster performance and double or triple the RT numbers, I'm sold to AMD.
 
Joined
Nov 30, 2021
Messages
132 (0.13/day)
Location
USA
System Name Star Killer
Processor Intel 13700K
Motherboard ASUS RO STRIX Z790-H
Cooling Corsair 360mm H150 LCD Radiator
Memory 64GB Corsair Vengence DDR5 5600mhz
Video Card(s) MSI RTX 3080 12GB Gaming Trio
Storage 1TB Samsung 980 x 1 | 1TB Crucial Gen 4 SSD x 1 | 2TB Samsung 990 Pro x 1
Display(s) 32inch ASUS ROG STRIX 1440p 170hz WQHD x 1, 24inch ASUS 165hz 1080p x 1
Case Lian Li O11D White
Audio Device(s) Creative T100 Speakers , Razer Blackshark V2 Pro wireless
Power Supply EVGA 1000watt G6 Gold
Mouse Razer Viper V2 Wireless with dock
Keyboard ASUS ROG AZOTH
Software Windows 11 pro
Because no 500+ FPS.


Because Nvidia wants you to buy a 500 Hz display?
I absolutley love monitor marketing when they show a specific fps vs. another and they blur the shit out of one of them to make the other look better.
 
Joined
Jan 14, 2019
Messages
10,975 (5.38/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
The most pointless numbers of the year. Why don't they share Half-Life 2 or Quake 3: Arena performance numbers while they're at it?
 
Joined
Jan 29, 2021
Messages
1,809 (1.40/day)
Location
Alaska USA
Looking forward to the real reviews. The 4080 16GB could be a winner along with the eventual 4080 Ti (20GB?). Pair one of those cards up with a Raptor Lake build for FPS gaming at 1440P and you're good to go.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,045 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
False, the gap between 4090 and 16GB 4080 shows that the 16GB should be a 4070 and the 12GB should be a 4060.
To an extent I barely care what they're called, I care what they're priced relative to the performance on offer, and that is also absolute bollocks for the 4080 series and the 4090 seems better but it's still ludicrous.

Nvidia is 100% taking advantage of early adopters and this small window they have before AMD launch and 30 series sits on shelves. More models will follow, price adjustments should follow too, all depends how greedy AMD also want to be, and on that front sadly I don't expect much either.
 
Joined
Nov 13, 2007
Messages
10,442 (1.71/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.4, 4.8Ghz Ring 190W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
If the 4090 is good then why didn't they show the results at 4K resolution?
I don't think there are reflex monitors at 360hz at that resolution yet. The current 4k panels are slow af with the exception of the samsung g8 which is a generally sub par VA monitor with scanline issues and QC problems for $1200.
 
Joined
Nov 13, 2021
Messages
211 (0.21/day)
24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.

JaysTwoCents video talks about that exact subject
 
Joined
Nov 11, 2016
Messages
3,317 (1.17/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Yawn, no one here realize that the huge FPS, coupled with Nvidia Reflex, offer insanely low input delay (which is important for e-sport, more than FPS).

For example I can't see when FPS is above 120, but I can feel the input delay difference when FPS is in the 100 vs in the 200 in PUBG.

Here is how Nvidia Reflex vs AMD Anti Lag stack up in some e-sport, including Overwatch
 
Joined
Jun 14, 2020
Messages
3,275 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.
Let me start off by saying I game on 120 hz monitor.

Yes, the difference between 144 and 240 is smaller than between 60 and 144. That much is obvious. I've seen LTT's testing, but it's fundamentally flawed. What you really need to do to figure out if going from 144 to 240 makes a difference, is to play for a week on a 240hz, and then go back to 144. Then your 144 will look like a snail. Playing on a 144 and then going to a 240, yeah - you are not going to notice much difference, but the moment you go back to your 144 after a week, oh my gawd.

And im saying this from personal experience, i've played exactly for a week exclusively on a 240hz apex legends. After going back to my 120, yeah, it seemed terrible.
 
Joined
Jun 20, 2022
Messages
302 (0.38/day)
Location
Germany
System Name Galaxy Tab S8+
Processor Snapdragon 8 gen 1 SOC
Cooling passive
Memory 8 GB
Storage 256 GB + 512 GB SD
Display(s) 2.800 x 1.752 Super AMOLED
Power Supply 10.090 mAh
Software Android 12
So this answers the question why DLSS 3.0 exists and being RTX 4000 only, as well as why RTX 3000 prices are not coming down further: The RTX 4080 12GB would be a shelf warmer at it's current MSRP. Looks like TSMC 4nm is quite expensive even if yields are as good as they are telling us they are.

Apart from that: If you look up Overwatch 2 performance videos: Not even the 3090ti manages the numbers nVIDIA is using in it's comparison for the 3080. As always: 1st party numbers need to be seen with a hefty grain of salt.
 
Joined
Jun 11, 2017
Messages
236 (0.09/day)
Location
Montreal Canada
OOO wow 1440P. You know most 1080P cards can do 1440p no problems. Try 2160p or 4320p then I would be impressed.

Nvidia marketing is a joke.

Just for some reference my two Diamond Monster Voodoo II's in SLI could do over 600 FPS at 1280x1024 and that was in 1999.

After this mega monster hope nvidia learns bigger is not always better.
 
Joined
Jun 2, 2017
Messages
8,477 (3.22/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Isn't this Game 5 vs 5 with PS4 Graphics?
 
Joined
Jun 21, 2021
Messages
2,989 (2.59/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Sonoma 14.5 (with latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Isn't this Game 5 vs 5 with PS4 Graphics?

Yes.

Activision Blizzard shrewdly chose a minimum system requirement that is fairly inclusive. The game does run on older and relatively modest powered consoles such as PS4, Xbox One and Nintendo Switch.

These older devices have a very significant user base. Switch alone has sold over 111 million units. Welcoming previous generation consoles provides a larger pool of players unlike Valorant (another 5v5) which is Windows only.

Naturally this low bar to entry means that PC players can use relatively modest systems like many notebooks.

Unlike its predecessor, Overwatch 2 is free to play and makes its revenue via in-game purchases such as cosmetics and battle passes. Having a large number of active and engaged players is crucial for sustained revenue generation.

From a PC standpoint, it's more important to Activision Blizzard that the game runs well on a GTX 1060, 1050 Ti and 1050 (respectively #1, #4 and #9 graphics cards in the latest Steam Hardware Survey) not the RTX 4090.
 
Last edited:
Top