• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Godfall Benchmark Test & Performance Analysis

bug

Joined
May 22, 2015
Messages
13,645 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I see the main "problem" with Ampere architecture in games, that each sub-unit in SM consist of two blocks. One proper block with only FP32 and second with concurent mix of INT32 and FP32 ops...and I assuming, that without better shader compiling games are using only first block(half of the total shader units). So in some cases we can see that 2080ti with proper 4352 cuda cores can easily outperform 3070 with proper 2944 cuda cores(5888/2). GodFall is good example as it is using tons of shader effects in materials...probably during development it was cooked too fast and we see not so good quality of coding.
Applications like luxmark, 3dmark, ...are better optimized for GPU utilization, there we can see masive performance boos, almost teoretical boost(turing vs ampere).
Nvidia is aware of all this and therefore msrp prices of Ampere TeraFlops monsters are not higher than turing GPUs.
While you are correct, this not specific to Ampere in any way. Every generation makes architectural choices and for each of them you can devise a workload that will act poorly on that specific architecture.
The thing is, when Nvidia or AMD make these choices, they rely on their relationships with game developers and they try to help current common usage patterns, not hinder them. That's why reviews show newer cards to always be faster than what came before ;)
 
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
Shouldn't it be trivial in such cases to run the 2nd block in just one mode (int or fp) ?
I am not an expert or engineer but I am completely satisfied how ampere is designed and how is working :). For me is the second block something like HyperThreading in Intel CPUs(which gives us 28-30% more performance without significant arch. changes).
I am assuming, that it was better and smarter decision to create second block with mix functionality, because graphics in games is still moving more towards pure raytraced...so it is good compromise.
 
Joined
Jan 8, 2017
Messages
9,332 (3.30/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I see the main "problem" with Ampere architecture in games, that each sub-unit in SM consist of two blocks. One proper block with only FP32 and second with concurent mix of INT32 and FP32 ops...and I assuming, that without better shader compiling games are using only first block(half of the total shader units). So in some cases we can see that 2080ti with proper 4352 cuda cores can easily outperform 3070 with proper 2944 cuda cores(5888/2). GodFall is good example as it is using tons of shader effects in materials...probably during development it was cooked too fast and we see not so good quality of coding.
Applications like luxmark, 3dmark, ...are better optimized for GPU utilization, there we can see masive performance boos, almost teoretical boost(turing vs ampere).
Nvidia is aware of all this and therefore msrp prices of Ampere TeraFlops monsters are not higher than turing GPUs.

Welcome to the forums, best first post I've seen here in a long time :)

Shouldn't it be trivial in such cases to run the 2nd block in just one mode (int or fp) ?

It doesn't quite work like that because it is exclusively up to the SM, the compiler just generates the instructions as they are. There is a scheduler in each SM which decides which warps (out of hundreds of in-flight threads) get executed in a clock cycle. The thing is that Int and FP operations are interleaved as there is usually a need to calculate some addresses before a bunch of FP instructions need to take place however most instructions are still going to be FP.

The most likely pattern for an Ampere SM is 64 Int + 64 FP operations in a clock cycle followed by bunch of 128 FP operations in the next clock cycles and so on. This is still better than a Turing SM because in a clock cycle in which 64 Int operations need to take place the Ampere SM can still execute up to 64 FP operations additionally.
 

bug

Joined
May 22, 2015
Messages
13,645 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I am not an expert or engineer but I am completely satisfied how ampere is designed and how is working :). For me is the second block something like HyperThreading in Intel CPUs(which gives us 28-30% more performance without significant arch. changes).
I am assuming, that it was better and smarter decision to create second block with mix functionality, because graphics in games is still moving more towards pure raytraced...so it is good compromise.
Yeah, that's probably the driver's job, figuring out how to schedule execution in order to use the resources optimally.

And welcome to TPU, btw.
 
Joined
Jul 15, 2015
Messages
46 (0.01/day)
i never got the appeal of looter shooters, it's like playing an mmo but singleplayer. i guess it's the progression system that gets people hooked.
 
Joined
Jul 18, 2016
Messages
354 (0.12/day)
Location
Indonesia
System Name Nero Mini
Processor AMD Ryzen 7 5800X 4.7GHz-4.9GHz
Motherboard Gigabyte X570i Aorus Pro Wifi
Cooling Noctua NH-D15S+3x Noctua IPPC 3K
Memory Team Dark 3800MHz CL16 2x16GB 55ns
Video Card(s) Palit RTX 2060 Super JS Shunt Mod 2130MHz/1925MHz + 2x Noctua 120mm IPPC 3K
Storage Adata XPG Gammix S50 1TB
Display(s) LG 27UD68W
Case Lian-Li TU-150
Power Supply Corsair SF750 Platinum
Software Windows 10 Pro
The GTX970 3.5GB thing was way over blown by vast majority of people that don't understand how this sort of stuff works.

You're kidding right. Nvidia lost their lawsuit for this. The whole issue was from Nvidia severing a memory controller link and sharing it with the one beside it for the last 512mb of VRAM causing horrible performance issues when that last 512mb was used. It would be much better if Nvidia just went with 3.5GB but then the specs will say 224-bit which doesn't look as good as 256-bit.
 

Coraptor

New Member
Joined
Nov 16, 2020
Messages
3 (0.00/day)
Who did the testing and how did you do it? Because those results make 0 sense and go against any kind of logic. Have you double and tripple checked those results? Verified them on a different system? Reached out to game developer and gpu manufacturers for public statements? Because that is what you should do when some benchmarks are wildly out of expected behavior.

And no it is not a VRAM issue. If it was then the 3070 (8gb) would not pull ahead of the 2080ti (11gb) on 4k when VRAM demand is the highest when those 2 cards are perform pretty much the same across most games. And drivers are also a questionable explanation at best because 3080 and 3090 literally use the same chip and same GDDR6x memory. Unless the memory usage is well above 10gb there should not be a giant fps advantage for the 3090.

I looks like a user/reviewer error or some system/driver corruption somewhere while testing those cards. I don't want to accuse you guys of failing to do benchmarking but for the sake for trustworthiness you should not just put those illogical results out there and say "this is how it is" but investigate the issue. Best case you hold those benchmarks back until you got a response from developers/AMD/Nvidia saying "this is expected performance because x" or "we are aware of the issue and are working on a fix".

You should strive for maximal trustworthiness and not become a second Userbenchmark putting out bs numbers.
 
Joined
Jul 9, 2015
Messages
3,413 (1.01/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Any idea why the 3080 -> 3090 delta is so high in this game? Maybe vram?
That 3090 performance jump over 3080 is insane compared to typical games. What's the cause? VRAM?
This benchmark dosnt looks right, RTX 2080Ti about 3 fps ahead of RTX 3080, then an RTX 3090, a card that is 10% faster than the 3080, has a 40% performance leap over it ??!


12GB of VRAM used, so, yeah.

That is also how DF crippled Turing in "early preview" of Ampere, so that Jensen Huang's lies look a bit less rampant, it was Doom, but "doesn't fit into VRAM" again.

[AMD did NVidia to this game!]

It is a console exclusive and consoles have only 16GB of RAM for everything and 2GB out of that is actually reserved for OS and 10GB of it is actually faster than the remaining 6GB (on XSeX).
 

bug

Joined
May 22, 2015
Messages
13,645 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Joined
Dec 31, 2009
Messages
19,371 (3.59/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Maybe he's talking about another game?

1605535157684.png
 
  • Like
Reactions: bug

bug

Joined
May 22, 2015
Messages
13,645 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,939 (2.36/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Who did the testing and how did you do it? Because those results make 0 sense and go against any kind of logic. Have you double and tripple checked those results? Verified them on a different system? Reached out to game developer and gpu manufacturers for public statements? Because that is what you should do when some benchmarks are wildly out of expected behavior.

And no it is not a VRAM issue. If it was then the 3070 (8gb) would not pull ahead of the 2080ti (11gb) on 4k when VRAM demand is the highest when those 2 cards are perform pretty much the same across most games. And drivers are also a questionable explanation at best because 3080 and 3090 literally use the same chip and same GDDR6x memory. Unless the memory usage is well above 10gb there should not be a giant fps advantage for the 3090.

I looks like a user/reviewer error or some system/driver corruption somewhere while testing those cards. I don't want to accuse you guys of failing to do benchmarking but for the sake for trustworthiness you should not just put those illogical results out there and say "this is how it is" but investigate the issue. Best case you hold those benchmarks back until you got a response from developers/AMD/Nvidia saying "this is expected performance because x" or "we are aware of the issue and are working on a fix".

You should strive for maximal trustworthiness and not become a second Userbenchmark putting out bs numbers.
Really? Because W1zzard is just “some guy” who hasn’t been reviewing and reliably benchmarking for many years and isn’t well -respected? LOL.

Oh, and welcome to TPU. :wtf:
 

Coraptor

New Member
Joined
Nov 16, 2020
Messages
3 (0.00/day)
Hardware unboxed just rele
Really? Because W1zzard is just “some guy” who hasn’t been reviewing and reliably benchmarking for many years and isn’t well -respected? LOL.

Oh, and welcome to TPU. :wtf:
Hardware Unboxed just released their Godfall benchmarks and those follow the expected performance scaling behavior across multiple gpus and are not subject to some illogical behaviors seen in this review. If you compare all cards in both review you can see that most of them are actually performing pretty similar. But some (like the 3080 for example) are just way worse in this TPU benchmark. Last driver update was on november 9th so both reviews should have used the latest drivers. So this leads me to believe that something went wrong in the TPU review, like 99% sure there was an error. Like an issue when installing drivers for example. It can happen, it should be caught before publishing the data but the most important part is to fix it now. Whenever something is not performing as expected you should be cautious and validate it.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,594 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I retested RTX 3080 and the results are totally different, and now in line with expectations. Charts have been updated. Not sure what happened.

Will retest 3070 and 3090, too

edit: 3070 was wrong, too. 3090, 2080 ti, 5700xt are fine
 
Last edited:

Coraptor

New Member
Joined
Nov 16, 2020
Messages
3 (0.00/day)
I retested RTX 3080 and the results are totally different, and now in line with expectations. Charts have been updated. Not sure what happened.

Will retest 3070 and 3090, too
Thank you!
Also Props to you for actually retesting and not being like "I did this hundreds of times, there is no way my results could be inaccurate!". I still think it should have been caught pre-publication but mistakes can happen and acknowledging and fixing them is even more important. It might even not have been an actual user error. Software can be finicky and sometimes drivers get corrupted or partially fail their installation because windows felt like it or decides to do " random totally ultra important task" during a benchmark.

There is so much talk about hardware these days in the middle of 2 giant gpu series launches. There are so many questionable or straight up wrong information among those talks, it is just so important that review and benchmarking sites, which undoubtedly will be linked to as sources hundreds of times during those debates, strive for maximum accuracy. I am glad to see that we are doing just that. :clap:
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,594 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I still think it should have been caught pre-publication
Agreed, and I did fail here indeed, because I saw the oddities and didn't double-check them. Maybe because it's because I was so unimpressed with the game that I just wanted to get it over with
 
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Looks like those blaming VRAM should just blame the W1zzmeister instead, no matter how you spin it.
 
Last edited:
Joined
Jul 10, 2015
Messages
754 (0.22/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
Very bright correlation: compute vs. gaming performance, AMD made a good choice to seperate to rdna and cdna as GCN failed at gaming. Whats botters me with Ampere is massive deficite in pixelfillrate against rdna2, I know its not 2003 anymore, but fillrate matters.
 
Joined
Sep 17, 2014
Messages
22,042 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Joined
Aug 20, 2007
Messages
21,283 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Absolutely, but I can't anymore because he fixed it already

And admitted he was wrong, which is so not a 2020 like behavior. It's refreshing.
 

bug

Joined
May 22, 2015
Messages
13,645 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
And admitted he was wrong, which is so not a 2020 like behavior. It's refreshing.
Was he wrong though? I was under the impression he did what he always does, but the setup had a glitch. It was all normal upon retesting.
 

HTC

Joined
Apr 1, 2008
Messages
4,656 (0.77/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
I retested RTX 3080 and the results are totally different, and now in line with expectations. Charts have been updated. Not sure what happened.

Will retest 3070 and 3090, too

edit: 3070 was wrong, too. 3090, 2080 ti, 5700xt are fine

In this and future situations where the review is updated, for whatever reason, it may be a good idea to add in the title something like " - Updated XX / YY / ZZZZ"

This way, a visitor becomes aware the contents of the review have changed since the date provided.

EDIT

Also, and since the review is now updated, then it's essentially "a new review", so you can add it to that new feature you introduced with that Lexar SSD review, even if "in updated form".
 
Last edited:
  • Like
Reactions: bug
Joined
Aug 20, 2007
Messages
21,283 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Was he wrong though? I was under the impression he did what he always does, but the setup had a glitch. It was all normal upon retesting.

That's still wrong, just not his fault.
 
  • Like
Reactions: bug
Joined
Nov 26, 2020
Messages
106 (0.08/day)
Location
Germany
System Name Meeeh
Processor 8700K at 5.2 GHz
Memory 32 GB 3600/CL15
Video Card(s) Asus RTX 3080 TUF OC @ +175 MHz
Storage 1TB Samsung 970 Evo Plus
Display(s) 1440p, 165 Hz, IPS
Game gets horrible reviews lol some reviewer's say they would rather watch paint dry

Funny to see 3070 smash the 2080 Ti at 4K tho, this will happen in pretty much all new games going forward tho

This means that the game does not break 8GB requirement at 4K, like the arcticle is saying, only on 3090 because 24GB means higher allocation
You should have been telling us the VRAM usage at 4K using a 8GB card

Minimum fps would have been usefull too

Funny that this game can't do 21:9 at all, a friend of mine uninstalled it after 20 mins because of black bars :laugh:
 
Last edited:
Top