• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Godfall Benchmark Test & Performance Analysis

Joined
Sep 20, 2018
Messages
1,451 (0.65/day)
This benchmark dosnt looks right, RTX 2080Ti about 3 fps ahead of RTX 3080, then an RTX 3090, a card that is 10% faster than the 3080, has a 40% performance leap over it ??!

Am pressing X on this one
 
Joined
Jun 27, 2019
Messages
2,105 (1.07/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
Thats too bad, I was kinda interested in this game 'no I'm not a kid' cause I like looter games like the mentioned Borderlands.

But this I rather not try on my 4gb 570+Ultrawide res, I guess I will check it out sometime next year with a new GPU and when the game is discounted/patched maybe.
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
I've seen screenshots and IMO graphics in this game is subpar and performance optimizations are just not there.

It's shiny and bright - I'll give it that.
 
Joined
Feb 19, 2009
Messages
1,161 (0.20/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) AW3423dwf.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
RTX 3090 is 20 % difference in terms of sm's and all accompanying specs and memory bandwidth at 23%.
It must be vram limitations at play here, cause you just don't get positive scaling when adding more sm's....
 
Joined
Jul 18, 2016
Messages
354 (0.12/day)
Location
Indonesia
System Name Nero Mini
Processor AMD Ryzen 7 5800X 4.7GHz-4.9GHz
Motherboard Gigabyte X570i Aorus Pro Wifi
Cooling Noctua NH-D15S+3x Noctua IPPC 3K
Memory Team Dark 3800MHz CL16 2x16GB 55ns
Video Card(s) Palit RTX 2060 Super JS Shunt Mod 2130MHz/1925MHz + 2x Noctua 120mm IPPC 3K
Storage Adata XPG Gammix S50 1TB
Display(s) LG 27UD68W
Case Lian-Li TU-150
Power Supply Corsair SF750 Platinum
Software Windows 10 Pro
Weird how the 3080 to 3090 delta is so high once a game uses over 8GB....when the 3080 has 10GB. I'm getting GTX 970 3.5GB vibes here...

Can someone who has a 3080 test to see if benchmarks suddenly see a huge drop in memory bandwidth and performance after more than 8GB of VRAM is used? Either that or the game is too VRAM bandwidth reliant.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,978 (2.35/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
I think PCIe Gen 3 is the problem for TPU rare results like 3080 and 3070 behind 2080Ti on 1080p.
Nah, I don’t think we are at “end of times” for PCIe 3.0 yet.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.51/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Weird how the 3080 to 3090 delta is so high once a game uses over 8GB....when the 3080 has 10GB. I'm getting GTX 970 3.5GB vibes here...

Can someone who has a 3080 test to see if benchmarks suddenly see a huge drop in memory bandwidth and performance after more than 8GB of VRAM is used? Either that or the game is too VRAM bandwidth reliant.

The GTX970 3.5GB thing was way over blown by vast majority of people that don't understand how this sort of stuff works.
 
Joined
Sep 17, 2014
Messages
22,317 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Any idea why the 3080 -> 3090 delta is so high in this game? Maybe vram?

It seems a bit like the 3080 is either notoriously shit in this specific game or was forgotten when releasing the driver. It even ends up below a 2080ti at 1080p, that just can't be right.

Or its right and its truly a shit GPU :D But thát shit? Impossibru!

Also... I will reiterate 10GB is already looking like a very tight balance. On the first PS5 'native' launch title...

And in much the same vein, it seems the 2080ti is more of a 4K card than the 3070 will ever be... because of VRAM caps. Its Nvidia as Nvidia does. You never get the full deal when they kick things down the stack. I'm certainly going to wait and see where these 4K VRAM caps land for the near future before I jump on another 8GB GPU.


BTW... it seems we can forget that 12GB VRAM required 'rumor' / 'leak' / 'Tuber nonsense' / (insert whatever you like) for this game.

The GTX970 3.5GB thing was way over blown by vast majority of people that don't understand how this sort of stuff works.

And yet Nvidia had to pay out and lost a case on it. What value you attribute to that is truly irrelevant, the complaints were real and the problems just the same. The 970 didn't perform as well as a 4GB version of it should. Driver trickery was needed to keep smart allocation on the first 3.5GB (I remember Far Cry 3... holy shit what a mess), ergo, effectively you really did miss out on half a gig. In addition, the card was that much less useful in SLI, where frametime variance was quite a lot worse than on dual 980's for example.

None of this is overblown, its just being nuanced down to nothingness by those who feel whatever way they feel about it. But the reality doesn't change: you got lied to and you turn it a blind eye by saying 'its not so bad'. Effectively that spells 'I'm fine with this' to a company. Not really the message a responsible consumer would want to convey - if you care a little bit about your next GPU purchase.

And lo and behold... Pascal had all symmetrical VRAM buses, Turing had much the same. Ampere similarly 'gets around bandwidth inconsistency' between different capacities of VRAM. This is no coincidence.
 
Last edited:
Joined
Jun 25, 2018
Messages
14 (0.01/day)
Very poor performance and mediocre graphics...


For example, the old Ryse Son of Rome (2013) have better graphics level and the same performance, but in 4K
 
Joined
Sep 17, 2014
Messages
22,317 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Very poor performance and mediocre graphics...


For example, the old Ryse Son of Rome (2013) have better graphics level and the same performance, but in 4K

I think that looks and runs fine given the GPU you use, what did you expect? Free RT and 120 FPS at max settings on yesteryear's midrange?

Also Ryse is as on-rails as it gets, not really a good example I'd say. Godfall isn't thát much different, but still has much more assets in play.
 
Joined
Jun 25, 2018
Messages
14 (0.01/day)
I think that looks and runs fine given the GPU you use, what did you expect? Free RT and 120 FPS at max settings on yesteryear's midrange?

Also Ryse is as on-rails as it gets, not really a good example I'd say.
in this game there GPU have performance level ~2070S|1080ti, look at the TPU benchmarks. But what does it have to do with RT? This game hasn't RT yet. It will be later, after patch. Now reflections = SSR
 
Joined
Sep 17, 2014
Messages
22,317 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
in this game there GPU have performance level ~2070S|1080ti, look at the TPU benchmarks. But what does it have to do with RT? This game hasn't RT yet. It will be later, after patch. Now reflections = SSR

It caps out faster but for your GPU it seems to me performance is where it should be. Even without RT.
 
Joined
Sep 17, 2014
Messages
22,317 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
graphics level the same, performance level 3 times worse. Ok, understand...

Well... look at the results of other GPUs in the stack. Most of the way up until the 1080ti the stack seems to be maintained, then Turing GPUs with 8GB and more come up quite well (more L2 cache) and Ampere tops the chart (even more changes to handling memory - more suited to new console gen). My analysis here is that the 3090 tops the chart so hard (even at 1080p with a lower VRAM requirement) because the current crop of GPUs misses some magic trick for doing stuff in VRAM that the 3090 simply doesn't need because it has so much.

If you then look at 1080p compared to 1440p, Vega and a crop of similar GPUs fall off quite hard. More tax on VRAM? Additionally, this would also explain why the 2080ti with 11GB can reach past the 3080 that has 10 and overall seems a bad performer.

Quite possibly, these results will change quite a bit with new drivers?
 
Joined
Jan 8, 2017
Messages
9,395 (3.29/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
No matter how you spin it it seems like VRAM is the only explanation for this unusual scaling.
 
Joined
Jul 20, 2016
Messages
56 (0.02/day)
And that's when I thought that 4k gaming is here. Looks like that new cards can't keep up with ever declining quality of console ports. Give it a couple of years, and 3080 will be good for 1080p only...
 

bug

Joined
May 22, 2015
Messages
13,718 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Do you plan to test this game in future CPU reviews? I was pretty interested because it recommends a 3600x, which i think is the highest of any game? But now looking at these results, i dont think it would be that useful. Seems the GPU will always be the bottleneck.
It's (yet another) UE4 game, so probably not.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,978 (2.35/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
And that's when I thought that 4k gaming is here. Looks like that new cards can't keep up with ever declining quality of console ports. Give it a couple of years, and 3080 will be good for 1080p only...
Because 4k is an ever-moving goalpost. That’s why in a few years you’ll want the next model, 4080, not the 3080.
 
Joined
Sep 2, 2020
Messages
1,491 (0.98/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
what settings is this game running at in the benchmark
would i be able to get my 580 8g pushing 60 fps at 1080p
 
Joined
Dec 31, 2009
Messages
19,371 (3.57/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
what settings is this game running at in the benchmark
would i be able to get my 580 8g pushing 60 fps at 1080p
There is a page that shows the settings he ran the benchmark at. The 580 was tested and showed 30 fps at 1080. Did you read the article?
 
Joined
Sep 2, 2020
Messages
1,491 (0.98/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
yes
im asking if lowering the setting will alow me to get 60
There is a page that shows the settings he ran the benchmark at. The 580 was tested and showed 30 fps at 1080. Did you read the article?
 
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
I see the main "problem" with Ampere architecture in games, that each sub-unit in SM consist of two blocks. One proper block with only FP32 and second with concurent mix of INT32 and FP32 ops...and I assuming, that without better shader compiling games are using only first block(half of the total shader units). So in some cases we can see that 2080ti with proper 4352 cuda cores can easily outperform 3070 with proper 2944 cuda cores(5888/2). GodFall is good example as it is using tons of shader effects in materials...probably during development it was cooked too fast and we see not so good quality of coding.
Applications like luxmark, 3dmark, ...are better optimized for GPU utilization, there we can see masive performance boos, almost teoretical boost(turing vs ampere).
Nvidia is aware of all this and therefore msrp prices of Ampere TeraFlops monsters are not higher than turing GPUs.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,707 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I see the main "problem" with Ampere architecture in games, that each sub-unit in SM consist of two blocks. One proper block with only FP32 and second with concurent mix of INT32 and FP32 ops...and I assuming, that without better shader compiling games are using only first block(half of the total shader units). So in some cases we can see that 2080ti with proper 4352 cuda cores can easily outperform 3070 with proper 2944 cuda cores(5888/2). GodFall is good example as it is using tons of shader effects in materials...probably during development it was cooked too fast and we see not so good quality of coding.
Applications like luxmark, 3dmark, ...are better optimized for GPU utilization, there we can see masive performance boos, almost teoretical boost(turing vs ampere).
Nvidia is aware of all this and therefore msrp prices of Ampere TeraFlops monsters are not higher than turing GPUs.
Welcome to the forums, best first post I've seen here in a long time :)

Shouldn't it be trivial in such cases to run the 2nd block in just one mode (int or fp) ?
 
Top