• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

New GameTech GPU benchmark. Share your results! (STEAM page live now)

Zerion

New Member
Joined
Aug 13, 2024
Messages
6 (0.06/day)
Hi @freeagent @yzonker @agent_x007 @stahlhart and thank you!

About horizontal resolution, you should not worry about it. It's by design. The vertical resolution will be the important one. The horizontal one, will be rendered maintaining the aspect ratio. The rest will be filled with black bars, without hitting the performance result.

About the connection issues, It's due to a slow response from the server from where I check the realtime global date (http://worldtimeapi.org/). It's solved now, leaving a longer time gap to get a valid server reply and added a second/alternative server check. I don't know how this important server can be so slow these days.

Thanks again!

Hi Miguel, thanks for creating the benchmark.

I just registered an account here just to inform you that I encountered the same "connection" issue on my first launch. Was the server issue really resolved?
 
Last edited:

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,697 (2.91/day)
Location
Jyväskylä, Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X @ PBO +200 -20CO
Motherboard Asus ROG Crosshair VII Hero
Cooling Arctic Freezer 50, EKWB Vector TUF
Memory 32GB Kingston HyperX Fury DDR4-3466
Video Card(s) Asus GeForce RTX 3080 TUF OC 10GB
Storage 3.3TB of SSDs + 3TB USB3.0 HDDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D Airflow White
Audio Device(s) Asus TUF H3 Wireless / Corsair HS35
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 + Asus ROG Strix Edge Nordic
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis
Raster 4K with RTX 3080

1723598795597.png


Raster 1080p with RX 6700 XT

1723599412574.png
 
Last edited:
Joined
Sep 29, 2020
Messages
83 (0.05/day)
Thank you all!

@PierreJG I will add to my to "to do" list, but with lowest priority. Anyway I'm quite sure it won't be feasible as Shader Model 6 is required, due to ultra high poly models and in order to make Nanite work. Without Nanite, it would work with high poly LODs, so anyway it would run heavily, or even worse. As this was designed with current gaming PCs in mind and to least some years with future ones too, probably it would need to be fully reworked to make it run smoothly in very weak PCs. For this case, I think using a more basic benchmark would have more sense. But I will take a look into it, when prioritary thing are completed.

@stahlhart , what kind of graphics corruption did you see? I could think about blurry textures? Maybe because Vram was a little exhausted. Did you remember if the result were, anyway, the same as when you ran it finely? (This, usually, shouldn't affect performance). I have also noticed the non-responding issue, but only in my laptop, never in desktop. But I think it only happens when I try to saturate my laptop pressing additional keys after executing the exe (like pressing the Windows key). Anyway, this only may be an Unreal issue I hope they resolve/improve. This is Unreal 5.3 (skipped 5.4). The final version of this benchmark is scheduled to happen during UE5.5, which I hope to be more stable, with many Unreal internal bugs fixed (some reported by myself and approved), and some interesting additions (like Path Tracing adaptive sampling and Stochastic shadows).

@Zerion I have revisited it and it seems it was already fine, but slow servers (worldtimeapi.org and timeapi.io). I have reinforced the code and left a longer time gap (7.5 seconds for each server check, to be returned as timeout). This should perfectly work now, if I'm not wrong, unless both servers go totally down. This expiration date 'strategy' will be probaly removed when released in Steam.


Thanks all!
 
Last edited:
Joined
Aug 9, 2024
Messages
99 (0.94/day)
Location
Michigan, United States
Processor Intel i7-13700K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling NZXT Kraken Elite 360
Memory G.Skill Trident Z DDR5-6400
Video Card(s) MSI RTX 4090 Suprim X Liquid
Storage Western Digital SN850X 4Tb x 4
Case Corsair 5000D Airflow
Audio Device(s) Creative AE-5 Plus
Power Supply Corsair HX-1200
Software Windows 11 Pro 23H2
@stahlhart , what kind of graphics corruption did you see? I could think about blurry textures? Maybe because Vram was a little exhausted. Did you remember if the result were, anyway, the same as when you ran it finely? (This, usually, shouldn't affect performance). I have also noticed the non-responding issue, but only in my laptop, never in desktop. But I think it only happens when I try to saturate my laptop pressing additional keys after executing the exe (like pressing the Windows key). Anyway, this only may be an Unreal issue I hope they resolve/improve. This is Unreal 5.3 (skipped 5.4). The final version of this benchmark is scheduled to happen during UE5.5, which I hope to be more stable, with many Unreal internal bugs fixed (some reported by myself and approved), and some interesting additions (like Path Tracing adaptive sampling and Stochastic shadows).

If I remember correctly the performance was consistent, but there was just some flickering in the textures as it ran. It was just that one time, and I haven't seen it since. It was about the same as far as the black screens went -- happened a few times, and then stopped. The benchmark has been running fine since. If it acts up again, I'll keep you posted.
 

Zerion

New Member
Joined
Aug 13, 2024
Messages
6 (0.06/day)
Thank you all!

@PierreJG I will add to my to "to do" list, but with lowest priority. Anyway I'm quite sure it won't be feasible as Shader Model 6 is required, due to ultra high poly models and in order to make Nanite work. Without Nanite, it would work with high poly LODs, so anyway it would run heavily, or even worse. As this was designed with current gaming PCs in mind and to least some years with future ones too, probably it would need to be fully reworked to make it run smoothly in very weak PCs. For this case, I think using a more basic benchmark would have more sense. But I will take a look into it, when prioritary thing are completed.

@stahlhart , what kind of graphics corruption did you see? I could think about blurry textures? Maybe because Vram was a little exhausted. Did you remember if the result were, anyway, the same as when you ran it finely? (This, usually, shouldn't affect performance). I have also noticed the non-responding issue, but only in my laptop, never in desktop. But I think it only happens when I try to saturate my laptop pressing additional keys after executing the exe (like pressing the Windows key). Anyway, this only may be an Unreal issue I hope they resolve/improve. This is Unreal 5.3 (skipped 5.4). The final version of this benchmark is scheduled to happen during UE5.5, which I hope to be more stable, with many Unreal internal bugs fixed (some reported by myself and approved), and some interesting additions (like Path Tracing adaptive sampling and Stochastic shadows).

@Zerion I have revisited it and it seems it was already fine, but slow servers (worldtimeapi.org and timeapi.io). I have reinforced the code and left a longer time gap (7.5 seconds for each server check, to be returned as timeout). This should perfectly work now, if I'm not wrong, unless both servers go totally down. This expiration date 'strategy' will be probaly removed when released in Steam.


Thanks all!
Thanks, it is working now.
 
Joined
Aug 23, 2017
Messages
113 (0.04/day)
System Name DELL 3630
Processor I7 8700K
Memory 32 gig
Video Card(s) 4070
Path tracing.


I couldn't change the resolutions with the other ones. 1440p was 2700x1440 instead of 2560x1440 and 4k was 4096x2160 instead of 3840x2160.
 
Joined
Sep 29, 2020
Messages
83 (0.05/day)
Thank you all three.

Interesting @stahlhart . I think it may was something related to virtual textures (textures which are streamed under demand, a quite new feature in Unreal). Thanks!

@AvrageGamr great score, like a 3090. About resolution, Path Tracing will be rendered always in 1080p, as resolution is not important for that test, but it can help needing less VRAM, so it's more compatible with more GPUs, with lower VRAM. About the realtime tests, horizontal res is automatically calculated to preserve the aspect ratio of your screen, instead of deforming the image.
 

Zerion

New Member
Joined
Aug 13, 2024
Messages
6 (0.06/day)
1440p RT and PT from 7900XT

PT result for AMD seems a bit low
 

Attachments

  • 2024-8-16_21-39-35.png
    2024-8-16_21-39-35.png
    210.9 KB · Views: 38
  • 2024-8-16_21-44-49.png
    2024-8-16_21-44-49.png
    119.8 KB · Views: 34
Joined
Aug 23, 2017
Messages
113 (0.04/day)
System Name DELL 3630
Processor I7 8700K
Memory 32 gig
Video Card(s) 4070
Thank you all three.

Interesting @stahlhart . I think it may was something related to virtual textures (textures which are streamed under demand, a quite new feature in Unreal). Thanks!

@AvrageGamr great score, like a 3090. About resolution, Path Tracing will be rendered always in 1080p, as resolution is not important for that test, but it can help needing less VRAM, so it's more compatible with more GPUs, with lower VRAM. About the realtime tests, horizontal res is automatically calculated to preserve the aspect ratio of your screen, instead of deforming the image.
I mean the other benchmark for fps.
 
Joined
Sep 29, 2020
Messages
83 (0.05/day)
Hi!

Do you have two different screens, one "standard" (16:9 or 4:3) and another one ultrawide?

If yes, please, could you make a fresh run with each screen, unplugging the rest of video outs, and selecting the same settings in the bench?

I want to be sure performance is the same independently of screen's aspect ratio.

Thank you very much!
 
Joined
Aug 23, 2017
Messages
113 (0.04/day)
System Name DELL 3630
Processor I7 8700K
Memory 32 gig
Video Card(s) 4070
Hi!

Do you have two different screens, one "standard" (16:9 or 4:3) and another one ultrawide?

If yes, please, could you make a fresh run with each screen, unplugging the rest of video outs, and selecting the same settings in the bench?

I want to be sure performance is the same independently of screen's aspect ratio.

Thank you very much!
Hi. If you are referring to me, I use a 40-inch 4k tv nothing else and it doesn't have 4096x2160 resolution just 3840x2160. The 2 screenshots show the resolution


Edit: my TV accepts a 4096x2160 signal, but it's cut off a little on the ends.
 
Last edited:

johnspack

Here For Good!
Joined
Oct 6, 2007
Messages
6,035 (0.96/day)
Location
Nelson B.C. Canada
System Name System2 Blacknet , System1 Blacknet2
Processor System2 Threadripper 1920x, System1 2699 v3
Motherboard System2 Asrock Fatality x399 Professional Gaming, System1 Asus X99-A
Cooling System2 Noctua NH-U14 TR4-SP3 Dual 140mm fans, System1 AIO
Memory System2 64GBS DDR4 3000, System1 32gbs DDR4 2400
Video Card(s) System2 GTX 980Ti System1 GTX 970
Storage System2 4x SSDs + NVme= 2.250TB 2xStorage Drives=8TB System1 3x SSDs=2TB
Display(s) 1x27" 1440 display 1x 24" 1080 display
Case System2 Some Nzxt case with soundproofing...
Audio Device(s) Asus Xonar U7 MKII
Power Supply System2 EVGA 750 Watt, System1 XFX XTR 750 Watt
Mouse Logitech G900 Chaos Spectrum
Keyboard Ducky
Software Archlinux, Manjaro, Win11 Ent 24h2
Benchmark Scores It's linux baby!
Might as well show you what an ancient system does. My gpu temps actually stayed low, I think my vram did me in, was more left in the gas tank....
2024-9-3_4-12-5.png
 
Joined
Sep 29, 2020
Messages
83 (0.05/day)
Hi. If you are referring to me, I use a 40-inch 4k tv nothing else and it doesn't have 4096x2160 resolution just 3840x2160. The 2 screenshots show the resolution

Edit: my TV accepts a 4096x2160 signal, but it's cut off a little on the ends.
Sure! To you, but also to anyone with two different monitors. Thank you for your report and for the Edit clarification (I was thinking something was read wrong, but I see it was detecting fine the actual max resolution of your screen, then).

To anyone with two different physical screens too, please, I want to compare the final performance of the benchmark when running it in an ultrawide monitor VS when running it in a 16:9 or even 4:3 monitor on the same PC. (For example, I want to confirm the perf. in a 2560x1080 monitor is the same than in a 1920x1080 monitor).

@johnspack the VRAM killed your score, sure! I'm sure your card could go higher, but for this benchmark, a minimum of 8GB is recommended.

Thank you all!
 
Last edited:
Joined
Apr 28, 2011
Messages
309 (0.06/day)
System Name VENTURI
Processor 2x AMD 7773x Epyc (128/256 cores)
Motherboard Gigabyte MZ72-HB0 Dual socket motherboard
Cooling Air, noctua, heatsinks, silent/low noise
Memory 1.TB 2 LRDIMM ECC REG
Video Card(s) 2x 4090 FE RTX
Storage Raid 0 Micron 9300 Max (15.4TB each / 77TB array - overprovisioned to 64TB) & 8TB OS nvme
Display(s) Asus ProArt PAU32UCG-K
Case TT miniITX P1 (SFF)
Audio Device(s) harmon Kardon speakers / apple
Power Supply 2050w 2050r
Mouse Mad Catz pro X
Keyboard KeyChron Q6 Pro
Software MS 2022 Data Center Server, Ubuntu
Benchmark Scores Gravity mark 144,742 (high score)
Joined
Sep 29, 2020
Messages
83 (0.05/day)

What an impressive machine @venturi ! What do you use it for?



BTW: Updated to version 0.98, with minor cosmetic changes, text changes and typos fixed. Featuring changes to the character's hair, which I wasn't happy with:

Unreal has serious problems when rendering hair strands under certain circunstancies, like being illuminated only by indirect lighting (Lumen GI), as it's the case. It has been a little challenging (trial and error) but this is the better result achievable with current Unreal state. Anyway, quite decent, in my opinion:
HighresScreenshot00006_.jpg


Expiration date updated to 15th October, as development it's a little paussed now because of other projects. I hope Epic releases UE5.5 during october to update the bench and to add the new forest level into it. And maybe other exciting news!

Regards!
 
Joined
Aug 9, 2024
Messages
99 (0.94/day)
Location
Michigan, United States
Processor Intel i7-13700K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling NZXT Kraken Elite 360
Memory G.Skill Trident Z DDR5-6400
Video Card(s) MSI RTX 4090 Suprim X Liquid
Storage Western Digital SN850X 4Tb x 4
Case Corsair 5000D Airflow
Audio Device(s) Creative AE-5 Plus
Power Supply Corsair HX-1200
Software Windows 11 Pro 23H2
Looks like it also has some problems rendering humans.
 
Joined
Apr 28, 2011
Messages
309 (0.06/day)
System Name VENTURI
Processor 2x AMD 7773x Epyc (128/256 cores)
Motherboard Gigabyte MZ72-HB0 Dual socket motherboard
Cooling Air, noctua, heatsinks, silent/low noise
Memory 1.TB 2 LRDIMM ECC REG
Video Card(s) 2x 4090 FE RTX
Storage Raid 0 Micron 9300 Max (15.4TB each / 77TB array - overprovisioned to 64TB) & 8TB OS nvme
Display(s) Asus ProArt PAU32UCG-K
Case TT miniITX P1 (SFF)
Audio Device(s) harmon Kardon speakers / apple
Power Supply 2050w 2050r
Mouse Mad Catz pro X
Keyboard KeyChron Q6 Pro
Software MS 2022 Data Center Server, Ubuntu
Benchmark Scores Gravity mark 144,742 (high score)
What an impressive machine @venturi ! What do you use it for?



BTW: Updated to version 0.98, with minor cosmetic changes, text changes and typos fixed. Featuring changes to the character's hair, which I wasn't happy with:

Unreal has serious problems when rendering hair strands under certain circunstancies, like being illuminated only by indirect lighting (Lumen GI), as it's the case. It has been a little challenging (trial and error) but this is the better result achievable with current Unreal state. Anyway, quite decent, in my opinion:
View attachment 363377

Expiration date updated to 15th October, as development it's a little paussed now because of other projects. I hope Epic releases UE5.5 during october to update the bench and to add the new forest level into it. And maybe other exciting news!

Regards!
Thank you, I use for the normals like other folks as a secondary duty like gaming, email, spreadsheets, surfing the web etc

primary work is in CNN, Rcnn, algorithms, and ai training. It’s my home PC

Request:

enable the MULTI GPU feature set of the Unreal engine for heterogenous gpus please

so far on the market only Tellusim’s gravity mark can do this across all plarforms, it would be nice to have mGPU for unreal in a benchmark,there are also pro benchmarks for renderers but it would show the caoabilities of unreal
 
Last edited:
Joined
Sep 20, 2021
Messages
441 (0.38/day)
Processor Ryzen 7 9700x
Motherboard Asrock B650E PG Riptide WiFi
Cooling Underfloor CPU cooling
Memory 2x32GB 6200MT/s
Video Card(s) 4080 SUPER Noctua OC Edition
Storage Kingston Fury Renegade 1TB, Seagate Exos 12TB
Display(s) MSI Optix MAG301RF 2560x1080@200Hz
Case Phanteks Enthoo Pro
Power Supply NZXT C850 850W Gold
Mouse Bloody W95 Max Naraka
9700x
7900 XT
2560x1080p

Interesting the way the benchmark allocates the monitor resolution, so 4k for me is 5120x2160 :D
And in the last scene with the body, the VRAM jumped to ~19951MB and started lagging, so I guess there's not enough VRAM, but the benchmark shows 18.29GB, which isn't true (I repeated it several times).
No problem with lower resolution 3414x1440 (??), works fine.
 

Attachments

  • GameTech_1440p_Lumen.png
    GameTech_1440p_Lumen.png
    6.3 MB · Views: 20
  • GameTech_1440p_RT.png
    GameTech_1440p_RT.png
    6.8 MB · Views: 23
  • GameTech_2160p_Lumen.jpg
    GameTech_2160p_Lumen.jpg
    2 MB · Views: 17
  • GameTech_2160p_RT.jpg
    GameTech_2160p_RT.jpg
    2.3 MB · Views: 18
  • GameTech_Render.png
    GameTech_Render.png
    2.4 MB · Views: 23
Joined
May 7, 2023
Messages
642 (1.14/day)
Processor Ryzen 5700x
Motherboard Gigabyte Auros Elite AX V2
Cooling Thermalright Peerless Assassin SE White
Memory TeamGroup T-Force Delta RGB 32GB 3600Mhz
Video Card(s) PowerColor Red Dragon Rx 6800
Storage Fanxiang S660 1TB, Fanxiang S500 Pro 1TB, BraveEagle 240GB SSD, 2TB Seagate HDD
Case Corsair 4000D White
Power Supply Corsair RM750x SHIFT
Hi all!

I have created a new benchmark I hope you like and enjoy! I'm quite proud of the result, even if it's still in its beta phase.

GameTech is a benchmark based on Unreal 5 that aims to measure performance and verify stability on today's and tomorrow's modern PCs. Utilizing the most cutting-edge and innovative gaming technology, it leverages Lumen, Nanite, Virtual Textures, Virtual Shadowmaps, Metahumans... to present an environment that is as realistic, demanding, and optimized as possible, while prioritizing visual quality.

View attachment 353829


What's Lumen? A new global illumination system implemented in Unreal Engine 5 that, preferably using Ray Tracing, allows for dynamically and realistically lighting entire scenes, generating diffuse and occlusion shadows. This achieves a quality similar to what was previously obtained by baking the lighting, but instantly, albeit at a much higher cost.

What's Nanite? A new mesh rendering system that allows for displaying meshes with millions of polygons without heavily overloading the scene, though it has a high base cost. The meshes only show the necessary polygons on screen based on the pixel surface they occupy, allowing for "infinite" completely smooth and partial transitions, even along a single mesh.

There are several benchmarking modes:

  • Without Ray Tracing (Raster): Will use "Lumen Software" if the PC is not compatible with Ray Tracing, or if the user explicitly selects it. Included simply so that older or less powerful cards (within reasonable limits) can still be accommodated.
  • Ray Tracing: Will use "Lumen Hardware." The default standard gaming mode.
  • Path Tracing: Offline rendering of the highest quality. Will use a fixed resolution of Full HD so that VRAM size is less of a burden and focuses more on measuring performance itself. Additionally, this standardizes the result for all PCs.

(Very) minimum requirements for Lumen Software:
  • Windows OS
  • Internet connection
  • 16GB RAM
  • GPU compatible with SM6 and DirectX12
  • GPU with 6GB of VRAM
  • GPU equivalent to GTX 1660 (1080p @ 20fps)

Minimum requirements for Lumen Hardware and Path Tracing:
  • Windows OS
  • Internet connection
  • 16GB RAM
  • GPU compatible with SM6 and DirectX12
  • GPU with 8GB of VRAM
  • GPU equivalent to RTX 2060

Recommended requirements for Lumen Hardware and Path Tracing:
  • Windows OS
  • Internet connection
  • 16GB RAM
  • GPU compatible with SM6 and DirectX12
  • GPU with 10GB of VRAM
  • GPU equivalent to RTX 3060

DOWNLOAD :
C&C are welcome! I can't wait to see your results!
Does the benchmark start with RT enabled and in the main menu? I ask as my RX 6800 always crashes with RT enabled in games (don't ask me why have looked into lots of solutions) and whenever I start the BM I get a Fatal Error popup dialogue box that says the game has crashed and will close, have looked for an .ini file to see if I could maybe toggle RT off upon start though was unable to find anything
 
Joined
Sep 29, 2020
Messages
83 (0.05/day)
Looks like it also has some problems rendering humans.
Because of the two humans in the screenshot?:p

Thank you, I use for the normals like other folks as a secondary duty like gaming, email, spreadsheets, surfing the web etc

primary work is in CNN, Rcnn, algorithms, and ai training. It’s my home PC

Request:

enable the MULTI GPU feature set of the Unreal engine for heterogenous gpus please

so far on the market only Tellusim’s gravity mark can do this across all plarforms, it would be nice to have mGPU for unreal in a benchmark,there are also pro benchmarks for renderers but it would show the caoabilities of unreal
I saw your photos... beautiful!

I will look into it, but it seems it's only possible to enable multi gpu support in Unreal for Path Tracing (offline) rendering. Anyway, Epic will include it one day, sure, and I will update this too.

9700x
7900 XT
2560x1080p

Interesting the way the benchmark allocates the monitor resolution, so 4k for me is 5120x2160 :D
And in the last scene with the body, the VRAM jumped to ~19951MB and started lagging, so I guess there's not enough VRAM, but the benchmark shows 18.29GB, which isn't true (I repeated it several times).
No problem with lower resolution 3414x1440 (??), works fine.
Which is your max native screen resolution? For the screenshots, they look fine somehow, as your own uploaded screenshots show that aspect ratio and images are not deformed. Could you check the dimensions of the auto screenshots taken by the benchmark and saved in the Saved Results folder?

About the human character I don't understand that VRAM consumption, sorry. Tested in my 1440 monitor, at 2160p, and consumption was as expected, around 9GB. The readings, by the way, are taken directly from DirectX API and was the better way I could found to do it in Unreal using C++. But it's the VRAM used by the application (yours may be of your whole PC). Would you have the chance to bench your GPU plugged in a different PC, if the issue persists?

To be followed-up, anyway. If anything changes, please, tell me!

Monitoring during the run through Windows task manager and GPU-Z. WTM showed a max around 9.4GB+0.3 shared GB. Gpu-Z a max of 9.7GB (maybe taking the shared memory too). The initial VRAM footprint was around 0.8GB. That's 9.4GB of dedicated VRAM minus 0.8GB = 8.6GB approx (I was rounding the numbers). The benchmark itself registered a maximum of 8.7GB.
2024-9-14_22-54-21.png


Does the benchmark start with RT enabled and in the main menu? I ask as my RX 6800 always crashes with RT enabled in games (don't ask me why have looked into lots of solutions) and whenever I start the BM I get a Fatal Error popup dialogue box that says the game has crashed and will close, have looked for an .ini file to see if I could maybe toggle RT off upon start though was unable to find anything
Hi Marcus!

Unfortunately, to be able to enable/disable RT on the fly, Unreal is configured, by default, to allow RT. So, even if I set it to disable RT in the first frame, your PC will "detect" it anyway. However, I will try to be creative and make the trick for you during october, probably, if you can remind it to me then.


Thank you all!!
 
Last edited:
Joined
Jan 5, 2006
Messages
18,584 (2.69/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
Joined
Sep 20, 2021
Messages
441 (0.38/day)
Processor Ryzen 7 9700x
Motherboard Asrock B650E PG Riptide WiFi
Cooling Underfloor CPU cooling
Memory 2x32GB 6200MT/s
Video Card(s) 4080 SUPER Noctua OC Edition
Storage Kingston Fury Renegade 1TB, Seagate Exos 12TB
Display(s) MSI Optix MAG301RF 2560x1080@200Hz
Case Phanteks Enthoo Pro
Power Supply NZXT C850 850W Gold
Mouse Bloody W95 Max Naraka
Which is your max native screen resolution? For the screenshots, they look fine somehow, as your own uploaded screenshots show that aspect ratio and images are not deformed. Could you check the dimensions of the auto screenshots taken by the benchmark and saved in the Saved Results folder?

About the human character I don't understand that VRAM consumption, sorry. Tested in my 1440 monitor, at 2160p, and consumption was as expected, around 9GB. The readings, by the way, are taken directly from DirectX API and was the better way I could found to do it in Unreal using C++. But it's the VRAM used by the application (yours may be of your whole PC). Would you have the chance to bench your GPU plugged in a different PC, if the issue persists?

To be followed-up, anyway. If anything changes, please, tell me!

Monitoring during the run through Windows task manager and GPU-Z. WTM showed a max around 9.4GB+0.3 shared GB. Gpu-Z a max of 9.7GB (maybe taking the shared memory too). The initial VRAM footprint was around 0.8GB. That's 9.4GB of dedicated VRAM minus 0.8GB = 8.6GB approx (I was rounding the numbers). The benchmark itself registered a maximum of 8.7GB.
View attachment 363445
My monitor is UltraWide 2560x1080p.
Here's the result you're asking for.
I can't test it on any other PC.

Here are the resolutions on my monitor (including the virtual ones) and it's quite normal for the benchmark to choose 5120x2160 - that's my "2160" :D
I will try to add custom one as your 3840x2160 and will test.
1726348690104.png


So with a custom resolution of 3840x2160 everything is fine, but if you look, MSI Afterburner reports almost 1 GB more VRAM usage than the benchmark (probably looking at different parameters), HWInfo reports the same, and the Adrenaline driver - also. So I'm not sure what your benchmark shows.

And I add the actual result for 4k.

I might suggest you give the user the choice to choose the one he wants from the list of resolutions, because as in this UW monitor situation, he has to add a new one and "lie" the benchmark through the monitor which one he wants to use.
 

Attachments

  • 2024-9-14_21-3-16.png
    2024-9-14_21-3-16.png
    446.8 KB · Views: 22
  • GameTech_2160p_VRAM.jpg
    GameTech_2160p_VRAM.jpg
    1.9 MB · Views: 22
  • GameTech_4k_RT.png
    GameTech_4k_RT.png
    9.7 MB · Views: 25
Joined
Sep 29, 2020
Messages
83 (0.05/day)
Thank you @AVATARAT !

Oh hell, I'm facing again a math issue I thought I already solved, but the calculations were wrong: my goal is to achieve screens equity, independently of the aspect ratio of the screen. So a 16:9 screen could compare directly to a 21:9 screen, getting the same score if the bench settings and PC specs are the same. The easy way is to just force the resolution to the screen, but that would deform the image (adapting a 1920x1080 to a 2560x1080, for example), and I want to avoid it. That's the goal too. I will try to update it soon, for those "special" resolutions. (Suggestions are always welcome, of course! How other benchmarks handle this? I definitly need to buy an ultrawide just for testing).

But I can't also add every resolution to the list, because I want as less as possible, to have it quite "normalized". Tons of options would make it uncomparable with eachothers.

About the VRAM, those softwares are showing you your whole PC consumption if I'm not wrong. Have you checked you consumed VRAM when all closed, inmediately before executing the Game? (That amount should be substracted to the total VRAM shown in those softwares during the benchmark).

Thanks again!
 
Joined
Sep 20, 2021
Messages
441 (0.38/day)
Processor Ryzen 7 9700x
Motherboard Asrock B650E PG Riptide WiFi
Cooling Underfloor CPU cooling
Memory 2x32GB 6200MT/s
Video Card(s) 4080 SUPER Noctua OC Edition
Storage Kingston Fury Renegade 1TB, Seagate Exos 12TB
Display(s) MSI Optix MAG301RF 2560x1080@200Hz
Case Phanteks Enthoo Pro
Power Supply NZXT C850 850W Gold
Mouse Bloody W95 Max Naraka
Yes, you are right here, something "eats" by default or is not completely cleaned after closed applications.

1726354621065.png


You can lock the resolutions to 16:9 format only or make a list taken from the connected monitor.

When I force it into 21:9 4k, I look at it this way.
And that's normal :)
 

Attachments

  • GameTech_4k_screen.jpg
    GameTech_4k_screen.jpg
    252.7 KB · Views: 12
Top