I've been playing with the integrated Performance Budget Tool, which is found in the graphics menu. The game assesses your graphics card's and processor's performance on first launch, presents the benchmark results as your total GPU and CPU "budgets", and (supposedly) sets the optimal detail level. Every graphics setting is assigned two values reflecting its impact on GPU and CPU performance. Here's a detailed breakdown of the costs from their minimum to maximum value:
| | GPU cost | CPU cost | | GPU cost | CPU cost |
---|
Texture quality | low | 30 | 10 | ultra | 50 | 10 |
Visual effects quality | low | 10 | 10 | ultra | 30 | 10 |
Shadow quality | low | 30 | 10 | ultra | 50 | 10 |
Post processing quality | low | 20 | 10 | ultra | 40 | 30 |
Volumetric fog resolution | low | 20 | 10 | ultra | 60 | 30 |
Global illumination quality | low | 30 | 20 | ultra | 90 | 20 |
Reflection quality | low | 40 | 0 | high | 60 | 0 |
Anisotropic filtering | off | 0 | 0 | 16x | 100 | 100 |
Ambient occlusion quality | low | 10 | 10 | ultra | 50 | 30 |
Atmosphere quality | low | 10 | 10 | ultra | 50 | 10 |
Cinematics depth of field quality | low | 20 | 10 | ultra | 60 | 30 |
Foliage quality | low | 10 | 10 | ultra | 10 | 10 |
Mesh quality | low | 30 | 10 | ultra | 90 | 10 |
Cinematics motion blur quality | low | 30 | 10 | ultra | 50 | 10 |
Particle quality | low | 40 | 10 | ultra | 60 | 10 |
Shadow mesh quality | low | 10 | 20 | ultra | 70 | 20 |
Shadow resolution quality | low | 20 | 20 | ultra | 20 | 40 |
Subsurface scattering quality | low | 20 | 10 | ultra | 80 | 10 |
total cost | | 380 | 190 | | 1020 | 390 |
There are also two toggles -- Lights shafts and Local exposure -- without allotted numerical costs.
While the whole idea sounds very practical, the values chosen by the developers to represent each cost make one wonder. While global illumination and mesh quality certainly burden the GPU, why is anisotropic filtering supposed to be the most demanding setting of all? Also, the tool seems to miscalculate the total CPU budget. According to the developers (the resolutions below assume upscaling in quality mode):
View attachment 310398
My 5800X3D was evaluated at 230, whereas the 13900K gets 290 -- both of which seem way off. The Performance Budget Tool could become really helpful eventually, but it needs further polishing. Also, the game would benefit greatly from textual descriptions of each setting, with the corresponding visual benefit/tradeoff. That could help less experienced gamers make more informed choices in quality vs. performance.
And just for reference, those are the total budgets assigned by the tool to various GPUs:
7900XTX | 1750 |
4090 | 1450 |
4080 | 1100 |
3080Ti | 800 |
4070 | 700 |
3080 | 700 |
3060Ti | 500 |
2060 | 300 |
1650 | 100 |
Steam Deck | 60 |