• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU Test System Update May 2019

Personal opinion: I don't much care about my FPS in Civilization games in general. As long as I can play it, FPS matters very little.

So I'd vote to remove Civ 6.
 
What about framerate consistency and latency, the most important aspect of a gaming card making a game "feel smooth"?
I think its long overdue.

Yep. GN frametime plots are killer. Bust out fcat @W1zz.

Witcher 3 is on CDPR's own REDEngine 3. Cyberpunk 2077 will be REDEngine 4 that will likely take Witcher 3's place in benchmarking.
Deus Ex Mankind Divided is on Dawn Engine which is modified Glacier Engine II - the same one that powers Hitman games, currently Hitman 2.

While we are at it, why not:
Ace Combat 7 - Unreal Engine 4
Anno 1800 - Anno Engine (?)
Assassin's Creed Odyssey - AnvilNext 2.0
Battlefield V - Frostbite 4
Civilization VI - Firaxis Engine (?)
Darksiders 3 - Unreal Engine 4
Devil May Cry 5 - RE Engine
Divinity Original Sin 2 - Divinity Engine 2
F1 2018 - EGO Engine 3.0
Far Cry 5 - Dunia 2
Hitman 2 - Glacier Engine II
Metro Exodus - 4A Engine
Monster Hunter World - MT Framework 2.x
Rage 2 - Apex Engine
Rainbow Six Siege - AnvilNext 2.0
Sekiro - PhyreEngine (?)
Shadow of the Tomb Raider - Foundation Engine
Shadow of War - LithTech Firebird
Strange Brigade - Asura Engine
Witcher 3 - REDEngine 3
Wolfenstein 2 - idTech 6

Good adds. The more games the better, else pick your games & get the result you want...

Why don't you add World War Z? Swarm Engine supports DX11 and Vulkan.

Because:
WWZ.png


Especially in DX12 / Vulkan titles due to Turing having proper support for Async-Compute and etc.

So TU no longer requires context switching for concurrent graphics & compute loads?

Moreover Turing has geometry performance advantage over Pascal as well.

Didn't know TU had >6 polymorph engines & substantially higher clocks than GP. We haven't seen mesh shader performance beyond demos but it does seem a tad better than Vega NGG/prim shader implementation at least. ;)
 
Last edited:
Just in time to redo with Zen 2
 
So TU no longer requires context switching for concurrent graphics & compute loads?
Correct, Turing can finally run Float and Int both concurrently.
 
Why don't you add World War Z? Swarm Engine supports DX11 and Vulkan.
This is the trick with selecting games.

Is there a good reason to add World War Z to the list of games being tested? This is a single game that very heavily leans towards AMD GPUs which makes it an outlier. Swarm engine seems to be a new version of Saber3D engine that is not used in any other significant games. The only one that comes to mind is Quake Champions where parts of it are used in conjunction with idTech6.

TPU list of tested games has always had outliers but there have been valid reasons for involving them.
- Despite leaning towards Nvidia cards Anno is fairly unique game and engine and a long-running series.
- Civilization, again, is unique game and engine and a long-running series with a definite large fanbase.
- GTA V is a significant game in its own right.
- Hitman: Absolution, Deus Ex: Mankind Divided, then Hitman and Hitman 2 are basically on the same engine that also went for DX12 early on.
- Sniper Elite 4 had (and still has) one of the best DX12 implementations that benefits all GPUs across the board over DX11. Strange Brigade is on improved version of the same engine.
- Unreal Engine is a touchy subject here as it tends to favor Nvidia cards by default. More so when developer has not worked on optimizing the game - this can be clearly seen from now-out Dragon Quest XI and Darksiders that is still on the list. Personally, I would have let Hellblade stay in its place as that is (slightly) better optimized as well as a more remarkable game. However, Unreal Engine is important to have in the lineup and preferably with a couple of games because it is widespread and popular engine used by a lot of games.

On the other hand as someone pointed out earlier there are some popular engines missing from the list mostly because the games have not been significant enough recently. CryEngine, for example. Both Prey and Kingdom Come Deliverance are on CryEngine 4 but neither has been significant enough to keep in the list. Unity has games but not big games like AAA releases or anything that would resemble a graphical powerhouse - I have my favourite Unity games as do many people but there is not one that I can think of that is significant enough game and would provide useful performance results at the same time.
 
Last edited:
However, Unreal Engine is important to have in the lineup and preferably with a couple of games because it is widespread and popular engine used by a lot of games.
That. and it's the same reason we don't have any CryEngine titles in our bench anymore, seems nobody uses it for their games
 
Looking at the recent CryEngine games, the only one that would make sense for benchmarking is Hunt: Showdown. This should be on CryEngine V but is still technically in early access.

Edit:
By making sense I mean the game is popular although a bit niche, it looks good and needs a reasonably beefy PC to keep FPS high. No idea whether The Hunt prefers one side or the other in terms of GPUs but if things are done right CryEngine should be pretty neutral.
 
Last edited:
@W1zzard why did you drop Wildlands.
Was just getting old, same engine as AC:O
Before someone points out that Rainbow Six Siege is also on the same engine and is older - all this is true - but R6 Siege keeps being updated, is popular and is one of very few competitive FPS games that can use as much hardware as possible thrown at it. This is also one of the few games where I would seriously argue the FPS differences in 100-200 range and perhaps above do matter.
 
but R6 Siege keeps being updated, is popular and is one of very few competitive FPS games that can use as much hardware as possible thrown at it. This is also one of the few games where I would seriously argue the FPS differences in 100-200 range and perhaps above do matter.
Exactly why it's included, I was also thinking about adding DOTA 2 for this rebench, but decided against it, would make the bench selection too eSports heavy I think
 
Actually have Vega 64 (1650MHz@1.05V) and its pretty fine graphic card. I quite dont understand why most manufacturers overvolt it as hell, since it can run on very low volts with very nice efficiency.

The only logical explanation would be that not all chips are stable at the lower voltage.
Manufacturers don't overvolt it as hell, they simply follow AMD's specs. At least for reference designs.
 
Looking at the recent CryEngine games, the only one that would make sense for benchmarking is Hunt: Showdown. This should be on CryEngine V but is still technically in early access.

Edit:
By making sense I mean the game is popular although a bit niche, it looks good and needs a reasonably beefy PC to keep FPS high. No idea whether The Hunt prefers one side or the other in terms of GPUs but if things are done right CryEngine should be pretty neutral.

+It's the game made by Crytek themselves.
 
So, you think the ability to pack both int & fp is "Async-Compute"?
This explains the details much better than I do.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Ti_Founders_Edition/2.html
What has changed, however, is that the Streaming Multiprocessor (SM), the indivisible sub-unit of the GPU, now packs CUDA cores, RT cores, and Tensor cores, orchestrated by a new Warp Scheduler that supports concurrent INT and FP32 ops, which should improve the GPU's asynchronous compute performance.


https://techgage.com/article/tracing-the-groundwork-of-nvidias-turing-architecture/
Asynchronous Compute
In what will be a big boon to a number of games, and less so to others, is a restructuring in how Turing handles asynchronous compute. NVIDIA has historically had issues with mixed workloads, long before Pascal, and it was a sore point even when it was introduced with Kepler and somewhat addressed with Maxwell.


When having to switch between graphics processing and compute, such as with physics calculations or various shaders, the GPU would have to switch modes, and this incurred quite a substantial performance loss due to the latency in the switch. Pascal made this switch almost seamless, but it was still there.

Turing on the other-hand, now finally allows concurrent execution of both integer and floating point math. Better still, they can now share the same cache. On paper, this results in a 50% boost in performance per CUDA core in mixed workloads. How this translates into real-world performance will be very different, but we expect to see quite a few games gain tangible improvements from this alone.

Games like Ashes of the Singularity will respond well to the changes, as well as games with mixed workloads, notably those that make use of DX12, but this is something we won’t see the full effect of until later. Generally though, a lot of games will see some kind of uptick in performance just from this change alone.
 
Last edited:
@W1zzard are you still using DX12 renderer on Strange Brigade? After all the updates I'm kind of curious how graphics cards stack up with Vulkan these days on that tittle. I.E. Babeltech's nvidia driver update review shows Vulkan now runs circles around directX(Though I'm sure they have typo with dx11, which game does not support).
 
are you still using DX12 renderer on Strange Brigade?
Yes. Given how similar both APIs are I wouldn't expect any meaningful difference in anything
 
Yes. Given how similar both APIs are I wouldn't expect any meaningful difference in anything
Performance in Strange Brigade in terms of Vulkan/DX12 difference has swayed a little to one side or the other over time (well, over patches and driver versions) but in big picture the current state of the game should be DX12 performing slightly better than Vulkan across the board.
 
If Deus Ex Mankind Divided was dropped because it was "too old" even though it was DX12, I'd say the same for Civ VI.
Only because they have a newer game with the same engine basically.

Witcher 3 is an epic game but its too old now, released in 2015.
It still taxes a number of GPU’s, especially at 2560x1440. Plus as W1z said earlier, it’s a very popular review page for people.
 
Last edited:
If Deus Ex Mankind Divided was dropped because it was "too old" even though it was DX12, I'd say the same for Civ VI.
Civ 6 got a big update just a few months ago
 
Counterpoint - I had a Vega64. It was having serious trouble reaching the spec 1546MHz boost clock. When overclocked and undervolted, I still could not get 1600MHz out of it no matter what I did. It was not power limit and higher voltages simply did not help. And that was at 300+W power consumption. Blower, but for testing I ran it at 100% which did take care of temperatures. It was nice and efficient at low voltages but it was never fast at the same time. I suppose there would be a way to get it a little higher today with what we have learned but that took months or close to a year to mature.

This seems to be why Vegas are overvolted - the variability is huge. For every nice one that can do 1700MHz under water and extreme tuning there is a dud like mine apparently was.

Yea I understand that. Every piece is different. Mine can run a lot higher than that, just would need better cooling. Also its guaranteed to run at 1650 MHz, probably reason why I had no issues. Actually ran slightly faster out of the box, but I didnt like temps, so went onwards with tweaking.

It has a lot of limits, temp, power consumption, how is VRM handled and its not just about raising voltages, but raising or lowering right ones. Kinda tricky thing must say. Probably not even worth time I spend with it, given I dont actually use it. :D
 
@W1zzard for Power Consumption (Average and Gaming) Metro Last Light isn't part of your current gaming list. Wouldn't using a game on the list be more representative like moving to Metro Exodus you could also use it to gauge DX11, DX12 & RTX power differences if warranted in the future.
 
@W1zzard for Power Consumption (Average and Gaming) Metro Last Light isn't part of your current gaming list. Wouldn't using a game on the list be more representative like moving to Metro Exodus you could also use it to gauge DX11, DX12 & RTX power differences if warranted in the future.
I second that. Metro Exodus is indeed a big power draw game.
 
@W1zzard for Power Consumption (Average and Gaming) Metro Last Light isn't part of your current gaming list. Wouldn't using a game on the list be more representative like moving to Metro Exodus you could also use it to gauge DX11, DX12 & RTX power differences if warranted in the future.
The great thing about Metro Last Light is that it has varying power consumption, not just full max all the time, so it also acts as a test to see how well gpus operate at slightly below full load. thats why we have avg/max
 
I'm a firm believer in quality over quantity.

Would 200%, without a doubt prefer cutting number of tests\games by half just to have a zen2 system on board.
 
Back
Top