• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Battlefield V Tides of War GeForce RTX DirectX Raytracing

Interesting. I played through the exact same place and had pretty much the same FPS (2-3 fps lower) with an RTX2080. 1440p all maxed.

Funny. Seems like its clearly still a work in progress, if you see the TPU numbers of Medium vs Low I get that same impression, and sometimes the FPS gaps between quality settings are almost negligible too.
 
Sub 60 FPS is also a problem in every isometric perspective game because it creates very noticeable screen tear (or high input lag, not a great thing to have in any ARPG for example). So that also touches on every MOBA right there, in fact, most competitive online gaming and even non-casual offline single player gaming.

TPU should really be posting minimums as standard to be fair.

But yes, for casual gaming, that 2080 will do fine in RTRT. I wonder how many casual gamers spend 600+ on a GPU ;)
There are many games that can be enjoyed at 60 average FPS. Adventures like the Witcher series, Tomb Raider...

As for the minimum fps as the standard, I can agree with you that would put Nvidia in the worst possible light here. But nobody does that.
And there's a reason for that: you can't really tell whether one instance of the game dropping to 5 fps is better or worse than the game dropping 20 times to 20 fps.

You'd either need to see the fps graph over time (like HardOCP does it), or 99th percentile scores.
 
Funny. Seems like its clearly still a work in progress, if you see the TPU numbers of Medium vs Low I get that same impression, and sometimes the FPS gaps between quality settings are almost negligible too.
I agree with the work in progress part. There is an upside to this though - it does look like there are ways to squeeze more performance out from it. :)
 
You know this, but in the context of saying 'the averages are comfortably above 60' when the guy before you spoke of 1440p/90fps performance, I think we're losing sight of the real use case on the high-end where, at 1440p, even the 2080ti will occasionally drop a frame or two below 60 FPS but it really is the only card that offers 'playable' performance.

When you see 1440p/60 averages such as with the 2080, that is certainly not a '1440p60 RT card'. It can get by, its 'capable', but far from desireable. What you see in videos is that the minimums there drop to 45-49 more often than not. For immersive, slow single player gaming, sure. For everything else? Meh.


1:10 onwards (and single player, it seems, which should not be ignored)
that's 4k uhd... not 2560x1440.

SP is the only reliable way to test in the first place. Hard to forget about that when it's the best (most reliable) way to test. There isnt an integrated benchmark.
 
that's 4k uhd... not 2560x1440.

SP is the only reliable way to test in the first place. Hard to forget about that when it's the best (most reliable) way to test. There isnt an integrated benchmark.

No they go through 1080p, 1440p, 4K. And regardless of SP being the most reliable way to test, it is also the most favorable environment for good FPS, not an indicator of the worst case scenario which happens in almost everything multiplayer. Besides, its primarily an MP game :D
 
I see now. Yeah dips to 57 fps.... whoa. Unplayable. How could they? :)

Thanks for the video.:)

Edit: MP testing/gaming experience varies by user, settings, map, # of people. It's hard to pinpoint things in MP.
 
Last edited:
No they go through 1080p, 1440p, 4K. And regardless of SP being the most reliable way to test, it is also the most favorable environment for good FPS, not an indicator of the worst case scenario which happens in almost everything multiplayer. Besides, its primarily an MP game :D
I disagree with that. Battlefields are tested in multiplayer for CPU performance or CPU limitations not GPU performance. Single-player maps still do seem to have more effects enabled that puts pretty much the same pressure on GPU as multiplayer maps-games do.

On benchmarking side of things, multiplayer adds a whole lot of variables that you are not able to control. FPS can and will be limited by CPU at times, players and action are different etc. And even if you get an empty-ish server and try to recreate the same scenario it is consistent but no longer representative of the real gameplay.

Basically, for GPU-centric testing like these DXR effects, testing reproducible and more fixed scenarios in single player are definitely the way to go.
 
Was going to say... we know this already. Have you seen minimums tested for this yet, out of curiosity? I havent.

Competative gamers wont be using this tech now...no way.

Well digital foundry talks about frametimes with the newest patch. Drops of fps were to low 50 fps in the worst case scenery at the mp Rotterdam. Interestingly enough cpu bottleneck fps more than gpu with dxr on. One will get more fps with locked intel coffee lake cpu than first gen ryzen(13:38 on same vid).

Edit: Oh, timestamp did not work on embedded vid.
 
Last edited:
I disagree with that. Battlefields are tested in multiplayer for CPU performance or CPU limitations not GPU performance. Single-player maps still do seem to have more effects enabled that puts pretty much the same pressure on GPU as multiplayer maps-games do.

On benchmarking side of things, multiplayer adds a whole lot of variables that you are not able to control. FPS can and will be limited by CPU at times, players and action are different etc. And even if you get an empty-ish server and try to recreate the same scenario it is consistent but no longer representative of the real gameplay.

Basically, for GPU-centric testing like these DXR effects, testing reproducible and more fixed scenarios in single player are definitely the way to go.

Don't get me wrong here. I'm not contesting that SP is the way to test, but its good to keep in mind that this is primarily an MP game AND the only game with functional DXR at this point. It would be unwise to base ourselves on SP performance numbers to determine how playable this game really is with RTX On. Another point is the amount of work required to get to this performance. If its time consuming, it will be that much harder to implement elsewhere without crippling performance.
 
I don't know about the others, but I'm having a blast reading this thread. It's like a bunch of people looking at a Model T and pointing out it's not a Ford GT yet. *popcorn*
Admittedly, a very expensive Model T.

Don't get me wrong here. I'm not contesting that SP is the way to test, but its good to keep in mind that this is primarily an MP game AND the only game with functional DXR at this point. It would be unwise to base ourselves on SP performance numbers to determine how playable this game really is with RTX On. Another point is the amount of work required to get to this performance. If its time consuming, it will be that much harder to implement elsewhere without crippling performance.
Well, last I checked, people playing MP seriously, were lowering the details as much as possible looking for 100+ fps. If they still do that, this whole conversation is largely moot.
 
I don't know about the others, but I'm having a blast reading this thread. It's like a bunch of people looking at a Model T and pointing out it's not a Ford GT yet. *popcorn*
Admittedly, a very expensive Model T.


Well, last I checked, people playing MP seriously, were lowering the details as much as possible looking for 100+ fps. If they still do that, this whole conversation is largely moot.

Isn't that what enthusiasts do, regardless of the hobby? :p As for MP, there are also people who turn down their resolution a notch to get there, and hey look 1080p/100fps is actually getting in reach. For the adoption of a new tech it'd be pretty neat to actually use it in the primary game mode of the game, no? We can also just fill every topic with five youtube links and call it a day...
 
I don't know about the others, but I'm having a blast reading this thread. It's like a bunch of people looking at a Model T and pointing out it's not a Ford GT yet. *popcorn*
Admittedly, a very expensive Model T.


Well, last I checked, people playing MP seriously, were lowering the details as much as possible looking for 100+ fps. If they still do that, this whole conversation is largely moot.

You actually bring up the interesting point. Can you get RTX on to run over 100 FPS by lowering other image quality a lot?
 
What about power consumption when rtx on and when off? Any diff?
 
What about power consumption when rtx on and when off? Any diff?
I cannot see any difference with RTX on or off. Clocks, temps, power, CPU usage, all is within the same range.
 
What about power consumption when rtx on and when off? Any diff?
the cards are limited by board power in both scenarios, so power draw should be the same
 
Yes they are. But if something is noticeably different, it should show somewhere. With power and temp limits it should show in clocks and it does not.
 
I wonder where the bottleneck is in RT performance scaling
rtx 2080 gains 23% from dropping RT quality at 1440p,while 2070 gains 15% and 2080Ti only gains 9%.it's certainly on the side on 2080ti cause at 1440p medium rtx medium it only manages 77 fps while rtx 2080 can already deliver 74 (5% difference).rtx ultra shows the gap rise to 23%.weird.
 
Back
Top