• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Battlefield V Tides of War GeForce RTX DirectX Raytracing

Yeah, so during that year and half no one could dare to say AMD was on top, because "hey wait! Nvidia is still yet to release their card". Impecable logic!
Of course you are allowed to say it, but we are talking history here, not present day.
For that matter no one considered the 1080Ti a winner till Vega 64 showed it"s lackluster performance, people kept saying wait for Vega, wait for Vega! So you see, deep down everyone reacts to a cycle of release and a counter release, not of who gets the card first.
 
This is a stupid argument and doesn't belong in this thread. Why don't we settle it that both companies suck but for different reasons? Win/Win. End of argument. Move on.
 
So finally you can get 47-48 fps with Ray Tracing high-ultra with a $1200 card in one of the best optimized AAA games. And they achieved it by reducing Ray Tracing effects. Great! :D

Much to the chagrin of the NVIDIA haters DXR has turned not to be a dud/failure it was portrayed to be. Also, just also, real time ray tracing has been a pipe dream up to this moment however for some reasons people fail to appreciate the monumental work that NVIDIA has poured into the technology to make it possible. I'm not buying any of the available RTX cards but that's because I don't live in an economically developed country and this card is just too expensive for me. But I will surely consider the RTX 3060 once it becomes available. Meanwhile I'm quite content with the GTX 1060. While Europeans/Americans (I guess most of TPU visitors) enjoy an income of over $2500 a month, I earn literal pennies in comparison, so I'm not sure where this "NVIDIA is ripping everyone off" comes from.

@W1zzard

You say that image quality is roughly the same between various presets but you've surely missed the difference in guns rendering.

Check DXR quality first. And yes, ray tracing is a failure as only 1 game supports it yet which is the most optimized one of all AAA games.
 
Last edited:
Looks like working together they've delivered a large performance uplift for a very small IQ difference, I say well done. For 1st gen RTRT this is good news for everyone.

Personally I wouldn't buy an RTX card JUST for RTRT right now, but I'm glad someone is doing it.
 
Check DXR quality first. And yes, ray tracing is a failure as only 1 game supports it yet which is the most optimized one of all AAA games.
You mean a completely new DX12 feature that came out barely a couple months ago? :D
Battlefield 5 is not most optimized. In fact its DX12 is as problematic as ever in Battlefield games.
 
The sooner people understand that nVidia took GPUs designed specifically for datacenter workloads and only scrambled to try to figure out how to market them to gamers as an after-thought, the sooner you chumps who ponied up big $$ for their first generation junk can stop hoping for the promised land of nGreedia. Some intrepid soul at nVidia headquarters during a strategy meeting probably said: "I know! If we write a code wrapper, we can use the tensor cores to do primitive, slow real-time ray-tracing, but we can tell all the suckers who will buy anything we shovel that we 'painstakingly crafted the GPU to do real-time ray-tracing'! They'll actually believe that we designed Turing as a gaming GPU! They're so gullible, they'll actually THANK us for dumping our lower binned Turing chips that we couldn't sell to datacenters, onto them, and at eye-watering prices, too! We've actually managed to move the price tag of 'premium' gaming cards up from $300 just a few years ago, all the way up to $1,200 for an x80 Ti card, and now there are even saps who'll pay $2,500 for the ever-so-slightly faster Titan card line-up we created to dump the very tiny number of GPUs that aren't quite data-center quality, but are just that extra smidgeon less flawed than the x80 Ti chips!!" And guess what. It worked, as usual.

You know what the irony is, you're actually right, and Nvidia still seems to manage making it viable after all.

Go figure. This performance update shows that it has potential and that its not in the land of impossible for the next 5-10 years. It just needs one or two generational performance bumps to hit the midrange in full force and with decent FPS. That is viable, no matter how you twist it.

I've been a DXR / RTRT naysayer since day one but even I have to admit, this is starting to look promising. Even with reduced quality, there is definitely potential.

That said, a few cooked BF V maps aren't impressive to optimize for, its a very limited and controlled setting. This DXR still requires specific implementation and dev time. And it still remains to be seen if it will gain traction. Its still a cost/benefit thing with a... well, not very profitable outlook considering the % of players capable of using it.

Yeah, so during that year and half no one could dare to say AMD was on top, because "hey wait! Nvidia is still yet to release their card". Impecable logic!

Yeah, except Fury X was also surpassed by a 980ti.
 
Last edited:
I've been a DXR / RTRT naysayer since day one but even I have to admit, this is starting to look promising. Even with reduced quality, there is definitely potential.

That said, a few cooked BF V maps aren't impressive to optimize for, its a very limited and controlled setting. This DXR still requires specific implementation and dev time. And it still remains to be seen if it will gain traction. Its still a cost/benefit thing with a... well, not very profitable outlook considering the % of players capable of using it.

The road to RTRT is going to be long and rocky. Still, the end game is promising enough to make it worth the effort.
For years to come, a problem will be that RT hardware will not be everywhere, so devs won't be able to develop titles with only RT in mind. In a way, this just like cars. EVs are great, but hybrids not so much.
From what I understand, the process of getting an image through RT is simpler than the rasterization way (e.g. you don't need a pass for AO, a pass for reflections and such). Better looking images is just icing on the cake ;)

What can I say, games have been looking pretty much the same since DX10. That's going to change.
 
From what I understand, the process of getting an image through RT is simpler than the rasterization way (e.g. you don't need a pass for AO, a pass for reflections and such).
Hybrid RTRT is going to be used for the forseeable future. Full RTRT remains too taxing.
Currently, DXR can be used to replace or augment parts of rasterization - reflections, AO or shadows are the current suspects and this should result in more "correct" result and make things easier for art department and possibly for technical side of things as well. Each of these are generally done in multiple passes even now.

That said, a few cooked BF V maps aren't impressive to optimize for, its a very limited and controlled setting. This DXR still requires specific implementation and dev time. And it still remains to be seen if it will gain traction. Its still a cost/benefit thing with a... well, not very profitable outlook considering the % of players capable of using it.
This is not where problems for RTX and hybrid RTRT in general lie. While DXR is standard and Vulkan extensions are alive and kicking, the requirement today include DX12 or Vulkan. There really are not many games or engines that would have good enough DX12 or Vulkan implementation along with technical expertise or willingness to implement something on the cutting edge.

DICE's DX12 implementation is still shaky exactly as it has been so far. There are minor upsides to it but major downsides, stuttering and lower fps than DX11 across the board being the main ones.
I am hoping whoever is working on adding DXR effects (shadows) to Shadow of Tomb Raider does a good job as that game is the best DX12 implementation in a game engine we have so far.
Sniper Elite 4 is the second game that has an excellent DX12 implementation.
Hitman - that had a good DX12 implementation eventually - has given up on DX12 renderer in Hitman 2.
That's it for the big hitters. The current crop of games that are listed for DXR/RTX support are smaller games and in-house engines.
On Vulkan side of things things are grim. Shadow of the Tomb Raider might do something with Hybrid RTRT. id Software's track record also suggests they may try something.

BF5 is neither limited or controlled. DXR would be much more effective in a more limited and controlled setting. Some atmospheric single-player game, probably something slow like adventure is likely to be the DXR killer app for now. There are a few announced, we will see.
 
Last edited:
check the performance numbers again.
Check them yourself genius, it's running comfortably above 60fps with Ultra DXR

rtx-2080-ti-1440.png
 
80 is close..... while it isn't the 90 he said, all are still notably above 60 FPS.
 
So, if that's how we want to measure things. I guess 49 is the new 60?

Actually that is a great idea! Now you can consider the 2080 to be 1440@60 as well.
 
So, if that's how we want to measure things. I guess 49 is the new 60?

Actually that is a great idea! Now you can consider the 2080 to be 1440@60 as well.
Any chance you are looking at the "old patch" numbers?

The green bars are the data for the new update
 
So, if that's how we want to measure things. I guess 49 is the new 60?

Actually that is a great idea! Now you can consider the 2080 to be 1440@60 as well.
Low is closer to 90 than off which is 40 FPS higher. My point was simply that it was playable at 80 or 90 FPS (anything over 60 FPS).

That said, Yes, a 2080 is a 1440@ 60 RT card....from off to high/ultra! See the GREEN bars o_O. That is, unless you don't call 59.9 'near' 60 FPS?

1440.jpg
 
LOL, apparently I can't read. That's what Friday morning with a BS schedule does. My apologies EarthDog and thanks W1zzard.
 
Low is closer to 90 than off which is 40 FPS higher. My point was simply that it was playable at 80 or 90 FPS (anything over 60 FPS).

That said, Yes, a 2080 is a 1440@ 60 RT card....from off to high/ultra! See the GREEN bars o_O. That is, unless you don't call 59.9 'near' 60 FPS?

View attachment 112111

With the minor detail these are still average framerates, not minimums. 60 average is not something I'd settle for, especially not in a shooter. 90-100 fps average on the other hand, would result in 65-75 FPS minimums which is nice for a smooth experience. So its really only the 2080ti that offers playable RT performance at 1440p, let's not fool ourselves here.
 
The last BFV patch seems to have considerably evened out the FPS variability with DXR enabled.
The game itself will have a highly variable FPS based on map and situation but before this patch DXR enabled used to drop FPS down a lot occasionally and now the FPS variability is quite small.

Have not run benchmarks but I do have an FPS over time graph running on the second monitor.
In couple of the maps I tried with RTX2080 at 1440p FPS does sometimes drop below 60, occasional dips even under 50 but it does run between 60 and 70 for the majority of the time.

I agree on the part that BFV continues to be a bad showcase for DXR. Not for any technical reasons but simply because fast(ish)-paced shooter is a pretty bad place to introduce a visual effect with major FPS impact :)
 
With the minor detail these are still average framerates, not minimums. 60 average is not something I'd settle for, especially not in a shooter. 90-100 fps average on the other hand, would result in 65-75 FPS minimums which is nice for a smooth experience. So its really only the 2080ti that offers playable RT performance at 1440p, let's not fool ourselves here.
And by that, you mean a plain 2080 will do just fine for everything but fast paced fps (or whatever other genres may be that fast paced) ;)
But yes, that's the ugly truth about average fps.
 
And by that, you mean a plain 2080 will do just fine for everything but fast paced fps (or whatever other genres may be that fast paced) ;)
But yes, that's the ugly truth about average fps.

Sub 60 FPS is also a problem in every isometric perspective game because it creates very noticeable screen tear (or high input lag, not a great thing to have in any ARPG for example). So that also touches on every MOBA right there, in fact, most competitive online gaming and even non-casual offline single player gaming.

TPU should really be posting minimums as standard to be fair.

But yes, for casual gaming, that 2080 will do fine in RTRT. I wonder how many casual gamers spend 600+ on a GPU ;)
 
Was going to say... we know this already. Have you seen minimums tested for this yet, out of curiosity? I havent.

Competative gamers wont be using this tech now...no way.
 
Was going to say... we know this already. Have you seen minimums tested for this yet, out of curiosity? I havent.

Competative gamers wont be using this tech now...no way.

You know this, but in the context of saying 'the averages are comfortably above 60' when the guy before you spoke of 1440p/90fps performance, I think we're losing sight of the real use case on the high-end where, at 1440p, even the 2080ti will occasionally drop a frame or two below 60 FPS but it really is the only card that offers 'playable' performance.

When you see 1440p/60 averages such as with the 2080, that is certainly not a '1440p60 RT card'. It can get by, its 'capable', but far from desireable. What you see in videos is that the minimums there drop to 45-49 more often than not. For immersive, slow single player gaming, sure. For everything else? Meh.


1:10 onwards (and single player, it seems, which should not be ignored)
 
1:10 onwards (and single player, it seems, which should not be ignored)
Interesting. I played through the exact same place and had pretty much the same FPS (2-3 fps lower) with an RTX2080. 1440p all maxed.

I would suspect some limitations other than just GPU there. From what I experienced so far, DXR does not do worse in multiplayer compared to single-player bits. Single-player maps/scenarios seem to have "cinematic" parts in them and multiplayer has simply more action, this all kind of evens out.
 
Last edited:
Back
Top