• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Battlefield V with GeForce RTX DirectX Raytracing

@W1zzard I asked @btarunr this in the original news post thread, but it doesn't seem to have made it through the noise, so I'll try again:

Ca you please do some power/clock speed monitoring while running RTX loads? Given that all RTX cards use 100% of their power targets doing rasterized rendering, I'm wondering just how much of this performance drop is due to power throttling of CUDA cores to allow the RT cores to work; and as a consequence, also how much power the RT cores require for their current performance levels. This would be an invaluable insight, and far more interesting than just pure performance numbers.

You wouldn't even have to do this across cards - just do simple a/b comparison with RTX on and off on a single card at a single detail level.
 
Just upgraded to 1809 and bought bf5 especially to test out this new feature(RTX). It's impressive as it adds a lot to scene realism. I game at 1080p with an rtx 2080 so I was able to set the RTX to ultra. I didn't see any noticeable framerate decrease.

I like it how some people agree and some people disagree with this new tech, each have their arguments but especially those who disagree put a lot of frustration in describing their disagreement.

I can't say it's worth the money since it's only some eye candy added on top and as someone else mentioned it's a fast paced shooter, you wouldn't be able to tell the difference.

All in all I disagree with the price and how it was introduced but am also for evolution of technology. Is it going in the right direction? Only time will tell, and passionate engineers and scientists. As a personal opinion I like it and I wish it will be used in more titles, where it could really make a difference.
 
Just upgraded to 1809 and bought bf5 especially to test out this new feature(RTX). It's impressive as it adds a lot to scene realism. I game at 1080p with an rtx 2080 so I was able to set the RTX to ultra. I didn't see any noticeable framerate decrease.

Unless you are on a 30Hz panel with V-Sync on, I highly doubt that...
 
Unless you are on a 30Hz panel with V-Sync on, I highly doubt that...
Yeah, dropping from 138 to 58 fps ought to be noticeable no matter your display. Even on a 60Hz display you'd notice a clear drop in fluidity and smoothness. I should know; I play Rocket League on a 60Hz display, locked to 120fps it's far smoother than at 60fps, even if half of those frames are discarded.
 
New tech is always great. Personally, I'm glad i held off this round; but Ill definitely enjoy it once they get the kinks out.

I get that it's a technological marvel and whatnot, but from a gaming point of view, and after watching a boatload of "on vs off" videos - it's just another eye candy element - and a rather subtle one at that.

If I owned one of these cards, and I was playing this game at 1440P i would leave it in the 'off' position.
 
Last edited:
@W1zzard I asked @btarunr this in the original news post thread, but it doesn't seem to have made it through the noise, so I'll try again:

Ca you please do some power/clock speed monitoring while running RTX loads? Given that all RTX cards use 100% of their power targets doing rasterized rendering, I'm wondering just how much of this performance drop is due to power throttling of CUDA cores to allow the RT cores to work; and as a consequence, also how much power the RT cores require for their current performance levels. This would be an invaluable insight, and far more interesting than just pure performance numbers.

You wouldn't even have to do this across cards - just do simple a/b comparison with RTX on and off on a single card at a single detail level.

I was interested in knowing that too. So far this is the closest answer

TechSpot said:
It is interesting to note across these tests that we are being RT core limited here. The higher the resolution, the higher the performance hit using the Ultra DXR mode, to the point where playing at 4K is more than 4x faster with DXR off. This also plays out when we spot checked power consumption: the cards were running at consistently lower power with DXR on, because the regular CUDA cores are being underutilized at such a low framerate.
 
If I was playing this game at 1440P i would leave it in the 'off' position.
so do i but.... not with a RTX 20XX... rather with a Vega 64 given how it's dropping to my actual 1070 price (520ish $ at the time) and how a 2070 is also overpriced for no reason (650-790$ for me ) and how little difference there is in performance (RTX off ... ofc) while the price difference is around 100 to 150$ ...



ah, 1 month and a half to wait ... harsh end of year for me :laugh:
 
The sad thing is, the amount of technology this takes... but humans generally overlook shadows & lighting - we don't wander around the real world paying attention to light interactions and shadows, if anything, we actively ignore those things so we can focus on what we're doing.

When a game is below the FPS level that I want, shadow detail and occlusion effects are the first things that get turned down. I feel like to most people they just don't make a huge difference - as long as they're there and in semi-realistic detail we pay about as much attention to them in game as we do in the real world - as little as humanly possible.
 
I don't get it. Why is the performance taking a hit. Any hit whatsoever. Because those RT cores are either making all the other cores sit idle 2/3 of the time, and we need 3x more of them to make up for the lack of RT power or they are only doing part of the job for RTRT and are offloading the rest to the Cuda cores. This is not good.
Turns out that DLSS on 4K is actually 1440p with upscaling and they call that a 35% performance increase when it is actually a drop, because going from 4K to 1440p should yield 100% performance increase. It is not doing DLSS for free, offloading all the work to the tensor cores, but it is cheating. Anyways I'm using 2070 instead of 1080 with RTX OFF DLSS OFF for the time being lol.
 
i just bought an "upgrade" card.. it just happened to have ray tracing.. a single 2080ti.. coming from a pair of 1070 cards in sli i only had a couple of options.. a single 2080ti or a pair of 1080ti cards in sli..

no way on this planet did i buy anything because it had ray tracing abilities.. i dont know how i fit in the general scheme of things as regards ray tracing but for me any upgrade option would have cost a lot ray tracing or not..

my next gpu upgrade will also cost a lot.. it will be another 2080ti to match the one i already have..

trog
 
No trog you don't. When 7nm hits the market, you sell the 2080Ti for it is much slower than a ~~70 nextgen. just like 1070 and 980Ti. 970 beat the first TITAN Kepler.

The single RTX 3070 will make RTX TITAN 4608 Core 12GB Edition look like a toy, and buy a single card 3080 Ti.
 
If I'd pay $1.300 for GPU, I'd expect flawless RTX gaming at 4K not a slideshow. Maybe RTX 4080TI can deliver.
by then you will have to pay 4445 for the RTX 4080Ti, assuming the msrp will increase by 85% for each generation as it did when rtx 2080ti was released...
gtx 1080ti msrp 699 usd
rtx 2080ti msrp 1299 usd
rtx 3080ti msrp 2405 usd
rtx 4080ti msrp 4445 usd

I bet it will be sold out before it was released
 
The sad thing is, the amount of technology this takes... but humans generally overlook shadows & lighting - we don't wander around the real world paying attention to light interactions and shadows, if anything, we actively ignore those things so we can focus on what we're doing.

When a game is below the FPS level that I want, shadow detail and occlusion effects are the first things that get turned down. I feel like to most people they just don't make a huge difference - as long as they're there and in semi-realistic detail we pay about as much attention to them in game as we do in the real world - as little as humanly possible.

That's it, this is why RT on these cards right now is quite frankly, pathetic. Halves performance all for the sum total of....better reflections. And here's the kicker: you won't notice them as you stomp around in multiplayer. Worse; you'll very likely turn it off!

Hence Nvidia trying to charge a premium for RT is pathetic too.
 
No trog you don't. When 7nm hits the market, you sell the 2080Ti for it is much slower than a ~~70 nextgen. just like 1070 and 980Ti. 970 beat the first TITAN Kepler.
Wow, still peddling your BS, huh? Your example of the 1070 beating the 980Ti was a one-off, a singular event.

Two, you dont ever advise someone to keep buying down from whatever tier they are at. What happens is that in 2026 they end up with an RTX5030 entry level card that of course will decimate any of today’s games, but can only play new games that year in a slideshow.
 
Any word yet on the scene/method tpu tested? I feel more annoying than normal asking for a 3rd time... but how many days should one wait for a simple answer? Why does this feel like it is some kind of secret?
 
Any word yet on the scene/method tpu tested? I feel more annoying than normal asking for a 3rd time... but how many days should one wait for a simple answer? Why does this feel like it is some kind of secret?
That’s actually a good request. People can compare their own results, as well as account for differences or similarities with other review sites.
 
we weren't aware DICE added another setting called "DXR reflections quality," which by default was set to "Ultra" (other settings include Low, Medium, and High).

Do I get it right, the lower than Ultra settings were not tested?
So maybe "DXR Reflections Quality: Medium" looks OK with 60% less performance penalty?
 
Lol the regular CUDA cores are under utilised and power consumption falls during RT ON, because not enough RT cores are present in the chip. They need 4 RT cores per SM of 64 CUDA cores, not just 1 RT core /SM64 as it currently is. Well that was a bad move Nvidia.
 
That’s actually a good request. People can compare their own results, as well as account for differences or similarities with other review sites.
Right. This isn't a canned benchmark so nearly everyone is doing it differently I would imagine.

I used the @wiz thing twice... no response though he posted in threads after. I know he is extremely busy which is why I don't want to ping him again... yet, I, and am sure many others, would still like an answer.

Part of the reason I am asking is because in the SP mode, there are 3 campaigns which, in my testing, the first two (Under No Flag and Nordlys) aren't very hard on the card with RT enabled. In fact, With Ultra settings and RT enabled, I pulled over 60 FPS in them with a RTX 2070. The 3rd campaign (Tirailleur) in the forest with the water on ground KILLS the card (around 30 FPS). So I am wondering how he got those numbers. It LOOKS like it is an average of the three??? I don't know if....

A. My testing is off...
B. How this testing is done in the first place... all 3 scenes and an average?


The other thing is, I can't even play the god damn game now. I swapped GPUs and it doesn't work. I double click the icon, the bfv.exe starts in task manager, gets to around 216MB and quits. Was on chat with EA for over an hour yesterday and they escalated the issue. I can't even friggin play the game now........ wek sos. I recall W1z mentioning something about swapping parts and limits, but, I don't have a message or anything and one would think ONE of the 3 EA reps I chatted with would have picked that out as I intentionally mentioned it in each chat so they were aware.

EDIT: Put the 2070 back in and it works... WTF?!!!

we weren't aware DICE added another setting called "DXR reflections quality," which by default was set to "Ultra" (other settings include Low, Medium, and High).

Do I get it right, the lower than Ultra settings were not tested?
So maybe "DXR Reflections Quality: Medium" looks OK with 60% less performance penalty?
If you look at the results, you will see low/med/high. Pretty sure that is the RT tested there.

I don't know, so much confuses me (maybe its only me) in how this was actually tested here at TPU.......
 
Last edited:
Nice review! Cool tech, won´t pay extra for it though.
On the game side of things, immersion is ruined in those screenshots instantly with realistic reflections that makes all the models look bad.
 
In non-RTRT gaming you get that. Going from the 1080 to 2080 I got an instant 40-50% boost in performance in my existing library of games. Not everyone is buying the Ti model. I spent much less that $1000 for my 2080 and offset that cost with the sale of my 1080. Keep things in the proper perspective and you'll see the big picture.

RTRT is brand new and it will continue to advance and evolve. In the mean time, non-RTRT gaming is getting big boosts in performance.

Ok then, don't.

Yes they do..

Thanks for the not to subtle insult. What I am happy with is what I mentioned just above, the big boost the 2080 gives to all existing games. I'm also happy to be an early adopter for this run of GPU's because I understand in reasonable detail how RTRT works and what is has to offer the future of gaming.

Your opinion. Not everyone agrees. The benchmarks in this very review do not support that statement.

Your 2080 "big boost" was already on the shelves for two years, they called it a 1080ti and it was cheaper. It seems your 'big picture' needs to get a bit bigger than it is.

you have it back to front.. the so called high prices are all about shareholders and stock prices.. the company has no obligation to its customers only in the sense it needs them as cash cows.. if they get it wrong and lose the cash cows they have f-cked up.. time will tell on that one..

trog

Spot on! That is why many cash cows are now up in arms against RTX. See, you do get it, it just takes awhile.
 
Your 2080 "big boost" was already on the shelves for two years, they called it a 1080ti and it was cheaper. It seems your 'big picture' needs to get a bit bigger than it is.
Maybe, but it didn't have RTRT. And that is something I'm excited about and looking forward too!
 
And that is something I'm excited about and looking forward too!

I'm sure you are, you bought a 2080 after all. :rolleyes:

Your militant praise is staggering , after entire pages your still here replying to everyone with the same standard response, that RTX is great and we aren't enlightened enough to realize this astonishing feat that Nvidia have brought over to us.

Whatever floats your boat but you're wasting your time doing that, literally no one believes you. Not that your words surprise me, few would have the boldness required to admit that the rather expensive product that they bought isn't as stellar as they initially thought. Your determination to protect your purchase to the bitter end is admirable though.
 
Back
Top