• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Battlefield V with GeForce RTX DirectX Raytracing

Do we live in the same Universe?
Clearly you simply don't understand real-time ray-tracing. And yes, 30FPS is acceptable. There are a ton of really great games that have a 30FPS limit built into them. They're still playable and still fun. You are displaying an elitist attitude. Kick that negativity down a notch or two and open your mind to the possibilities.
 
Clearly you simply don't understand real-time ray-tracing. And yes, 30FPS is acceptable. There are a ton of really great game that have a 30FPS limit built into them. They're still playable and still fun. You are displaying an elitist attitude. Kick that negativity down a notch or two and open your mind to the possibilities.

Patently false for a large number of ppl. If I wanted a migraine and to be extremely frustrated, I'd play at 30 fps. Also, you'd need adaptive sync to even see anything with all the tearing and stuttering going on.
 
Is this RT core always active, and not power gated,
That is such a great speculation, and may end up being just how first-gen RT works and why it hogs power. At the announcement, Jensen said something about having to fuse/integrate these new cores into the existing raster pipeline, in order to achieve the speedup that was marketed in his presentation. Of course, the alternative would be to use the new cores in a separate (tracing) pass, but then you'd end up needing large and fast buffers to hold the intermediate work. This adds complexity and latency, much like having RT on an add-on board.

Just think about how NVidia improved Simultaneous Multi-Projection in Pascal to 16 view ports in one pass, over Maxwell's 6, in order to accelerate VR rendering. Add enough registers into the geometry transform stage and throughput is increased via SIMD-like instructions that transform the same scene geometry data through multiple view matrices in the same pass. It is basically pipeline injection, because the entire render pass hasn't completed yet. While latency is reduced compared to the alternative, it is harder to gate power.
 
Clearly you simply don't understand real-time ray-tracing. And yes, 30FPS is acceptable. There are a ton of really great games that have a 30FPS limit built into them. They're still playable and still fun. You are displaying an elitist attitude. Kick that negativity down a notch or two and open your mind to the possibilities.

$1.300 is a premium price so I have every right to expect elite performance from the product.
 
Give it up lex I’m not gonna argue with you too. the numbers speak for themselves THAT is what I am standing by. If your happy with 1060 performance for $1200 than I guess Nvidia has done its job and you’re pretty easily satisfied and thie tyoe of customer they were counting on yes NOW I’m going after an owner, you. Wanna be a tatgert fine. RTRT has basically been proven to be not ready for mainstream as I’ve said the numbers prove it. That would be the equivalent of me using my Vega at 720p to get playable FPS sorry that’s ridiculous. They marketed this card on this feature and it can’t deliver without MAJOR compromises and this is as basic RT as you can get. Tomb Raider is having even more trouble with performance so don’t expect this to get any better. The hardware just isn’t good enough. Turing 2 will hopefully correct that or else it’s going to remain in the Professional space a few years longer.
Only trolling here was Rhino I’ve stuck to the facts not your fantasy of RT for all. We’ll get there but Turing is not going to be the one to make that happen. If we take RTRT out of the mix Turing is still overpriced but is a very capable card.

Clearly you simply don't understand real-time ray-tracing. And yes, 30FPS is acceptable. There are a ton of really great games that have a 30FPS limit built into them. They're still playable and still fun. You are displaying an elitist attitude. Kick that negativity down a notch or two and open your mind to the possibilities.
Oh give me break lex!! 30FPS in an open world MP FPS I clearly do understand it and it’s too taxing for the available hardware. You’re diisplaying a sevrere lack of sense. NOBODY but a bandwagoner like you would defend such poor performance trying to pedal 30FPS as acceptable for a game like this. It’s not acceptable on my Vega any more than it’s acceptable on a 2080 Accuse me of trolling yet you expect people to swallow that BS? Stockholm Syndrome already?
 
Last edited:
Turing 2 better come in 7nm flavor, so they can squeeze more RT cores onto there because the increase is badly needed. Power requirements may not go up drastically due to the shrink, but heat density will be a major challenge and will almost require water cooling to sustain current clocks. Taking power and thermal limitations into consideration, Turing 1 is quite compromised, just not sold as such.
 
Oh give me break lex!! 30FPS in an open world MP FPS

Here's the real kicker: people on Xbox One are getting better performance that whoever chooses to enable RTX, XBOX ONE.

The PC master race has transcended many boundaries it seems.
 
Here's the real kicker: people on Xbox One are getting better performance that whoever chooses to enable RTX, XBOX ONE.

The PC master race has transcended many boundaries it seems.
Wow now that’s actually sad..
 
$1.300 is a premium price so I have every right to expect elite performance from the product.
In non-RTRT gaming you get that. Going from the 1080 to 2080 I got an instant 40-50% boost in performance in my existing library of games. Not everyone is buying the Ti model. I spent much less that $1000 for my 2080 and offset that cost with the sale of my 1080. Keep things in the proper perspective and you'll see the big picture.

RTRT is brand new and it will continue to advance and evolve. In the mean time, non-RTRT gaming is getting big boosts in performance.
Give it up lex I’m not gonna argue with you too.
Ok then, don't.
the numbers speak for themselves THAT is what I am standing by.
Yes they do..
If your happy with 1060 performance for $1200 than I guess Nvidia has done its job and you’re pretty easily satisfied and thie tyoe of customer they were counting on yes NOW I’m going after an owner, you.
Thanks for the not to subtle insult. What I am happy with is what I mentioned just above, the big boost the 2080 gives to all existing games. I'm also happy to be an early adopter for this run of GPU's because I understand in reasonable detail how RTRT works and what is has to offer the future of gaming.
RTRT has basically been proven to be not ready for mainstream
Your opinion. Not everyone agrees. The benchmarks in this very review do not support that statement.
 
PCWorld was doing a steaming first impressions on BF5 RTX and not every reflective surface is reflecting objects or effects
Was just watching that. And was going to post it. Noticed a few things not being fully rendered.

What I'm seeing there is a fully playable game experience, with settings on Ultra at mostly above 80FPS.
 
Was just watching that. And was going to post it. Noticed a few things not being fully rendered.

What I'm seeing there is a fully playable game experience, with settings on Ultra at mostly above 80FPS.

He is also avoiding any interaction with players to see the reflections. Unless you like playing a Multi player game by yourself looking at your reflection. Then its the perfect experience.

The earlier video I posted had much more interaction and the FPS was much worse and again this is a single RT RTX feature (just reflections). Who know how much of performance hit will be added if any other effects are introduced.
 
Last edited:
I would like to think this will see these cards no longer selling, but alas that is probably wishful thinking. Until AMD get their act together, Nvidia are just sat there, arms folded, saying "Well, if you want a top end fast GPU what else you gonna get, eh?! Now give us ya money you muppets!"
 
Your opinion. Not everyone agrees. The benchmarks in this very review do not support that statement.
Oh lex, please show me where I’m wrong. I know you love this bone and can’t put it down. Where are the acceptable performance numbers? The ones that show RTRT in a positive light. No sorry 30FPS is not positive. Being forced to play at 1080 to get acceptable FPS is not positive. This game isn’t a walking simulator. There is no way to justify a 58% loss in performance just by turning DXR on. That’s the absolute definition of diminishing returns.

This is PhysX 2.0 big shiny bad performance.
 
In non-RTRT gaming you get that. Going from the 1080 to 2080 I got an instant 40-50% boost in performance in my existing library of games.

You shouldn't be comparing 1080 vs 2080 as for 1080 NVidia's MSRP was $549 and 2080 is $799. So real comparison is 1080 vs. 2070 (5-10 % increase in perf) and 1080TI vs 2080 (0-4 % increase in perf)… RTX 2080TI can be compared with Titan V ( 8 % faster on average). And btw. 1080TI vs 2080TI = +24 % at 1440p and 30% at 4K not 40-50% boost. So you're paying $600-700 more for 27% increase in FPS.

 
Last edited:
He is also avoiding any interaction with players to see the reflections. Unless you like playing a Multi player game by yourself looking at your reflection. Then its the perfect experience.

The earlier video I posted had much more interaction and the FPS was much worse and again this is a single RT RTX feature (just reflections). Who know how much of performance hit will be added if any other effects are introduced.
He's demonstrating the effects on offer. I think it's cool looking.
 
There is no way to justify a 58% loss in performance just by turning DXR on.

Actually at 1080p if you disable DXR you gain a whopping 2.5x performance improvement. And mind you that's with a fairly CPU bound scenario.
 
I'll compare what the heck I want thanks. I owned a 1080 and replaced it with a 2080 in the same machine. That's my comparison basis.
I won’t deny A:RTRTis pretty neat but we’re at least a generation away from it being practical
B:if we take RTRT out of the equation it’s great card
Just don’t try to pretend this is acceptable in its current form it’s purely proof of concept.
 
What I've found in testing this game is the first two SP campaigns are pretty easy as far as RT goes. In fact, the GIGA 2070 used ran the first two over 60 FPS using an 8700K @ 4.7 GHz - where we run our benchmarks (67 and 64 FPS respectively). With RT off, it doubled to ~110-155 FPS. It is that last campaign next to the locked one that really cripples the cards. The water on the ground that is RT'd and whatever else... kills it. In that test, this card averaged 29 FPS. With RT off it pulled 48... so there is something up outside of RT in that level that really puts a hurting on GPUs REGARDLESS of RT being enabled in-game.


Outside of that, not jumping in the ball pit with the kids... I'll get sick. Its like a friggin pre-school in this place, LOL!

Anyway, moving on to the 2080...... then the 2080Ti.


EDIT: I sure would love for W1z (or someone who knows) to share which scenes were benchmarked for this and the previous review................
 
Last edited:
New tech = Bleeding edge.

I just can't see spending so much cash on something that is still in the process of teething. I'll give it some time to mature a little. (a lot)

The problems with some of these 2080Ti cards piss me off because prices on 1080Ti cards stopped dropping. (not good)
 
Needs another two generations at least before RT will take off, if ever. So GTX 4080. This performance is sh**e.
Yeah I’m quite interested to see what AMD plans to do with it. I’d think I’ve got a lot of idle Conpute units that would love to do the math but I’m sure performance would be just as bad. Also curious if when SLI is enabled it can leverage both cards. That would be the only real viable setup with a decent balance of performance.
 
I'll compare what the heck I want thanks. I owned a 1080 and replaced it with a 2080 in the same machine. That's my comparison basis.
Lex is actually right on this. Price doesn’t matter on comparison until you do price vs performance. A direct comparison between 2080 and 1080 is completely justified and is what should be compared as they occupy the same place within each generation.
 
Back
Top