• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Battlefield V Tides of War GeForce RTX DirectX Raytracing

They are still not fixed, the developer used SSR to render them until the original bug is fixed (Also the lighting bug is not fixed). Later when they implement the system which mixes SSR with ray tracing they will recieve the ray tracing treatment.

In essense, you were WRONG, the patch improved IQ, and didn't downgrade it in the slightest.
Save your breath, the red camp will grasp at any straw if it keeps them from admitting AMD is being left behind. Even if you somehow get them to admit that since everybody is still figuring DXR out, rendering is expected to be on a roller-coaster for a while, they'll find something else to complain about. You're basically just feeding trolls.

PS For bonus points, ask one of them when was the last time they raged over IQ in a game. Any game.
 
so still a long way to go.not worth to buy RTX card , Instead 2025 when console adopts Ray tracing.
 
so still a long way to go.not worth to buy RTX card , Instead 2025 when console adopts Ray tracing.
Straw man argument. Nobody in their right mind buys Turing RTX card for DXR effects. RTX 2070 is same price as GTX 1080 and a bit faster. RTX 2080 is the same price as GTX 1080Ti and same performance.
I get the notion that RTX cards are expensive but when buying a new card today there is no good reason to prefer a Pascal card over a Turing one, DXR or not.
 
Yes it does look better than the first attempt but you will still need a 750 dollar card to fully enjoy this new technology.
Also where are the other rt games?
 
Yes it does look better than the first attempt but you will still need a 750 dollar card to fully enjoy this new technology.
Also where are the other rt games?
Isn't this the cart before the horse?

How and why would developers take the time, money, and effort to code for it when there are no GPUs on the market (and therefore no users) that could run it? How quickly would adaptation be if it worked that way?

Games are coming out in time.... only time will tell how quickly this catches on... or doesn't. Plenty have made that gamble before. Look at AMD and their APIs for example... those don't seem to be panning out and saturating the market...
 
so still a long way to go.not worth to buy RTX card , Instead 2025 when console adopts Ray tracing.
the same as tessellation, shader 3.0 dx9c, msaa, dx11, vulkan

it takes a long time and many gens of gpus to be standard
 
Yes it does look better than the first attempt but you will still need a 750 dollar card to fully enjoy this new technology.
Also where are the other rt games?
You're not going to enjoy DXR for the next year or two. But these cards at least let you run some tech demo to get an idea. As a bonus, you get to show that off to your friends that can't run those demos :D
 
And lets not kid ourselves here, AMD has always been behind nVidia. Since the days of HD2900 AMD never held the single GPU crown once.

You are the only one kidding yourself here and it's hilarious. You should also try and fact check the nonsense you write.

Nobody in their right mind buys Turing RTX card for DXR effects.

Yet here we are , 4 pages of people arguing whether DXR is worth anything or not. Certainly someone cares.
 
You are the only one kidding yourself here and it's hilarious. You should also try and fact check the nonsense you write.

Shower misinformation with facts. Sure beats patronizing and insulting members. ;)
 
HD5870? Do you remember? Nvidia took 8 months to reply with his famous GTX480 that was very good for winter days.
So between 2007 and 2018, AMD has bested Nvidia twice. Nvidia must have lost a lot of sleep over this.
Don't get me wrong, I gave a lot of praise to ATI, but that was back in the Radeon 8000 days. Between then and now, Nvidia has always been the obvious choice for me. Well, part of that has been Linux support which AMD has almost fixed. But now they have the Linux support and no card I'd want to buy :(
 
Last edited:
Shower misinformation with facts. Sure beats patronizing and insulting members. ;)

The internet is right there for anyone to fact check. If you don't, it's on you. I only care to point that out.
 
So between 2007 and 2018, AMD has bested Nvidia twice. Nvidia must have lost a lot of sleep over this.
Don't get me wrong, I gave a lot of praise to ATI, but that was back in the Radeon 8000 days. Between then and now, Nvidia has always been the obvious choice for. Well, part of that has been Linux support which AMD has almost fixed. But now they have the Linux support and no card I'd want to buy :(

Ofc Nvidia has been better option, I can´t deny that, I have Nvidia too for some gens now. That´s obvious. But was just stating that he was wrong. Sadly AMD can´t compete on the high-end anymore vs Nvidia. Even gtx 1080 levels are not high-end soon. They still have 1080ti, 2080 and 2080ti.
 
The internet is right there for anyone to fact check. If you don't, it's on you. I only care to point that out.
Right now, I'm imagining you're a defense attorney.
"Your honor, the prosecution is wrong!"
"Why would you say that?"
"The truth is out there, you just have to look."
 
When the intentions are blatant so are my words.
There's something to be said for taking the high road... be a part of the solution here, not the problem. Being blatent really isn't the issue. Its clear you wanted to lambaste the guy. Just saying chill out and be cool instead of breaking the rules and acting like that towards people. :)

Nobody is perfect... I'm not... just saying though. :)
 
the same as tessellation, shader 3.0 dx9c, msaa, dx11, vulkan

it takes a long time and many gens of gpus to be standard

There is nothing wrong with Pre-Vulkan/DX12 , DXR Tech is Still too young so not Worth to buy RTX card because of new tech and DXR tech requires enormous resources which is not ready. I'm sorry if any Nvidia pro is not happy with my post.
 
HD5870? Do you remember? Nvidia took 8 months to reply with his famous GTX480 that was very good for winter days.
You delcare winners within a one release cycle, not on the basis of who releases a card first. So Yeah HD5870 was faster than GTX 280 but that was a previous gen, then NVDIA responded in the end with a superior performing GTX 480. So within the same gen the 480 won.
 
What are you smoking? The Low, Medium vs. High, Ultra looks totally different!

Seriously wth is going on with all the big sites? Everyone just rushes to publish quick benchmarks, but no-one bothers to check if the IQ is identical or did they sacrifice quality for performance.
These things, especially with such big boosts, always need a before/after comparison

Agreed. Go watch Digital Foundry on YouTube. They do ACTUAL work on image quality in games.

TPU "No differences between Ultra and Low". HAHAHA, makes me laugh when in their screenshots, thing that takes up 20% of the screen shot looks completely different.
 
Vegetations were ignored by the ray tracer since day one.
View attachment 112006

Is that from TechSpot ?

TechSpot - Revisiting Battlefield V Ray Tracing Performance


However there is a visual quality downgrade that I spotted in this area, this new patch has introduced some artifacting to the way ray traced reflections are handled. Previously, everything that was supposed to be reflected, was reflected, so you got these nice, clean and for the most part supremely accurate reflections that blew away screen space reflections for visual quality. However with the latest patch, some screen space reflection-like artifacts have crept back into ray traced surfaces.

If you look closely at the water surface, at times an object will move across the reflected area like an AI character or a falling leaf, and for a brief moment you’ll spot the classic screen space reflection-like streak caused by that object obstructing the reflection’s path. My guess here is that DICE have chosen to more aggressively cull rays from objects not in view, which has improved performance significantly, but it’s at the cost of the occasional artifact where something that should still be in view is getting culled erroneously.
 
It's still an "it works on my computer" thing to me as far as I'm concerned except substitute computer for nvidia endorsed game
 
You delcare winners within a one release cycle, not on the basis of who releases a card first. So Yeah HD5870 was faster than GTX 280 but that was a previous gen, then NVDIA responded in the end with a superior performing GTX 480. So within the same gen the 480 won.

Yeah even if they took 8 months to release the card right? That´s the same thing as if AMD around May 2019 had a card that would beat RTX 2080ti. But Nvidia had the best GPU for 8 months or so. 8 months is a lot of time and I remember people going for HD5870 in that time because AMD beat nvidia.
 
Is that from TechSpot ?
Yeah.

It's because techspot are a bunch of idiots, previously neither vegetation or leaves were reflected in RTX, but DICE made them reflected with SSR on top of RTX. So they improved the total reflection quality.
Yeah even if they took 8 months to release the card
Yeah, even if it took them a year and a half (like Vega) it still counts as one cycle. So the 1080Ti beat the Vega 64, not FuryX.
 
Yeah.

It's because techspot are a bunch of idiots, previously neither vegetation or leaves were reflected in RTX, but DICE made them reflected with SSR on top of RTX. So they improved the total reflection quality.

Yeah, even if it took them a year and a half (like Vega) it still counts as one cycle. So the 1080Ti beat the Vega 64, not FuryX.

Yeah, so during that year and half no one could dare to say AMD was on top, because "hey wait! Nvidia is still yet to release their card". Impecable logic!
 
Back
Top