Wednesday, September 5th 2018
DICE to Dial Back Ray-tracing Eye-candy in Battlefield V to Favor Performance
EA-DICE, in an interview with Tom's Hardware, put out some juicy under-the-hood details about the PC release of "Battlefield V." The most prominent of these would be that the commercial release of the game will slightly dial back on the ray-tracing eye-candy we saw at NVIDIA's GeForce RTX launch event demo. DICE is rather conservative about its implementation of ray-tracing, and seems to assure players that the lack of it won't make a tangible difference to the game's production design, and will certainly not affect gameplay (eg: you won't be at a competitive disadvantage just because a squeaky clean car in the middle of a warzone won't reflect an enemy sniper's glint accurately).
"What I think that we will do is take a pass on the levels and see if there is something that sticks out," said Christian Holmquist, technical director at DICE. "Because the materials are not tweaked for ray tracing, but sometimes they may show off something that's too strong or something that was not directly intended. But otherwise we won't change the levels-they'll be as they are. And then we might need to change some parameters in the ray tracing engine itself to maybe tone something down a little bit," he added. Throughout the game's levels and maps, DICE identified objects and elements that could hit framerates hard when ray-tracing is enabled, and "dialed-down" ray-tracing for those assets. For example, a wall located in some level (probably a glass mosaic wall), hit performance too hard, and the developers had to tone down its level of detail.At this time, only GeForce RTX series users have access to the ray-tracing features in Battlefield V, and can turn them off to improve performance. There are no DXR fallbacks for people with other graphics cards (GeForce GTX or Radeon). "…we only talk with DXR. Because we have been running only NVIDIA hardware, we know that we have optimized for that hardware. We're also using certain features in the compiler with intrinsics, so there is a dependency. That can be resolved as we get hardware from another potential manufacturer. But as we tune for a specific piece of hardware, dependencies do start to go in, and we'd need another piece of hardware in order to re-tune." DICE appears to be open to AMD sending hardware with its own DXR feature-set implementation, so it could add it to Battlefield V at a later stage. The RTX features themselves will only make it via a day-zero patch when the game releases in October, and won't feature in tomorrow's open-beta. There's also no support for NVIDIA SLI. The interview also reveals that Battlefield V has been optimized for processors with up to 6 cores and 12 threads.
Source:
Tom's Hardware
"What I think that we will do is take a pass on the levels and see if there is something that sticks out," said Christian Holmquist, technical director at DICE. "Because the materials are not tweaked for ray tracing, but sometimes they may show off something that's too strong or something that was not directly intended. But otherwise we won't change the levels-they'll be as they are. And then we might need to change some parameters in the ray tracing engine itself to maybe tone something down a little bit," he added. Throughout the game's levels and maps, DICE identified objects and elements that could hit framerates hard when ray-tracing is enabled, and "dialed-down" ray-tracing for those assets. For example, a wall located in some level (probably a glass mosaic wall), hit performance too hard, and the developers had to tone down its level of detail.At this time, only GeForce RTX series users have access to the ray-tracing features in Battlefield V, and can turn them off to improve performance. There are no DXR fallbacks for people with other graphics cards (GeForce GTX or Radeon). "…we only talk with DXR. Because we have been running only NVIDIA hardware, we know that we have optimized for that hardware. We're also using certain features in the compiler with intrinsics, so there is a dependency. That can be resolved as we get hardware from another potential manufacturer. But as we tune for a specific piece of hardware, dependencies do start to go in, and we'd need another piece of hardware in order to re-tune." DICE appears to be open to AMD sending hardware with its own DXR feature-set implementation, so it could add it to Battlefield V at a later stage. The RTX features themselves will only make it via a day-zero patch when the game releases in October, and won't feature in tomorrow's open-beta. There's also no support for NVIDIA SLI. The interview also reveals that Battlefield V has been optimized for processors with up to 6 cores and 12 threads.
62 Comments on DICE to Dial Back Ray-tracing Eye-candy in Battlefield V to Favor Performance
software.intel.com/en-us/articles/quake-wars-gets-ray-traced
Quake 2: amietia.com/q2pt.html
2004: 40 Xeon CPUs doing RTRT in Quake3 at 512x512px @ 20fps - called a great achievement (I remember this one from the press).
2009: 4 Xeon CPUs doing RTRT in ET:QW at 720p @ 20-35fps - fantastic, finally usable without HPC.
2018: a single GPU doing RTRT in BF V at 4K @ 20-30fps - awful, pointless.
And even if you play ET on highest settings and BF on lowest, the difference in details (and minimal requirements) is just enormous.
Are RTX cards too early? Not for those that are fine with 1080p. Everyone else can wait until RTX cards match their requirements. What we have is a PoC, but also a consumer product that gaming studios can work with.
It really doesn't matter when RTRT would be included in GPUs, because it would always mean a significant performance drop from a level that we would be used to at that point.
"Amazingly, this happened in 2004, a time when most people rejected the concept of real-time ray tracing. " - nothing changed.
But really, Ray tracing for lighting has been used in movies to cinematic effects for at least a couple of decades now, but not in real-time. Peter Jackson was quite the tech investor when he started on the Lord of the Rings Trilogy, and those movies have been out forever. :p Anyway, with that said, the idea that ray-tracing would enter the gaming arena has been worked on extensively by all players in the CPU/GPU field, and the idea that such effects would potentially cripple a system aren't new either, as you've pointed out. So devs need to very carefully balance the effects vs performance for all GPUs not just "in general", but now quite specifically, and that's a lot of work. We should be commending the effort that is being put into this tech now. If anyone thinks that DICE has delayed a month because of RTX alone, they are greatly mistaken... it's been part of the engine since day one. The only thing that has changed, maybe, is the specs of the cards that support it, giving them a clear line in the sand that they need to optimize to. But given the huge size of this franchise, they should have had such data for a long time, and if they haven't, well, that's hilarious to me as well.
Actually, RT is more like tasks that we usually run on CPUs. Each ray (or group of parallel rays) is traced in a single thread. It moves through different matter, it bounces and diffracts. It's a complicated serial problem and running it on GPGPU means massive number of cycles.
CPU rendering workstations (high-core Xeons) are still doing pretty well agains best GPUsdespite having 50 or 100 times less cores. :-)
As Cadeveca points out , Nvidia reinvents wheel, we're not convinced, they buried Caustic over ten years ago Afaik and they Had working Asics that they said could scale well.
Nvidia bought and buried them, and now ten years on ish , look a wheel.
Nope sorry it was imagination technologies that bought Caustic ( i am stressed at work atm ,brain ache inc)
And they sell add in cards apparently
Still No ty Nvidia.
Make no mistake, I can't wait the day when entire scene will be ray traced in real-time at 60+ fps at any resolution, even 4K or more. Basically, from there on, we'll basically reach peak realism, only thing we'll be able to build upon will be number of rays and bounces used. And more work on textures and models. There won't have to be "creative" part to achieve effects, they just happen naturally then. But we're still far away from there.
Imagine current ray tracing maturity being somewhere on the level of first pixel shaders when everyone was obsessed over them and they basically only used them to make water...
Sorry to say this, but it seems like many of you just heard about RTRT for the first time... Of course the idea isn't new. But the consumer-friendly implementation is first and should remain unmatched for a while. That's the great part.
RTRT itself is nothing special... at least for me.
Again: it seems like some people here just had their first contact with this idea. But they quickly found info that someone has done it already 10 years ago, so it's clearly nothing special.
I guess on this forum the only exciting thing about games is more fps. Booooring.
I will likely get one somehow at some point jyst to try , but I insist on high res and Qi so it's not for me, i know this.
Still I'm not saying it's rubbish just that it'll be rubbish to me and many others.
And RTRT isn't new, you definitely are not the only one who knows what is what here ,you talk a lot, doesn't mean it's worth reading.
And Nvidia invented a cheaper(in computation) algorithm and the hardware, to run a less then equal version of Raytracing (it is not fully Raytraced) yet still heavily impact performance , im not paying over odds for that.
By the sound of it a 2080ti should play pretty much any present game at 4k@60 without Rtx or some games with Rtx on at 1080@60.
Those two resolutions work just fine on my monitor but do you really believe I or any other 4k owner Ever goes down to 1080.
Also because of this ,ie a 2080/2080ti owner is going to connect it to a pretty Decent monitor, they're not then going to be happy playing the Latest AAA games at 1080 and swapping back to 4k for everything else ,so i expect dev adoption to die off, even with big NV money about.
I don't game on the PC anymore. 1050 wasn't bought for gaming anyway. Well, we've already established that my knowledge is limited by what I read and your comes from owning. I do hope you'll get an RTX card at some point and your rendering knowledge will explode. It's worth it. :-) You have literally no idea what you're talking about. :-D
1) There's no such thing as "fully raytraced".
2) Nvidia's RT approaches are among the most advanced available today. Sure, RTRT is nowhere near professional renders or even movies (which Nvidia is famous for, BTW), but RT cores speed up serious rendering as well. It's a huge achievement. And it works in tandem with tensor cores. It's just a showcase of technological supremacy.
These first RTX cards being successful or not is another story - we'll see some sales statistics after the launch. IMO the preorder looks promising.
Nvidia can now build professional rendering machines around this new tech, so the R&D will pay off anyway.
Is your money where your mouth is
Or
Is your mouth where your money's earned
Your like a written ever changing Nvidia Advert with ultimate bs apeal.