Wednesday, September 5th 2018
DICE to Dial Back Ray-tracing Eye-candy in Battlefield V to Favor Performance
EA-DICE, in an interview with Tom's Hardware, put out some juicy under-the-hood details about the PC release of "Battlefield V." The most prominent of these would be that the commercial release of the game will slightly dial back on the ray-tracing eye-candy we saw at NVIDIA's GeForce RTX launch event demo. DICE is rather conservative about its implementation of ray-tracing, and seems to assure players that the lack of it won't make a tangible difference to the game's production design, and will certainly not affect gameplay (eg: you won't be at a competitive disadvantage just because a squeaky clean car in the middle of a warzone won't reflect an enemy sniper's glint accurately).
"What I think that we will do is take a pass on the levels and see if there is something that sticks out," said Christian Holmquist, technical director at DICE. "Because the materials are not tweaked for ray tracing, but sometimes they may show off something that's too strong or something that was not directly intended. But otherwise we won't change the levels-they'll be as they are. And then we might need to change some parameters in the ray tracing engine itself to maybe tone something down a little bit," he added. Throughout the game's levels and maps, DICE identified objects and elements that could hit framerates hard when ray-tracing is enabled, and "dialed-down" ray-tracing for those assets. For example, a wall located in some level (probably a glass mosaic wall), hit performance too hard, and the developers had to tone down its level of detail.At this time, only GeForce RTX series users have access to the ray-tracing features in Battlefield V, and can turn them off to improve performance. There are no DXR fallbacks for people with other graphics cards (GeForce GTX or Radeon). "…we only talk with DXR. Because we have been running only NVIDIA hardware, we know that we have optimized for that hardware. We're also using certain features in the compiler with intrinsics, so there is a dependency. That can be resolved as we get hardware from another potential manufacturer. But as we tune for a specific piece of hardware, dependencies do start to go in, and we'd need another piece of hardware in order to re-tune." DICE appears to be open to AMD sending hardware with its own DXR feature-set implementation, so it could add it to Battlefield V at a later stage. The RTX features themselves will only make it via a day-zero patch when the game releases in October, and won't feature in tomorrow's open-beta. There's also no support for NVIDIA SLI. The interview also reveals that Battlefield V has been optimized for processors with up to 6 cores and 12 threads.
Source:
Tom's Hardware
"What I think that we will do is take a pass on the levels and see if there is something that sticks out," said Christian Holmquist, technical director at DICE. "Because the materials are not tweaked for ray tracing, but sometimes they may show off something that's too strong or something that was not directly intended. But otherwise we won't change the levels-they'll be as they are. And then we might need to change some parameters in the ray tracing engine itself to maybe tone something down a little bit," he added. Throughout the game's levels and maps, DICE identified objects and elements that could hit framerates hard when ray-tracing is enabled, and "dialed-down" ray-tracing for those assets. For example, a wall located in some level (probably a glass mosaic wall), hit performance too hard, and the developers had to tone down its level of detail.At this time, only GeForce RTX series users have access to the ray-tracing features in Battlefield V, and can turn them off to improve performance. There are no DXR fallbacks for people with other graphics cards (GeForce GTX or Radeon). "…we only talk with DXR. Because we have been running only NVIDIA hardware, we know that we have optimized for that hardware. We're also using certain features in the compiler with intrinsics, so there is a dependency. That can be resolved as we get hardware from another potential manufacturer. But as we tune for a specific piece of hardware, dependencies do start to go in, and we'd need another piece of hardware in order to re-tune." DICE appears to be open to AMD sending hardware with its own DXR feature-set implementation, so it could add it to Battlefield V at a later stage. The RTX features themselves will only make it via a day-zero patch when the game releases in October, and won't feature in tomorrow's open-beta. There's also no support for NVIDIA SLI. The interview also reveals that Battlefield V has been optimized for processors with up to 6 cores and 12 threads.
62 Comments on DICE to Dial Back Ray-tracing Eye-candy in Battlefield V to Favor Performance
Hypothetical, but what would you do with that overpriced monitor?
Regarding BF V, developers themselves have said they still have a lot of work to do on this, mainly optimization.
Just as an example, note how you said lower resolution, lower details? BF V demo ran with RT reflections at full resolution while traditional methods ran at half resolution.
Answer: damage control. People and tech press noted and posted that BFV performance with RT was ABYSMAL so the obvious answer is the same we see when people complained about shitty console ports: we get press releases saying 'we're working hard on your PC port and its optimization'. The next press release contains something along the lines of 'we always listen to the community'. It's standard PR fare these days. Wake up already...
It's all so obvious, you really have to be oblivious to how these things work to not see it. Nvidia has dug itself a hole and is just waiting for gullible noobs to fall into it. And they're doing it. @notb and yourself are leading the charge it seems. 'But its faster and you only need a Titan to get 30 fps!' lmao - never mind the fact that what you get in return is downgraded so far you were better off using traditional render methods. :roll::roll::roll:
home.otoy.com/render/brigade/
If you listen carefully some of 40-50 minutes presentations by OTOY they were able to get about 12 megarays or so with 2 watt chip, probably on a bigger node than current 14++ nm. And do not forget old larrabe demos with raytraced quake etc. I hope you will do additional research by yourself
That might be ,lets hope they put mirrirs glass or water everywhere eh.