Thursday, November 21st 2019

NVIDIA Develops Tile-based Multi-GPU Rendering Technique Called CFR

NVIDIA is invested in the development of multi-GPU, specifically SLI over NVLink, and has developed a new multi-GPU rendering technique that appears to be inspired by tile-based rendering. Implemented at a single-GPU level, tile-based rendering has been one of NVIDIA's many secret sauces that improved performance since its "Maxwell" family of GPUs. 3DCenter.org discovered that NVIDIA is working on its multi-GPU avatar, called CFR, which could be short for "checkerboard frame rendering," or "checkered frame rendering." The method is already secretly deployed on current NVIDIA drivers, although not documented for developers to implement.

In CFR, the frame is divided into tiny square tiles, like a checkerboard. Odd-numbered tiles are rendered by one GPU, and even-numbered ones by the other. Unlike AFR (alternate frame rendering), in which each GPU's dedicated memory has a copy of all of the resources needed to render the frame, methods like CFR and SFR (split frame rendering) optimize resource allocation. CFR also purportedly offers lesser micro-stutter than AFR. 3DCenter also detailed the features and requirements of CFR. To begin with, the method is only compatible with DirectX (including DirectX 12, 11, and 10), and not OpenGL or Vulkan. For now it's "Turing" exclusive, since NVLink is required (probably its bandwidth is needed to virtualize the tile buffer). Tools like NVIDIA Profile Inspector allow you to force CFR on provided the other hardware and API requirements are met. It still has many compatibility problems, and remains practically undocumented by NVIDIA.
Source: 3DCenter.org
Add your own comment

33 Comments on NVIDIA Develops Tile-based Multi-GPU Rendering Technique Called CFR

#26
Unregistered
3roldHere's a crazy idea: Why not work with M$/AMD to optimize DX12/Vulkan? Hell Vulkan has an open source SDK, it does not even need special cooperations with anyone.
Also, back when DX12 was launched there was a lot of hype on how good it would perform with multi-GPU setups using async technologies (indipendent chips & manufacturers) wccftech.com/dx12-nvidia-amd-asynchronous-multigpu/
Seems like everyone forgot about it...
The devs did a really nice job with DX12 mGPU in Tomb Raider & Rise of the Tomb Raider (what a beautiful game) - I haven't played the 3rd one yet though. I was quite impressed with how well it ran. Not even a hint of stuttering for me in 4k / 60fps.
Posted on Edit | Reply
#27
eidairaman1
The Exiled Airman
Razrback16The devs did a really nice job with DX12 mGPU in Tomb Raider & Rise of the Tomb Raider (what a beautiful game) - I haven't played the 3rd one yet though. I was quite impressed with how well it ran. Not even a hint of stuttering for me in 4k / 60fps.
Yup but it is still up to game devs to support it. Most dont bother
Posted on Reply
#28
Unregistered
eidairaman1Yup but it is still up to game devs to support it. Most dont bother
Ya, with the ones who don't bother, I likewise don't bother with paying for their games. I enjoy rewarding devs when they do a nice job, though and not only buy their games, but usually will recommend them and write good reviews for them on Steam & Metacritic to try to help them out.
Posted on Edit | Reply
#29
Anymal
What about RT and this method? SLI for classic rasterization is hit and miss, but maybe it turns out very compeling for RT.
Posted on Reply
#31
R-T-B
Razrback16Ya, with the ones who don't bother, I likewise don't bother with paying for their games.
As a dev explicit mGPU is a major PITA to support. Some smaller devs may never be able to fully support it. It's not an answer to just boycott games that don't use it, as that will really limit you to like 10% of games tops for eternity.

Not a fun reality, I realize, but mGPU in DX12 was not an answer, more a copout and passing of the issue to devs.
Posted on Reply
#32
Anymal
Btarun, why is segment about CFR in RT deleted from original post?
Posted on Reply
#33
Unregistered
R-T-BAs a dev explicit mGPU is a major PITA to support. Some smaller devs may never be able to fully support it. It's not an answer to just boycott games that don't use it, as that will really limit you to like 10% of games tops for eternity.

Not a fun reality, I realize, but mGPU in DX12 was not an answer, more a copout and passing of the issue to devs.
Oh I still play them, don't worry, no missing out here. At the end of the day, IMO at least, it's a feature that is of great help to customers and should be supported on AAA titles. The big companies out there paying their execs millions can afford to implement mGPU support. I'll never forget how well Rise of the Tomb Raider ran and how gorgeous it looked in 4k / 60fps with everything cranked. I was happy to pay for that game and equally glad to write glowing reviews on both Steam & Metacritic for them.
Posted on Edit | Reply
Add your own comment
Nov 21st, 2024 05:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts