• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Develops Tile-based Multi-GPU Rendering Technique Called CFR

Joined
Nov 4, 2005
Messages
11,982 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
AA would probably be one of the postprocessing methods done at the end of rendering a frame.

You can't get off with shared memory like that. You are still going to need a sizable part of assets accessible by both/all GPUs. Any memory far away from GPU is evil and even a fast interconnect like NVLink won't replace local memory. GPUs are very bandwidth-constrained so sharing memory access through something like Zen2's IO die is not likely to work on GPUs at this time. With big HBM cache for each GPU, maybe, but that is effectively still each GPU having its own VRAM :)

Chiplet design has been the end goal for a while and all the GPU makers have been trying their hand on this. So far, unsuccessfully. As @Apocalypsee already noted - even tiled distribution of work is not new.
I know NVlink won't replace memory, it's merely the protocol for interdie communication.


I am saying the IO die could handle memory interleaving between two sets of 6GB vram and assign shared and dedicated memory and resources, it's already the same sort of memory management used, but with the ability to share resources with multiple dies, which would also make them a good shared workstation card, allow hardware management of user and resources allocation.
 
D

Deleted member 177333

Guest
Here's a crazy idea: Why not work with M$/AMD to optimize DX12/Vulkan? Hell Vulkan has an open source SDK, it does not even need special cooperations with anyone.
Also, back when DX12 was launched there was a lot of hype on how good it would perform with multi-GPU setups using async technologies (indipendent chips & manufacturers) https://wccftech.com/dx12-nvidia-amd-asynchronous-multigpu/
Seems like everyone forgot about it...

The devs did a really nice job with DX12 mGPU in Tomb Raider & Rise of the Tomb Raider (what a beautiful game) - I haven't played the 3rd one yet though. I was quite impressed with how well it ran. Not even a hint of stuttering for me in 4k / 60fps.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,115 (6.63/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
The devs did a really nice job with DX12 mGPU in Tomb Raider & Rise of the Tomb Raider (what a beautiful game) - I haven't played the 3rd one yet though. I was quite impressed with how well it ran. Not even a hint of stuttering for me in 4k / 60fps.

Yup but it is still up to game devs to support it. Most dont bother
 
D

Deleted member 177333

Guest
Yup but it is still up to game devs to support it. Most dont bother

Ya, with the ones who don't bother, I likewise don't bother with paying for their games. I enjoy rewarding devs when they do a nice job, though and not only buy their games, but usually will recommend them and write good reviews for them on Steam & Metacritic to try to help them out.
 
Joined
Jul 10, 2015
Messages
754 (0.22/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
What about RT and this method? SLI for classic rasterization is hit and miss, but maybe it turns out very compeling for RT.
 
Joined
Jan 27, 2015
Messages
1,065 (0.30/day)
System Name loon v4.0
Processor i7-11700K
Motherboard asus Z590TUF+wifi
Cooling Custom Loop
Memory ballistix 3600 cl16
Video Card(s) eVga 3060 xc
Storage WD sn570 1tb(nvme) SanDisk ultra 2tb(sata)
Display(s) cheap 1080&4K 60hz
Case Roswell Stryker
Power Supply eVGA supernova 750 G6
Mouse eats cheese
Keyboard warrior!
Benchmark Scores https://www.3dmark.com/spy/21765182 https://www.3dmark.com/pr/1114767
Joined
Aug 20, 2007
Messages
21,453 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Ya, with the ones who don't bother, I likewise don't bother with paying for their games.

As a dev explicit mGPU is a major PITA to support. Some smaller devs may never be able to fully support it. It's not an answer to just boycott games that don't use it, as that will really limit you to like 10% of games tops for eternity.

Not a fun reality, I realize, but mGPU in DX12 was not an answer, more a copout and passing of the issue to devs.
 
Joined
Jul 10, 2015
Messages
754 (0.22/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
Btarun, why is segment about CFR in RT deleted from original post?
 
D

Deleted member 177333

Guest
As a dev explicit mGPU is a major PITA to support. Some smaller devs may never be able to fully support it. It's not an answer to just boycott games that don't use it, as that will really limit you to like 10% of games tops for eternity.

Not a fun reality, I realize, but mGPU in DX12 was not an answer, more a copout and passing of the issue to devs.

Oh I still play them, don't worry, no missing out here. At the end of the day, IMO at least, it's a feature that is of great help to customers and should be supported on AAA titles. The big companies out there paying their execs millions can afford to implement mGPU support. I'll never forget how well Rise of the Tomb Raider ran and how gorgeous it looked in 4k / 60fps with everything cranked. I was happy to pay for that game and equally glad to write glowing reviews on both Steam & Metacritic for them.
 
Top