• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Microsoft Refines DirectX 12 Multi-GPU with Simple Abstraction Layer

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,678 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Microsoft is sparing no efforts in promoting DirectX 12 native multi-GPU as the go-to multi-GPU solution for game developers, obsoleting proprietary technologies like SLI and CrossFire. The company recently announced that it is making it easier for game developers to code their games to take advantage of multiple GPUs without as much coding as they do now. This involves the use of a new hardware abstraction layer that simplifies the process of pooling multiple GPUs in a system, which will let developers bypass the Explicit Multi-Adapter (EMA) mode of graphics cards.

This is the first major step by Microsoft since its announcement that DirectX 12, in theory, supports true Mixed Multi-Adapter configurations. The company stated that it will release the new abstraction layer as part of a comprehensive framework into the company's GitHub repository with two sample projects, one which takes advantage of the new multi-GPU tech, and one without. Exposed to this code, game developers' learning curve will be significantly reduced, and they will have a template on how to implement multi-GPU in their DirectX 12 projects with minimal effort. With this, Microsoft is supporting game developers in implementing API native multi-GPU, even as GPU manufacturers stated that while their GPUs will support EMA, the onus will be on game-developers to keep their games optimized.

View at TechPowerUp Main Site
 
Does this mean multicard GPU configurations will now become something ordinary as games will support it natively without the stupid profiles and also with better scaling? Because that would be nice and maybe the first time I'll have 2 cards in my system because of it...
 
Does this mean multicard GPU configurations will now become something ordinary as games will support it natively without the stupid profiles and also with better scaling? Because that would be nice and maybe the first time I'll have 2 cards in my system because of it...

the answer is in the article already....native multi gpu support will be offered; all is nice but depend on game developers if they implement the api or not...
 
This could explain why AMD is betting on multi GPU at a time where people abandon it, and Nvidia is minimizing it's efforts on supporting more than 2 GPUs in SLI. They both know what Microsoft is doing with DirectX 12.

Also we can thank Vulkan, or Microsoft wouldn't care that much to bring advantages on DirectX 12. Working on making multi GPU on DirectX 12 easier, can help convince developers to not ignore this feature.
 
Does this mean multicard GPU configurations will now become something ordinary as games will support it natively without the stupid profiles and also with better scaling? Because that would be nice and maybe the first time I'll have 2 cards in my system because of it...

IKR.. It only took them what.. Ten years? lol.. When did Sli start? I think it was with the 8 series gpus..
I had two of them.. 8800's. That's like ten years ago..lol..
I've always had a problem with sli and I've always had a sli system, since the 8800 series..
Micro stutters, crashes, incompatibility with games, artifacts, weird shadows, not working right with 3D vision... The list goes on and on.
But finally a good reason to go sli again.
 
Wow youngster, you have much to learn.

NVIDIA took over 3DFX in 2002, which gave them loads of patents and 3DFX's ScanLine interleaving technology.

Please don't call me youngster..lol. I'm 39 and I take offense to condescending tones.
I said it's been like 10 years and it's 2016.. Math time! Ten years ago it was 2006..
If it came out in 2004, which it did.. I was off by two years.. Sue me..
It started with the 6 series, not the 8 series..

https://en.wikipedia.org/wiki/Scalable_Link_Interface

Also...

3dfx SLI is not nvidia SLI usually via AFR... completely different things youngster. They are even are called different. For nvidia it is Scalable Link Interface
 
Last edited:
Actually it was 3dfx that invented pairing of multiple graphic cards. But they were so ahead of time they had to hack things with bulky external cables and external power supplies because internal slots didn't even provide enough power and there was no standardized external power like we have now with 6pin/8pin. They were also first company to use multiple GPU's on single PCB. Their entire ecosystem consisted of stacked VSA100 GPU's. Did I mention they also basically invented Mantle/DX12 as well? Glide was what Mantle and DX12 became later.

Sometimes I wish 3dfx wouldn't go under and we'd have epic battles of 3 big graphics vendors. I mean, just imagine what all 3dfx could do with DX12 they couldn't back in its day. They were over 15 years ahead of time and only thing stopping them was technology itself because it just wasn't ready for their radical ideas.
 
Actually it was 3dfx that invented pairing of multiple graphic cards. But they were so ahead of time they had to hack things with bulky external cables and external power supplies because internal slots didn't even provide enough power and there was no standardized external power like we have now with 6pin/8pin. They were also first company to use multiple GPU's on single PCB. Their entire ecosystem consisted of stacked VSA100 GPU's. Did I mention they also basically invented Mantle/DX12 as well? Glide was what Mantle and DX12 became later.

Sometimes I wish 3dfx wouldn't go under and we'd have epic battles of 3 big graphics vendors. I mean, just imagine what all 3dfx could do with DX12 they couldn't back in its day. They were over 15 years ahead of time and only thing stopping them was technology itself because it just wasn't ready for their radical ideas.

Aye...the things we miss out on because of bad management.
 
I tried digging for more information on this and there's literally nothing beyond what is linked (which is barely more than nothing). It's all very "coming soon."


In before UWP only.
 
I still need to see this more in action than anything. I just have this thought in the back of my head that this is not going to work as well as its being said to by Microsoft (Not even mentioning the game devs still have to support it/implement it).
 
Young one.
SLI was first introduced in 1998 with 3dFX's VooDoo2 card. Even from back then SLI over GLide API was offering beautiful scaling, providing almost 90% performance increase over a single card.
And yes, nVidia "bought" (steal) the SLI tech from 3dFX (yes, is almost the same tech, different acronyms meanings).

Don't forget, Voodoo2 was limited to 800x600 resolution. You could only get 1024x768 with SLI.
 
Don't forget, Voodoo2 was limited to 800x600 resolution. You could only get 1024x768 with SLI.
Those were the good ol 'times of PC gaming!
I remember seeing Quake 2 on 800x600 in miniGL API, after I play it for a couple of hours on 320x240 software mode... MIND BLOWN!!
 
Come to think of it my first GPU was Creative RIVA TNT2. It offered great performance for its time. I wonder why Creative stopped their RIVA line.
 
Wish people ONE DAY understand such a simple TRICK from company like "msoft"
Understand carefully ( it will let developers bypass the Explicit Multi-Adapter (EMA) mode of graphics cards)
this means if once a profile been made under this it will only work under DX12
"so called b**tard M'soft want people to rely on their OS forever"

they can never do something just for the sake of development
 
Come to think of it my first GPU was Creative RIVA TNT2. It offered great performance for its time. I wonder why Creative stopped their RIVA line.
hey, that was my first gpu also!!! :D
on topic: at last, i cant wait for the dx12 to take effect..
 
Riva was the model name for the GPU established by nVidia. :)
 
Yup, NVIDIA Riva. I think I still have one floating around here. To be honest, they weren't very good. Then again, they all kind of sucked at the time. It is a DirectX 6 card; things started getting exciting at DirectX 7 (Radeon and GeForce debuted).
 
Yup, NVIDIA Riva. I think I still have one floating around here. To be honest, they weren't very good. Then again, they all kind of sucked at the time. It is a DirectX 6 card; things started getting exciting at DirectX 7 (Radeon and GeForce debuted).
indeed! i will never forget the first time i tried a GeForce3 Ti200!!
 
Thing is that most of us have already multiple GPUs in the system (all intels and all amd apus + discrete) ... and some have also some extra GPUs on the storage (maybe an old unsold card ... etc).

If they could make this work so transparent and smooth and to distribute the load according to each GPU power and on top of this to make this smooth with low 99% frame time than that would be awesome ... however I have the feeling it is still too early and too complex as of now.

Its one thing to make it work in a game as proof of concept, another thing to provide good experience and scalability for most of the games.

indeed! i will never forget the first time i tried a GeForce3 Ti200!!

I also had one of those, I was a student back then and bought it second hand and OMG it was flying in Quake 3 :)
 
Last edited by a moderator:
Yup, NVIDIA Riva. I think I still have one floating around here. To be honest, they weren't very good. Then again, they all kind of sucked at the time. It is a DirectX 6 card; things started getting exciting at DirectX 7 (Radeon and GeForce debuted).

TNT and TNT2 competed favorably with the 3DFX Voodoo line cards. It was the jump start to the nVidia juggernaut.
 
I also had one of those, I was a student back then and bought it second hand and OMG it was flying in Quake 3 :)

a friend lend it to me to try it, i was in high school then, so no money at all.. i tried it with "legacy of kain" and of course Q3.. :D good times.. it felt like a leap compared to the riva..
 
I'm a game developer. These essentially sound like templates to show how to use multi-GPU in DX12, which is normally a complicated process. I'm appreciative, and were I to make a game capable of utilizing a lot of horsepower I'd definitely be encouraged to take a look.
 
3dfx SLI is not nvidia SLI usually via AFR... completely different things youngster. They are even are called different. For nvidia it is Scalable Link Interface
Young one.
SLI was first introduced in 1998 with 3dFX's VooDoo2 card. Even from back then SLI over GLide API was offering beautiful scaling, providing almost 90% performance increase over a single card.
And yes, nVidia "bought" (steal) the SLI tech from 3dFX (yes, is almost the same tech, different acronyms meanings).

Wth is with you people and the condescending tones? Don't call me "young one", "Kid" , "Youngster" or anything else condescending please..
Just asked the guy before you not to call me "youngster" so I know you're trolling..
I'm a 39 year old building inspector, not a "young one"....
We already clarified the points you are making by the way.. Nvidias version of sli started with the 6 series in 2004. Bought, stolen, whatever, jitterbug.
 
Last edited:
Wth is with you people and the condescending tones? Don't call me "young one", "Kid" , "Youngster" or anything else condescending please..
Just asked the guy before you not to call me "youngster" so I know you're trolling..
I'm a 39 year old building inspector, not a "young one"....
We already clarified the points you are making by the way.. Nvidias version of sli started with the 6 series in 2004. Bought, stolen, whatever, jitterbug.
You have my apologies. No insult intended. Just teasing you a little. :)
 
Back
Top