• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Ashes of the Singularity DirectX 12 Mixed GPU Performance

Joined
Sep 19, 2015
Messages
21 (0.01/day)
Processor Core I7 6700k
Motherboard ASRock Z170 Extreme4
Memory Corsair 2x4GB DDR4 2800
Video Card(s) MSI Fury X
Storage Samsung EVO 500GB
Power Supply Rosewill 650W 80+ Gold Modular

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
[hyper ventilation begins, before even reading the article]

Edit: Having read the article, yeah - its exciting stuff. As mentioned if we can adjust the rendering on each card (the 60/40 split, 30/70 etc) then we're in for the golden era of PC gaming.
even something as simple as putting AA onto GPU 2 would be enough.

I see this more happening on an engine level (look at how popular the Unity engine has become for indie games), that suddenly a dozen games can use at once.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,745 (3.30/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
It seems we might as well follow the same rules as before. A single 980Ti is faster than a 980Ti paired with the R9 380. In fact, a single 980Ti is even faster than the 980Ti paired with a GTX960, so it's not a vendor specific or even architecture specific issue.
 

nem

Joined
Oct 22, 2013
Messages
165 (0.04/day)
Location
Cyberdyne CPU Sky Net
Nvidia's asynchronous compute...
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
It clearly shows the % gains in FPS from DX11 to DX12 for each card.

And to add something some of us talked about when someone asked for buying a new GPU to keep it. Look into the link below and check how much more VRAM DX12 asks for. And then rething about 970 3.5GB vs 390 8GB...

http://www.overclock3d.net/reviews/..._beta_phase_2_directx_12_performance_review/5

You mean, look at how much VRAM this single beta DirectX 12 engine uses. Making a blanket statement about the entirety of DX12 based on a single beta engine shows astounding ignorance, which you then confirmed with your "OMG GTX 970 HAZ 3.5GB" comment.
 
Joined
Apr 1, 2014
Messages
503 (0.13/day)
System Name Personal Rig
Processor Intel i5 3570K
Motherboard Asus P8Z77-V
Cooling Noctua NH-U12P Push/Pull
Memory 8GB 1600Mhz Vengeance
Video Card(s) Intel HD4000
Storage Seagate 1TB & 180GB Intel 330
Display(s) AOC I2360P
Case Enermax Vostok
Audio Device(s) Onboard realtek
Power Supply Corsair TX650
Mouse Microsoft OEM 2.0
Keyboard Logitech Internet Pro White
Software Legal ;)
Benchmark Scores Very big
Would this work with something like HD520 and a 930M ?

Yeah, i know this is a crap laptop solution, but every additional fps matters :D:D:D
 

bug

Joined
May 22, 2015
Messages
13,781 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
It seems we might as well follow the same rules as before. A single 980Ti is faster than a 980Ti paired with the R9 380. In fact, a single 980Ti is even faster than the 980Ti paired with a GTX960, so it's not a vendor specific or even architecture specific issue.

Check out Anandtech. They conclude this title is actually CPU limited and the CPU literally cannot do the work of keep both GPU working.

Edit: The title is also in beta, so there's no telling whether the end result will be the same. Maybe there's a lot of debugging code in there, maybe additional optimization are still to land.
 
Last edited:
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
As W1zzard says though, it's up to the dev's to code to make it work and how many will bother?
Sweet bugger all. Given the current state of QA testing of games I really can't see many devoting resources to a feature that won't be a primary selling point for games - and you can forget about Nvidia/AMD sponsorship to make it happen. Nvidia is happy with its ecosysyem and the last thing AMD needs to be seen to be endorsing Nvidia features like PhysX and the rest of Gameworks. Mixed-GPU looks like one of those promising ideas that will probably be fraught with issues until something else takes it place and it gets buried in a shallow grave - much the same as Lucidlogix's Hydra.
AoS is a known quantity so far but in the world of consoles I'm not so sure it'll catch. As far as the card differences, we know AMD has a better DX12 Async architecture so the results aren't too surprising. But once again we still need more DX12 game benchmarks. I have zero interest in AoS as a purchase so other titles implementation of DX12 features will have varying results and more interest for me. It's very annoying Deus Ex Mankind Divided was pushed back as that was going to incorporate DX12 was it not? Plus it was an AMD sponsored title so that would be a great platform for AMD/RTG to shout about their achievements.
I think AotS is a best case scenario for AMD and worst case scenario for Nvidia. The Nitrous game engine was developed on and with GCN and Mantle in mind for Star Swarm, so I would reserve judgement until we see we a few more game engines and dev implementations before drawing a conclusion.
Still, by the time DX12 and Vulkan actually mean gameplay I'll more than likely be on the next generation of cards from one (or possibly both if mixed-GPU actually takes off), so wake me when it is actually relevant.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,745 (3.30/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
Check out Anandtech. They conclude this title is actually CPU limited and the CPU literally cannot do the work of keep both GPU working.

Edit: The title is also in beta, so there's no telling whether the end result will be the same. Maybe there's a lot of debugging code in there, maybe additional optimization are still to land.

If that were the case, then why would we see such significant drops, and why would we see improvements with other setups? If it were CPU limited that badly, I'd expect to see minimal changes. The title still being in beta remains a valid point though, I suppose that remains to be seen in the final product.
 

bug

Joined
May 22, 2015
Messages
13,781 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
If that were the case, then why would we see such significant drops, and why would we see improvements with other setups? If it were CPU limited that badly, I'd expect to see minimal changes. The title still being in beta remains a valid point though, I suppose that remains to be seen in the final product.

I haven't had the time to read their review properly (just glanced over while @work), but the explanation is somewhere on this page: http://anandtech.com/show/10067/ashes-of-the-singularity-revisited-beta/4
 
Joined
Jun 14, 2010
Messages
632 (0.12/day)
Location
City 217
Processor AMD Phenom II X4 925
Motherboard Asus M4A78LT-M
Cooling Ice Hammer IH-4***
Memory 2x4GB DDR3 Corsair
Video Card(s) Asus HD7870 2GB
Storage 500GB SATAII Samsung | 500GB SATAII Seagate
Display(s) 23" LG 23EA63V-P
Case Thermaltake V3 Black Edition
Audio Device(s) VIA VT1708S
Power Supply Corsair TX650W
Software Windows 10 x64
Makes you wonder though - will Nvidia cock-block this blender feature? They aren't keen on sharing, especially now when AMD is weak.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Makes you wonder though - will Nvidia cock-block this blender feature? They aren't keen on sharing, especially now when AMD is weak.


they'll more than likely work on a new DX12 version of SLI, possibly that makes the game engine think its a single GPU but allows *them* to customise what parts are rendered on what GPU. (forcing AA to GPU2, physX to GPU3, etc)
 
Joined
Sep 17, 2014
Messages
22,468 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
AMD's long term strategic choices are FINALLY coming to fruition with the changes in DX12. Async, the investments (early on) in Mantle, they may very well get their money's worth out of it, and myself along with (Im sure) many others have long thought otherwise.

I very, very much like the fact that AMD's cards are now overtaking the Nvidia counterparts. There is finally a performance gap on several price points that Nvidia can no longer 'fix' through Gameworks optimizations and just sending engineers around to devs to 'work on code'. This is exactly the way in which AMD can overtake Nvidia in the long run; not by code-specific adjustments, but by tech on the hardware level that is well suited to a new era in gaming. Having Nvidia play catch-up is good, very good for the market and the fact that an underdog can do this, shows how much there is still to win in terms of efficiency, performance and a healthy marketplace.

Go AMD. For the first time in years, you've got me interested beyond a few marketing slides. Put this performance to work in practical solutions and games, and they may very well be back in the game. I really like seeing Fury cards becoming worth the money, before this it was way too easy to think HBM had no real purpose. However it all depends so much on how well they manage to port this performance boost to games outside Ashes.

@Mussels That seems extremely Nvidia-like for a solution that both keeps their SLI contracts intact and at the same time gives them a feature to 'market'. If they do this, the current 780ti is the last Nvidia card for me ;)
 
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
And when Pascal releases that (supposedly) fixes/improves the async performance.....................

I don't think HBM has much to do with it... its still not 'needed' except for 4K res and VR... both of which hold a nearly non existent market share at this time.

I have to say, I think NVIDIA came into the party at the right time with their HBM2 on Pascal.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
And when pascal releases that (supposedly) fixes/improves the async performance.....................

we get new conspiracy theories, duh.

i'm just glad that my $150 280x turned 290 via warranty is going to be even more awesome in DX12, i got my moneys worth :D
 
Joined
Sep 17, 2014
Messages
22,468 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
And when pascal releases that (supposedly) fixes/improves the async performance.....................

Supposedly. Hopefully. Maybe.

The last Pascal demo I saw, was actually running on Maxwell cards. I think Nvidia's powerpoint slides are miles ahead of reality and they may very well run into some trouble. Will Nvidia 'Do an AMD'? I've got popcorn at the ready :)
 
Joined
Jun 14, 2010
Messages
632 (0.12/day)
Location
City 217
Processor AMD Phenom II X4 925
Motherboard Asus M4A78LT-M
Cooling Ice Hammer IH-4***
Memory 2x4GB DDR3 Corsair
Video Card(s) Asus HD7870 2GB
Storage 500GB SATAII Samsung | 500GB SATAII Seagate
Display(s) 23" LG 23EA63V-P
Case Thermaltake V3 Black Edition
Audio Device(s) VIA VT1708S
Power Supply Corsair TX650W
Software Windows 10 x64
they'll more than likely work on a new DX12 version of SLI, possibly that makes the game engine think its a single GPU but allows *them* to customise what parts are rendered on what GPU. (forcing AA to GPU2, physX to GPU3, etc)
Is it not possible to do that now? I thought people were using one weak Nvidia GPU for PhysX and another one for the hard work for some time now.
 
Joined
Sep 17, 2014
Messages
22,468 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Is it not possible to do that now? I thought people were using one weak Nvidia GPU for PhysX and another one for the hard work for some time now.

That is not in SLI, that is just assigning PhysX to whatever you want, like CPU or GPU of choice.

But yes, as far as implementation goes, I would much rather see Nvidia do that for other things like post processing, AA and whatnot, to make it SLI-independant. I mean they can also already run their AA across SLI and PhysX on a component of choice, now they just need to marry the two and remove the SLI requirement. Doesn't seem like a stretch to me.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Is it not possible to do that now? I thought people were using one weak Nvidia GPU for PhysX and another one for the hard work for some time now.

to a super limited extent where the physX GPU cant be used for any other task, yes. I was suggesting they may expand on that model, since it 'works' for them already.
 
Joined
Apr 8, 2009
Messages
3,016 (0.53/day)
Location
vermont
System Name The wifes worst enemy
Processor i5-9600k
Motherboard Asrock z390 phantom gaming 4
Cooling water
Memory 16gb G.skill ripjaw DDR4 2400 4X4GB 15-15-15-35-2T
Video Card(s) Asrock 5600xt phantom gaming 6gb 14gb/s
Storage crucial M500 120GB SSD, Pny 256GB SSD, seagate 750GB, seagate 2TB HDD, WD blue 1TB 2.5" HDD
Display(s) 27 inch samsung @ 1080p but capable of much more ;)
Case Corsair AIR 540 Cube Mid tower
Audio Device(s) onboard
Power Supply EVGA GQ1000W MODULAR
Mouse generic for now
Keyboard generic for now
Software gotta love steam, origin etc etc
Benchmark Scores http://hwbot.org/user/philbrown_23/
whoa whoa wait just a minute. does this mean i can run my gtx560 4gb and my gtx 7604gb together? and ati and nvidia cards wtf? wow things have come a long way in my 2yr absence.
 
Joined
Apr 30, 2011
Messages
2,703 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
You mean, look at how much VRAM this single beta DirectX 12 engine uses. Making a blanket statement about the entirety of DX12 based on a single beta engine shows astounding ignorance, which you then confirmed with your "OMG GTX 970 HAZ 3.5GB" comment.
So, ignorant is the one who posts facts and not the one who predicts things opposing the facts that are showing clearly the tendency of how the use of the DX12 features increase the demand for VRAM. OK, keep trolling instead of finding proofs as they don't exist (at least for now)...
 
Joined
Jan 27, 2015
Messages
1,065 (0.30/day)
System Name loon v4.0
Processor i7-11700K
Motherboard asus Z590TUF+wifi
Cooling Custom Loop
Memory ballistix 3600 cl16
Video Card(s) eVga 3060 xc
Storage WD sn570 1tb(nvme) SanDisk ultra 2tb(sata)
Display(s) cheap 1080&4K 60hz
Case Roswell Stryker
Power Supply eVGA supernova 750 G6
Mouse eats cheese
Keyboard warrior!
Benchmark Scores https://www.3dmark.com/spy/21765182 https://www.3dmark.com/pr/1114767
And when Pascal releases that (supposedly) fixes/improves the async performance....................

not to sure about that; i don't hearing a word about pascal's async until AotS benches hit the internet and exposed NV's weakness.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,058 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
not to sure about that; i don't hearing a word about pascal's async until AotS benches hit the internet and exposed NV's weakness.

More specifically, NV haven't mentioned the full architecture of Pascal as it is NDA. Like Arctic Islands. The assumption is that knowing the move to DX12 would have brought a more basic API and having seen AMD utilise Mantle to reasonable effect, NV aren't exactly going to have sat on their laurels. With Maxwell, the drive was clearly to knock down CUDA's compute (which is great at parallelism) to focus on power efficiency with faster clocks. That gave the 980ti enormous headway for DX11 which is still the current and ruling API. Nvidia'a DX11 Maxwell focus, compared to GCN's DX12 advantage wasn't a fantastic move by AMD. Latest figures show despite Fiji parts being readily available, they are not selling as well as Maxwell parts.

http://hexus.net/business/news/comp...t-share-expected-hit-new-low-current-quarter/

I have no idea how Pascal will fare against Polaris. Perhaps Polaris (or whatever the arch is called) will have enough tweaks to finally and resoundingly become the gold standard in gfx architecture. Maybe Pascal will be a Maxwell Volta bastard son and simply hold on till Volta arrives proper?

What is for sure is that this single, DX12 bench isn't any revelation. If Async isn't a dev's priority (for whatever reason) then GCN loses it's edge. If Nvidia buy into some AAA titles before Pascal is out (with assumed parallelism) they'll be sure to 'persuade' lower focus on Async.

Roll on Summer - another round of gfx wars :cool:
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
What is for sure is that this single, DX12 bench isn't any revelation. If Async isn't a dev's priority (for whatever reason) then GCN loses it's edge.
DX12 redux:
AotS pre-alpha gets released, AMD rulez, AMD predicted to rule the known world even though the engine is tailored to GCN and isn't slated for wide uptake by game studios
Fable Legends (using the much more widely used UE4 engine) beta DX12 benchmark arrives and adds some perspective to the subject
A few months on, AotS does the rounds again and the AMD cheerleaders and doomsayers are back doing their Async/web version of the Nuremburg Rally.

It's almost as though there is an epidemic of attention deficit disorder.

If Nvidia buy into some AAA titles before Pascal is out (with assumed parallelism) they'll be sure to 'persuade' lower focus on Async.
If it is seen as a weak area for the architecture then most assuredly. If those AAA titles are built on the UE4 engine, it would almost be a certainty that they would do exactly as AMD/Oxide have done with AotS, and make the game settings at its highest level ( the marketing/tech site bench level) heavy with DX12 transparency and custom blending features since AMD's current architectures require software emulation for concerted use of conservative rasterization/ROV's
I have no idea how Pascal will fare against Polaris. Perhaps Polaris (or whatever the arch is called) will have enough tweaks to finally and resoundingly become the gold standard in gfx architecture.
You aren't the only one with no idea - you can count virtually everyone else in on that particular list. History says that both Nvidia and AMD/ATI have had comparable performance down their product stacks for the best part of twenty years. With the exception of a particularly well executed G80 and not particularly well executed R600 at the dawn of the unified shader architecture era, it has been largely give and take depending upon IHV feature emphasis even when the companies have used different manufacturing partners ( such as ATI using TSMC's 130nm Lo-K and Nvidia using IBM's 130nm FSG (Fluorosilicon Glass) process). I really don't see that trend changing in the space of a single GPU generation. TSMC's are already shipping commercial 16nmFF+ products, and Samsung/Glofo are ramping 14nmLPP, so aside from wafer start availability, the manufacturing side of the equation shouldn't be an issue either.
 
Joined
Jan 28, 2012
Messages
468 (0.10/day)
Location
Lithuania
Processor Intel Core i5 4670K @ 4.8 GHz
Motherboard AsRock Z87 Extreme 4
Cooling Lepa NeoIllusion RGB CPU cooler
Memory 2*4GB Patriot G2 Series RAM
Video Card(s) MSI Radeon R9 380 4GB
Storage Transcend SSD 740 256GB + WD Caviar Blue 1TB
Display(s) Samsung SA 300 24" Full HD
Case NZXT Phantom 530 + Bitfenix Recon fan controller
Audio Device(s) Creative SB0770 X-Fi Xtreme Gamer
Power Supply PC Power and Cooling Silencer MkIII 750W 80+ Gold
Mouse Logitech G502
Keyboard Steelseries Apex RAW
Benchmark Scores IT WORKS
If I understood correctly that would be available on all DX12 games? So this technology allows to use two NVidia graphics cards in SLI uncerfitied motherboards without Different SLI/ Hyper SLI (those won't give 100% guarantee that it works fine)?
 
Last edited:
Top