Friday, February 24th 2017

AMD's X370 Only Chipset to Support NVIDIA's SLI

Only AMD's top-of-the-line X370 chipset will support competing NVIDIA's SLI technology. AMD's next-in-line B350 eschews SLI support but retains CrossFire compatibility, while the low-end A320 chipset will offer no support for any such multi-GPU technologies. While this may seem a move by AMD to purposely gimp NVIDIA products on its platforms, it stands to reason that even enthusiasts tend to stay away from multi-GPU solutions and their associated problems. Besides, AMD will surely avoid any way of giving NVIDIA more funds than the company already has, by way of paying the "SLI Tax" on every chipset it ships. By limiting SLI support to its highest-end chipsets, AMD shaves some expenses from licensing efforts, whilst keeping SLI support to those that are, in truth, more likely to use them: power users, who will certainly spare no expense in springing to a X370-based platform.

As of now, some details remain unclear in the overall feature-set and compatibility differences between AMD's upcoming AM4 chipsets, but it would seem that only AMD's X370 chipset manages to leverage the full 20 PCIe lanes (18x if you run 2x SATA connections) delivered by AMD's Ryzen CPUs. This would look like a way for AMD to impose a "motherboard tax" on users, by limiting the number of PCIe lanes available on lower-end motherboards, and thus urging them to take the next step to their own X370. Apparently, PCIe lanes are not a differentiating factor between AMD chipsets (with X370, B350 and A320 all offering 4 native lanes), only their ability to access (or not) Ryzen's own 20.

Not much time until all of this is adequately cleared up, though.
Source: Computerbase.de
Add your own comment

69 Comments on AMD's X370 Only Chipset to Support NVIDIA's SLI

#26
Brusfantomet
SLI being capped at the top end is not that surprising,

BUT that double asterisk, over clocking on X300, B300 and A300 THAT is serious news for the budget minded out there, the R5 1300 is a 4 core 8 thread Ryzen cpu at 175 USD, all Ryzen are unlocked, that could be a budget users dream, if the OC on ryzen is good this could herald a 175 USD AMD part going toe to toe with the i7 7700k.
Posted on Reply
#27
Camm
alucasaNot worried about sli since I don't use them. But overall port # is a little worrying. Having only 2 USB3 unless it's top of line?

I generally don't buy top of line mobo for builds.
There's 6, 4 from the CPU, 2 from the chipset.
Posted on Reply
#28
Steevo
The Witcher 3 shows the 980Ti with a loss of 9FPS, the 480 was released after the 1070 so the loss in performance came after the 1070 was the performance/price leader for Nvida, a spot the 980Ti used to fill. The GTA5 performance is due to Rockstar adding optimizations more than anything.

Posted on Reply
#29
cdawall
where the hell are my stars
SteevoThe Witcher 3 shows the 980Ti with a loss of 9FPS, the 480 was released after the 1070 so the loss in performance came after the 1070 was the performance/price leader for Nvida, a spot the 980Ti used to fill. The GTA5 performance is due to Rockstar adding optimizations more than anything.

Which individual driver I am basing this off of the 376.33 WHQL based review w1z dropped Feb 13, 2017 for comparison.

This specific review

same numbers are shown (mostly positive occasional game drop, but game engines update so the same truth fro GTA5 exists in both directions depending on update Fallout 4 with the texture pack for example) in this review (same driver)

375.70 appears to be mostly positive as well? link

Here this is as far back as I can find matching games, 365, 375, 376; 375 to 376 is no change 365 to the later driver is positive. Yet again what is the FUD that drivers hurt nvidia performance in earlier cards?

Posted on Reply
#30
G33k2Fr34k
Only those low IQ gaimers spend hundreds of dollars on graphics cards to play these modern shitty games at "high" resolutions and graphics settings. The vast majorty of people are casual gamers who play competitive games like mobas or CS: GO.
Posted on Reply
#31
cdawall
where the hell are my stars
G33k2Fr34kOnly those low IQ gaimers spend hundreds of dollars on graphics cards to play these modern shitty games at "high" resolutions and graphics settings. The vast majorty of people are casual gamers who play competitive games like mobas or CS: GO.
Not very many people "casually" play mobas that shit consumes them and they have no life outside of it. There are definitely more players that stick in the MOBA genre, however they wouldn't continue making new games if they didn't sell.
Posted on Reply
#32
bug
Even Nvidia has restricted SLI to high-end with Pascal, so I really don't see a problem here. If anything, it's lucky AMD provides any SLI support at all, since they have to license it first afaik.

On another note, what happened to "DX12 and Vulkan mGPU rendering will make SLI/Crossfire obsolete"?
Posted on Reply
#33
Grings
you cant clock the crap out of a 1070, nvidia is way ahead of you there, they are too locked down to only clock so far, regardless of pcb quality (to stop them going as fast as 1080's)
Posted on Reply
#34
Camm
bugEven Nvidia has restricted SLI to high-end with Pascal, so I really don't see a problem here. If anything, it's lucky AMD provides any SLI support at all, since they have to license it first afaik.

On another note, what happened to "DX12 and Vulkan mGPU rendering will make SLI/Crossfire obsolete"?
It exists, but because its the developers responsibility to implement, its rare. Afaik, AotS and latest Deus Ex game support it.
Posted on Reply
#35
eidairaman1
The Exiled Airman
G33k2Fr34kOnly those low IQ gaimers spend hundreds of dollars on graphics cards to play these modern shitty games at "high" resolutions and graphics settings. The vast majorty of people are casual gamers who play competitive games like mobas or CS: GO.
It's spelled gamers, not gaimers.
Posted on Reply
#36
xorbe
If you're buying 2x Titan or 2x 1080, then springing for the top chipset is of little concern. *shrug*
Posted on Reply
#37
Casecutter
Big whoop. Anybody that's into a new build and either has two older enthusiast Nvidias' or GTX 1070(s) or more, Should have the money to plug into top-shelf Mobo. It's not like you can use two new GTX 1060's any ways! This is more a fault of Nvidia disabling SLI than AMD support on a mainstream platform, cry me a river! :cry:
Just saw xorbe above. ;)
Posted on Reply
#38
Fierce Guppy
xorbeIf you're buying 2x Titan or 2x 1080, then springing for the top chipset is of little concern. *shrug*
That's exactly right. mGPU users have larger budgets. They'd just get a mb w/ x370 chipset.
Posted on Reply
#40
Sandbo
petepeteSpeak for yourself, I go to my cousins and watch his 980's in SLI destroy games like Witcher, Doom, BF1, BF4, WoW, GTA, Fallout 4, Shadow Warrior, Siege, Overwatch.. Like basically every AAA games besides Resident evil runs flawlessly on his 5820k rig and it sure as hell beats 1 1080.

Games that run SLI destroys any single-gpu solution which is why after I go Ryzen I might pop in another 980 Ti :p

tl;dr - Sli support has grown massively,, almost all AAA games support it .. 980 Tis will blow the 1080 Ti straight out of the water.
My five cent was that I gave up using RX480 CF.

Not sure how SLi will fair, but the intrinsic problem is the frame pacing, and that will ALWAYS introduce stuttering no matter how it is optimized, or if you have free sync or gsync.

It will never be as smooth as using a single card. My opinion is that, if your game is now playing at 30 fps, getting SLi or CF to have 50-60fps will surely help.

If you have now 45-50 fps, going with dual cards and get 80-90 fps, you might end up having less flurent gameplay.
Posted on Reply
#41
TheLaughingMan
The only thing I want to know is can I use the Ryzen Master OC tool to setup a "game mode" for the CPU to switch to when I launch a full screen app. And if I can, will down clocking some of the cores to overclock others give me a high OC level on those cores? Or straight up disable the extra cores if I know a certain game can only use 4 cores. That way I can have my 3.8 GHz 16 thread beast for multi-threaded stuff and switch a 4C/8T at 4.5 GHz for most games.
Posted on Reply
#42
ADHDGAMING
Umm had'nt they announced previously that only the to tier board would support Multi GPU .. i mean just guessing but i think that would include CF as well as SLI. I guess there has to be a few spattered around that missed it .. It was one of the more talked about things i suppose.
Posted on Reply
#43
Dippyskoodlez
xorbeIf you're buying 2x Titan or 2x 1080, then springing for the top chipset is of little concern. *shrug*
This is spot on. Cutting the SLI licensing is irrelevant to midrange boards.

It also appears most people in this thread have a terrible understanding of the position SLI is currently in. With the increase in frame latency by traditional means (AFR/AFR2), SLI's primary reason for use right now is to achieve playable framerates at 4k+. At 1440 and 1080, sli is entirely meaningless at the top end because performance is so ludicrus currently. Triple SLI is no longer viable in any current games. Once again, low res options are far too high of a framerate to benefit from the latency increase, the SLI bridge is not capable of the bandwidth at 4k+, and the third card is just a dead weight beyond 3 monitor surround.

Two way SLI with the 9xx series will actually provide substantially better performance at 4k using the HB/PCB bridge Nvidia makes than 3 way SLI in every single game I tried/play. The best scaling I found was perhaps FFXIV, but two card was still a preferred option and I have since sold my third card.


I will say two way SLI is still supported extremely well currently though. Most games either already have sufficient profiles or can be fixed at launch to operate, including RE7. DX12 support and functionality is still effectively nonexistent for users via SFR unless you're using SLI-VR.
Posted on Reply
#44
m1dg3t
eidairaman1Could very well change with revision boards if there is enough demand, but think about it sli and crossfire have been around since 2004- it has been almost 13 years and the gpu makers still haven't gotten it right and most game devs pretty much ignore it.
Didn't multi GPU start in the late '90s? I think it's been closer to 20yrs now???
eidairaman1It's spelled gamers, not gaimers.
HaHaHa gheymers! :rolleyes: LoL
Posted on Reply
#45
Dimi
So i heard that there won't be any X370 Micro ATX boards? Is this true? People are saying that for Micro ATX and ITX there will be the X300 chipset, and if this is true, then i won't be having any of it.

Very very weak those specs.
Posted on Reply
#46
Fluffmeister
I wonder how much nVidia get from each X370 mobo sold, a few bucks here and there is always nice, especially from a rival.
Posted on Reply
#47
Dippyskoodlez
FluffmeisterI wonder how much nVidia get from each X370 mobo sold, a few bucks here and there is always nice, especially from a rival.
and a market that they're not actually in!
Posted on Reply
#48
Hood
alucasaNot worried about sli since I don't use them. But overall port # is a little worrying. Having only 2 USB3 unless it's top of line?

I generally don't buy top of line mobo for builds.
It's the same with Intel - x370 is the top chipset, but plenty of cheaper boards will use it, not just flagship expensive boards.
Posted on Reply
#50
Captain_Tom
This is nothing new.

I have seen plenty (And owned 1) motherboards that support Crossfire but not SLI. It's just how Nvidia looks at things vs how AMD does. One wants uniformity, and one wants total freedom.
Posted on Reply
Add your own comment
Dec 21st, 2024 23:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts