Monday, July 4th 2016

NVIDIA GeForce GTX 1060 Doesn't Support SLI? Reference PCB Difficult to Mod

Here are some more technical pictures of NVIDIA GeForce GTX 1060 reference-design board, which reveals quite a few features about the card. The biggest revelation is that the card completely lacks SLI bridge fingers. We wonder if NVIDIA has innovated a bridge-less SLI for this card, although we find it unlikely given the amount of efforts the company put into marketing the SLI HB bridge, and the reason SLI needs a bridge in the first place. Meanwhile, the Radeon RX 480 supports 4-way CrossFireX.

Next up, the PCB is shorter than the card itself, and NVIDIA's unique new reference-cooler makes the card about 50% longer than its PCB. NVIDIA listened to feedback about shorter PCBs pushing power connectors towards the middle of the cards; and innovated a unique design, in which the card's sole 6-pin PCIe power connector is located where you want it (towards the end), and internal high-current wires are soldered to the PCB. Neato? Think again. What if you want to change the cooler, or maybe use a water-block? Prepare to deal with six insulated wires sticking out of somewhere in the PCB, and running into that PCIe power receptacle. The rear PCB shot also seems to confirm the 192-bit memory bus, given how some memory chip pads are blanked out by lacking SMT components needed by the memory chip.
Source: PurePC.pl
Add your own comment

83 Comments on NVIDIA GeForce GTX 1060 Doesn't Support SLI? Reference PCB Difficult to Mod

#51
dj-electric
Favorite things to do online:

1. tell Nvidia how much they suck and put limits on software and hardware
2. hype AMD products
3. get disappointed by AMD products
4. let Nvidia keep leading the market

Rinse and repeat.
Posted on Reply
#52
$ReaPeR$
TheinsanegamerNSo not enough for dual 1060s, but enough for dual 1080s? What sense does that make? If that were the case, the 1070s and 1080s wouldnt have SLI

3.0 x8 is enough for dual GPU, its just not enough for triple GPUS. We've known this for awhile.

The rumor is only the 3GB model cant SLI, the 6GB model can. nvidia is probably trying to avoid any vRAM outrages like with the 970. SLI 1060s would be terribly held back by 3GB of vRAM, and you know people would be complaining that nvidia screwedup again. So they simply removed the option altogether to prevent the future complaining and baseless claims of gimping on the fourms
interesting thought, though i dont think they give a f@ck about opinions on any forum, people complain while buying their cards. they just dont want the 1070/80 sales get hurt from this thing.
ensabrenoir.....why? Who would deprive themselves of an extra $30 to $50. Custom Sli bridges are like jewelry for gpus......you know like poodles with diamond collars nail polish and hair bows....



gotta show your gpus that you really love them....
LOL people love their overpriced bling..
thesmokingmanMaybe, maybe not. What is obvious though is that they are really sucking out the value in their value segment.
what value? value has left nvidia since the 8800 series. 300$ segment never had any value, and 90% of people dont buy that high, not that the need it anyway..
Posted on Reply
#53
bug
TheGuruStudNo kidding. I bet those POS just installed more spies at AMD (probably trying to rip off engineering, too).

They got caught once and nothing happened, why not keep doing it?
Well, you're wrong. It's just that with so few competitors in the market and so similar technologies used, it's impossible not to know what the other side is doing.
Posted on Reply
#54
Totally
FordGT90ConceptI don't get why they can't just send that data through PCI Express. We know PCI Express x16 3.0 has twice the bandwidth any one graphics card needs. Crossfire proves an external connector is not necessary.
There's probably a patent held by AMD that precludes them from doing so, that's the most logical explanation.
Posted on Reply
#55
ShurikN
Dj-ElectriCFavorite things to do online:

1. tell Nvidia how much they suck and put limits on software and hardware
2. Buy Nvidia anyway, because AMD drivers "suck"
3. Hope that AMD will regain market share, bc we all need competition to bring the prices down.
Fixed it for ya ;)
Posted on Reply
#56
Hood
ensabrenoir....if this card is really cheap yet packs a greater slightly punch than a 980.......from a business stand point I don't blame them for omitting sli fingers. What reason would someone have to buy a 1070 or potentially a 1080 if two cheap 1060 could match/surpass it?
What reason? Because a single card is always preferable to two lesser cards (when both options have similar performance and price). Because you'll create a lot less heat and noise with a single decent card, and use a lot less electricity. Because you'll have fewer driver issues or game optimization issues with a single card. Because for all your trouble, you might save a whole $100 by going with two cheap cards. And last but not least, people who actually prefer two cheap cards over one good card may have issues with self-esteem (bragging rights are more important than actual performance). I'm not saying that this describes your reason for using SLI, but you DID ask that question, as if a person would have to be crazy to NOT want two cards. Well, call me crazy, then, because I never wanted more than one card. In fact, if you think about it a while, it's ridiculous to be required to buy a complete other EXPENSIVE sub-system, just to run your monitor at decent frame rates in games. We only accept this because it's always been done this way. With tech that's available now, Intel could build GTX 960-level graphics processors into all their CPUs, and decimate the whole AIC GPU market. That level would satisfy 99% of all users, and probably 95% of all "gamers" (read the stats if you don't believe this). The cost (and failure rate) would be much lower. Posers and braggarts would just have to find another way to inspire envy or resentment in their equally shallow friends. Perhaps some more LEDs would make it seem faster...
Posted on Reply
#57
Jurassic1024
"What if you want to change the cooler, or maybe use a water-block?"

LMAO! Is this a GTX 1060 or a Fury X we're talking about?
Posted on Reply
#58
The N
well, it seems RX 480 will lead the way. however, if 1060 6GB version would have it, then it still a good deal. NVIDIA doing this cuz they know people will going to SLI in order to get performance equal to 1080 with lower cost. so they are trying to saving up 1080 sales.

i think after AMD's 480 price/performance revealed, trend has been changed. people now thinking towards low cost with high end performance. and it is possible in current times.
Posted on Reply
#59
cdawall
where the hell are my stars
I could definitely see the 1060 6gb beating the 480 across the board and costing $300. Something along the lines of a 5 or so % performance gain with a 30-40% power savings and that awesome cost jump.
Posted on Reply
#60
The N
cdawallI could definitely see the 1060 6gb beating the 480 across the board and costing $300. Something along the lines of a 5 or so % performance gain with a 30-40% power savings and that awesome cost jump.
Also I think 1060 in case if passes over to 480, it will be not more than 5 %. besides, we can rely on power saving feature of NVIDIA as they are done before. unlike AMD
Posted on Reply
#61
cdawall
where the hell are my stars
The NAlso I think 1060 in case if passes over to 480, it will be not more than 5 %. besides, we can rely on power saving feature of NVIDIA as they are done before. unlike AMD
Power saving features? The entire design is better on power. That is not a feature it is a process, also remember it was not too long ago that the shoe was on the other foot the 5xx0/6xx0 series from AMD sipped while the fermi cards gulped from nvidia. This shit all goes in cycles and right now nvidia has release an efficient powerful product, I severely doubt the 1060 will be any different for the current nodes.

Just remember this whenever you tout how nvidia has power saving and AMD does not.



The single fermi GPU traded blows with a pair of 4870's...



and then lost to the more power efficient PAIR of GPU's on the 5970
Posted on Reply
#62
chlamchowder
HoodBecause a single card is always preferable to two lesser cards (when both options have similar performance and price). Because you'll create a lot less heat and noise with a single decent card, and use a lot less electricity. Because you'll have fewer driver issues or game optimization issues with a single card.
I'll add that SLI/CF duplicate game data across both cards so usable VRAM for gaming doesn't increase. You might have enough GPU power with two cards to turn up settings or use higher resolutions, but end up unable to do that because you don't have enough VRAM.
HoodWith tech that's available now, Intel could build GTX 960-level graphics processors into all their CPUs, and decimate the whole AIC GPU market.
Unlikely. The problem is memory bandwidth. The GTX 960 has 112 GB/s of bandwidth. A Z170 Skylake chip using dual channel DDR4-2133 has only 34 GB/s of bandwidth, and a X99 CPU with quad channel DDR4-2133 only achieves 68 GB/s. To make things worse, that bandwidth would be shared competitively with CPU cores.
Posted on Reply
#63
The N
Those figures from old days are wroth noticing to how cycle of power saving process goes along with performance. NVIDIA has improved there line of process by intruding much more efficient power build.

Yes NVIDIA seems to follow up the last best known efficiency to take to whole new level. what NVIDIA delivers is contrary to AMD. we can see the process redesigned from kepler, the whole new platform of maxwell, which then leads to Pascal, they have consistently improved, the power. finally we have 1070 delivering Enthusiastic performance with power consumption even lesser than RX 480
Posted on Reply
#64
cdawall
where the hell are my stars
The NThose figures from old days are wroth noticing to how cycle of power saving process goes along with performance. NVIDIA has improved there line of process by intruding much more efficient power build.

Yes NVIDIA seems to follow up the last best known efficiency to take to whole new level. what NVIDIA delivers is contrary to AMD. we can see the process redesigned from kepler, the whole new platform of maxwell, which then leads to Pascal, they have consistently improved, the power. finally we have 1070 delivering Enthusiastic performance with power consumption even lesser than RX 480
Why are you comparing the 1070 to the 480? It directly follows up the 970 which it increases performance from (to 980 level) with a slight power consumption drop from the 970. They are good cards, ground breaking I think not and as I said this is all cyclical. When nvidia finally adds in true DX12 support I feel like we will see a power jump. Both companies have seen great power improvements, both companies have shown that performance is getting better, neither company is perfect.
Posted on Reply
#65
The N
cdawallWhy are you comparing the 1070 to the 480? It directly follows up the 970 which it increases performance from (to 980 level) with a slight power consumption drop from the 970. They are good cards, ground breaking I think not and as I said this is all cyclical. When nvidia finally adds in true DX12 support I feel like we will see a power jump. Both companies have seen great power improvements, both companies have shown that performance is getting better, neither company is perfect.
I am comparing the power consumption. 1070 is high end card, competing 980Ti (as NVIDIA's own) for the performance 480 is around 34% of 1070. 480 still draws little more power, despite of being 6 pin oriented. is that my only point.
Posted on Reply
#67
Caring1
PP Mguireo_O Idk what's going on anymore.
Take the Green pill this year followed by the Red pill next year, if you don't feel better, take one of each and use them simultaneously.
Posted on Reply
#68
The N
Caring1Take the Green pill this year followed by the Red pill next year, if you don't feel better, take one of each and use them simultaneously.
rofl nice one dude.

on a serious note, yes Green still going to have edge over Red, Till Vega arrives. after then the next year will probably be Red Flag
Posted on Reply
#69
cdawall
where the hell are my stars
The NI am comparing the power consumption. 1070 is high end card, competing 980Ti (as NVIDIA's own) for the performance 480 is around 34% of 1070. 480 still draws little more power, despite of being 6 pin oriented. is that my only point.
Right so the 480 is not targeted to compete with the 1070. Worry about amd power consumption with the rx490 comes out.
Posted on Reply
#70
matar
Please NVidia is this a joke or is this the new July fools..
I hope NVidia will have a sli option via PCIe just like AMD. any this is possible since they dropped 3-way and 4-way Sli on high end cards so hopefully lower end card that had only one sli finger will no longer require sli bridge and instead will use the PCIe to do that
Posted on Reply
#71
Captain_Tom
john_No SLI. Well Nvidia is limiting SLI as years go buy. 10 years ago you could do SLi with little cards like 9500GT.

Anyway.... so....
GTX 1080 is faster than 2 GTX 980s.
One GTX 1060 is as fast if not faster than a GTX 980, so two GTX 1060 are equal with GTX 1080(at least where SLI is working)

If a GTX 1060 was costing $300, NVidia wouldn't care. But if it was costing $250, many would prefer a duo of 1060's over an overpriced and difficult to find more expensive GTX 1080.

I think 1060 is coming at $250. At least the 3GB version if the 6GB does come with an SLI. If it comes I would say $300 for the 6GB version. If we are looking at a 6GB version with no SLI, then the 6GB could come at $250.
The 1080 is faster than 980 SLI, not twice as strong as a 980. So it is like 60-70% stronger. So now cut that performance in half and you will get around 970 framerates.

I'm sorry but based on the specs the 1060 looks to be around exactly as strong as the 480 while having less VRAM and costing more money. Simple as that.
Posted on Reply
#72
xorbe
FordGT90ConceptI don't get why they can't just send that data through PCI Express. We know PCI Express x16 3.0 has twice the bandwidth any one graphics card needs. Crossfire proves an external connector is not necessary.
Latency ... but does it matter? Beats me!
Posted on Reply
#73
Captain_Tom
matarPlease NVidia is this a joke or is this the new July fools..
I hope NVidia will have a sli option via PCIe just like AMD. any this is possible since they dropped 3-way and 4-way Sli on high end cards so hopefully lower end card that had only one sli finger will no longer require sli bridge and instead will use the PCIe to do that
No they are doubling down on their archaic SLI-Bridge system. IMO Nvidia is just going about multi-gpu the completely wrong way. Removing 3/4-way SLI just means they are ok with spending countless man-hours optimizing for each game at a time.

AMD realizes that they need to make CF as simple to implement as possible so that the devs can take over the responsibility and scale up to 4 gpus for EVERY AMD card...
Posted on Reply
#74
Captain_Tom
The Nrofl nice one dude.

on a serious note, yes Green still going to have edge over Red, Till Vega arrives. after then the next year will probably be Red Flag
Define "Edge". Performance crown? Maybe, but not really considering the Pro Duo is the strongest card out still.

AMD seems to finally be winning sales, which is what they really need.
Posted on Reply
#75
semitope
NC37Well the 480 is pretty subpar even when it has Crossfire, but for a 60 series to not have SLI...That could be a deal breaker for some.

nVidia seems to love to find ways to devalue the 60 series. If cutting it's memory bus wasn't enough, now it loses SLI capabilities.

Does send a bit of a message to midrange and under...if you don't have 1070 or better, you do have squat.

Nows the time for AMD to pull out the 8 pin 480 that clocks like they hyped the 480 up to be.
pretty sub par relative to what? clocks like they hyped it up to be? Where was this hype? Last I saw everybody was mad AMD wasn't giving more information. now all of a sudden they hyped too much...
Posted on Reply
Add your own comment
Nov 20th, 2024 12:44 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts