# HD4350, a PCI-E x1 Slot and a Dremel (Tons of Pics Warning).



## Yukikaze (Mar 21, 2009)

First, a public service announcement:

WARNING:
There is almost no room for error and you need a very steady hand with a Dremel (Or more professional tooling) to get this done. If someone attempts this and kills their card, I take no responsibility. Be careful with taking power tools to your video cards 

Don't do this to a piece of hardware you'll miss if something goes wrong. Killing it is real easy.

And now we return to our scheduled report:

With the possible acquisition of a new TV and with the wish for a third monitor in any case, the need arose for a second video card in my system to drive that display. Since I did not want to take up the second PCI-E x16 slot in my system, and since PCI video cards are relatively expensive (and nearly impossible to find here locally), I decided to mod a Sapphire HD4350 (availiable everywhere here and el cheapo) to use a PCI-E x1 slot. I did not want to cut the slot on the motherboard (to make it open-ended), since the motherboard cost me plenty and is still under warranty, whereas the card was cheap.

In the process, I decided to also benchmark the effects of the HD4350 being run at full bandwidth, at PCI-E 2.0 x1 and PCI-E x1. Considering it is a weakling of a card, I thought the bandwidth might not matter, but as we shall see shortly, I was wrong. No issue for me, I only use it to drive two more monitors, but interesting to see nonetheless.

The testing system is my main rig in my system specs tab (The leftmost one in the list) and the benchmarks are ran were 3DMark06, 3DMark Vantage and Crysis at 1680x1050 on Low settings.

Here's the victim's box:






And here is the victim from the front:





And from the rear:





And how small it is next to my HD4870X2:





To check whether I am going to cut off the right number of connectors, I first tested the card with some of the golden fingers taped over. The card looked like this with most of the connectors covered:





And here's the back of it:





Dismemberment !





And here it is in the computer for testing:





And here it is sitting above my HD4870X2 (Temps on both are alright this way):





Success !





Okay, I promised some benchmarks, so here they are:
Crysis was benchmarked by running through the early game (Contact) until the GPS jammer on the beach (Including taking it out and watching the fireworks) and recording the FPS with FRAPS. 3DMV and 3DM06 were run on their defaults. The card was OC'ed to the maximum CCC Overdrive would allow it just for the heck of it.

First, the card running at half-full bandwidth. It comes up at PCI-E 2.0 x8 in GPU-Z since that's what the slots on the mATX DFI P45 are wired as, but this shouldn't have any effect on its performance - Even far more powerful cards do not mind the PCI-E 2.0 x8 bandwidth.

Crysis:
Min: 7
Avg: 16.559
Max: 24

3DMark06:





3DMark Vantage:





Now, the card running at the PCI-E 2.0 x16 slot in the pre-cut testing, taped over to reveal only one PCI-E lane. Resulting bandwidth is a single PCI-E 2.0 lane.

Crysis:
Min: 7
Avg: 15.009
Max: 23

3DMark06:





3DMark Vantage:





Finally, the chopped down card in the PCI-Ex1 slot.

Crysis:
Min: 6
Avg: 13.633
Max: 21

3DMark06:





3DMark Vantage:





As you can see, getting chopped down to a single PCI-E lane had an effect on the card's performance, with the biggest drop occurring in 3DMark06.

I hope this is informative and of use to someone. I had tons of fun doing it and was a little surprised I did not kill it in the process


----------



## exon1 (Mar 21, 2009)

what the... '


----------



## Error 404 (Mar 21, 2009)

Nice work, I wonder if I could do this with an 8400 GS and have me a physics card!
By removing the majority of the PCI-E slot, does that reduce the electrical power available to the card, or is that supplied through the second tab of PCI-E lanes?


----------



## KainXS (Mar 22, 2009)

very nice, props man


----------



## alexp999 (Mar 22, 2009)

Wouldnt it have been better to cut the mobo slot or something?


----------



## Yukikaze (Mar 22, 2009)

alexp999 said:


> Wouldnt it have been better to cut the mobo slot or something?



The video card costs about a fifth of the price of the board, so nope, cutting the mobo was not the better option. I did not want to void the warranty.


----------



## alexp999 (Mar 22, 2009)

Yukikaze said:


> The video card costs about a fifth of the price of the board, so nope, cutting the mobo was not the better option. I did not want to void the warranty.



Fair enough

I still cant work out why there is a difference between taped pins and cut off pins tho.


----------



## Yukikaze (Mar 22, 2009)

alexp999 said:


> Fair enough
> 
> I still cant work out why there is a difference between taped pins and cut off pins tho.



The PCI-E x16 slots on my mobo are PCI-E 2.0, while AFAIK, the PCI-E x1 slot is PCI-E 1.x, hence the difference in performance depending on where the card sits.


----------



## 1Kurgan1 (Mar 22, 2009)

Thats pretty darn cool! I didn't even know that PCI-E 2.0 cards would work in a 1X slot. I assume this is only for lower end cards?


----------



## KainXS (Mar 22, 2009)

this would work on any pci-e card i guess but running this on any card over a 4650 might be near murder lol

still i might try this when i get a dedicated physx card for my older pc


----------



## 1Kurgan1 (Mar 22, 2009)

Do this to my 4870x2, lol!


----------



## Xazax (Mar 23, 2009)

Why didnt you just buy the 9500GT x1 PCI-E that just came out lawlz


----------



## newtekie1 (Mar 23, 2009)

Error 404 said:


> Nice work, I wonder if I could do this with an 8400 GS and have me a physics card!
> By removing the majority of the PCI-E slot, does that reduce the electrical power available to the card, or is that supplied through the second tab of PCI-E lanes?



All the power to the card is supplied by the small seperate set of pins on the connector.  Every card gets the same amount of power regardless of how many lanes they are.



Xazax said:


> Why didnt you just buy the 9500GT x1 PCI-E that just came out lawlz



It sounds like he was looking for a local solution that was as cheap as possible.  The 9500GT would have cost more to buy.  Plus, he is running Vista, which means ATi+nVidia wouldn't work.


----------



## Yukikaze (Mar 23, 2009)

newtekie1 said:


> It sounds like he was looking for a local solution that was as cheap as possible.  The 9500GT would have cost more to buy.  Plus, he is running Vista, which means ATi+nVidia wouldn't work.



Correct on both. PCI-E x1 cards are non-existent over here (Except for Quadros which cost more than my mobo), and the only PCI video cards here are expensive as heck, and also nVidia, so it wouldn't work for me anyway.


----------



## Mussels (Mar 23, 2009)

i've done this before, but what i did was cut the slot out, not the card!

As is obvious from the pics, 1x slots are only 1.0, not 2.0


----------



## EnglishLion (Mar 23, 2009)

Yukikaze said:


> The PCI-E x16 slots on my mobo are PCI-E 2.0, while AFAIK, the PCI-E x1 slot is PCI-E 1.x, hence the difference in performance depending on where the card sits.



So the cut down card in the x16 slot should reach ~the same benchmark scores as the tapped over card then?


----------



## Mussels (Mar 23, 2009)

EnglishLion said:


> So the cut down card in the x16 slot should reach ~the same benchmark scores as the tapped over card then?



yeah thats the point i was trying to make a few posts ago.

 Rarely is anything other than the one (or two) main PCI-E slots PCI-E 2.0, they're nearly always 1.0/1.1 slots.


----------



## BrooksyX (Mar 23, 2009)

Hmm interesting stuff. Thanks for sharing. I have an old ati x300 that I wouldn't mind trying this on. It is just sitting with the rest of my PC junk. Only time I ever use it is if I have a bad bios flash on another video card. But I guess I could set it up to a 1x card just for the fun of it.


----------



## Kweku (Mar 26, 2009)

Nice bra, good work man, got the same card but will never do that to it, wanna overclock it to see just how far it can go. Will share the results up here.

ONE


----------



## NeotonicDragon3 (Mar 27, 2009)

didnt know this was possible
my Dell Dimension E310 that is currently an issh
only has PCI-E-X1 and PCI slots lol


----------



## Hybridchemistry (Mar 30, 2009)

lolwut?
Sweet job though, but I would have cut the board before I cut the card... may still be under warranty, but there is way less to go wrong.


----------



## crazy pyro (Apr 4, 2009)

And if he killed the mobo in another way? The card cost a fifth of what his mobo did he says  so there's no problem.


----------



## Yukikaze (Apr 4, 2009)

EnglishLion said:


> So the cut down card in the x16 slot should reach ~the same benchmark scores as the tapped over card then?



Yep.


----------



## Baum (May 8, 2009)

Thanks for the try, that means you can use these cutdown technics in Xeon/server Boards? where the mayority doesn't have any x16 slot just the x1?

that would be neat gonna try that some day


----------



## crtecha (May 8, 2009)

Im soo confused.......Im glad it worked out for you though


----------



## Yukikaze (May 8, 2009)

Baum said:


> Thanks for the try, that means you can use these cutdown technics in Xeon/server Boards? where the mayority doesn't have any x16 slot just the x1?
> 
> that would be neat gonna try that some day



Yes, you can. But for what aim ?


----------



## wolf2009 (May 8, 2009)

WOW, didn't know this was possible. haven't seen anything like this b4


----------



## BrooksyX (May 8, 2009)

I am very tempted to do this with a card so I can fit something decent in my mini-itx rig I am building. It only has 1 pci-e 1x slot. I am thinking like either an 8500gt or 9400gt maybe. Anything higher would be a waste due to bandwidth being so low.


----------



## Yukikaze (May 8, 2009)

BrooksyX said:


> I am very tempted to do this with a card so I can fit something decent in my mini-itx rig I am building. It only has 1 pci-e 1x slot. I am thinking like either an 8500gt or 9400gt maybe. Anything higher would be a waste due to bandwidth being so low.



The HD4350 is a good card to do it to. From what I saw in benches it is better than the 9400GT and it is priced well enough to attempt this on it (Heck, here it costs less than anything which can be commonly found, except the 8400GS). Anything above it would be quite severely affected. What's the specs on the rest of your system ?


----------



## BrooksyX (May 8, 2009)

Yukikaze said:


> The HD4350 is a good card to do it to. From what I saw in benches it is better than the 9400GT and it is priced well enough to attempt this on it (Heck, here it costs less than anything which can be commonly found, except the 8400GS). Anything above it would be quite severely affected. What's the specs on the rest of your system ?



Well I would like to go with an nvidia card because the motherboard I am usintg has an nvidia chipset. Its for this rig here:
http://forums.techpowerup.com/showthread.php?t=92937 

The specs are:
Zotac 630i mini-itx
Intel e5200
2gb of ddr2 667 ram
250w psu (16amps on 12v rail)

If I do do this it probably won't be for a couple months but ill probably keep my eye for a cheap 8400gs (or better) on craigslist/ebay/here.


----------



## Yukikaze (May 8, 2009)

BrooksyX said:


> Well I would like to go with an nvidia card because the motherboard I am usintg has an nvidia chipset. Its for this rig here:
> http://forums.techpowerup.com/showthread.php?t=92937
> 
> The specs are:
> ...



Ah, okay. Depends on what you want from it, the 8400GS might be a good choice, but the 9400GT will be slightly better even sawed off.

Heh, I think I just coined the term: "Sawed-Off Graphics Card".


----------



## JrRacinFan (May 8, 2009)

I LOVE this Yukikaze. Now if anyone has a 4350 to donate to see if crossfire works . . . 

I just realized I couldve done this with my pair of 2600 Pros for crossfire on my DFI Dark.


----------



## MilkyWay (May 8, 2009)

ive seen this a few times on tpu but not as good as this

nice job give yourself a pat on the back! lol


----------



## Yukikaze (May 8, 2009)

Thanks guys. I might do it to a 8400GS next week. I have two rigs with three monitors. Currently the other rig has an onboard GPU to drive a third monitor, but with the last of my i7 parts arriving and the computer shuffling that this will bring mean that the computer on my other setup will have a 9600GT and a 8600GT for PhysX. I have no idea whether a card dedicated to PhysX could be also used for extra monitor output, but if not - The 8400GS is the next victim (Then I'll have a 9600GT, 8600GT and 8400GS all in the same rig ).


----------



## W1zzard (May 8, 2009)

amazing work. i didnt know this was possible


----------



## Geofrancis (May 18, 2009)

does any one know if this will work with any pci-e cards not just graphics cards ? im looking at some raid controllers that this mod would be ideal for.


----------



## JrRacinFan (May 18, 2009)

Geofrancis said:


> does any one know if this will work with any pci-e cards not just graphics cards ? im looking at some raid controllers that this mod would be ideal for.



I don't see why not.


----------



## Yukikaze (May 18, 2009)

JrRacinFan said:


> I don't see why not.



Same here - I was (mis-)using a feature of PCI-E, not of the card. The lanes are independent of each other.

Just take your RAID controller card, tape over the pins to turn it into an x1 card, test it, and if it works, you're good to go.

Just be careful, okay ?


----------



## Geofrancis (May 18, 2009)

would someone be willing to try this? because im looking online for a raid controller but i dont want to spend a fortune on it for it not to work.


----------



## KainXS (May 18, 2009)

this works,

I am really thinking of doing this to a 8500GT though

If you wanted another way to do it, you can cut the notch at the end of the pcie slot so any size card can be installed(I have done this and it works) 





but if the card isn't secure it could be a problem, but if the card dosen't support pci-e 1X it won't work either way.

BTW thats not my pic


----------



## Geofrancis (May 18, 2009)

i think he modded the card because it is much cheaper than the motherboard. what would you rather destroy a cheap graphics card or an expensive motherboard.


----------



## lemonadesoda (May 18, 2009)

Nice table of PCI-e x16 pinouts: http://pinouts.ru/Slots/pci_express_pinout.shtml

Yukikaze, why did you slice off the whole finger length except for lane 1? Wouldnt it be better to dremel out the single lane required to "jump" the plastic slot on the mainboard, leaving a 15 lane GPU? Goodness knows if a 15 lane GPU would work.  Or a wider slot, leaving a 12 lane GPU?  While you dont need it now, it might come in useful at a later date.

REQUEST: Could someone tape their GPU and try a 12 lane (not 16, not 8) setup and see if it works.  If it does, could you try taping just ONE lane, and lane, and see if it works? I cant do the experiement because I'm on AGP.

If taping just one lane works, then there is no need to slice the whole finger off, but just "notch" where the socket is.  The would be great to turn a 16 lane into a 15 lane, that is compatible with x8 socket, or x4 socket (depending on where you put the notch).

In theory you could put 4 notches in the GPU, to have a 12 lane GPU, compatible with x1, x4, x8 and x16 slots!


----------



## Yukikaze (May 19, 2009)

lemonadesoda said:


> Nice table of PCI-e x16 pinouts: http://pinouts.ru/Slots/pci_express_pinout.shtml
> 
> Yukikaze, why did you slice off the whole finger length except for lane 1? Wouldnt it be better to dremel out the single lane required to "jump" the plastic slot on the mainboard, leaving a 15 lane GPU? Goodness knows if a 15 lane GPU would work.  Or a wider slot, leaving a 12 lane GPU?  While you dont need it now, it might come in useful at a later date.
> 
> ...



I am not quite sure this idea would work. PCI-E works with 1, 2, 4, 8 or 16 lanes, AFAIK and picks the largest of this number to run at. So a card with 12 lanes would run at x8. As for making the card with the lanes not sequential - Might be worthwhile to look into.


----------



## sruli (Jun 25, 2009)

.


----------



## ShadowFold (Jun 25, 2009)

sruli said:


> http://www.hisdigital.com/un/product2-444.shtml



This post deeply confuses me.


----------



## a_ump (Jun 25, 2009)

sruli said:


> http://www.hisdigital.com/un/product2-444.shtml





ShadowFold said:


> This post deeply confuses me.



lol yea i don't understand its purpose either


----------



## mlee49 (Jun 25, 2009)

He was just trying to help U2KOnline 

I'm gonna tape over my 8600GT and see if this works.  Does SLI work with a PCI 1x and a 16x lane?


----------



## aCid888* (Jun 25, 2009)

ShadowFold said:


> This post deeply confuses me.



I dont see how...he linked you to a PCI card.....I would assume he is trying to point out that if the OP would have got that he wouldn't need to cut anything. 


If the Yuki did get a PCI card there wouldn't be any fun to be had by cutting a good card up.


----------



## Yukikaze (Jun 25, 2009)

aCid888* said:


> I dont see how...he linked you to a PCI card.....I would assume he is trying to point out that if the OP would have got that he wouldn't need to cut anything.
> 
> 
> If the Yuki did get a PCI card there wouldn't be any fun to be had by cutting a good card up.



I could've gotten an overpriced PCI card (Since no ATI PCI cards are available locally and even the NV ones are close to the 100$ mark over here), than paid for overseas shipping, or I could pay a couple of dimes here and have fun in the process. Plus, IIRC, the card he linked to wasn't even out when this was done.

Besides, on my old P45, which is mATX, when the HD4870X2 was in the lower slot, the PCI slot was blocked. Putting it in the top slot meant that the air the CPU cooler was sucking in was going next to the HD4870X2, driving temps up.


----------



## JrRacinFan (Jun 25, 2009)

mlee49 said:


> He was just trying to help U2KOnline
> 
> I'm gonna tape over my 8600GT and see if this works.  Does SLI work with a PCI 1x and a 16x lane?



It would work as long as you have a SLI capable chipset.


----------



## Yukikaze (Jun 25, 2009)

JrRacinFan said:


> It would work as long as you have a SLI capable chipset.



That might be interesting to test, actually.....


----------



## mlee49 (Jun 25, 2009)

mlee49 said:


> I'm gonna tape over my 8600GT and see if this works.



Making good on my word:







benchies for bandwidth to follow


----------



## Yukikaze (Jun 25, 2009)

mlee49 said:


> Making good on my word:
> benchies for bandwidth to follow





Castrated video cards -> We need to start a club


----------



## Mussels (Jun 25, 2009)

"people who get good things and make them suck club" 




(i'm all for this club, it has practical applications)


----------



## Yukikaze (Jun 25, 2009)

Mussels said:


> "people who get good things and make them suck club"
> 
> 
> 
> ...



It doesn't suck ! I gets twice the FPS of whachamacallit's PCI 8400GS, that means I can run Crysis at 50FPS maxed out on an Atom 230 ! 

Seriously, though, good is relative. If anyone does this to the HD4870, he's a certified nut, but somewhere along the lines of the 8500GT and below ? That sure has applications.


----------



## Mussels (Jun 25, 2009)

oh yeah, i mean, whats to stop me using a PCI-E 1x slot with an 8500 for Physx, for example? 10 minutes with a hacksaw, thats what.


----------



## mlee49 (Jun 25, 2009)

3D06:
1x






16x


----------



## Yukikaze (Jun 26, 2009)

Thanks, mlee, can you try and run a game or two with it taped down to PCI-E x1 ? Crysis should be interesting to see.


----------



## mlee49 (Jun 26, 2009)

Yuk,

Dont have crysis but I could do Aquamark or another 3DMark run.  05 would be good since that was the era of the 8600.



lemonadesoda said:


> Nice table of PCI-e x16 pinouts: http://pinouts.ru/Slots/pci_express_pinout.shtml
> Yukikaze, why did you slice off the whole finger length except for lane 1? Wouldnt it be better to dremel out the single lane required to "jump" the plastic slot on the mainboard, leaving a 15 lane GPU? Goodness knows if a 15 lane GPU would work.  Or a wider slot, leaving a 12 lane GPU?  While you dont need it now, it might come in useful at a later date.
> 
> *REQUEST: Could someone tape their GPU and try a 12 lane (not 16, not 8) setup and see if it works.  If it does, could you try taping just ONE lane, and lane, and see if it works? I cant do the experiement because I'm on AGP.*
> ...




So are you wanting to try just the 12th pin taped over?  Can you use a pic of the gfx card and show me what you want? I'd be willing to try it


----------



## Mussels (Jun 26, 2009)

oh, dont use 3dmarks. they barely take a hit from the lower PCI-E lanes, while games take far more of a hit. I learned that myself with my crossfire setup. (and someone pointing this out to me)


----------



## mlee49 (Jun 26, 2009)

So Street Fighter 4 benchmark ok?  Last Remnant benchmark?

SF4 16x:





1x:





LR 16x:






LR1x:







For both:


----------



## mlee49 (Jun 27, 2009)

benchmarks updated! 


I want to run this as a Physics card next!  I'll try that and get back to ya's!

HA!  I need a Physx benchmark ASAP!:











MONEY!

EDIT: * Running at 1x!!*











With PhysX: 2162
Without PhysX 1368
Difference: 794
% Improvement: *58%*


----------



## Easo (Jun 27, 2009)

Wtf.... :d


----------



## SonDa5 (Jun 27, 2009)

Geofrancis said:


> i think he modded the card because it is much cheaper than the motherboard. what would you rather destroy a cheap graphics card or an expensive motherboard.





Interesting mod. I think it was done out of desperation. 

Far out!


I would never do this. I wonder how long the card will continue to live. I'd be worried about an electrical short destroying all the hardware attached to the MB.


----------



## Geofrancis (Jun 28, 2009)

SonDa5 said:


> Interesting mod. I think it was done out of desperation.
> 
> Far out!
> 
> ...



you are not modifying the power pins that are in the first part of the pci-e slot. the second part is the pci-e data lanes that are grouped in 1x 4x 8x and 16x with this mod all you are doing is limiting the amount of pci-e lanes that the card can use from 16x to 1x. this mod should also work with 1x 4x and 8x pci-e lanes.


----------



## thebeephaha (Jun 28, 2009)

I modded an 8600GT to 8x for a server motherboard. Worked great.


----------



## Taz100420 (Jul 5, 2009)

Would this work on a 9500GT? I have it sittin around and I use my 8600GTS over the 9500GT. And would it work to use the 8600GTS in the X16 spot and 9500GT in the X1 for Physics?


----------



## Yukikaze (Jul 5, 2009)

Taz100420 said:


> Would this work on a 9500GT? I have it sittin around and I use my 8600GTS over the 9500GT. And would it work to use the 8600GTS in the X16 spot and 9500GT in the X1 for Physics?



It *should* work with any PCI-E card. But killing/castrating a 9500GT is a bit of a shame, IMHO.


----------



## Taz100420 (Jul 5, 2009)

Yukikaze said:


> It *should* work with any PCI-E card. But killing/castrating a 9500GT is a bit of a shame, IMHO.



Oh I wont cut the card lol Id just notch my X1 slot
I basically got the 9500GT for free


----------



## Yukikaze (Jul 5, 2009)

Taz100420 said:


> Oh I wont cut the card lol Id just notch my X1 slot
> I basically got the 9500GT for free



That should work.


----------



## Taz100420 (Jul 5, 2009)

Yukikaze said:


> That should work.



Ok Ill do some modding in a while and see what results I get.


----------



## dr emulator (madmax) (Jul 28, 2009)

lemonadesoda said:


> Nice table of PCI-e x16 pinouts: http://pinouts.ru/Slots/pci_express_pinout.shtml
> 
> Yukikaze, why did you slice off the whole finger length except for lane 1? Wouldnt it be better to dremel out the single lane required to "jump" the plastic slot on the mainboard, leaving a 15 lane GPU? Goodness knows if a 15 lane GPU would work.  Or a wider slot, leaving a 12 lane GPU?  While you dont need it now, it might come in useful at a later date.
> 
> ...

















 a big 
	

	
	
		
		

		
		
	


	




for the link to the pci express info ,i love having pinouts for anything to do with pc's 
	

	
	
		
		

		
		
	


	




again


----------



## cobular (Jul 30, 2009)

*They do this!*

Just thought I'd let you all know there is a motherboard out there filling the ITX need for fitting a graphics card but not able to support a x16 slot that uses something similar the "notching approach". It has a full sized PCIe x16 slot physically but is only pinned out to a x4 slot. Its the J&W MINIX 780G-SP128MB if you find a top down pic (like the one you can hopefully see below) you can see the pins stop after x4.


----------



## Mussels (Jul 30, 2009)

the minix is old news by now, but still one hell of a HTPC mobo.


----------



## Mr.Amateur (Aug 3, 2009)

jesus christ, card modding?  you guys just blew my mind.  Applause everywhere!  Can't wait to see what will happen next


----------



## EnergyFX (Aug 7, 2009)

this is nuts!

you're nuts!


----------



## Yukikaze (Aug 7, 2009)

EnergyFX said:


> this is nuts!
> 
> you're nuts!



I'll take that as a compliment


----------



## lxfguits (Sep 21, 2009)

I'm curious about if you could run a quad SLI/CrossFireX setup like this and if it would actually be worth doing with some older 256bit cards.


----------



## Mussels (Sep 22, 2009)

lxfguits said:


> I'm curious about if you could run a quad SLI/CrossFireX setup like this and if it would actually be worth doing with some older 256bit cards.



cant be done. the 1x slots are normally from the southbridge, and you cant mix NB and SB PCI-E lanes for crossfire or SLI


----------



## lxfguits (Sep 22, 2009)

Ahh right, kinda forgot about that one...


----------



## newtekie1 (Sep 22, 2009)

Mussels said:


> cant be done. the 1x slots are normally from the southbridge, and you cant mix NB and SB PCI-E lanes for crossfire or SLI



Then how did the 965P and P35 manage to support crossfire?


----------



## Mussels (Sep 23, 2009)

newtekie1 said:


> Then how did the 965P and P35 manage to support crossfire?



P35 didnt mix, i know that much. it was just 16x/4x from the NB


----------



## newtekie1 (Sep 28, 2009)

Mussels said:


> P35 didnt mix, i know that much. it was just 16x/4x from the NB



It was x16 from the northbridge, and x4 from the south.  The northbridge only had 16 total lanes to use, to it would be impossible to do x16/x4 entirely from the P35 northbridge.

Unless I'm reading the diagram and specs for P35 wrong:


----------



## Mussels (Sep 29, 2009)

you may well be right.

i assumed it wasnt, but hell - i ran crossfire off my P35 with no issues.

i guess i failed my own saying "assumption is the mother of all f$%kups"


----------



## Patriot (Oct 27, 2009)

hey I am interested in giving this a shot... how do I find out how many pins to leave?
not a too cheap an error but you still make to dead cards and be ahead.

I have a coworker that is using a server and could use a gfx card to bump up his monitor to full res... but x1 slots are all that are free.

granted he could switch systems with a bit of pain... but I figured I could surprise him with this... I have a few old quatros I could try it on as well 

(I don't know how I found this thread but I am glad I did... I read yalls news daily and am highly active on OCW forums)


----------



## lemonadesoda (Oct 27, 2009)

mlee49 said:


> So are you wanting to try just the 12th pin taped over?  Can you use a pic of the gfx card and show me what you want? I'd be willing to try it



Mlee,

Go to this link: http://pinouts.ru/Slots/pci_express_pinout.shtml

Tape over channel 15.  (Last 5 pins). Then you have an x15 card. Boot it up. See if it works as an x8, x12, x15 or x16.

Second test. Tape over only channel 2. Then you also have an x15 card. Boot and see if it works as an x1, x4, x8, x15 or x16.

Proof of concept. Please share results.


----------



## Yukikaze (Oct 27, 2009)

Patriot said:


> hey I am interested in giving this a shot... how do I find out how many pins to leave?
> not a too cheap an error but you still make to dead cards and be ahead.
> 
> I have a coworker that is using a server and could use a gfx card to bump up his monitor to full res... but x1 slots are all that are free.
> ...



Tape the pins over and leave only the first 7 showing to test before chopping. You can see this in the pics in the first post.


----------



## mlee49 (Oct 28, 2009)

lemonadesoda said:


> Mlee,
> 
> Go to this link: http://pinouts.ru/Slots/pci_express_pinout.shtml
> 
> ...



Sorry I cant do the tests at the moment, I sold the card and shipped it out.


----------



## suraswami (Nov 11, 2009)

This is sick.  I am going CF 2 x 2600XT and cut a 8400GS, put in a 1x slot for physx and see how it performs 

Does a 6200 or 7200 series card support Physx or only 8xxx and above support?


----------



## Mussels (Nov 12, 2009)

suraswami said:


> This is sick.  I am going CF 2 x 2600XT and cut a 8400GS, put in a 1x slot for physx and see how it performs
> 
> Does a 6200 or 7200 series card support Physx or only 8xxx and above support?



8series and up, needs at least 64 shaders and 256MB of ram


----------



## suraswami (Nov 12, 2009)

Mussels said:


> 8series and up, needs at least 64 shaders and 256MB of ram



so will a 8400gs 256mb work?  I am getting one for real cheap.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814141068&Tpk=biostar 8400gs


----------



## Mussels (Nov 12, 2009)

suraswami said:


> so will a 8400gs 256mb work?  I am getting one for real cheap.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814141068&Tpk=biostar 8400gs



i think the minimum people recommend is an 8500


----------



## SaperPL (Nov 21, 2009)

I'm willing to do this mod on some spare old X1300PRO but i encouter problem with my mobo. When i try to tape over even only last five pins on the card i encourter boot error from the POST and card doesn't work. Is this mod is supposed to work only for pci-e 2.0 cars and slots? What if i have only pci-e 1.1 ? ive been willing to do this mod to check some stuff like multiseat configs on my system and possibility of accelerating the CADs like Solidworks working on normal gaming cards by secondary unconnected card - same idea like accelerating physx on systems with powerfull radeons by adding cheap geforce. Anyone can help getting the idea what i'm doing wrong?


----------



## r9 (Nov 21, 2009)




----------



## Yukikaze (Nov 21, 2009)

SaperPL said:


> I'm willing to do this mod on some spare old X1300PRO but i encouter problem with my mobo. When i try to tape over even only last five pins on the card i encourter boot error from the POST and card doesn't work. Is this mod is supposed to work only for pci-e 2.0 cars and slots? What if i have only pci-e 1.1 ? ive been willing to do this mod to check some stuff like multiseat configs on my system and possibility of accelerating the CADs like Solidworks working on normal gaming cards by secondary unconnected card - same idea like accelerating physx on systems with powerfull radeons by adding cheap geforce. Anyone can help getting the idea what i'm doing wrong?



Have you tried taping over all pins except for the power ones and the 7 first data ones, exactly as I did ? 

I am not quite sure how it would respond if you tape off some random number of connectors.


----------



## SaperPL (Nov 21, 2009)

I did this by the first attemp and didnt worked, then i checked how would it work with only one last lane disabled an my system still doesnt boot.


----------



## Yukikaze (Nov 21, 2009)

SaperPL said:


> I did this by the first attemp and didnt worked, then i checked how would it work with only one last lane disabled an my system still doesnt boot.



I think it has nothing to do with PCI-E 1.1 and PCI-E 2.0, but rather with some self-check the X1300Pro has. It probably detects lane failure and refuses to work, even though this is against what I know of the PCI-E spec (A device must select the maximum number of working lanes, but must work with even a single one operational). It might also be some issue with your motherboard not liking the idea for some reason. My Abit I-N73HD refused to boot with the chopped off card in the PCI-E x1 slot, for example, whereas both my DFIs (P45 and X58) boot with it in any PCI-E slot and I get a video signal.


----------



## SaperPL (Nov 21, 2009)

The same thing happens when i tape out my quadro FX370 - it has to be something with my mobo but dunno how to unlock it.


----------



## Yukikaze (Nov 21, 2009)

SaperPL said:


> The same thing happens when i tape out my quadro FX370 - it has to be something with my mobo but dunno how to unlock it.



Definitely something with the mobo, though. What mobo is that ?

Can you force the PCI-E x16 slot to run in PCI-E x1 mode via the BIOS ?


----------



## SaperPL (Nov 21, 2009)

its ASUS K8N-DL - two dual core opterons, nforce4 professional 2200, two pci slots, one x16 slot, one x1 slot(sandra shows sli for this mobo so probably the first one could be replaced by full lenght slot but as for server/workstation board they didnt thought of that probably when designing). in the bios for pc-e i can only manipulate spread spectrum, size of payload from 128 to 4096 bytes and init display first option(pci/pci-e to choose) and anything of above does the trick for this mod.


----------



## Yukikaze (Nov 21, 2009)

Well, I have no idea what might be causing it. I haven't tried this on a large amount of motherboards, so I really don't know how I can help you out.


----------



## Taz100420 (Nov 22, 2009)

Yea it seems my Abit An-52 dont like a vid card in the x1 slot either. The cards fan turns on but nothing pops up in the Nvidia control panel or device manager. and If I try to just boot from the x1 slot, I get the no graphics card beeps lol


----------



## SaperPL (Nov 22, 2009)

Ill try installing the card in the pci-e x1 slot on cheap mobo with nforce4(ill cut through the end of the slot) next weekend at home. Until than lets suppose that nforce4 doesnt support pci-e x1 VGA's


----------



## lemonadesoda (Nov 22, 2009)

Hats off! to a great thread, explanation, and method.

But why not just buy one of these?







A PCIe riser card. Less than $10. No risk. Original card stays intact.


----------



## Yukikaze (Nov 22, 2009)

lemonadesoda said:


> Hats off! to a great thread, explanation, and method.
> 
> But why not just buy one of these?
> 
> ...



A riser means the card will not fit in a standard slot, since it will be higher.

Other than that, no real reason not to buy that.

Well, the chance exists you wanna do it to some old card you don't mind killing and don't want to spend money on anything else, (or cannot obtain one) but that is a rather weak argument.


----------



## lemonadesoda (Nov 22, 2009)

LOL. Of course it will fit, unless it is a slimline case. Look inside your case and you will see about 3-5cm headroom in an ATX case, and more in a tower or server case.  The 2cm used by the riser will not cause a problem. You just might need a longer screw to hold the card in place. And it certainly isnt a problem in this example, because THIS THREAD USES A LOW_PROFILE CARD!!! 

Oh, and for anyone wanting to practice different channel setups, get a bunch of these, and cut them to x4, x8 etc. and have some fun


----------



## Yukikaze (Nov 22, 2009)

lemonadesoda said:


> LOL. Of course it will fit, unless it is a slimline case. Look inside your case and you will see about 3-5cm headroom in an ATX case, and more in a tower or server case.  The 2cm used by the riser will not cause a problem. ESPECIALLY SINCE THIS THREAD USES A LOW_PROFILE CARD!!!
> 
> Oh, and for anyone wanting to practice different channel setups, get a bunch of these, and cut them to x4, x8 etc. and have some fun
> 
> http://img.techpowerup.org/091122/Capture186.jpg



You miss the point. 

It will fit the case, but you won't be able to apply the screw (Or the tool-less clip) to it in order to secure it in place, and if you do it to a non-LP card, the top video output (Say, DVI) may no longer fit, or become blocked by the part of the case to which expansion cards are attached.


----------



## SaperPL (Nov 22, 2009)

Its good choice but in my country's stores such stuff costs horrobly above 60$ and its even hard to find that.


----------



## lemonadesoda (Nov 22, 2009)

No I didnt miss the point. You did. The point is: *There are options*.  

What works for you, _might not be the "right" solution for others_. I'm not forcing you to adopt the riser approach. But it is probably the easiest and risk-free approach and totally-suitable for some other people, depening on their setup. And a great way to test if the card/mainboard is compatible with the hack. There are many stories in this thread of it not working for some... and in the process, they dont know if they had a bad chop or an incompatible mainboard. This lets them test... even KEEP if it works for them.


----------



## Yukikaze (Nov 22, 2009)

lemonadesoda said:


> No I didnt miss the point. You did. The point is: *There are options*.



Options which are not viable in pretty much any standard case setting are not options, do you not agree ?

Anyway, this argument is pointless.


----------



## Yukikaze (Nov 22, 2009)

lemonadesoda said:


> No I didnt miss the point. You did. The point is: *There are options*.
> 
> What works for you, _might not be the "right" solution for others_. I'm not forcing you to adopt the riser approach.



Eh, in my first reply to you I said the only reason not to use it is because it will not fit in a standard case afterwards. You tried to explain why it would, to which I showed you why it will not fit.

I never said others should not use it, that one you came up with.

Like I said, pointless.


----------



## SaperPL (Nov 26, 2009)

I just cut off the end of pci-e X1 in my nf4 4x mobo and the same thing happens - card powers up but no signal. Additionaly windows doesn't recongise any new devices. Dunno if there's anything else to try.


----------



## lemonadesoda (Nov 26, 2009)

What a shame you didnt test it first with one of these:


----------



## SaperPL (Nov 26, 2009)

I dont get ur point - this mobo isnt on warranty anymore and i dont give a **** about this pc - it has venice 3000+ and 1 gig of ram onboard. My main pc's (with quad K8 cores and 4 gigs of ram) mobo is still untouched. I'm thinking about quick mobo replacemnt at the end of the year for Gigabyte's GA-2CEWH. The point is that CK804 doesn't support VGA cards on secondary and tertiary pci-e slots or doesnt support x16 line cutting and if u want two cards u need to get the one with nforce4 SLI which is really two times x8 modes slots in sli mode or dual southbridge mobo like nf4 pro 2200 + nf4 pro 2050 on the GA-2CEWH or NFPIK8AA or any other like tyan thunders. As i said before i don't wanna waste cash for something like this if not sure if it's gonna work.


----------



## chalk210 (Jun 18, 2010)

This is a beautiful thread. Megakudos to the OP.  It looks like the cards that variations have been attempted have been models that have a stock x1 alternative-- has anybody tried it on a card that has not been marketed with an x1 equivalent? The left over Pins will be the same, but I wonder if there could be limitations imposed from the drivers, etc?


----------



## Geofrancis (Jun 18, 2010)

you could get something like this tto check if it workd before you cut anything 
http://cgi.ebay.co.uk/StarTech-PCI-...ting_DesktopComponents_RL&hash=item439f68b5c3


----------



## chalk210 (Jun 18, 2010)

Yeah, I'm going to start with something similar to that. I'm going to get a 1x riser  which you can get for a song, and try to cut an opening (under $5 post paid) and just let the rest of the card hang out. That's just to try it out. I usually leave my case all apart anyway.


----------



## Yukikaze (Jun 18, 2010)

chalk210 said:


> This is a beautiful thread. Megakudos to the OP.  It looks like the cards that variations have been attempted have been models that have a stock x1 alternative-- has anybody tried it on a card that has not been marketed with an x1 equivalent? The left over Pins will be the same, but I wonder if there could be limitations imposed from the drivers, etc?



I have not tried it on video cards, but I do it to all sorts of NICs and NIC prototypes at work. Since the prototypes are FPGA-based, they are large and bulky, and do not fit in a case, so they use PCIe extenders anyway. I have often used PCIe x1 extenders with x2, x4 and x8 cards and it NEVER failed to work because a connection with less lanes is part of the PCIe standard.


----------



## newtekie1 (Jun 18, 2010)

There are _some_ graphics cards and motherboard combinations that won't allow the card to run at x1.  I know W1z couldn't get his GTX480 to boot at x1.  Also, when Tomshardware did their PCI-E Scaling article, they had to go through several motherboards and BIOSes before they found one that would let the cards work at x1.  Though both times it was with high end cards(for the time), that I wouldn't want to be running at x1 anyway.


----------



## chalk210 (Jun 18, 2010)

I wonder how a geforce 9500 would do in a 1x lane.


----------



## chalk210 (Jun 29, 2010)

Okay, gang, I've decided to give it a go.

For the first step, I got this 1x riser for a couple of bucks that I melted on one end to hopefully receive a 16x card and work right. 

I've ordered a used Radeon 4350 x16 for $15USD (will take 2 weeks to ship to me, part of the low cost of the deal)


I've rendered a spare bracket to account for additional height if necessary. $0, spare parts. I've got plenty of space in the other directions.


Total cost under $20 compared to stock 1x versions of the card that are selling for $90-$120USD.



If that don't work, I may consider dremeling it in the manner of the original poster.

Why?

Because we can


----------



## chalk210 (Sep 15, 2010)

Sorry, if this is an inappropriate thread resurrection, but in case there are still people curious, I have now performed this operation on both a radeon 4350 and now a geforce 9500.

Proceed at your own risk.

I prefer the results overall with the slightly more expensive 9500, but the 4350 was not without a few merits especially in physical dimensions.

Also, I would like to announce that I think I've coined a word for this procedure. I don't think anybody has thought of it yet but apologies if I'm not the first.... *"underlaning"* So, I have successfully underlaned both a x16 radeon 4350 and a x16 geforce 9500 gt in a x1 test system. 

Absolute kudos for the OP for giving me the courage to give it a try. I had a lot of fun.


----------



## Yukikaze (Sep 15, 2010)

chalk210 said:


> Sorry, if this is an inappropriate thread resurrection, but in case there are still people curious, I have now performed this operation on both a radeon 4350 and now a geforce 9500.
> 
> Proceed at your own risk.
> 
> ...



Cool to know I've sent more people on the way towards graphic-card-sawing-insanity 

Also very happy to hear the cards survived!


----------



## muchgooder (Mar 9, 2011)

Sorry for resurrecting an old thread (again).  

I've got a 440sc and I've been using it as a media streamer (to my ps3) for a while now.  I'd like to get a projector in my home theater and turn this server into a true htpc. 

Will the 3450 be able to play 3d movies without problem?  I've read mixed reviews and I wasn't quite sure how to interpret the screen shots in this thread.  

If not the 3450, is there a better card that I can snip to fit in this case?  They are fairly cheap on ebay and I found out the hard way how expensive these motherboards are.    I did pick up a riser card on ebay already so that I can test before I snip.  

Any advice would be appreciated!


----------



## Mussels (Mar 10, 2011)

i dont think the 3450 can do 3D movies, i think you need a 6K card for that.

I'm pretty sure there are physically small 6K cards out and about.


----------



## muchgooder (Mar 10, 2011)

Thanks for the reply.  It looks like only these nVidia cards will connect to a projector at this time:

http://www.nvidia.com/object/3d-vision-requirements.html

Any idea if I would be able to install this card into my 440?

http://www.nvidia.com/object/product-geforce-gt-430-us.html

Once again, I appreciate any advice that anyone might have as I am a noob when it comes to these cards.  I have two 440's and I would hate to have to go build another pc just to watch 3d movies on my projector.

(for those that do not know, projectors are "3d ready" - they need middleware to convert from devices such as a ps3)


----------

