# internal network speed!



## freaksavior (Mar 22, 2012)

What is your INTERNAL network speed between machines? 

Both of my servers are using gigabit nics, one is a pci card, the other is onboard. I understand it has some what to do with overall system performance, but right now I am getting 15.4 which is just slightly faster than the onboard 100mbps nic I had. 

I theoretic speed is 125mb's and I would love to see that, but why the heck am I getting a 10th of that?


----------



## Easy Rhino (Mar 22, 2012)

typical speeds for 1 drive sata setups across a network. which is why it is hilarious that all of the NAS manufacturers market their 1gig network ports when you will never ever get close to that ever.


----------



## Munki (Mar 22, 2012)

Easy Rhino said:


> typical speeds for 1 drive sata setups across a network. which is why it is hilarious that all of the NAS manufacturers market their 1gig network ports when you will never ever get close to that ever.



That's exactly it. Stupid hard drives!


----------



## Maban (Mar 22, 2012)

I get capped at 115MB/s PC to PC with ATTO. Why are NAS systems so slow?


----------



## dir_d (Mar 22, 2012)

Maban said:


> I get capped at 115MB/s PC to PC with ATTO. Why are NAS systems so slow?



Are you using a switch? Home switches cant hit that speed sustained.


----------



## Maban (Mar 22, 2012)

dir_d said:


> Are you using a switch? Home switches cant hit that speed sustained.



It's direct, one of the benefits of dual NICs.


----------



## Munki (Mar 22, 2012)

Maban said:


> It's direct, one of the benefits of dual NICs.



I am not sure about the other machine, but you have to consider you have an SSD in your PC.


----------



## dir_d (Mar 22, 2012)

Maban said:


> It's direct, one of the benefits of dual NICs.



Yes that's his problem. For him to get decent speed he would need an enterprise switch with the throughput he needs. Then he would probably have to partition the server disks, on the outside of the platter to simulate short stroking to get the wanted speeds all the time.


----------



## Maban (Mar 22, 2012)

Munki said:


> I am not sure about the other machine, but you have to consider you have an SSD in your PC.



I tried with a RAMdrive, SSD, 2x 500GB RAID 0, and a nearly full 500GB (16GB left at the very end of the drive). Even the nearly full drive got me 85MB/s. The RAID array got 115 write but only ~85 read.


----------



## bencrutz (Mar 22, 2012)

freaksavior said:


> What is your INTERNAL network speed between machines?
> 
> Both of my servers are using gigabit nics, one is a pci card, the other is onboard. I understand it has some what to do with overall system performance, but right now I am getting 15.4 which is just slightly faster than the onboard 100mbps nic I had.
> 
> I theoretic speed is 125mb's and I would love to see that, but why the heck am I getting a 10th of that?


yeah, that kinda low

what kind of server is that? what lan cables you are running them in? how long the cable runs?

for typical file server/proxy server it usually peaks at 40-ishMBps (non raid) and 70-ishMBps (raid) and tops at around 100MBps (fast pciex SSD) - with good gigabit switches and cables


----------



## Solaris17 (Mar 22, 2012)

because your not running SSDs or raided cheetahs.


----------



## bencrutz (Mar 22, 2012)

Maban said:


> It's direct, one of the benefits of dual NICs.



have you tested teaming/bonding on them? and if you have, how much bandwidth you could squeeze from them?
just curious


----------



## Maban (Mar 22, 2012)

bencrutz said:


> have you tested teaming/bonding on them? and if you have, how much bandwidth you could squeeze from them?
> just curious



If I remember and if I can figure out how to do it, I will try that out tomorrow.


----------



## freaksavior (Mar 22, 2012)

Cables are cat5e and cables are ran through the house. I have had in the upwards of 60+ sustained speeds on a single drive write over the gigiabit lan but it's not working between the two machines.

I am thinking it's to much overhead between date flying across my network between all the devices on it. I have probably around 20 devices in total wired and wireless using the network at the same time. 

I was mostly curious what others got but my question ended up turning into how can I make it faster


----------



## Solaris17 (Mar 22, 2012)

freaksavior said:


> Cables are cat5e and cables are ran through the house. I have had in the upwards of 60+ sustained speeds on a single drive write over the gigiabit lan but it's not working between the two machines.
> 
> I am thinking it's to much overhead between date flying across my network between all the devices on it. I have probably around 20 devices in total wired and wireless using the network at the same time.
> 
> I was mostly curious what others got but my question *ended up turning into how can I make it faster*



nothing new their. Id say welcome to TPU but me and you already learned these ropes.


----------



## bencrutz (Mar 22, 2012)

Maban said:


> If I remember and if I can figure out how to do it, I will try that out tomorrow.


thanks, looking forward for the result 



freaksavior said:


> Cables are cat5e and cables are ran through the house. I have had in the upwards of 60+ sustained speeds on a single drive write over the gigiabit lan but it's not working between the two machines.
> 
> I am thinking it's to much overhead between date flying across my network between all the devices on it. I have probably around 20 devices in total wired and wireless using the network at the same time.
> 
> I was mostly curious what others got but my question ended up turning into how can I make it faster


you could plug out other devices from the network and test the bandwidth between the servers,
and if the bandwidth does not improve, try new ethernet cables, just to make sure it's not the cable that degrading. you might also want to check the ethernet switch, or simply switch to other port(s)


----------



## freaksavior (Mar 22, 2012)

Solaris17 said:


> nothing new their. Id say welcome to TPU but me and you already learned these ropes.



heh yeah, I know the ropes  



bencrutz said:


> thanks, looking forward for the result
> 
> 
> you could plug out other devices from the network and test the bandwidth between the servers,
> and if the bandwidth does not improve, try new Ethernet cables, just to make sure it's not the cable that degrading. you might also want to check the Ethernet switch, or simply switch to other port(s)



Going to try it out tomorrow actually, straight plug in to the server from the server


----------



## freaksavior (Mar 23, 2012)

Server to BackupServer







Server to server local 






BackupServer to BackupServer


----------



## bencrutz (Mar 23, 2012)

have you check the link state, just to make sure it is running on gigabit mode. also you might want to test it with new, shorter lan cable, and/or test it with pcie gigabit ethernet add in board if you happen to have one lying around


----------



## freaksavior (Mar 23, 2012)

Yeah they are all set to run 1000Mbps full duplex. Shorter cables are not an option, it's wired through my house.

Tomorrow (later today) I am going to put them together and do a crossover. Hope that narrors the shit down.


----------



## Easy Rhino (Mar 23, 2012)

just typical speeds man. nothing to worry about.


----------



## Maban (Mar 24, 2012)

I set up teaming on both system, it showed 2Gb but it would absolutely not get above gigabit during the transfer. I set all types of teaming modes too.


I really didn't mean to sidetrack this thread but umm yeah...


----------



## Jetster (Mar 24, 2012)

I just bought a Linksys E4200 so I tested the speed from my Desktop to my HTPC plater to plater. A 10.5 Gb movie took 86 seconds at 142 Mbs is the reported speed.


----------



## Aquinus (Mar 24, 2012)

For everyone using gigabite, I hope you're using CAT6 cables if you really want full gigabit speeds, CAT5 will do gigabit and usually is fine with short distances, but for longer stretches your performance could degrade quickly. You may want to test using a ram disk and to make sure that caching is disabled. Windows loves to cache.


----------



## freaksavior (Mar 24, 2012)

Easy Rhino said:


> just typical speeds man. nothing to worry about.



Yeah I can see that now since I was able to get a sustained 40+ directly connected. I still think it should be faster but 40Mbps is fine by me right now.


----------



## bencrutz (Mar 25, 2012)

Maban said:


> I set up teaming on both system, it showed 2Gb but it would absolutely not get above gigabit during the transfer. I set all types of teaming modes too.
> 
> 
> I really didn't mean to sidetrack this thread but umm yeah...


bummer
any idea why? driver or OS perhaps



freaksavior said:


> Yeah I can see that now since I was able to get a sustained 40+ directly connected. I still think it should be faster but 40Mbps is fine by me right now.


well that's more like it 
btw, ubiquiti toughcable is pretty reliable, but a bit expensive. i've used it for outdoor installation (30 - 40m towers, up to 50m cable length without signal degradation)


----------



## newtekie1 (Mar 25, 2012)

dir_d said:


> Are you using a switch? Home switches cant hit that speed sustained.



They certainly can, I have a cheap Rosewill switch that sustains 90-100MB/s easily.



Aquinus said:


> For everyone using gigabite, I hope you're using CAT6 cables if you really want full gigabit speeds, CAT5 will do gigabit and usually is fine with short distances, but for longer stretches your performance could degrade quickly. You may want to test using a ram disk and to make sure that caching is disabled. Windows loves to cache.



Cat5*e* will handle gigabit up to the official distance limit(325ft), that is what it is rated for and that is what it will handle without degradation of the signal.  CAT6 was developed for 10Gb/s networks, it isn't necessary for gigabit at all.



freaksavior said:


> What is your INTERNAL network speed between machines?



To answer your question directly:






That is:  2x 5900RPM 1.5TB RAID0 --10ft Cat5e--> Rosewill 8-port Gb Switch --~100ft CAT5e--> Linksys 5-port Gb Switch --6ft Cat5e--> 3x 7200RPM 1.5TB RAID5

And I still think this being limited by the hard drives not the network.


----------



## Steevo (Mar 25, 2012)

Backplane bandwidth.


----------

