Anyone using either infiniband or 10Gbps Ethernet at home?

NeverDie

Senior Member
Moving  large iso's to a server over gigabit ethernet to execute as VM's.  It's not terrible, but it takes a while.  Roughly a minute for Windows 10 Technical Preview. 
 
Here's the upload of Windows 10 Technical Preview:
 
isoupload.png
 
 
That depends on where the bottleneck is.  If you aren't saturating the link then the bottleneck is not your network connection.  Many file transfer protocols (i.e. CIFS, FTP) would have a had time of even reaching 1Gbps and even the ones that can (NFS) usually require tuning to get there.
 
It is highly doubtful you would see any benefit from 10Gbps...
 
wuench said:
That depends on where the bottleneck is.  If you aren't saturating the link then the bottleneck is not your network connection.  Many file transfer protocols (i.e. CIFS, FTP) would have a had time of even reaching 1Gbps and even the ones that can (NFS) usually require tuning to get there.
 
It is highly doubtful you would see any benefit from 10Gbps...
 
I tried attaching a graph to show that it did saturate at 1gbps, but I guess the it didn't convey that meaning clearly enough.  Not sure why you think I wouldn't see any benefit from 10gbps, though, as I even gave an example scenario where faster speed would have been beneficial.
 
But unless I'm missing something, that didn't saturate 1GB - you were at 160MB which is merely a fraction of that.  The graph shows near the top of the chart only because it autoscales to keep the max value at the top of the chart.  If you were running 10mbps, then the top of your chart would be 8-9mbps and *that* would appear fully saturated.
 
I've hit 1gpbs fully saturated on netgear prosafe switches - it's doable.
 
Think about it though - high dollar enterprise SANs might hit the 10gpbs equipment but the industry has been booting off SANs for 10 years off 2gbps aggregate - nothing at your house even comes close to needing that.
 
That said - if you have the money to piss away, nobody would argue that having 10gbps in your house isn't awesome! Just extreme overkill.
 
Sorry,I stand corrected,it looks like you are saturating it, my assumption without the graph was you weren't as usually the protocol gives out first.  
 
You might look into trunking multiple gig links (assuming your equipment and the file protocol you are using will support it using multiple connections) before you go to the extreme of 10Gbps.   But yes 10Gbps is an option.
 
Moving  large iso's to a server over gigabit ethernet to execute as VM's.  It's not terrible, but it takes a while.  Roughly a minute for Windows 10 Technical Preview.
 
Personally I think its a great learning exercise. 
 
Purchasing hardware versus the minute or two wait time would be sort of a futile exercise relating to time and effort and money.
 
Note that is my opinion and your stuff which you can do anything you want with these days at very reasonable costs (relatively speaking).
 
If you have a hot swap drive cage in the server you can always just take the drive you put the ISO's on and put that drive in your VM server or install external eSata ports on your server and copy them that way.  The copying wait time would be a bit less.  
 
Here also utilize a little Zalman ZM-VE200 virtual portable drive (USB-480Mbps / eSATA 3Gbps transfers).  It is faster and smaller than my USB portable blue ray combo.  It has a USB port / eSata port and ISO image emulation built in.  I purchased it based on a review right here on CT a few years back that Dan (Electron) wrote.  I put a 1 Tb notebook drive in it.  It's very small and portable.
 
Waste of money or not, Netgear claims it's doable over Cat6 or even Cat5e cabling:  http://www.netgear.com/landing/10gigabit.aspx
In addition, Netgear says Cat6 can do 10G-ethernet up to 55 meters, which I should think would be be plenty far for any single run in a home.
~$800 for an 8-port switch.  
 
Seems that if you had a wired connection to your server, you wouldn't need a harddrive at all, and you wouldn't miss not having one either.
 
Also, with 802.11ac, you can do up to 7Gbps, if you have the right gear and circumstances are right.  
 
This may be leading edge, but it isn't sounding like science fiction.
 
I am not saying that 10G-ethernet / Gb 802.11ac is not doable or science fiction  It isn't leading edge technology. 
 
I've been playing with Gb 802.11AC now for over a year.  It works only OK.  I am not impressed right now with it.
 
10G-ethernet is getting getting more price reasonable to do these days and that is something to look at now if you need it for residential network use.  
 
I am just writing if you want to move these files or build your VMs faster today connect the a drive with the ISO images to your VM box rather than purchase a 10G-ethernet switch and NICs for your VM creation stuff.
 
Moving  large iso's to a server over gigabit ethernet to execute as VM's.  It's not terrible, but it takes a while.
 
No offense but 10G is not leading edge anymore, We were doing 40G years ago and 100G not too long ago, the only problem there, is the optics are too expensive still and not many makers yet.
 
As for distance, you can find tons of 120KM optics for up to 40G right now.
 
Frunple said:
160 M BYTES is 1.2 Gbits, so yeah, it did max it out.
That's what I get for posting late and not looking closer...
 
If the speed is worth shelling out ~$1500 to connect 2 machines is worthwhile then by all means, go for it!  Of course it'd be fun to do.  That said, if it were me I'd just do the link aggregation and connect 2-4 cables with server-grade NICs and settle there.  Once you get to a certain speed on the network side of things, you'll run into other bottlenecks I'd imagine - but I honestly don't know the limits of each component so I don't know at which point you'd start having trouble... just thinking of bus bandwidth; drive read/write speeds, etc.
 
Back
Top