Top 40GB per second network for lightning-quick file transfers
Top 40GB per second network for lightning-quick file transfers
The latest 960 Pro SSDs achieve speeds around 3.2 Gigabytes per second. This allows me to transfer files as quickly as locally. Another consideration is RAMDISK sharing across the network.
Due to the limited space in the 960 Pro SSDs, 40Gb networking offers minimal benefit. It's similar to constructing an extensive pipeline for transporting only a small amount of water.
Wow guys think outside of the box for a moment. TODAY: if you have a single SSD network transfer is limited by a 10GbE card. Same thing goes for a raid with more than 1,2Gigbytes/s read (about 8 drives raid 0). This is a solution that costs as much as 10GbE but is significantly faster. Please dont spread your 640k mentality.
It works well for transferring files between PCs, but beyond that you run into the usual issues with infiniband setups. You’ll need dedicated infiniband switches and an Ethernet gateway—either a Linux server with the right adapter or a switch capable of handling it. This still doesn’t solve real-world needs. Apart from moving files, no general computer tasks will benefit from this bandwidth. It raises the point about why data transfers between computers are justified only when they’re limited to simple operations. My setup is different: I store everything on my server using multiple 6.5GB SSDs, connected via a VHDX file on an SMB3 shared drive with X540-T1 10Gb NICs. I rarely exceed 10Gb during normal use—except for disk tests or RAM disk copies. With this configuration, my whole network gains from the 10Gb connection, even though it’s affordable on eBay. Cheap options exist, but they’re more adaptable.
Because your space isn't being used doesn't mean others aren't. Consider the latest movies in 4K resolution. Even with HEVC compression, files can reach around 30Gbyte. Titles like Quantum Break or Gears of Wars require over 80Gbyte. Moving such large amounts over a standard 1Gbit connection is extremely slow. Transferring this much data between locations is impractical with traditional RJ45 cables. The most cost-effective option appears to be Thunderbolt or Infiniband, both offering 40Gbit transfers for less than $100. Infiniband switches are more affordable than 10GBase-T connections.
You're still sending a lot of data, right? Most folks have one high-end gaming rig, so where do they go to utilize 10GbE or 40GbE? Keep the data on the server and stream it instead. 4K video doesn't require nearly 1Gbps, and loading games uses even less. What are your A and B levels, and why are you transferring everything between them? There are more efficient methods that don’t need extra hardware or add complexity. This might sound a bit elitist, but I’m one of the few here who could actually benefit from such bandwidth at home—without needing big games or movies. I own several 4 dual socket 1366 servers and an IBM x3500 M4 for my home lab, mostly using SSD storage. Just because you’re moving huge amounts of data doesn’t mean everyone else is doing it or needs it. Edit: I’m not saying stop it or you shouldn’t have tried. If you’ve done it and found it helpful, that’s great. What I’m really pointing out is that this isn’t perfect—there are downsides, and not everyone can use it. Also, there are affordable used 10GbE NICs on eBay at similar prices, without all the issues of Infiniband. 10GBase-T isn’t the only option; you could use 10GbE SFP+ switches for some devices and cheaper 1Gb ports for others.