What's new

Windows 8.1 SMB3 multichannel 1350MB/s

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

Dennis Wood

Senior Member
After "Azazel" posted some information in this thread ( http://forums.smallnetbuilder.com/showthread.php?t=15062) regarding SMB3 multichannel performance, my curiosity was piqued. After a bit of reading, it was immediately apparent that something super cool is hiding under the hood of windows 8.1 and 2012 server.

Server Message Block (SMB) protocol operates as an application layer network protocol, for providing shared access to serial ports, printers and files. It has been around for awhile, where Vista and Windows 7 brought SMB2, and now SMB3 is part of Windows 8.1 and 2012 server. There are a lot of performance enhancements in SMB3.

You may or may not know that link aggregation (teaming multiple NIC cards together) does nothing to speed up single connections, and requires LACP switches, NIC teaming drivers etc. So for any home user etc, a waste of time really.

However with Windows 8.1 running with a RAID array as a NAS, you can simply plug two (or more) NIC ports from a dual lan card into your switch, plug two cables (or more) similarly from a workstation, and instantly double your bandwidth with zero configuration. It just works compliments of SMB3.

So how does 1360 MB/s transfer between two windows workstations sound? This is pretty crazy performance, and in this example is being limited by ramdrive software. I've tried IMDISK and now Softperfect RAmdisk to observer a 200MB/s difference. If you look at the Ethernet 5 and 6 graph, you can see that each 10GBe port is only at 60% saturation. So ~2300 MB/s is likely doable, with a fast enough system. Mine isn't! For an incredibly cost effective shared media server for RAW video and photos (my goal in this project), this is off the shelf hardware. I ran the same test with dual Gigabit cards and got an instant 235 MB/s result with zero config on the switch or NIC drivers. In other words, if you have dual lan workstations, simply upgrading to Windows 8.1 can double your network bandwidth with zero hardware investments and no configuration.

multichannel_smb3.png
 
Last edited:
Glad you discovered it and are having fun!

I just wish I had access to 10GbE...sigh...

Well...actually I guess I wish I had faster storage, mine already can't keep up with my dual GbE links with the server RAID array at 45% utlization and my desktop RAID array at 70% utilization.

I need new disks :(
 
I guess the question is..do you actually need that bandwidth?

My home office setup will continue on with Gigabit only, and the 10G test results are being more or less executed for my team at Cinevate. What we do there is a combination of Solidworks (already using a network workflow), studio photography (primarily product photos, but all RAW, and quite a few of them) and the many videos we produce in-house at www.vimeo.com/cinevate

Once the 10G setup is implemented, the R&D, video and marketing workstations will have shared storage and therefore very easy access when required, primarily for collaborative purposes. There are quite a few instances where this means significant efficiencies for my team. If you consider that pretty much every product launch, our last Kickstarter campaign and social media efforts all involve media that use Solidworks models, graphic elements, product photography and video components, the advantage of the entire team having access from their desktops makes a lot of sense. In fact, our marketing/web guru will be doing a crash course in Adobe Premiere Pro CC, so he will be directly accessing the timeline to work his brand of graphical coolness. I'm pretty interested in observing how it all plays out.
 
Oh, no, no, not at all. I don't even need 2xGbE at all, but its nice to have.

I guess its where I hyperfixate.

It just seems silly to have to wait for a 2GB movie to transfer, let alone 10-20GBs of movies, not even talking about if I have to rebuild my server at some point. UGH.

Though, I guess to counterbalance, I really want the price to come down, in large part because I am just not willing to spend all that much on it. $400 range maybe. It could be somewhat more in a couple of years, it could be less. Just depends on financial obligations at the time.

Its a hobby more than anything.

Though actually, it did just give me an idea for when I upgrade the RAID arrays in my systems in a few months. Both because those arrays should be faster and because we are talking about transfering a couple of TERABYTES of data, which does take a rather long amount of time, I think I might run my 100ft patch cable between the server and my desktop and reenable the onboard NICs so I can try running 3xGbE just to retransfer the data between the machines as I pull each array and stick in the new one.

If it can cut a roughly 3hr process down to 2hrs for each "leg" of the rebuild, I wouldn't mind, even if I am not actively sitting in front of the machines waiting for it to finish.
 
Two PCIe intel 4 port server cards from flea bay, some cable and you're at 500MB/s ...
 
Yeah, its just running the cabling and having the switch capacity. I just finished an area of my basement for an office and only ran two Cat5e for my desktop. I could maybe, possibly fish the wires back out with string on them to do a pull back through for more, but it might be chancy.

No way I am tearing back in to my newly finished walls to do it.

I just have my fingers crossed that the 5e lengths aren't too long or too much EMI to support 10GbE down the road. I've seen Cat5e with 10GbE running fine on it, but I don't think I've actually ever witnessed an attempt with a length over about 20-25ft.

Supposedly Cat5e can handle 10GbE up to, 42m? 45m? and my runs are around 20m (max) between switch and jacks and another 1.5m from the jacks to my desktop. No stranded patch cables, all solid copper conductor for in wall and from jack to NIC (I roll all my own cables). Everything with Cat 6 connectors (keystone jacks and the RJ-45 connectors). So fingers crossed in a few years, the wiring will support 10GbE.

That said, with a renovation of my house's family room, which will eventually be converted in to an office in several years when my kids are older, I plan on running Cat 6 then (debating 6a), in large part because I only have about 80ft of 5e left in my 500ft box and I am probably going to need at least 2x that with the runs I am putting in. So I'll save the 5e for making patch cables and just get some 6/6a for that project.

On the switch capacity I have a TP-Link SG2216, which is nice, but I only have 2 spare ports right now, and looking at needing to connect up at least another 3 in the short term and probably 5-8 long term. I also have an 8 port Trendnet dumb GbE switch I can use to temp increase capacity. I am thinking possibly the 24 port model of the same switch in the nearer term, as that should allow me to have all the port space I need, even for far future renovations on my house, it also allows me to run GBIC cards for fiber, as I have an outbuild I want to connect up, wired, in a couple of years and I don't want to run copper underground outside.

Depending on where things end up though, and what pricing is looking like in a few years, I just might investigate a couple of quad port Intel GbE switches to connect the two machines, at least if/when I have a RAID array that can support those speeds.

I wonder how well Intel NICs work and how robust SMB3.0 is...could I connect two of the ports through the switch and 2 of the ports as cross over direct between the machines? I am kind of intrigued now. I might have to just connect one NIC through the switch and run the 50ft patch cable to connect the other NIC direct to the server now and test it out and see what happens.
 
Last edited:
I ran just a few simple tests (more coming as we test in our building when things go live) with 10G over Cat5e cables. With two workstations connected to the switch using 25 ft patch cables (terminated/tested by yours truly) there was zero degradation.
 
Thanks for the test!

Yeah, I am not really expecting any, but having others confirm that they haven't run in to issues with Cat5e and 10GbE always makes me feel better, since I think I've seen in person exactly one test of around a 20ft 5e cable and 10GbE and it worked just fine. I've read on the internet a couple of other occurances, and lots of people saying the sky will fall if you try it (but not actually having tried 10GbE over 5e)...and supposedly it is rated to >40m.
 
I've been trying to post up concrete results on stuff like 10G over CAT5e as I've read precisely zero measured tests results on the web :)
 

Latest threads

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top