What's new

10GB Connection Performance tweaking

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

nbat58

Occasional Visitor
Does anyone know a guide on how to tweak the intel X540 T2 for performance, I have got currently 2 same NIC's between my windows 10 pc and my OmniOS NAS, I seem to be only getting about 120 mbps, where I should expect to see about 300 > with my disk and hardware.

I have currently got the below settings changed on both machines:

Jumbo frame at 9000
Flow control off

Then on the windows PC:
Interrupt Moderation = Disabled
Receive Buffers = 4096
Transmit Buffers = 16384

Then on the NAS machine:
max_buf=4097152 tcp
send_buf=2048576 tcp
recv_buf=2048576 tcp

The cable run is about max 3 meters between machines.
 
turn on flow control, interrupt moderation, quad your receive buffers or disable jumbo frame (jumbo frame is not really necessary).

Thing is with buffers, it doesnt really have to be very large, the buffers need to be large enough to hold a portion of the bandwidth, about 1% or less is good (but big enough to hold a couple of the largest packets) as it is better to drop a packet from being full or have a control mechanism to tell it to not transfer than it is to store it in a buffer as some things like TCP rely on tuning via drops, control and retransmits.

Its also 10Gb not 10GB. Networking is in bits, not bytes. An older laptop hard drive may operate at 80MB/s, with newer ones going up to 150MB/s. newer desktop drives will run about 200MB/s. Dont use the hard drive for testing this, i suggest using a ram drive or a proper bandwidth tester because you will not be able to find out whats wrong otherwise when the storage device is the bottleneck.

In my case i got 90% of the rated AC link by using a bandwidth tester and a lot of packet drops, as practically you'll only expect to see around 50-60% of the link speed practical wifi AC bandwidth. For 10Gb/s i've gotten close to that but i dont use an ethernet 10Gb, i use SFP+ direct. Theres a lot that goes into overheads when you use an application, thats why you should use a proper bandwidth tester. I'll let @sfx2000 recommend a few.
 
Forgot to say that the two NIC 's are connected direct, I have used iperf3 and the below is the test result:

Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.

PS C:\Users\N> ./iperf3.exe -c 10.10.10.10
Connecting to host 10.10.10.10, port 5201
[ 4] local 10.10.10.20 port 60949 connected to 10.10.10.10 port 5201
[ ID] Interval Transfer Bandwidth
[ 4] 0.00-1.00 sec 800 MBytes 6.71 Gbits/sec
[ 4] 1.00-2.00 sec 826 MBytes 6.93 Gbits/sec
[ 4] 2.00-3.00 sec 824 MBytes 6.91 Gbits/sec
[ 4] 3.00-4.00 sec 820 MBytes 6.88 Gbits/sec
[ 4] 4.00-5.00 sec 826 MBytes 6.93 Gbits/sec
[ 4] 5.00-6.00 sec 816 MBytes 6.84 Gbits/sec
[ 4] 6.00-7.00 sec 827 MBytes 6.94 Gbits/sec
[ 4] 7.00-8.00 sec 825 MBytes 6.92 Gbits/sec
[ 4] 8.00-9.00 sec 828 MBytes 6.94 Gbits/sec
[ 4] 9.00-10.00 sec 815 MBytes 6.84 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ ID] Interval Transfer Bandwidth
[ 4] 0.00-10.00 sec 8.02 GBytes 6.89 Gbits/sec sender
[ 4] 0.00-10.00 sec 8.02 GBytes 6.89 Gbits/sec receiver

iperf Done.
PS C:\Users\N> ./iperf3.exe -s
-----------------------------------------------------------
Server listening on 5201
-----------------------------------------------------------
Accepted connection from 10.10.10.10, port 62169
[ 5] local 10.10.10.20 port 5201 connected to 10.10.10.10 port 35209
[ 7] local 10.10.10.20 port 5201 connected to 10.10.10.10 port 50193
[ 9] local 10.10.10.20 port 5201 connected to 10.10.10.10 port 48770
[ 11] local 10.10.10.20 port 5201 connected to 10.10.10.10 port 58289
[ ID] Interval Transfer Bandwidth
[ 5] 0.00-1.00 sec 192 MBytes 1.61 Gbits/sec
[ 7] 0.00-1.00 sec 192 MBytes 1.61 Gbits/sec
[ 9] 0.00-1.00 sec 193 MBytes 1.62 Gbits/sec
[ 11] 0.00-1.00 sec 191 MBytes 1.61 Gbits/sec
[SUM] 0.00-1.00 sec 767 MBytes 6.44 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 1.00-2.00 sec 204 MBytes 1.72 Gbits/sec
[ 7] 1.00-2.00 sec 201 MBytes 1.69 Gbits/sec
[ 9] 1.00-2.00 sec 202 MBytes 1.69 Gbits/sec
[ 11] 1.00-2.00 sec 200 MBytes 1.68 Gbits/sec
[SUM] 1.00-2.00 sec 808 MBytes 6.77 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 2.00-3.00 sec 208 MBytes 1.75 Gbits/sec
[ 7] 2.00-3.00 sec 209 MBytes 1.75 Gbits/sec
[ 9] 2.00-3.00 sec 207 MBytes 1.74 Gbits/sec
[ 11] 2.00-3.00 sec 206 MBytes 1.73 Gbits/sec
[SUM] 2.00-3.00 sec 831 MBytes 6.97 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 3.00-4.00 sec 207 MBytes 1.73 Gbits/sec
[ 7] 3.00-4.00 sec 210 MBytes 1.76 Gbits/sec
[ 9] 3.00-4.00 sec 207 MBytes 1.73 Gbits/sec
[ 11] 3.00-4.00 sec 209 MBytes 1.75 Gbits/sec
[SUM] 3.00-4.00 sec 833 MBytes 6.99 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 4.00-5.00 sec 208 MBytes 1.75 Gbits/sec
[ 7] 4.00-5.00 sec 207 MBytes 1.74 Gbits/sec
[ 9] 4.00-5.00 sec 207 MBytes 1.73 Gbits/sec
[ 11] 4.00-5.00 sec 207 MBytes 1.74 Gbits/sec
[SUM] 4.00-5.00 sec 829 MBytes 6.96 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 5.00-6.00 sec 196 MBytes 1.64 Gbits/sec
[ 7] 5.00-6.00 sec 196 MBytes 1.65 Gbits/sec
[ 9] 5.00-6.00 sec 197 MBytes 1.65 Gbits/sec
[ 11] 5.00-6.00 sec 196 MBytes 1.65 Gbits/sec
[SUM] 5.00-6.00 sec 785 MBytes 6.59 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 6.00-7.00 sec 212 MBytes 1.78 Gbits/sec
[ 7] 6.00-7.00 sec 205 MBytes 1.72 Gbits/sec
[ 9] 6.00-7.00 sec 211 MBytes 1.77 Gbits/sec
[ 11] 6.00-7.00 sec 208 MBytes 1.75 Gbits/sec
[SUM] 6.00-7.00 sec 837 MBytes 7.02 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 7.00-8.00 sec 210 MBytes 1.76 Gbits/sec
[ 7] 7.00-8.00 sec 210 MBytes 1.76 Gbits/sec
[ 9] 7.00-8.00 sec 211 MBytes 1.77 Gbits/sec
[ 11] 7.00-8.00 sec 210 MBytes 1.76 Gbits/sec
[SUM] 7.00-8.00 sec 840 MBytes 7.05 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 8.00-9.00 sec 203 MBytes 1.70 Gbits/sec
[ 7] 8.00-9.00 sec 212 MBytes 1.78 Gbits/sec
[ 9] 8.00-9.00 sec 203 MBytes 1.71 Gbits/sec
[ 11] 8.00-9.00 sec 211 MBytes 1.77 Gbits/sec
[SUM] 8.00-9.00 sec 829 MBytes 6.95 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 9.00-10.00 sec 211 MBytes 1.77 Gbits/sec
[ 7] 9.00-10.00 sec 211 MBytes 1.77 Gbits/sec
[ 9] 9.00-10.00 sec 209 MBytes 1.75 Gbits/sec
[ 11] 9.00-10.00 sec 205 MBytes 1.72 Gbits/sec
[SUM] 9.00-10.00 sec 836 MBytes 7.01 Gbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ 5] 10.00-10.00 sec 0.00 Bytes 0.00 bits/sec
[ 7] 10.00-10.00 sec 0.00 Bytes 0.00 bits/sec
[ 9] 10.00-10.00 sec 0.00 Bytes 0.00 bits/sec
[ 11] 10.00-10.00 sec 0.00 Bytes 0.00 bits/sec
[SUM] 10.00-10.00 sec 0.00 Bytes 0.00 bits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ ID] Interval Transfer Bandwidth
[ 5] 0.00-10.00 sec 0.00 Bytes 0.00 bits/sec sender
[ 5] 0.00-10.00 sec 2.00 GBytes 1.72 Gbits/sec receiver
[ 7] 0.00-10.00 sec 0.00 Bytes 0.00 bits/sec sender
[ 7] 0.00-10.00 sec 2.01 GBytes 1.72 Gbits/sec receiver
[ 9] 0.00-10.00 sec 0.00 Bytes 0.00 bits/sec sender
[ 9] 0.00-10.00 sec 2.00 GBytes 1.72 Gbits/sec receiver
[ 11] 0.00-10.00 sec 0.00 Bytes 0.00 bits/sec sender
[ 11] 0.00-10.00 sec 2.00 GBytes 1.71 Gbits/sec receiver
[SUM] 0.00-10.00 sec 0.00 Bytes 0.00 bits/sec sender
[SUM] 0.00-10.00 sec 8.00 GBytes 6.87 Gbits/sec receiver
-----------------------------------------------------------
Server listening on 5201
 
one thing you can do, make sure things like DMA are enabled in bios and apply the NIC settings i told you. Jumbo frames doesnt really matter if you consider the speed of the drives and DMA allows the NIC to directly access the drive.
 
The DMA setting I guess would be under the PCIE slot settings where the NIC is installed? I will give it a go this week end and see if there are any improvements.

In the mean time hopefully I will get more suggestions from someone with a similar set up using the Intel X540 T2.
 
Does anyone know a guide on how to tweak the intel X540 T2 for performance, I have got currently 2 same NIC's between my windows 10 pc and my OmniOS NAS, I seem to be only getting about 120 mbps, where I should expect to see about 300 > with my disk and hardware.

I have currently got the below settings changed on both machines:

Jumbo frame at 9000
Flow control off

Then on the windows PC:
Interrupt Moderation = Disabled
Receive Buffers = 4096
Transmit Buffers = 16384

Then on the NAS machine:
max_buf=4097152 tcp
send_buf=2048576 tcp
recv_buf=2048576 tcp

The cable run is about max 3 meters between machines.
go to gea site ( napp-it )

he has a complete tunings manual as commands and it's in the newer versions of napp-it as executable from gui

send from a mobile device, so typo's are to be expected
 
I do use napp-it and the above tuning settings have been suggested by Gea and others in another forum without success.

I guess I need to go through Gea's tuning manual with a toothcomb and try different setting until I hit the right spot.
 
I do use napp-it and the above tuning settings have been suggested by Gea and others in another forum without success.

I guess I need to go through Gea's tuning manual with a toothcomb and try different setting until I hit the right spot.
There are DMA settings for various things in bios.
PCIe is like a network switch, devices can communicate directly without going through ram/CPU, but nvidia restricts this feature to their tesla GPU. Not even their quadro can do it.
 
Look for other bottlenecks - if the Client computer has SATA, and you're doing an SMB copy, you're only going to be as fast as the slowest link and the associated overhead...

160-190 MB/s as noted by Windows is pretty good for a client with a sata drive over 10Gb Ethernet...

ZFS has a fair amount of overhead on the NAS side, and the NAS has to account for that as well as the TCP layer and then the Samba configuration itself.
 

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top