What's new

Channel Interference-What's more important: Direct interference or adjacent channels?

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

LastLine77

New Around Here
Hey everyone,

I've read a number of stickys before posting but I did not see any clarification on this. I live in Manhattan and as such there is a ton of interference. I use Wifi Analyzer on Android. It shows a graph of curved peaks by channel (I'm concerned with 2.4ghz). I have selected channel 8 as it clearly stands out as the deadzone winner as far no interference near the peak. However, many people are on neighboring channels and there is a lot of overlap at the base of the chart.

My router on "auto" keeps selecting channel 6 which has the most people on it. While my signal is the strongest of the bunch, am I better off staying on channel 6, competing with ostensibly "one set" of neighbors' routers...or am I better going to channel 8 where I may be competing with "two sets" of neighbors' routers.

I apologize if this is unclear, but it's the best way I can think of explaining.
 
Adjacent Channels typically will interfere more than others on the same channel... mostly due to the fact that the neighboring AP's cannot see the management frames of the other AP's, so they cannot check to see if the channel is free or not before transmitting...

This is also the key reason why it is unwise to run Wide Channels in 2.4GHz...
 
Thanks!

It sounds like I am being mislead about the graphical nature of the app. I am currently running my 2.4ghz at 20mhz for the exact reason you cited.

When I envision channels, should I picture a square block of channels being occupied rather than something akin to a bell curve? I.e. If a neighboring router selected channel 5, is the signal on channel 6 JUST AS disruptive as channel 7 to my router on channel 8 (which also is hitting channels 6, 7)?

If I may link to images...this is what the app looks like (not mine specifically):

http://www.google.com/imgres?imgurl...kVOjCBcnesASvsoD4Cw&tbm=isch&ved=0CDcQMygKMAo

But I'm wondering if I should envision this instead:

http://www.dd-wrt.com/wiki/index.php/Image:2.4GHz_channels.png
 
Channels are not discrete. They can and do affect adjacent channels depending on the proximity of the routers and the environment itself.

A bell curve is the best estimation of what they would look like.

About a year or more ago I was using inssider to determine the best channels for myself and my customers. What I learned after countless hours of testing different configurations and then actually using the network was that the correlation between what inssider seemed to be suggesting and what the real world results were almost never matched.

My criteria and the criteria I educate my customers to trust is throughput.

In the 2.4GHz band, I consistently get the best throughput on channel 11 and in the 5GHz band the best channel is usually 48.

Regardless of any neighboring ssid's on the same channels.

Many times in the past I have seen on the 2.4GHz band channels 6 and 11 heavily used and channel 1 showing wide open. Switching the router to that channel might show the highest signal possible over the other choices, but invariably, it also showed a very inconsistent connection with regards to how steady (or not) the throughput graphs portrayed the connection. This was also observed with simple browsing and some pages would jump on the screen while others may lag for a second or so.

Changing the router to channel 11 might show an indicated connection of less than maximum, but it also gave the most consistent and fastest throughput out of all the options available too.

Since then, I have stopped using analyzers and simply tested every (control) channel available on each band to determine which channel was the optimal one. While the theory is nice and allows us to predict what should happen, nothing beats actual test results with the winner being consistency and throughput.


The 2.4GHz band's control channels (1, 6 and 11) are the only ones that have worked reliably for my customers over extended time periods. Using the in-between channels may give benefit for that specific point in time, but when other networks become utilized, it degrades significantly for all concerned.

The 5GHz band's higher channels are supposed to be better at distance, but in many examples with my customers office layouts, the lower channels are just as effective if not more so and also have the benefit of greater throughput and consistency too.


I cannot assume that your wireless environments are similar to what I have seen consistently over the last years, but what I can suggest is to not be wary of trying a 'known' bad channel because of a free wifi analyzer app.

The point of a network is to enjoy the highest and most consistent throughput it can offer. Not to setup it up to a theoretical perfect state that ultimately doesn't live up to it's potential. :)
 
Thanks for this detailed post. I was slowly, but surely coming to the same conclusion myself as I poured over mountains of info online with no clear real-world answer.

When I get home tonight I will do as you suggest and try some speed tests on the control channels during peak times.

I've been having random wi-fi disconnects with my new RT N66U. I just installed Merlin last night and am trying to problemshoot everything before my 30 day Amazon window closes. I've done everything except assign new and distinct SSIDs for 2.4ghz and 5ghz channels.

I have a feeling I will be returning this router as I searched and found similar posts with no answer...except the guy that sent his in and was told it was unrepairable.
 
You're welcome.

Make sure to do testing from a wired client to a wireless client using a large (1GB or so) file to test downloads with.

And also the same sized file from a wireless client to a wired client to test upload speeds with.

In addition to the Ookla speedtests, of course. :)
 
Random wireless disconnects...

Make sure your client device's WiFi list of previously used SSIDs has none of in-range WiFi, even weak ones. Pay attention to open WiFi SSIDs - ones with no encryption.

If your client loses signal briefly, or the signal is weak too long, the client can scan for a and try to use a different SSID for which it has the password, or a no-password SSID. On any channel.
Or your own WiFi if it has APs, and the AP is weak where the client device is located.
 
Channels are not discrete. They can and do affect adjacent channels depending on the proximity of the routers and the environment itself.

A bell curve is the best estimation of what they would look like.

About a year or more ago I was using inssider to determine the best channels for myself and my customers. What I learned after countless hours of testing different configurations and then actually using the network was that the correlation between what inssider seemed to be suggesting and what the real world results were almost never matched.

My criteria and the criteria I educate my customers to trust is throughput.

In the 2.4GHz band, I consistently get the best throughput on channel 11 and in the 5GHz band the best channel is usually 48.

Regardless of any neighboring ssid's on the same channels.

Many times in the past I have seen on the 2.4GHz band channels 6 and 11 heavily used and channel 1 showing wide open. Switching the router to that channel might show the highest signal possible over the other choices, but invariably, it also showed a very inconsistent connection with regards to how steady (or not) the throughput graphs portrayed the connection. This was also observed with simple browsing and some pages would jump on the screen while others may lag for a second or so.

Changing the router to channel 11 might show an indicated connection of less than maximum, but it also gave the most consistent and fastest throughput out of all the options available too.

Since then, I have stopped using analyzers and simply tested every (control) channel available on each band to determine which channel was the optimal one. While the theory is nice and allows us to predict what should happen, nothing beats actual test results with the winner being consistency and throughput.


The 2.4GHz band's control channels (1, 6 and 11) are the only ones that have worked reliably for my customers over extended time periods. Using the in-between channels may give benefit for that specific point in time, but when other networks become utilized, it degrades significantly for all concerned.

The 5GHz band's higher channels are supposed to be better at distance, but in many examples with my customers office layouts, the lower channels are just as effective if not more so and also have the benefit of greater throughput and consistency too.


I cannot assume that your wireless environments are similar to what I have seen consistently over the last years, but what I can suggest is to not be wary of trying a 'known' bad channel because of a free wifi analyzer app.

The point of a network is to enjoy the highest and most consistent throughput it can offer. Not to setup it up to a theoretical perfect state that ultimately doesn't live up to it's potential. :)

Since you don't user analyzers...how would you or I test the best 5Ghz channel? What do you use to measure throughput? Copy a file from a wireless device to another device on the network? Speedtest.net?
 
copy files (big ones) on your LAN is good. If the PC server is fast and so on.

Speedtest.net is very good IF you choose the correct test host and use it consistently - and know that in the Internet busy hour, speeds may at times degrade.

The issue is if there's a bandwidth-hog neighbor that's within 3 channels of yours. No quick/easy way to know that with survey tools.
 
Or sometimes even with simple testing. When you test should be when you generally care about wireless performance. If you test at 10pm, your neighbor might generally be asleep, but if you care about wireless performance more at 8pm, it could turn out that they run giant backups from their laptop to their NAS every day from 7-9pm hammering your channel.
 
Since you don't user analyzers...how would you or I test the best 5Ghz channel? What do you use to measure throughput? Copy a file from a wireless device to another device on the network? Speedtest.net?


As mentioned above, I use a large file (1 to 5 GB) between a wired client and a wireless client on the WiFi band I'm testing. Lately, these clients have all had SSD's installed so the drive performance is not an issue.

I also test from the Ookla speedtest site using the ISP of the customer as the test host. The steadiness (or not) of these download graphs are a good indication of how responsive browsing will be.

These tests above are carried out at least twice, an hour apart, on the best determined channel but for some customers I have also kept testing for every day for a week at the same time, and, one or two additional times when the results seemed good but the customer still reported slowdowns at various times during the workday.

As the customers quickly realize, they may not have the best speeds that the theory predicts they should. But they do have the best speeds with the equipment and environment they're in.

If / when they do want to buy new equipment, it is easy to show them an improvement if any with the same procedure and of course, comparing the numbers to their previous runs.
 
If, say, WiFi shows 56Mbps connection speed, then theory says that you'll get 60% or so of that as net yield with IP speed tests, ignoring all other causes of speed reductions.

What goes into that 60% that WiFi marketing never mentions?
Things like
  • WiFi is half-duplex unlike Etherent which is full duplex (normally)
  • Every WiFi frame sent has to get a response "ACK" message from the receiving end. This takes time = overhead
  • When a WiFi frame (sub-packet if you will) has an uncorrectable error, the sender has to time-out waiting for the ACK, and this is dead-air time = lower throughput
  • WiFi does "listen before transmitting" (CSMA/CA) to reduce transmission collisions.
  • WiFi uses forward error correction (FEC) which means a percentage of bits sent at some Mbps rate are not data bits - they're FEC coding bits. Without errors, you can say these bits use up bandwidth
  • And a few more. Wireless is hard.
 
Last edited:
Well, actually "it depends" on what the theory is for what you can get.

802.11n is a FRIGGEN mess. There are so many "optional" bits, you don't know what you might possible get. 802.11ac is really nice in that it clears about a fair amount of optional stuff and makes a lot of it mandatory or drops optional support entirely.

Under 802.11n without ANY extensions/optional bits, the maximum theoretical is roughly 78% net yield of the modulation rate for data payload assuming you have very good wireless environment and a clean client and basestation.

That is basically the minimum overhead that forward error correction, ack packets, beacon frame, etc. take-up out of your "connection" speed.

This is under ideal circumstances. Net yield will drop as you get further from the access point, not just modulation rate even under very good circumstances. This is because of some of the other bits mentioned, waiting for time outs because of a poorly received packet so you have to wait for a retransmit, etc.

I've seen 78% net yield (228Mbps on a 300Mbps connection) on a very good client and basestation close to the basestation. I think the best net yield I've seen "far" from a basestation, even when there was NO preceivable interference was around 60% or so. Example, at a modulation rate of 120Mbps, the max I see is around 75Mbps. At a modulation rate of 48Mbps, the max I see is around 28Mbps. At a modulation rate of 11Mbps the max I see is around 6Mbps. Closer though, at a modulation rate of 240Mbps, I see around 168Mbps which is about spot on 70% yield.
 
Yes, with an excellent signal to noise ratio, the modulation mode can use less error-correcting bits (coding rate). With fewer non-data bits per frame, the net yield improves.

The goodness of the design of the MAC layer per vendor/chip matters but no way to really know. I'd like to thing that Broadcom and one or so other chip makers do it optimally whereas some slop shops over there just toss out something that appears to work sort-of for case 1.

None of this is visible to the retail buyer.
 
Not sure if it is SoC or radio/amplifier differences or firmware or other board components, but there is certainly noticable differences in performance/net yield between different manufacturers and there seem to be some differences between SoCs/radios too.

My guess is, at all adds up. A really great SoC and radio, married to a crappy board design that has poor RF shielding and crappy trace/component layout combined with bad firmware is likely to end up with really cruddy performance. A crappy SoC and radio married to awesome board design, traces, great firmware and antennas/placement might actually be a fairly decent performer.
 
yes, a great WiFi chipset and MAC firmware used with a badly designed amplifier producing a distorted waveform = higher error rate = lower IP layer yield.
 

Similar threads

Latest threads

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top