What's new

Bye Bye 40 MHz Mode in 2.4 GHz

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

As beisser reaffirmed, 20MHz in 2.4 GHz is in the 802.11-2012 spec. "40 MHz only" mode in 2.4 GHz is also against spec.

if anyone is interested in hardcore knowledge behind wireless i can only suggest this:

http://www.wirelesstrainingsolution...-cwna-certfied-wireless-network-administrator

after that you will know what you talk about, i promise.

@tim i hope you dont mind that link. wireless is always a topic where people complain without having the basic knowledge on how the technology works.

i recently did that training above and it really helps understand wireless technology. may be interesting for some here.
 
Note that the IEEE does NOT perform compliance tests or issue certifications of compliance.

The Industrial Alliance trademarked "WiFi", has a certification process. Manufacturers supposedly honestly follow the WiFi Alliance's process and submit their findings as conducted by or accepted by a testing organization recognized by the WiFi Alliance.

This whole process is honor system, in reality, and the intense marketing competition among vendors, especially some places in Asia, causes the rules to be bent, if you will. Also, the WiFi Alliance lacks a strong policing of their own policy, as they have a conflict of interest... as the Alliance is funded by the manufacturers' dues, and also caused by the sheer scale of the marketplace.
 
You better know everything.

Fee: $2,095.00

needed it for my job, so company paid :)

but it really teaches you everything there is to know about wireless.. most people think they know how wireless works while in reality they dont.
i was totally surprised on the amount of incorrect knowledge and assumptions i had amassed during the years.

of course this is not something just any one can do (because of the pricetag). but those that need to "work" with the technology can certainly use the deeper knowledge behind it.
 
Last edited:
Wndr4000

I was wondering if I'd see an article about this forced fallback to 20 MHz even when the router is manually configured to operate on 40 MHz on the 2.4 band.

I started noticing this back in January this year with the WNDR4000. I've got my own "neighboring" wireless network (for additional coverage and dedicated 802.11g AP to not slow the N clients) and the WNDR4000 definitely forces its good neighbor policy and falls right back to 20 MHz when it detects a strong neighboring network.

I did a little testing to see exactly how and when this kicked in. The unit when powered up does initially operate @ 40 MHz but then within a minute or two will fallback to 20 MHz if a decent neighboring signal is detected (a weak enough neighboring network won't cause the fallback). Another interesting bit I found was if a N client connects immediately to the router on 40 MHz before it switches over to 20 MHz, the unit appeared to continue operating @ 40 MHz until that client disconnected.
 
Some how I always thought using 40MHz BW on 2.4GHz is illegal. Maybe
it's being enforced now? I have a habit of not using it on 2.4GHz radio.
 
Thanks for the info on the NETGEAR, Zero. To clarify for other readers, NETGEAR routers don't have a 40 MHz only mode. The "Up to 300 / 450 Mbps" modes are auto 20/40.
 
Some how I always thought using 40MHz BW on 2.4GHz is illegal.
Not a question of legality. The issue is spec compliance.

As noted in the article, and in earlier posts in this thread, the described behavior has been in the 802.11 spec since the release of 802.11n in late 2009.
 
well if there is a neighbors ap, your 40mhz mode wont work very well anyway as that ap WILL cause interference and framedrops and whatnot. what you will see is a rapid decline in performance and increasing packetloss. so sticking to 40 mhz mode wont do you any good in that scenario.

somms.jpg


Not in my case! With 'illegal' 40MHz bandwidth forced on and streaming slingbox wirelessly over 2.4GHz for +6Hrs only 1 dropped packet disproves this!:D

Broadcom wireless utility even show the interference as being high/medium:
congestion.jpg
 
Last edited:
that shows ip packets lost, not 802.11-frames.

you wont be able to see that unless you either have a wifi protocol analyzer or your ap/router supports displaying such stats.

a benchmark would also show this.

a jperf benchmark through a good 300mbps wifi connection ahould get you somewhere around 13-15 MB/sec (at least 10+). anything less and you see the effects of framedrops and retransmits.

what your utility tells you is one thing, what you really get/have may be something entirely different :)
 
Not in my case! With 'illegal' 40MHz bandwidth forced on and streaming slingbox wirelessly over 2.4GHz for +6Hrs only 1 dropped packet disproves this!
Not "illegal" per the FCC (they make the regulations for the US, not IEEE or WiFi Alliance).

Dropped packet is IP layer. Lotta things can cause that - most often, the internet, not WiFi. As said above, the measure of WiFi is an 802.11 analyzer that tallies up MAC ACK timeouts w/retransmissions, and uncorrectable frames (FER = frame error rate). FER is well hidden from consumers.

Good thing, too, because crummy hardware abounds that transmits distorted waveforms (bad FER as it leaves the transmitter, before the impairments of propagation some distance). Some (DD-WRT) users try to coerce the hardware to higher power than it was designed for, at a certain modulation rate (bit rate). Higher the rate, lower the RMS power in the channel. Laws of physics + economics of the cost of the final amplifier in WiFi products - that must be very, very linear at the highest OFDM bit rates. That's called the "OFDM back-off", meaning the transmitter RMS power has to be reduced to avoid distortion at the higher bit rates. The backoff is around 5dB; 6dB would be quarter power). Avoiding that backoff is costly in hardware in the competitive market, so usually, its not done. The lower rates in 802.11 are not OFDM, no backoff, higher power specs.
 
Last edited:
very good post. :)

for those that cant afford the cwna course, you can also read http://www.amazon.com/dp/0470438908/?tag=snbforums-20 which will make you understand a few things like 802.11 frame retransmits and framecorruption. it is 700+ page book though so i wouldnt consider this "light" reading :)

understanding how wireless works on layer 1 and 2 will help you understand why it is behaving the way it is and why bullying your neighbors by forcing 40mhz on 2.4 ghz isnt going to help anyone.

so what if it says linkrate 300 mbps. you will get so many frame-errors that the effective rate will plummet (hence my suggestion to do an actual iperf/jperf benchmark) :)
 
Last edited:
thats internet speed. so that speedtest is useless.. 75 megabit is roughly half of what you can see on a good 300mbps wifi connection on the lan.

so if all you do is test the internet you will probably never notice issues. your wireless could be dropping/restransmitting frames like crazy and you probably wouldnt notice it, because your internet is considerably slower than your wireless..

test jperf between two pcs.. one connected to the cable and one connected via 300 mps wireless.. if you see 130+ mbps stable throughput i would consider that good. if you you get less than 100 you know your wireless has issues.

and to find out what exactly is causing the issues would require a lot of efford, but it would most likely (80%+) be related to your forced 40mhz + outside interference.

there is a reason why it is required to fall back to 20 mhz if you want to actually be 802.11 compliant.

40 mhz und 2.4 ghz will basically only work well if you are inside a shielded bunker or in a very rural area where no other wireless exists :)
 
I tried the 40 MHz only mode for 2.4 ghz on my rt-n66u with the latest stock firmware. I'm in an apartment building, so interference is plentiful. At 40 MHz, the transfer rates are extremely inconsistent, plummeting to as low as 5 Mbps in my testing. It does seem easier for my Logitech Revue to stream HD video at 40 MHz on 2.4 ghz. I sometimes get a forced "low bandwidth" switch to SD at 20 MHz despite my 30 Mbps down, 5 Mbps up cable connection. I never see that at 40 MHz. Browsing is where the inconsistencies are the most pronounced. There's either a little stutter when the page is loading or it stalls completely on my tablet.

It makes sense to use 20 MHz in my situation (or 5 ghz where I can). Using 40 MHz on 2.4 ghz gives you more bandwidth, but opens you up to that much more interference because you're using more bandwidth. In an apartment building saturated with AT&T and TWC 2.4 ghz routers, not to mention other appliances, it's better to get as much stuff as you can on 5 GHz. I was shocked to see that my 5 ghz network was the only one inssider detected around me.

Last thought: besides the conspiracy to eliminate 40 MHz on the 2.4 GHz frequency, what happened to external antennas on routers? I'm sure that antenna technology has progressed some, but they've become a rarity. Did OEMs mostly do away with them simply because the range wasn't needed or did antenna technology improve that much? It sort of makes sense to lessen the range of a router as a way to limit interference (once newer routers are adopted).
 
Last thought: besides the conspiracy to eliminate 40 MHz on the 2.4 GHz frequency,
To repeat information presented here, the 40 MHz operation in 2.4 GHz with coexistence mechanism is an 802.11-2012 spec requirement.

what happened to external antennas on routers? I'm sure that antenna technology has progressed some, but they've become a rarity. Did OEMs mostly do away with them simply because the range wasn't needed or did antenna technology improve that much? It sort of makes sense to lessen the range of a router as a way to limit interference (once newer routers are adopted).
Three reasons: Cost, aesthetics and performance.

Dual-band three stream routers require three antennas per radio. Single band antennas are better than dual-band so that would be six antennas. That would have very low WAF.

Internal antennas are less costly than external, especially when they can be integrated on the main board.

Finally, there is no performance tradeoff. Antenna technology has indeed progressed. The plastic case is transparent to the RF. The main disadvantage is the ability to play with independent antenna position. This has become less important with 802.11n where signal multipath actually is a good thing (improves receiver gain).
 
Antenna technology has indeed progressed. The plastic case is transparent to the RF. The main disadvantage is the ability to play with independent antenna position. This has become less important with 802.11n where signal multipath actually is a good thing (improves receiver gain).
The jargon: "Beneficial multipath" versus "Destructive multipath". For many years, the latter has been converted to the former - using (way back) "adaptive equalizers" - which essentially create delay windows in time to recombine the later arriving signal power with the earlier ones. The later ones arrive late, well, due to the longer path that multipath takes. Ye ole analog TV- ghost picture... from a mountain, airplane, etc., is classic multipath.

DOCSIS cable modems have to deal with multipath due to reflections in imperfectly terminated coax and amplifers in coax. So they use adaptive equalizers - where the adaptive means it can change with varying conditions. These are non-OFDM signals ("single carrier").

In OFDM signals, as in popular 802.11, the same concepts apply for combining signals after making them, essentially, time coincident despite the delay times (multiple delay paths- tree leaves, icy road, some building materials, etc.). MIMO also adds space/time diversity - a subject of many graduate thesis, and the praise goes to the likes of NSA from decades back for inventing it. Used in the 60's for tropospheric scatter/over-the-horizon communications.

The enabler wasn't antennas per se, but low cost high speed digital signal processors.

It's amazing that today's economies have taken this to, what, 25% of $125 manufacturing cost?
 
I thought 20/40 coexistence only kicked in if legacy a/b/g networks (20mhz only) encroached into the channel range and not other 20/40 N devices? is that a wrong assumption?

Also, if a 40 Mhz Only mode exists in the router - would that fail WiFi Alliance Certification, or is that an IEEE standard requirement only?
 
I thought 20/40 coexistence only kicked in if legacy a/b/g networks (20mhz only) encroached into the channel range and not other 20/40 N devices? is that a wrong assumption?
That's incorrect. Any network that meets the coexistence interference criteria should prevent 40 MHz mode operation.

Important to note that networks that are right on the main or secondary channel, do not trigger coexistence fallback. So if you have an AP in Auto 20/40 mode with Primary Channel on 1 and secondary on 5, if a neighboring network is operating on either of those channels, 40 MHz operation will be allowed.

Also, if a 40 Mhz Only mode exists in the router - would that fail WiFi Alliance Certification, or is that an IEEE standard requirement only?
An AP can have that mode for marketing reasons. But if the AP does not obey 20/40 coexistence, i.e. properly fall back to 20 MHz mode, it is in violation of both the 802.11 standard and WiFi Alliance.
 

Latest threads

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top