What's new

N/AC/AX Mixed Wireless Mode, “Maximizes performance”?

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

Yeah. For example, if I send beacon frames at higher-than-minimum rate and my neighbor's webcam can't read that, how am I negatively affecting my neighbor? Seems to me that if anything, I'm affecting my neighbor positively by chewing up less airtime.

It doesn't work that way...

It's complicated, and we've tried to explain it - just leave things alone, and your network will be better for it.
 
Yeah. For example, if I send beacon frames at higher-than-minimum rate and my neighbor's webcam can't read that, how am I negatively affecting my neighbor? Seems to me that if anything, I'm affecting my neighbor positively by chewing up less airtime.

Well in theory your neighbor's device will detect that as noise rather than being able to negotiate and share, however negotiating and sharing with A and G weren't that great anyway. It isn't really the minimum/supported rates advertised (since they're the same for all 5Ghz technologies) but the fact that you aren't advertising A/G/N (or whatever) capability.
 
802.11a wasn't very common back in the day - mostly found in the enterprise space (I have an old Cisco 802.11a PC Card)...

We did roll it out where I worked back in the early 2000s. It was short lived and expensive. I still have my A/B and A/B/G cards, not entirely sure why. I think both are IBM. I actually have a few of the original Lucent Orinoco cards with the external antenna. And an old 2Wire router/DSL modem that accepted that PCMCIA card and had a little notch to run the optional external antenna.

Hey, it was pretty high tech and impressive back in the day.
 
We did roll it out where I worked back in the early 2000s. It was short lived and expensive. I still have my A/B and A/B/G cards, not entirely sure why. I think both are IBM. I actually have a few of the original Lucent Orinoco cards with the external antenna. And an old 2Wire router/DSL modem that accepted that PCMCIA card and had a little notch to run the optional external antenna.

Back in the day, this was standard issue - if you were allowed on the corp WLAN - I was a bit of a floater as I spent about 80 percent of my day outside of my direct office in meetings, whatever...


Oddly enough - I still have that card -- ;)

Later on - I had to put in a specific request to get an A/B/G card for my corp laptop as I had industry and customer meetings to cover...
 
It doesn't work that way...

How does it work then? I read @drinkingbird's explanations upthread, which made sense to me, but there didn't seem to be any reason there to keep legacy settings enabled unless one of my own devices required that.

It's complicated, and we've tried to explain it - just leave things alone, and your network will be better for it.

I find that attitude patronizing and more than a little offensive. I hold a bachelor's in EE and a doctorate in CS. If you can't explain things in a way that I can understand, I'm pretty sure you've lost just about everybody else on these forums.
 
Back in the day, this was standard issue - if you were allowed on the corp WLAN - I was a bit of a floater as I spent about 80 percent of my day outside of my direct office in meetings, whatever...


Oddly enough - I still have that card -- ;)

Later on - I had to put in a specific request to get an A/B/G card for my corp laptop as I had industry and customer meetings to cover...

Yeah we used the Aironet APs, even had one at home for a while, but not the client cards. The Orinocos were the standard issue for B and the IBM (since that was our laptop supplier) for the A/B then A/B/G ones.

My first wireless setup was in my college apartment on campus in either 99 or 2000. Intel ran this killer deal on an 802.11B AP and PCMCIA card as part of an early adopter program. It was still a few hundred bucks from what I recall. I had wifi before my campus and company did, and with practically no 2.4ghz interference, could get on it from some of the classrooms facing the apartments.
 
How does it work then?

I'm by no means an expert on the depths of wifi technology but my understanding has always been anything an AP or client can't "understand" will get interpreted as noise, and airtime sharing won't happen, etc. So I guess being neighborly would mean similar to not enabling 40mhz on 2.4ghz, as not only are you using up all of the spectrum but any device not capable of 40 will just be inundated with noise.

However on 5ghz like I mentioned, they all use the same rates and my (possibly incorrect) understanding is all 5ghz devices will be able to "see" each other and coexist, they just won't attempt to join something that isn't broadcasting the capability (in this case A or N300 etc).

In reality, I guess it is moot with 5ghz, it won't travel inside a neighbors house (or if it does it will be so weak it won't matter). Personally I like to disable the 6M rate on 5G, I even had 12 disabled for a while, but in reality it likely is not accomplishing much, if anything, and the vast majority of users probably would not notice any difference.
 
Back in the day, this was standard issue - if you were allowed on the corp WLAN - I was a bit of a floater as I spent about 80 percent of my day outside of my direct office in meetings, whatever...


Oddly enough - I still have that card -- ;)

Later on - I had to put in a specific request to get an A/B/G card for my corp laptop as I had industry and customer meetings to cover...

I just went and opened the closet of forgotten technology (inside the spare bedroom of "obsolete but potentially usable for something obscure" technology) to pull out the old 2wire/Orinoco setup. Then I saw just how horrific the closet of forgotten technology is and slowly closed the door again. I don't remember becoming a pack rat.....

Turns out there is a Cisco A/B/G/N card in there. Unfortunately it is Cisco Linksys (no idea when or why I got that). Also a Linksys (non Cisco) 2.4ghz G card, which is even more embarrassing and confusing.
 
I'm by no means an expert on the depths of wifi technology but my understanding has always been anything an AP or client can't "understand" will get interpreted as noise, and airtime sharing won't happen, etc.

But the airtime is eaten up whether or not receiving devices can read the packet, no?

Personally I like to disable the 6M rate on 5G, I even had 12 disabled for a while, but in reality it likely is not accomplishing much, if anything, and the vast majority of users probably would not notice any difference.

With you there. I've got the 2.4GHz radio disabled entirely on my APs --- I have no devices that need it, and there are way too many competing SSIDs in my vicinity. Better for me and for my neighbors to just not use the band. I've also got the multicast rate on my 5GHz radios set to 12MHz not 6. That works for all the clients I have, and I've still not heard a reason why I shouldn't bump it up to reduce the airtime consumed by beacon packets.
 
But the airtime is eaten up whether or not receiving devices can read the packet, no?

Yes but instead of just cutting in line it in theory will wait its turn. Noise vs. negotiation. However I don't know if that applies to this discussion at all.

With you there. I've got the 2.4GHz radio disabled entirely on my APs --- I have no devices that need it, and there are way too many competing SSIDs in my vicinity. Better for me and for my neighbors to just not use the band. I've also got the multicast rate on my 5GHz radios set to 12MHz not 6. That works for all the clients I have, and I've still not heard a reason why I shouldn't bump it up to reduce the airtime consumed by beacon packets.

Unfortunately the Asus doesn't give you any way in the GUI to disable rates on 5ghz (other than multicast), but using a script you can shut 6 off which will also disable the 6M multicast rate and bump your beacons up to 12M, I don't think bumping the multicast rate alone is enough to decrease beacon noise, AFAIK they are still at 6M in that case. But on 5Ghz, not a huge deal.

I do need 2.4ghz but have mine set to 5.5 minimum. I'd bump it to 6 and disable 11, but if I get a brand new blu-ray that needs to download an encryption key, that old draft-N player has to be able to connect. Though I don't think they've added new encryption to standard blu rays in years now, everyone is focused on 4k now. As long as the DSSS rates are disabled I'm happy, CCK can stay for now.
 
Sounds to me like a reason to disable

When enabled your router will see it as Wi-Fi communication and not a noise. Timeshare instead of talk over. Since the devices we discuss are extremely rare, Auto does the job well and no change is needed.
 
I hold a bachelor's in EE and a doctorate in CS. If you can't explain things in a way that I can understand, I'm pretty sure you've lost just about everybody else on these forums.

Well, if you have a PhD, then you know how to do your research, you don't need me to explain it to you...
 
When enabled your router will see it as Wi-Fi communication and not a noise. Timeshare instead of talk over.
I'm not following --- the router will see what as communication? Clients don't send beacon packets. My mental model of this is that an old client will sit there waiting for a beacon it can understand and is authorized to connect to. A beacon it can understand doesn't seem markedly more useful than one that it can't, if it's for somebody else's SSID.
Since the devices we discuss are extremely rare, Auto does the job well and no change is needed.

Sure, we're just talking about fiddling at the margins here. But you seem to be claiming that tweaks like a faster beacon rate make things actively worse, and I don't follow the reasoning.
 
I'm not following --- the router will see what as communication? Clients don't send beacon packets.
I think I grasp what's being suggested (and I have /no/ college degree). The router and (other) client will thus see /communication/ as communication rather than just so much background noise. If router (and/or) "client" then be polite and sit quiet, there will certainly be degradation to router's network. Possibly beyond that from just so much extra noise, right? And, hahaha, who cares about being disruptive to another's archaic device anyway? They should update it...
 
I do have my radios not responding to a and b on my Cisco wireless APs. I think it is the default. I have several APs with the power turned down on them using separate channels. As far as I know I am not seeing any issues. My 2.4 radios are down 50% on power as it penetrates my exterior brick walls more than my 5GHz radios. I am no expert in radios.
 
Last edited:
the router will see what as communication?

Close proximity Wi-Fi communication between other routers and connected to them clients. Your router shares airtime with other routers as well, the Wi-Fi spectrum is shared. If your router doesn't support or recognize this communication it will be seen as noise like microwave oven and it may attempt talking to its own clients over it. There is a common road or highway representation of Wi-Fi. It's shared between high and low speed vehicles, cyclists, pedestrians. Imagine you always drive around and never walk or ride a bicycle. What you normally do - drive over the cyclists and pedestrians because you don't need to see them anymore?
 
Close proximity Wi-Fi communication between other routers and connected to them clients. Your router shares airtime with other routers as well, the Wi-Fi spectrum is shared. If your router doesn't support or recognize this communication it will be seen as noise like microwave oven and it may attempt talking to its own clients over it.

I get that this could happen if my neighbor's router fails to recognize beacons that mine is sending, but now you are talking about ancient APs not ancient clients. That's not a scenario I think I need to worry about. Every SSID my wifi scanner can see is at least 802.11ac capable.

Also, I don't entirely buy that beacons sent at higher speed will look like noise-that-its-okay-to-send-over to other devices. If that's the case, there would be a lot more problems with high-speed data packets being interfered with than with beacon frames being interfered with. I confess not knowing the details of how devices decide they're clear to send, but ISTM mixed-generation WLANs wouldn't work at all if older devices thought that high-speed packets were just noise.
 
I get that this could happen if my neighbor's router fails to recognize beacons that mine is sending, but now you are talking about ancient APs not ancient clients. That's not a scenario I think I need to worry about. Every SSID my wifi scanner can see is at least 802.11ac capable.

A/N/AC/AX - they're forward/backward compatible, so little to no impact to leave the AP configured for all four of the protocols in 5GHz...

Nice within with 6E (6Ghz) is that it's greenfield, and there is no legacy support required there - so it's a lot less complicated.

The challenge in mixed mode is 11b and the DSSS modes - G/N/AX are fine, but 11B has constraints around the preambles and basic rates. There's also the issue with ERP and protection modes there, along with client protections that only apply to 802.11n in 2.4Ghz.

OpenWRT actually changed their defaults for 2.4 to disable the legacy modes a couple of years ago (one can enable it if needed).

One of the more interesting things I've see recently is that the carrier provided gateway/routers configure the 2.4GHz radios as G/N/AX, disabling the legacy mode. Those same gateways configure the 5GHz radios as A/N/AC/AX, as we've discussed, there's no impact to support 11A clients.

As others have mentioned earlier in this thread - it's hard to consider why one would use 11B on a regular basis - maybe it's for an older printer, or to support older Game Consoles (there were a few that were 11B only, Nintendo hasd a few) - and there's the retrocomputing thing, where 11B might be all that's available.

For those case - I would just dedicate a separate AP for those with it's own SSID and encryption schemes - however I wouldn't even consider WEP, and WPA is a manageble risk, but not entirely risk free (many 11B clients can't do WPA2/AES, but most can do WPA/TKIP).

Anyways - I've done a lot of work with ath9k, which is a QC-Atheros 802.11n chipset family - and there, the driver and WiFi NIC actually track the association ID and uplink/downlink rates, so mixed mode there works fine - and when disabling the 11B legacy, it sets the basic rates accordingly (6/12/24) - remember when one is looking at PCAP's, the (B) doesn't mean 11B, it means the Basic Rates.

The Basic Rates determine the data rates for the Beacon Frames, along with all management frames for the BSS - and 11A/N/AC/AX default to the same rate sets - and this is so that all clients (the AP designer has no idea of what mix of clients are on any given WLAN), so there, the 802.11 specs actually say support them all...

Going also to the Spec - 802.11B and other legacy modes are deprecated so support there is optional - one still has to support 11G/ERP and 11N/HT modes, along with 11AX/HE if the chipset supports it and is configured to support HT and HE modes in 2.4 (note that AC/VHT doesn't apply, however, 11AX/HE mode allows the 2.4 radio to use the VHT fields).

So I hope this clears it up a bit - as some of the forum members may know, I'm a recovering member of the IEEE 802 working groups, both for 11 and 16 (16 was WiMax), as well as one of the folks that contribute from time to time on OpenWRT...

so @tgl - my answers earlier may have sounded a bit flippant and patronizing, if you are personally butt hurt on this, I do apologize - it is one of those things, well, better things to worry about than this - if you want to dive in and do more research, 802.11 is out there, and so is the ath9k source code and chipset documentation - some of it is samizdat but it is out there if you would like to dig into things further...
 
I'm not following --- the router will see what as communication? Clients don't send beacon packets. My mental model of this is that an old client will sit there waiting for a beacon it can understand and is authorized to connect to. A beacon it can understand doesn't seem markedly more useful than one that it can't, if it's for somebody else's SSID.


Sure, we're just talking about fiddling at the margins here. But you seem to be claiming that tweaks like a faster beacon rate make things actively worse, and I don't follow the reasoning.

Modern wifi uses time slices and all clients participate and wait their turn. If they simply interpret the neighboring AP (or wifi device) as noise, they just talk over other devices and don't play nice.

Now whether an A client that can't join a 5Ghz network interprets that network as noise or not, would require some research, it should still see it as a network just not one it is capable of joining, but devices are supposed to coexist with all nearby networks, not just ones they can join.

It isn't just APs sending beacons, all clients must participate in order for everything to play nice together.
 

Similar threads

Latest threads

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top