Hi,
I see many times with different routers higher channel gives better signal
at device end. My idea was antenna is tuned better at higher channel freq.(better SWR, giving better erp) Even some op. manuals mention this.
Encourages to use higher channel.
Some of it is antenna design. The higher the frequency, the higher the signal gain of the antenna, within reason (dual band antennas have 2-3dBi more gain in 5GHz than in 2.4GHz bands. So a 5dBi dual band antenna is actually, if they are advertising it/doing it right, a 5dBi antenna in 2.4GHz, but a 7/8dBi antenna in 5GHz).
Now, the differences are minor, but you are talking a bit of a difference between channels in terms of frequency, so a higher 2.4GHz channel might well net a tenth or two dBi better gain, more so in 5GHz bands where the low are ~5.2GHz and the high are ~5.9GHz...as well as the UNI-III/5.9GHz range being (until recently) higher broadcast power than the UNI-I 5.2GHz range.
In my testing, I see no difference between channels in 2.4GHz (other than the occasion borked client/AP which HATES a specific channel, which I have seen. I have effectively no 2.4GHz interference). In 5Ghz, I have found that setting to channel 147 is slightly better than channel 161/165, at least in 11n 40MHz mode. I haven't done much 11ac testing. The 147-165 range is also slightly better than the low UNI-I 5GHz range (again, N600 router that certainly doesn't have the new FCC broadcast power limits), but performance is pretty similar except right up on max range.
In very, very limited testing I see no difference between the low and high 5GHz channels with my AC1750 router, but it may have the newer FCC broadcast power limits for UNI-I. Dunno. It is new enough it is possible, or it could be low enough in Tx power already not to matter much, or explicit beamforming or I just haven't done enough testing or I don't know.