What's new

Increasing cellular reception with passive antenna

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

mto

Occasional Visitor
I wish to increase cellular reception with a passive antenna.

I noticed that some areas in my home have cellular reception below (-100dbm) which is extremely low.

There's a table which specifies the dbm levels and their meaning:
https://blog.surecall.com/enter-field-test-mode-on-iphone-or-android-phones/
(see table near bottom of page.)
I'll copy the table to here:
dBm Range - Signal Quality

0 to -60 Perfect Signal
-61 to -75 Very Strong Signal
-76 to -90 Strong Signal
-91 to -100 Spotty Signal
-101 to -110 Very Poor Signal
-110 to -200 No Signal

I prefer to use a passive antenna for that, instead of the common approach which is a booster/repeater.

The cellular frequencies are below 2GHz and even below 1GHz in my country.
Therefore, RG6 cable can convey these signals with minimum attenuation.

I was thinking:

I'll connect a directional antenna (Yagi) to a RG6 cable and then, on the other side, I'll connect an omni-antenna.
The Yagi antenna was directed towards a cellular antenna.
The omni-antenna was inside one of the rooms in the house.

Sadly, that didn't work.
Anyone knows why?

How do I make a passive antenna to work for a better cellular reception?
 
Last edited:
Edit and ask your question on the first post here. An admin can move it as necessary. :)

(Use the 'Report' button on your first post to ask them to). :)
 
passive antenna increases the path and therefore losses, so a passive antenna doesnt work that way. If the antenna is short(cable, antenna, everything) and more sensitive than it could help. Another alternative is to simply wrap some cable around the device, i've tried it before and it did help a bit.

i have before taken a massive directional antenna from an outdoor AP and attached it to the back of a wireless card, and the results were worse because the card simply did not have the power to use that antenna.

If you want to use a passive antenna, make sure the device supports it and the power required as well. Just remember, the more power you add, the more noise you get and the lower your bandwidth.
 
Last edited:
passive antenna increases the path and therefore losses, so a passive antenna doesnt work that way. If the antenna is short(cable, antenna, everything) and more sensitive than it could help
Yeah though the losses are minimal - the cable itself should introduce a loss of about 4db @ 1GHz & room temperature (considering cable length).

I think I understand the reason - impedance mismatch. The omni-antenna should be terminated with 75Ohm, i.e. the impedance of the RG6 cable.

If you want to use a passive antenna, make sure the device supports it and the power required as well. Just remember, the more power you add, the more noise you get and the lower your bandwidth.
Well, it's just a cellular device which transmits power according to standards.

My attempt was to convey the cellular signal via a much lower losses path - meaning via a cable.
 
Yeah though the losses are minimal - the cable itself should introduce a loss of about 4db @ 1GHz & room temperature (considering cable length).

I think I understand the reason - impedance mismatch. The omni-antenna should be terminated with 75Ohm, i.e. the impedance of the RG6 cable.


Well, it's just a cellular device which transmits power according to standards.

My attempt was to convey the cellular signal via a much lower losses path - meaning via a cable.
actually cables have very high losses When it comes to mobile signals, the energy in the cable is so weak that its not like the coaxial cable tv/internet with a powerful transmitter and a limited path for the energy to go. with conductors, weak signals can be worse in copper or highly conductive cables due to how easy it is to pick up other things and that the signal has to travel from the antenna, cable then receiver is a long path. Thats why there is a match to design the transmitter with the right power to the right antenna and rules differ for different types of antennas so some configs let you use stronger receivers but omni has the weakest allowed transmitter/receiver.

weak signals dont survive in good conductors because of how good they are as a conductor. circuit board paths are narrow and dont use a good conductor.

Laptops tend to have worse signal than wifi cards that are external because on the inside they have antenna cables while an external card has barely any cables or path for the signal to travel after the antenna, its not much to do with the antenna being inside because a plastic case wont empede signal. Although a better antenna can make a difference, the signal path and interference are 2 very important factors you have to worry about.

Check your SnR with and without the antenna. Its far better for you to wrap copper around your device than it is to directly connect one.
 
Yeah though the losses are minimal - the cable itself should introduce a loss of about 4db @ 1GHz & room temperature (considering cable length).

I think I understand the reason - impedance mismatch. The omni-antenna should be terminated with 75Ohm, i.e. the impedance of the RG6 cable.


Well, it's just a cellular device which transmits power according to standards.

My attempt was to convey the cellular signal via a much lower losses path - meaning via a cable.
I just remembered, the reason you'd use a passive antenna is if the SnR is bad but there is good signal. a passive antenna wont help much if the signal is too weak, but most of the time its the SnR that matters over the signal strength. Some modems let you pick a channel (frequency) which has different effects. Lower frequencies are much better for range but have less bandwidth. Its not always the case if the modulation is different though.
 
actually cables have very high losses When it comes to mobile signals, the energy in the cable is so weak that its not like the coaxial cable tv/internet with a powerful transmitter and a limited path for the energy to go. with conductors, weak signals can be worse in copper or highly conductive cables due to how easy it is to pick up other things and that the signal has to travel from the antenna, cable then receiver is a long path. Thats why there is a match to design the transmitter with the right power to the right antenna and rules differ for different types of antennas so some configs let you use stronger receivers but omni has the weakest allowed transmitter/receiver.

weak signals dont survive in good conductors because of how good they are as a conductor. circuit board paths are narrow and dont use a good conductor.

Laptops tend to have worse signal than wifi cards that are external because on the inside they have antenna cables while an external card has barely any cables or path for the signal to travel after the antenna, its not much to do with the antenna being inside because a plastic case wont empede signal. Although a better antenna can make a difference, the signal path and interference are 2 very important factors you have to worry about.
I understand - it is about the interference with other signals which makes the cable to be a bad choice in my case.
Though, assuming that my cellular device is using frequency 800MHz - which is reserved for cellular transmission only - then why would I get interference in this frequency band (e.g. 700-900Mhz ; except for white noise of course)?
Even if the signal is very weak, the interference should leave the SNR almost the same as with before the cable - because that nobody else is transmitting in these frequencies.

Check your SnR with and without the antenna. Its far better for you to wrap copper around your device than it is to directly connect one.
It sounds a good experiment but I'd rather to avoid with carrying a copper wire with me while conversing o_O:D

I just remembered, the reason you'd use a passive antenna is if the SnR is bad but there is good signal. a passive antenna wont help much if the signal is too weak, but most of the time its the SnR that matters over the signal strength. Some modems let you pick a channel (frequency) which has different effects. Lower frequencies are much better for range but have less bandwidth. Its not always the case if the modulation is different though.

That contradicts my reasoning - if the SNR is bad then "nothing" would help.. no?
Unless you're using a directional antenna which blocks interference - then the SNR becomes greater.
Or, unless you're using a smart method which takes the signal from multiple antennas and infers the proper signal despite of the low SNR.


It sounds like I have no other option but to use a cellular repeater.
 
I understand - it is about the interference with other signals which makes the cable to be a bad choice in my case.
Though, assuming that my cellular device is using frequency 800MHz - which is reserved for cellular transmission only - then why would I get interference in this frequency band (e.g. 700-900Mhz ; except for white noise of course)?
Even if the signal is very weak, the interference should leave the SNR almost the same as with before the cable - because that nobody else is transmitting in these frequencies.


It sounds a good experiment but I'd rather to avoid with carrying a copper wire with me while conversing o_O:D



That contradicts my reasoning - if the SNR is bad then "nothing" would help.. no?
Unless you're using a directional antenna which blocks interference - then the SNR becomes greater.
Or, unless you're using a smart method which takes the signal from multiple antennas and infers the proper signal despite of the low SNR.


It sounds like I have no other option but to use a cellular repeater.
laptops with new compact antenna design that get great signal do something very very different. The problem is that the cable also acts as an antenna, you can have an antenna that specially designed for a frequency to be good at receiving it but bad at receiving other frequencies, but its in the cable where things start to go wrong. Since the cable is used to pass the signal, it ill pick up other things (other frequencies, emf, etc) because the cable is also a good antenna for other things.

As for the frequencies you mentioned, i literally did wrap cables around a 3G modem last time in class and got some internet where there was none (signal too weak). It didnt help much though but it did make a difference.

If the signal is too weak the best option is a repeater which you need to place closer to the tower where it can get a signal, and through the use of directional antenna, direct it to your house. A good passive antenna in an area where you already have a signal helps the signal strength a bit and improves bandwidth, but if the signal is too weak you are going to need a repeater.

Just remember that anything conductive also acts as an antenna which includes the antenna cables. You can also test by wrapping the cable in aluminium foil (essentially what shielded ethernet is used to protect against emf of power lines and other power sources you get in hospitals like radiation and other things)..
 
Have you though about using Wi-Fi calling in your router or wireless AP? That's what I do as my cell reception is poor in my old house. On a positive note it does not count as minutes on your plan.
 

Latest threads

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top