In designs where multiple antennas are required, what should be the minimum separation between the antennas to prevent mutual coupling and interference?
The answer is “It depends”, but I can give you some definite guideline…
-
All comments are directed at receiving antennas. If you have one antenna that is a transmitting antenna then the spacing MUST be much greater.
-
If the antennas operate in the same band, then the distance should be a quarter wavelength apart minimum, and one-half or one full wavelength is better, especially if they are facing the same direction and at the same height. This is especially true of directional antennas, whereas if you have two identical isotropic sources then a quarter wavelength is ideal.
-
If the antennas are different, then a quarter-wavelength is also sufficient to limit interference, again assuming they are not directional and that more distance is better
-
For all distances it is undesirable to have them an exact quarter-wave or half-wave etc., apart, (except when you are trying for diversity reception perhaps) but to make it slightly larger to avoid and resonant effects. Think five to ten percent extra to be safe.
Illustrative examples. Picture two dish antennas pointed at each other. They would block almost everything from one another while absorbing any radiated signal. If they were back-to-back then there would be almost zero effects, and those due to just having metal nearby and acting like big capacitors For two Yagi antennas it’s almost alike, but if you could put them both on a mast and separate the heights by a quarter wavelength you’d get very little interference and actually be able to enhance one another. Finally, if they were both etched into a PCB and were different bands (say one was GPS and the other was WiFi) you would only need a quarter wavelength to avoid any coupling due to capacitance, etc.
So the simplest, fundamental answer is one quarter wavelength except for special cases.