Skip to content

 

Air Traffic Safety

ELECTRONICS

 INTER N AT I O N A L

 

THE LEADING JOURNAL IN GLOBAL CNS/ATM COVERAGE

VOLUME 2 NUMBER 2, 2024

APRIL/JUNE 2024

Rain Degradation of Satellite Communication Signals

Satellite technology is increasingly becoming the technology of choice the world over for a wide variety of applications, including radio communications, satellite internet services, broadcasting, mapping, earth observation, astronomy and weather forecasting. For the aviation industry, the gradual but determined shift towards satellite technology as the technology of choice for enabling a globally homogenous, harmonized and interoperable air traffic management cum air navigation systems started in 1983 with the establishment, by the Council of the International Civil Aviation Organization (ICAO), of the Special Committee on Future Air Navigation Systems (FANS), charged with the responsibility of studying, identifying and assessing new concepts and technologies and making recommendations for the coordinated development of air navigation.

Ever since 1988 when the FANS committee came out with a report underscoring the importance of satellite-enabled datalinks and digital communications and ever since 1992 when the 31st Session of the ICAO General Assembly endorsed the FANS concept, there has been no let-down in the aviation industry’s forays into satellite technology arenas. From telecommunications and navigation to air traffic surveillance and weather forecasting, the industry has kept up the pace in exploring technological opportunities for ensuring the continuing safety, security, and efficiency of civil aviation operations.

But, the aviation industry is a very peculiar arena. It is, perhaps, the most hounded industry and the most particularized sector, so to say, in the world. It operates based on a complicated framework of rules, regulations, standards, recommended practices and techno-operational dictates. Talking about communications, for instance, civil aviation operations accept nothing short of a 99.9% service availability. But, satellite communication links are known to be vulnerable to certain elements that can negatively impact the propagation of electromagnetic signals, resulting in complete outage or service degradation.

One element has to do with the characteristics of the electromagnetic waves propagated by a satellite communication system. These waves are known to follow a straight, line-of-sight trajectory and are also effectively incapable of penetrating solid objects. As such satellite signals can be impacted by high-rise buildings, trees and other terrains and obstacles when they come in the way of the signals on their Earth-space trajectories. This factor is not always a regular problem and that is why satellite communications is favoured for communication over wide geographical areas. There are also concerns surrounding susceptibility to interferences from electrical devices and power lines and issues related to the limitation of elevation angle and the problem of path loss between Earth-space points consequent upon the dilution of the propagating waves, among other effects that are peculiar to non-ionized atmosphere.

Talking about degradation of satellite communication signals, one factor that is of prime significance is the impairments occasioned by inclement atmospheric conditions such as rain, snow and ice crystals. This factor is also of great concern in terrestrial communication links. Rain can degrade satellite signals both on the Earth-space (Uplink) and the space-Earth (Downlink) paths by disrupting or out rightly attenuating the propagating signals depending on a number of factors, including the frequency band in use, the wavelength of the signal, the diameter of the raindrop, and the rain rate ( a measure of raindrops per unit volume). Thick cloud build-ups or a distant storm can also disrupt satellite signals even when it is not raining. Snow and ice crystals can also occasion noticeable impairments, although the amount of degradation may not be as much as what rain will occasion.

Rain impacts the communication link by attenuating the signals and causing a shift of the phase of the waves. Attenuation, though, is negligible with snow or ice crystals due to the fact that the molecules are tightly bound unlike those of liquid water, which interact easily with the electromagnetic waves. So, while snow or ice crystal can only cause phase shift, raindrops result in both attenuation and signal phase shift.

UNDERSTANDING RAIN 

Satellite communication links operate on specific frequency bands, which form a very important factor in the design of satellite communication links. Satellite frequency bands determine a number of techno-operational elements of a satellite communication network such as the throughput, the width of the available bandwidth, ground antenna size, and the amount of signal degradation that can result from the effects of precipitation. For instance, higher frequency bands in the centimeter and millimeter bands translate into significantly increased throughput and access to wider bandwidths. Higher frequencies also mean significantly reduced antenna size. Signal degradation due to precipitation, however, increases as the frequency bands increase. Higher frequencies with shorter wavelengths can carry a lot of data, although they are more susceptible to the degradation effects of precipitation.

The pluses of higher frequencies in terms of the capability to transmit robust data at higher data transmission rates carry a caveat regarding the robustness of the power level of the received signal in order to preclude data losses. Lower frequencies with longer wavelengths cannot carry much data, although they are less susceptible to rain fade. In other words, the extent of impairments that satellite signals suffer due to the effects of rainfall is a function of the frequency of operation of a satellite communication link. Other important determinants of the magnitude of rain-induced impairments are rain intensity or rain rate and the raindrop size. The issue of path loss due to dilution or losses caused by precipitation to the propagating signals underscores the significance of factoring in the Effective Isotropic Radiated Power EIRP) of the system when designing a satellite communication network.

Satellite frequency bands typically fall under the SHF (Super High Frequency) band, otherwise known as the centimetre band (1-10 cm) and the EHF (Extremely High Frequency) band, also known as the millimeter band (1-10 mm). Frequency bands within the SHF spectrum are typically within the 1-40GHz range, incorporating the L, S, C, X, Ku, and Ka bands. Bands in the EHF region are typically in the 30-80GHz range and comprise the Q/V bands. Although, standard communication satellite applications are normally restricted to the SHF bands, the growing requirements for applications that require higher data rates and much wider bandwidths are causing increasing shifts towards available opportunities in the millimetre waves (mmWaves) region.

Talking about the quality of received satellite signals, signal impairment due to rain degradation usually manifests in a number of ways. Firstly, it manifests in the form of signal attenuation. Secondly, it results in the alteration of the polarization of the propagating electromagnetic waves. Thirdly, it precipitates an increase of the system noise temperature whilst also decreasing the Figure of Merit (G/T).

SIGNAL ATTENUATION

The primary effect of rain degradation on electromagnetic waves propagated between Earth and space is signal attenuation, which results from the absorption and scattering of the propagating signals by raindrops. During intense rainfall, it may also be necessary, particularly at frequencies in excess of 10 GHz, to factor in attenuation resulting from absorption by atmospheric gasses, the effect of which is dependent on frequency, elevation angle, water vapour density and the altitude above sea level.1 The primary effect of the absorption of electromagnetic waves by rain droplets is an increase in the molecular energy with a resultant loss of equivalent signal energy.

Rain fade or rain attenuation refers to the degradation of signal quality as a result of precipitation. Rain attenuation is effectively a measure of the extent of degradation of satellite signals as a result of precipitation. It is essentially a Power Ratio (whose level is usually equal to a number that is less than 1), representing the ratio of the power level of the received signal at an earth station during rain to the power level of the received signal in the absence of rain. The measure ‘attenuation’ is usually expressed in decibels (dB), which is defined as 10 times the logarithm to base 10 of the power ratio (rain attenuation).

Studies conducted in different geographical locations in recent times have found that signal attenuation due to rain fade increases monotonically with increases in rain intensity and frequency of propagation. It is also a function of the specific rain attenuation and the rain path length. Additionally, rain attenuation is not a fixed measure across the world as rain rate is neither constant throughout a year nor similar across the world.2 Also, the distributions of precipitation attenuation measured on the same path at the same frequency and polarization may vary year-to-year.1

Rain attenuation is also a function of the frequency and wavelength of the propagating waves. This relationship is defined by the equation:

                                                                                                       C = λƒ

where λ is the wavelength, C is the speed of light (3 × 108 m/s) and ƒ is the frequency. By Transposing this equation yields a second equation:

                                                                                                       λ = C/ƒ

showing that frequency and wavelength are inversely related and that the higher the frequency, the lesser the wavelength.

The attenuation effect of rain degradation increases as frequency increases and wavelength decreases. The interplay between the size of the signal wavelength and the raindrop size can be used to explain this. The relationship is such that the closer the size of the wavelength of the propagating signal comes to the size of a raindrop, the higher the probability of its absorption and thus the higher the attenuation of the signal strength. Thus, rain intensity and the size of a raindrop are important determinants of the amount of signal attenuation.

The size of a raindrop varies based on rainfall speed and rain intensity. A drizzle typically contains raindrops of a size that is less than 0.5 millimetre. In tropical and equatorial climates with highly intense rainfall, a raindrop can measure close to 5 millimetres or more in diameter. A raindrop of this diameter, though, usually breaks up into small droplets before reaching the ground. But, typically and on average, a raindrop is 2 millimetres or 2000 microns in size.

Let’s consider some examples now. At C-band downlink frequency of 4 GHz, the wavelength is 75 millimetres. If we take the size of a typical raindrop to be 2 millimetres, it can be seen that the wavelength of the signal is much larger than the size of a raindrop and as such the signal can pass through the rain with just negligible attenuation. At the X-band downlink frequency of 8 GHz, the wavelength is 37.5 millimetres, which is still greater than the size of a raindrop although the attenuation will be slightly higher than is the case with C-band. At the Ku-band downlink frequency of 12 GHz, the wavelength is 25 millimetres, which is still not very close to the size of a raindrop. At Ka-band downlink frequencies of 20 and 26 GHz, the wavelengths are 15 millimetres and 11.5 millimetres respectively and at Q-band downlink frequencies of 30 and 33 GHz, the wavelengths are 10 millimetres and 9.1 millimetres respectively. At V-band downlink frequency of 40 GHz, the wavelength is just 7.5 millimetres, which is quite very close to the size of a raindrop. It can be seen how vulnerable frequencies above 10 GHz such as Ku- and Ka-bands are to rain fade. The situation is even dire for frequencies in the milimetre bands such as the Q/V bands.

There has been a bourgeoning of research in recent times regarding the modelling of rain attenuation. For example, an attempt has been made to estimate rain attenuation at EHF bands in Bangladesh.2 The study, which deployed the International Telecommunication Union (ITU-R P618-10 (10/2009)) model, found that attenuation due to rain in Q/V band could reach up to 150 dB with attenuation varying from 40 dB to 170 dB depending on the months. Another research3, aimed at estimating long-term rain attenuation at 10 GHz, 20 GHz and 30 GHz for satellite communication applications in South-South, Nigeria using the ITU Radio-Wave Propagation (ITU-R P618-13) model, found that at high frequency, rain fade can cause signal attenuation level as high as 96.13 dB at 30 GHz.

Of the numerous models available today for modelling rain attenuation, the International Telecommunication Union (ITU)/CCIR models are the most favoured because of their robustness and versatility. The accurate determination of Specific Attenuation is very crucial for accurately predicting attenuation. The International Telecommunication Union4 recommends the following equation for determining Specific Attenuation based on the power-law relationship:

                                                                                                        YR = kRα

 

where YR  is the specific attenuation in dB/km (which is dependent on rainfall rate and frequency) and R is rain rate in mm/h. к and α are coefficients, which are essentially frequency- dependent.

For the prediction of attenuation due to precipitation and clouds along a slant propagation path for frequencies up to 55 GHz, International Telecommunication Union (Rec. ITU-R P.618-8)1 recommends a step-by-step general method predicated upon the following parameters:

R0.01: point rainfall rate (mm/h) for the location for 0.01% of an average year

hs:     height (Km) above mean sea level of the Earth station

θ:      elevation angle (degrees)

Ψ:     latitude of earth station (degrees)

ƒ:      frequency (GHz)

Re:   effective radius of the earth (8500 Km)

SIGNAL DEPOLARIZATION

Polarization is a crucial factor when considering the propagation of electromagnetic waves as any alteration of the intended polarization can affect the performance of a communication system.  In a layman’s language, polarization is simply the motion of electromagnetic waves in specific directions.

Technically speaking, polarization defines the geometrical orientation of the vibrations of electromagnetic waves in specific directions. The geometrical orientation refers to the orientation of the two components – E1 and E2 – of the electric field. Thus, polarization relates to the figure traced by the time-varying electric field at a point that is normal to the direction of propagation. For this reason, polarization is possible only in traverse waves with the electric field being perpendicular to the direction of propagation of the waves.

There are different types of polarization just as different polarizations are known not to behave in the same manner in a communication device. Polarization can be linear, circular, or elliptical. Linear polarization can be either horizontally polarized or vertically polarized with the radiation moving in a single plane relative to Earth’s surface. Circular polarization (which is the most appropriate for mitigating or avoiding depolarization) and elliptical polarization can come either as right hand circularly polarized (RHCP) or as left hand circularly polarized (LHCP) with the ‘handedness’ determined by the rotation of the wave vector in time in the direction of motion.

That said, the effect of rain on Earth-space propagation is to alter the sense of polarization of the received signal, the degradation or depolarization loss experienced being stronger as the frequency and the rain attenuation increase. This effect is often referred to as depolarization or hydrometeor depolarization as it is induced by rain and ice crystals. But, when two independent electromagnetic waves interfere with each other due to a depolarizing effect along the propagation path the term, cross-polarization, is used. There is a variant of depolarization that is known as multipath depolarization, which is a problem at frequency bands below 3 GHz and is a consequence of tropospheric or ionospheric variables among other things.

Depolarization due to rain and ice crystals is due to a change in polarization. The effect is precipitated by viscous forces experienced by the falling raindrop, which not only alter the shape of the raindrop but also cause the raindrop to rotate. This buffeting of the raindrop in the atmosphere results in differential propagation characteristics with different polarization states and the alteration of the polarization of the received signal. The alteration of the polarization causes power transfer from the intended polarization state to the unintended orthogonal polarization state5, resulting in both signal strength loss and interference.

 

INCREASED SYSTEM NOISE TEMPERATURE AND REDUCED G/T

In addition to causing signal attenuation and signal depolarization in satellite communication links, the effect of rain is to cause an increase in the downlink system noise temperature and a reduction of the figure of merit of the ground station receive antenna. Signal degradation results from the rain contributing to an increase of the sky temperature and the consequential increase of receive antenna noise. This degradation effect of rain on the downlink system noise temperature does not extend to the space-based satellite whose antenna is looking at the warm earth.

On the downlink, the clear sky system noise temperature, Ts, is defined by the relationship:

 

                                                                                              Ts = Ta + Tc

 

where Ta is the antenna noise temperature and Tc is the composite noise temperature of the receiving system. The system noise temperature is sometimes referred to as the effective input noise temperature of the receiving system. The composite noise temperature of the receiver incorporates all the components in the receiving system, including lines/feed, the downconverter and the low noise amplifier (LNA). The antenna noise temperature includes antenna losses and the clear sky noise (background microwave radiation).

Usually, absorption due to an atmospheric phenomenon causes an increase of the antenna noise temperature. In ITU Rec. ITU-R P.618-81, the following equation is specified for estimating atmospheric contribution to antenna noise in an earth station:

 

                                                                                                Ts = Tm (1 – 10-A/10)

 

where Ts is noise temperature (K) as seen by the antenna, A is path attenuation in decibels, Tm is effective temperature, which is dependent on the contribution of scattering to attenuation, antenna beam width, and the physical extent of rain cells and clouds on the vertical variation of the physical temperature of the ‘scatterers’.

The Figure of Merit defines the downlink performance of an earth station and is dependent on antenna noise temperature and system noise temperature. It is essentially the ratio of the antenna gain (G) to the system temperature (T) and is expressed as G/T. The system temperature T in the ratio is the same as the system noise temperature, Ts.

An increase in noise temperature can have an impact on Signal-to-Noise Ratio. The relationship between noise temperature, which describes how much noise is generated in the receiver, and noise power is defined in the equation:

 

                                                                                                            Pn = kTnB

 

where Pn is the Noise Power, k is Boltzmann’s constant (1.38 × 10-23 J/K), Tn is the noise temperature (K) for the system, and B is the noise bandwidth (Hz).

FADE COMPENSATION AND MITIGATION TECHNIQUES

The efficient use of frequencies above 10 GHz, particularly Ka and Q/V bands, for satellite communications may not be feasible in terms of signal quality, coverage and energy level due to the effects of atmospheric precipitation. It is also true that although rain attenuation is much more noticeable at frequencies above 10 GHz, lower frequencies, when used particularly in tropical and equatorial climates, may result in effects close to those of higher frequencies. Although the attenuation effects of rain fade can be mitigated through a number of techniques, it may be impossible to completely compensate for the resultant signal loss especially at the Ka and Q/V bands without negotiating a trade-off.

In the beginning, it may be useful to check for the proper alignment of the ground-based antenna to the satellite. The application of a hydrophobic coating to the satellite dish surface has been suggested in some quarters as an effective way of repelling rainwater. The use of a larger receive antennas may also help to compensate for rain loss. However, it should be borne in mind that antenna sizes typically reduce with increasing frequencies.

Three distinct techniques can be deployed to compensate for loss of signal resulting from inclement atmospheric conditions. The first method is the Uplink Power Control (UPC) technique, which involves boosting transmission power using UPC systems that have the capability to monitor and detect signal strength interference and to automatically adjust transmitted power level to compensate for any loss. This technique, however, is contingent upon the availability of sufficient power and may not be useful where the system’s power is limited. In certain cases, what is done is to allocated rain margins – which may vary for specific amounts of signal quality because of variations of rain rates in a particular year – to compensate for rain loss, especially at frequencies in excess of 10 GHz. However, in some instances, especially in the case of applications operating at higher bands such as the Q/V band, the margin in decibels as well as the power level may be too high, thus requiring other mitigation techniques.

The second technique is Adaptive Coding and Modulation (ACM). This involves automatically lowering the modulation of the space-Earth signal by reducing the bit rate below the nominal value to compensate for attenuation experienced by the communication link due to atmospheric interference. It also entails maintaining a FEC (Forward Error Correction) coding at a specific bit rate and a lower carrier power. Typically, the ACM system is equipped with an Earth-space feedback communication channel and the capability to raise the modulation to its normal capacity when there is improvement in atmospheric conditions.

The third option is Site Diversity, involving the inter-connection – typically via a terrestrial link – of two or more geographically dispersed earth stations such that when the signal of a station is being affected by precipitation the installed diversity systems can re-route the stronger of the received traffic from any of the interlinked stations not experiencing atmospheric interference through the link. Two parameters have been defined as being essential for characterizing diversity performance (ITU-R P.168-8 04/2003)1: one, Diversity Gain (the difference in decibels between the single-site and diversity attenuation values for similar time percentage), and two, Diversity Improvement Factor (the ratio of the single-site time percentage and the diversity time percentage at the same attenuation level).

 

CONCLUDING REMARKS

Availability is a key factor when it comes to satellite communications. Bandwidths are also a critical service enabler just as frequency spectrum remains a very valuable but scarce resource. The reality, however, is that frequency bands in the 4-8 GHz range, which account for the most widely used bands, are already saturated. This as well as the unfolding bourgeoning of appetite for wider bandwidths and increased throughput is pushing communication industry stakeholders higher up the centimeter and millimeter bands ladder.

As atmospheric interferences remain the primary headache for telecommunication operations carried out at higher frequency bands, industry stakeholders are not likely to relent as far as efforts to dig out better mitigation techniques are concerned. That, of course, is good news for aviation as it deepens its affinity for satellite technology.

____________________________________________________

1 ITU-R P.618-8, “Propagation Data and Prediction Methods required for the Design of Earth-space Telecommunication Systems”. (2003).

2 S. Hossain, and A. Islam, “Estimation of Rain Attenuation at EHF Bands for Earth-to-Satellite Links in Bangladesh”, International Conference on Electrical, Computer and Communication Engineering (ECCE), February 16-18, Cox’s Bazar, Bangladesh (2017).

3 U. Ukommi, K. Ekanem, E. Ubom, and K. Udofia, “Evaluation of Rainfall Rates and Rain-induced Signal Attenuation for Satellite Communication in South-South Region of Nigeria”, Nigerian Journal of Technology, Vol 42, No. 4, pp. 472-477 (2023).

4 ITU-R P.838-2, “Specific Attenuation Model for Rain for Use in Prediction Methods”. (2003).

5 L.J. Ippolito, “Radiowave Propagation in Satellite Communications”, Van Nostrand Reinhold Company. (1986).