It has been nearly five years since the International Telecommunication Union (ITU) released ITU-R M.2083-0, with a roadmap to release IMT-2020 to address the anticipated future of 5G performance that would greatly eclipse that of IMT-Advanced and then-planned 4G technology.

In 2017, the 3rd Generation Partnership Project (3GPP) rushed to release a preliminary version of their 5G standard (Release 15) so telecommunications companies would begin developing hardware and deploying 5G infrastructure according to their guidelines. The result was the new radio (NR) standard for 5G that includes a non-standalone (NSA) millimeter-wave spectrum capability and additional sub-6 GHz cellular bands. Release 15 has provided enhanced mobile broad-band (eMBB) specifications, with future updates to Release 16 and a Release 17 to provide more definition of massive machine-type communications (mMMC) and ultra-reliable low-latency communications (URLLC).

3GPP Release 15 has provided standards that support higher capacity, greater network efficiency, higher peak data rates, higher user experienced data rates, greater spectrum efficiency, and greater mobility. The result has been telecommunications base-station transceivers (BST) and UE hardware that operate at higher frequencies, handle greater carrier aggregation (CA), support higher MIMO layers for UE/BST, offer greater bandwidth per channel, leverage more efficient access methods, and can use more complex modulation schemes.

Another result of 3GPP Release 15 was the release of conformance and performance testing recommendations and requirements (TR 38.810). Unlike previous generations of cellular technology, 5G NR standards introduced not only millimeter-wave frequency bands but the requirement for sub-6 GHz and millimeter-wave communications to operate simultaneously (5G NR NSA).

With the mainstreaming of beamforming/beamsteering, CA, and multi-input multi-output (MIMO) technology, BST and UE hardware is no longer a single radio with tuning/filtering technology and a mix of antennas/wideband antenna. Telecommunications hardware has evolved to be an array of antenna elements with transmit/receive modules (TRMs) driving each element or a sub-array of elements. More complex digital processing is also required to successfully implement beamforming, CA, and MIMO, which leads to extremely integrated active antenna systems/advanced antenna systems (AAS) that now resemble the active electronically scanned array (AESA) technology used in military radar.

IMT-2020 set a high bar for future telecommunications performance that will likely require substantial changes to every aspect of the telecommunications infrastructure.

This article discusses 5G hardware RF test challenges and highlights several current developments in research and test component selection that move toward mitigating the challenges.

Millimeter-Wave Test Challenges

Until 3GPP Release 15, the highest frequency of cellular telecommunications reached only a few gigahertz. Though there are wireless networking standards (WiFi IEEE that operate between 5 GHz and 6 GHz, these systems were specifically designed for shortrange (tens of meters) networking communications. With the introduction of FR2 millimeter-wave frequencies, cellular technology moved in a single iteration to a technology node with a very different makeup than that of commercial telecommunications technology.

Microwave/millimeter-wave hardware manufacturers and suppliers have catered primarily to applications in military/defense, aerospace, space, weather, and scientific research rather than telecom. These providers have typically produced small volumes of highly custom components designed to meet stringent criteria specific to the application being served. Designing products that overcome the intrinsic challenges of millimeter-wave operation requires niche design expertise, familiarity with the nuances of computer-aided design (CAD) tools in the millimeter-wave spectrum, substantial computing power for simulations, lossy interconnect, high-cost test and measurement equipment, extremely precise and repeatable manufacturing capability, and well-trained technicians capable of performing exacting inspections, quality control, and tuning procedures.

ITU-T Immunity and Emissions Test Recommendations for 5G.

There are several physical phenomena that become more significant at higher frequencies. There are also a variety of features of RF components and interconnect that are correlated to the size of the operating wavelength. For instance, RF losses increase as a function of frequency, as factors like skin effect and frequency-dependent conductivity and loss of materials become more prevalent in the millimeter-wave spectrum. Also, transmission line (coaxial cables and microstrip/striplines) dimensions must be proportionally smaller to accommodate shorter wavelength signals, which further increases the loss of the signal path between and within RF components.

Other considerations at millimeterwave and with ultra-wide-bandwidth applications are group delay and phase delay. For dispersive media, such as waveguide and microstripline, the group delay and phase delay can be large enough to require special delay-mitigating components and other design and routing efforts to correct.

Though vector network analyzers (VNAs) are typically used to characterize group delay variations by measuring the phase distortion, it may be advantageous in some setups to use spectrum analyzers (SAs) and signal generators (SGs). A SA/SG test setup for group delay may lead to a more simplified setup with faster measurement speed, which could benefit test scenarios with a large number of signal paths. Having separate control of a SG may also be helpful in some millimeter-wave test scenarios, as the power output of the SG could be made higher to compensate for the greater transmission losses at these frequencies. Skew matched coaxial assemblies may also help to mitigate the time delay (skew) between various signal paths.

Characterization Testing

Characterizing RF devices, of which there is greater need for highly integrated components/devices for 5G applications, involves testing a device in a very precise manner over an extremely wide range of frequencies. To generate an accurate model for use with electronic CAD (ECAD) software for design and simulation, a device/component usually must be tested a few times beyond its maximum operating frequency range and well below its minimum frequency of operation.

The surge of demand for MMICs and Systems-on-Chips (SoCs)/Systems-in-Packages (SiPs) for 5G is also raising the need for characterization testing to and beyond 100 GHz, which generally requires a variety of upconverters/down-converters, waveguide interconnect, and complex banded test setups. Broadband testing can be done to 110 GHz with precision 1-mm coaxial connectors for which calibration standards and adapters are commercially available. Testing beyond 110 GHz requires iterative testing of devices/components over several waveguide bands and then de-embedding and stitching the data to create a model.

FR1 and FR2 Simultaneous Test Challenges

In addition to the challenges posed by millimeter-wave testing, testing multi-band (FR1 & FR2) systems may lead to testing sub-6 GHz and millimeter-wave frequencies and systems simultaneously. In the case of 5G NR NSA, where the millimeter-wave data communications require sub-6 GHz control plane signals to function, simultaneous multi-band testing is necessary. To account for electromagnetic compatibility (EMC) and conformance testing, multi-band testing is also needed to ensure that multi-band systems don’t create undue interference, as harmonics from lower frequencies could also reach into the millimeter-wave spectrum and vice versa.

Current coexistence and EMC standards do not necessarily account for sub-6 GHz devices creating interference in the millimeter-wave spectrum. The ITU has released recommendations (ITU-T Series K Supplement 10) on methods for handling immunity and emissions tests for 5G devices. Given the new number of frequency bands and operation modes (i.e. carrier aggregation), coexistence testing for 5G is significantly more complex compared to prior cellular technologies. There is now a much larger number of harmonic and spurious byproducts that could result from co-site interference from sub-6 GHz band and millimeter-wave 5G transmissions, and given the proximity of antennas in compact AAS, the resulting interference products could be substantial compared to low RX signal levels from high path loss millimeter-wave communications.

Beyond just FR1 and FR2 measurements, 5G UE will also contain a variety of other wireless technologies including Bluetooth, WiFi (2.4 GHz, 5 GHz, and possibly WiGig at 60 GHz), NFC, and potentially a variety of wireless charging technologies. The resulting test landscape for complex UE means that immunity and emissions testing will become significantly more nuanced.

Currently, there is no standardized test approach for a 5G UE and each individual wireless standard has its own compliance requirements. It may be that future versions of these standards will include additions or changes to the compliance sections to account for coexistence with new 5G operation and it is also likely that future 5G conformance standards will have to take into account other wireless standards for better interoperability of future 5G UE.

High-Density Testing of Active Antenna Systems

Several of the standards and conformance bodies — such as FCC, CISPR, and ETSI — require that emissions amplitude is tested in an antenna’s far field when possible. The distance from the antenna at which the far field region begins is frequency dependent and proportional to wavelength. Hence, for sub-6 GHz frequencies, the far field region begins tens of centimeters to meters away from the antenna. For millimeter-wave frequencies, the far field may begin from centimeters to millimeters away from the antenna.

This factor, combined with a much higher path loss for millimeter-wave frequencies, means that the path loss for millimeter-wave signals may yield signal levels too low to measure in the far field of sub-6 GHz antennas. Higher gain antenna and low-noise amplifiers may be used to increase the signal power from dual-band systems somewhat. However, higher gain antennas tend to be extremely directional and require precise positioning, potentially reducing repeatability.

In the United States, FCC Part 30: Upper Microwave Flexible Use Service, dictates conformance standards including power, bandwidth, effective isotropic radiated power (EIRP), and emission limits based on power spectral density (PSD) for adjacent and spurious bands. An issue with the EIRP measurements as well as related measurements for total radiated power (TRP) is that the current methods assume the antenna is isotropically radiating. This assumption creates inaccurate measurements for AAS that use MIMO and beamforming technology.

Moreover, for millimeter-wave beamforming antennas, the effective far field region of a MIMO/beamforming antenna is different than the respective far field region of each antenna element. To measure MIMO/beamforming antennas in their far field, accurate predictions of the far field region are necessary to select the correct measurement antenna and avoid unnecessary loss and error. Also, being non-isotropic, measuring spherically around the antenna to determine the EIRP and TRP might be necessary or at least provide more accurate measurements for radiated power. The quiet zone for measurement antennas at these frequencies is very small, which would lead to the need for high-precision positioning systems and a very long measurement time delayed by both the positioner time and measurement time.

Testing Complex Assemblies, MMICs, and SoCs/SiPs

With the small sizes of most RF components and devices and the need to reduce interconnect losses, it is more likely that telecom will look to integration and digitization to minimize the cost and complexity of 5G FR2 technology. This would result in complex assemblies built with integrated or compact filters, attenuators, amplifiers, oscillators, phase shifters, power combiners/splitters, interconnect, mixers, circulators/isolators, and other RF components and devices.

Many AAS manufacturers are likely to integrate as many of these components as possible using current or partially augmented low-cost and highvolume production methods such as silicon semiconductor fabrication. Hence, digitizing RF functions is becoming increasingly important and more commonly employed.

The far field of MIMO/beamforming antenna arrays is different from traditional far field calculations for an antenna.

Though higher levels of integration will lead to more compact, energy-efficient, and cost-effective telecommunications hardware, it will also result in more complex testing scenarios. It is likely that future 5G hardware will not have the ease and familiarity of coaxial or waveguide interconnect between test equipment and the device under test (DUT).

Many of today’s millimeter-wave components are assembled in packages with coaxial or waveguide interconnect that readily mates to the coaxial/waveguide ports of test equipment and calibration standards. Greater levels of integration mean that many RF components will be incorporated into a single complex assembly, or MMIC, where testing the components will need to be done using a custom test interface and/or probe testers at the chip level or wafer level. Much of this testing will also need to be multi-domain, where power, digital, analog, and RF testing occur simultaneously to determine device performance and functionality.

Even higher levels of integration, which are necessary for UE equipment that fits in the popular form factors and operates for an adequate duration with modern rechargeable batteries, is resulting in development of complex SoCs/SiPs. These are beginning to include a mix of domains and technologies in a single package that is often only accessible for testing on the wafer or through the final package leads. These SoCs/SiPs are often operating with extremely high-speed digital signals, millimeter-wave signals, stringent power supplies, and highly sensitive analog signals.


Though much of the test landscape for current and future 5G UE and BST hardware is still nebulous, there are a few predictions that can be made. The inclusion of millimeter-wave frequencies and the simultaneous testing of sub-6 GHz and millimeter-wave bands will lead to greater test complexity and a variety of design/prototyping and conformance test challenges. The use of CA, MIMO, and beamforming technologies will lead to a much higher numbers of antennas and radio ports that need to be tested and, in the case of conformance testing, make OTA the only viable option.

Greater integration of RF, digital, analog, and power components is also inevitable to reduce the size, weight, cost, and power consumption of complex AAS, which will ultimately transition much RF testing to probe stations. RF test equipment and systems will also need to evolve to better accommodate variable test requirements, a process that is already underway.

Lastly, telecom OEMs face the challenge of finding suppliers that can fulfill their many urgent requirements to avoid the long lead times that are typical of RF and millimeter-wave component manufacturers and could lead to substantial delays and extended time-to-market for critical 5G hardware.

This article was written by Peter McNeil of Pasternack, Irvine, CA. For more information, visit here .

Aerospace & Defense Technology Magazine

This article first appeared in the September, 2021 issue of Aerospace & Defense Technology Magazine.

Read more articles from the archives here.