Evolution from 2G to 5G
5G (5th generation) wireless network or 5G network is the next frontier in mobile telecommunication standards. The Figure 1 above illustrates the evolution from 2G to 5G. As we know, 2G (2nd Generation) wireless network started in the early 1980s and basic phone with voice and data services was offered. Next, 3G (3rd generation) wireless network were introduced offering data transfer with max download speeds ranging up to 4.4 Mbps for HSPA and 168 Mbps for HSPA+. With 3G, video calling services were enabled.
Following 3G came the arrival of 4G (4th generation) wireless systems. The 4G initiated the era of high-speed Internet such as video streaming, online gaming, with the max download speeds reaching up to 100 Mbps for mobile access and close to 1 Gbps for fixed WLANs. The growing penetration of smartphones all over the world has increased the demand for higher bandwidth and thereby putting pressure on telecom operators to increase the bandwidth capacity. To meet this higher bandwidth demand, 3GPP standardization group formalized the 5G specification release in June 2018.
Why 5G is needed?
Subsequently, disruptive technologies such as autonomous vehicles and Internet Of Things (IoT) in the present times have given an impetus for 5G technology. For example, DHL’s IoT tracking and monitoring system tracks everything from vehicle behavior to packages to environmental sensors in the warehouse. Thus, data volume and peak rates are going increase over the airwaves. For these reasons, the mobile communication industry is coming up with 5G networks as a successor to 4G.
In the last three years, the 5G networks have started getting traction with standardization efforts at European Telecommunications Standards Institute (ETSI). Sooner, mobile operators in the European Union (EU) and USA started real-world trials of 5G networks. With 5G, speed up to 10 Gbps can be achieved.
Therefore, some of the application areas where 5G plays a prominent role include Smart Homes, self-driving cars and VR/AR applications.
How is 5G different than 4G?
The 4G LTE landscape is illustrated in the figure below with respect to frequency bands on a global scale. According to GSMA intelligence report , 352 mobile operators have commercially launched 4G-LTE networks across 124 countries as of January 2015.
Furthermore, at the end of 2016, there is more recent data providing details on the global 4G LTE deployments and their mapped frequency bands. The key finding is that around one-third of 4G-LTE networks deployment taking place worldwide are based on 1800 MHz frequency band by refarming of the 2G/3G spectrum. On the other hand, the 2.6 GHz and 800 MHz are the second and third most deployed frequency bands, respectively. These findings are illustrated in the Figure 3 below.
The 5G spectrum needs to be made available within three important frequency ranges for delivering top-notch widespread coverage and support all use cases that require very high performance such as availability, security and reliability. The three frequency ranges are Sub-1 GHz (Low band), 1 – 6 GHz (Mid bands) and above 6 GHz (High bands). Those bands are shown in the figure 4 below.
Low frequency band (< 1 GHz)
This low frequency band will support widespread coverage across urban, suburban and rural areas; especially able to provide support for IoT services. The ITU recommends in its IMT-2020 (i.e., 5G specification release ) to support at least 100 MHz channels per operator in this sub-1 GHz band.
Mid frequency band (1 — 6 GHz)
The Mid frequency band between 1 to 6 GHz excels in providing a good mix of network coverage and capacity benefits. As seen in Figure 5 below, the 5G services in the mid-frequency band fall in the spectrum range between 3.3 to 3.8 GHz for most of the countries. But for the US, it also uses spectrum range from 3.7 to 4.2 GHz.
High frequency bands ( > 6 GHz)
The bands above 6 GHz is needed to meet the ultra-high broadband speeds envisioned for 5G services. Currently, the 26 GHz and (or) 28 GHz bands going to be adopted internationally. The ITU specifies in its IMT-2020 release regarding radio interface that up to 1 GHz channels per operator should be supported in this high band spectrum range.
The Figure 5 illustrates how 5G is getting deployed and (or) set to be targeted at different frequencies by countries (United States, European Union, Canada, UK, China) leading the 5G adoption.
Wireless network Parameter Differences between 5G and 4G
Before discussing the parameter differences between 5G and 4G, let us first define these key parameters.
1. Latency in the air link (air latency):
The air latency is defined as the delay caused due to the physical distance between the user device and the cell tower. The factors such as signal strength, number of devices in the cell sector and noise influences the air latency.
2. Latency end-to-end (device to core):
The end-to-end latency is the total delay between the base station (cell towers) and the gateway to the Internet. The network elements such as an RF unit in base station, routers, switches, and gateway components add to the end-to-end latency.
3. Connection density:
It is defined as the total number of devices served by the base station fulfilling a specified quality of service (QoS) per unit area (per square km).
4. Spectral efficiency:
It equates to the peak data rate achieved for a given bandwidth with the lowest transmission errors in a cellular network.
As shown in Table 1, the 5G achieves superior performance over 4G LTE. For instance, the latency values for 5G are 8 times lesser than 4G. Next, the connection density in 5G is up to 1 million connected devices per square km in comparison to 2000 devices in 4G. The spectral efficiency is 5 times and peak throughput is 33 times higher in 5G. And also the energy efficiency in 5G improves by more than 90% over 4G LTE.
|Latency in the air link||8 ms ||< 1 ms (8 times better) |
|Latency end-to-end (device to core)||10 ms ||< 3 ms (3 times better) |
|Connection density (devices per 0.98 square km)||2000 connected devices ||1 million connected devices (500 times higher) |
|System spectral efficiency||15 (bit/s)/Hz/cell ||79 (bit/s)/Hz/cell (5 times higher) |
|Peak throughput (downlink) per connection||300 Mbps ||20 Gbps (33 times higher)|
|Energy efficiency||90% improvement over LTE|
What are new technological innovations in 5G?
To achieve superior values over 4G LTE, the 5G standard includes a new physical layer, massive MIMO technology, and millimeter wave frequencies deployment. In the following, we will go through each of these technological innovations.
A new physical layer for 5G
The new physical layer for 5G has its priorities sorted out by improving data rates and spectral efficiency using 4G LTE as the baseline. To accomplish these objectives, both active antennas and antenna arrays need to be increased drastically. In addition, power, channel and low-noise amplifier designs along with new modulation and coding schemes are developed.
One of the issues that the new physical layer should tackle is the beamforming during the millimeter wave (mmWave) RF signal processing. The mmWave adaptability will be achieved by employing spatial channel models designed separately for sub-6 GHz (mid-band) and mmWave (>28 GHz) frequencies.
Millimeter wave spectrum
The current 4G network operates at frequencies below 3 GHz. With the demand towards higher data rates (i.e., above 10 Gbps), there is an emerging need to adopt millimeter wave spectrum (> 6 GHz to 100 GHz). Also, the current adoption of 5G below 6 GHz bandwidth range doesn’t satisfy the peak data rates above 10 Gbps. This has resulted in targeting and tapping frequency bands in the millimeter wave spectrum to achieve very high peak data rates.
Therefore, there are ongoing trials by equipment vendors and mobile network operators to assess the viability of higher frequency transmissions in mid-band spectrum (3.4 GHz to 5 GHz) and including the mmWave spectrum (28 GHz and 39 GHz).
Also, the 5G deployment using mmWave spectrum need massive MIMO antenna arrays with hundreds of antenna elements on base stations. Also, the size of an antenna array reduces in proportion to the wavelength. Therefore, antenna arrays designed for mmWave transmission can be up to 100 times smaller than an antenna array used for microwave transmission.
The Multiple-Input and Multiple-Output (MIMO) is a process to increase the capacity of a radio link using multiple transceiver antennas to reap the benefits of multipath propagation. The wireless communication standards such as IEEE 802.11n (Wi-Fi), HSPA+ (3G), WiMax (4G) and LTE (4G) also integrates MIMO. To attain higher spectral efficiency, 5G technology utilizes massive MIMO. The improved spectral efficiency by using massive MIMO achieves target network capacities and enhances the connectivity in 5G.
Basically, massive MIMO is a variant of multi-user MIMO in which the number of antennas at a base station will be much higher than the number of user-devices served by that base station. With this set-up of a huge number of base station antennas relative to a number of user-devices, triggers a quasi-orthogonal channel response which also holds the promise of yielding significant gains in spectral efficiency.
With Massive MIMO, RF designers face a challenging time when they prepare to scale the number of antennas to hundreds. From the simulation perspective, the simulation time of antenna design doesn’t scale up and significantly increases as the antenna array size increases.
Next, antenna coupling is another difficult issue to deal with during simulation. One possible solution is the use of hybrid beamforming for optimizing the number of RF chains (i.e., transmitters, receivers, and amplifiers). Besides, the massive MIMO also overrides drastic propagation losses in mmWave frequency bands.
What is the importance of 5G in 2019?
In the earlier sections, we discussed how 5G differs from 4G LTE in terms of frequency, wireless network parameters, and technological innovations. In this section, let’s delve further into the importance of 5G for the end-users.
What 5G means to the users:
There has been much buzz on 5G with demos and presentations going on worldwide at top mobile and wireless conferences such as Mobile World Conference (MWC), Barcelona. MWC was recently held in February 2019 and the attendees were made aware of latest 5G technologies including the latest smartphones and gadgets with 5G connectivity. In the United States, 5G-based fixed wireless access has been a reality with AT&T and other operators.
Also, Motorola in USA will soon release its 5G-enabled smartphone with Moto Mods feature. Moreover, there are latest on-going trials by network operators and govt-led 5G initiatives to test 5G viability to users in terms of data rates and low latency (~ 1 ms) for applications such as Augmented reality and live HD video streaming.
For example, the South Korean government ran a 5G pilot networks at the 2018 Winter Olympics with the aim to provide futuristic immersive experiences such as augmented reality based navigation. Some of these latest trials and commercial roll-outs are discussed below:
1. Vodafone’s 5G Network Trial at the Manchester Airport
Vodafone has performed a large-scale 5G test using Qualcomm Snapdragon X50 5G Modem. This test is part of UK-wide 5G trial in Birmingham, Bristol, Cardiff, Glasgow, Liverpool, London, and Manchester.
The device specifications of Qualcomm X50 5G modem is as follows:
- Technology: 5G New Radio
- Spectrum: mmWave and sub-6 GHz
- mmWave specs: 800 MHz bandwidth, 8 carriers, 2×2 MIMO
- Modes: TDD, NSA (non-standalone)
- sub-6 GHz specs: 100 MHz bandwidth, 4×4 MIMO
- mmWave Features: Dual-layer polarization in downlink and uplink, Beam forming, Beam steering, Beam tracking
- 5G Peak Download Speed: 5 Gbps
2. Commercial Roll-out of 5G in US
AT&T is the first telecom operator to launch the commercial mobile 5G network in mmWave licensed spectrum (39 GHz) on 21st December 2018. In the first phase, the roll-out of 5G hotspots will cover 12 urban cities in US, namely Atlanta, Georgia; Charlotte, N.C.; Dallas; Houston; Indianapolis; Jacksonville, Florida; Louisville, Kentucky; Oklahoma City; New Orleans; Raleigh, N.C., San Antonio and Waco, Texas.
Besides, the Netgear Nighthawk 5G mobile hotspot is the world’s first mobile 5G hotspot operating on mmWave frequencies over AT&T 5G+ network. AT&T brands its 5G network as “5G+” as it operates at mmWave frequency. AT&T reports that after 50 days of the 5G service launch some of its small and medium-sized business customers such as a motion-picture TV and digital company have seen speeds at 200-300 Mbps and up to 400 Mbps. Moreover, AT&T plans to deploy nation-wide 5G network using sub-6 GHz spectrum by early 2020.
3. Denmark will have a 5G rollout in 2020
The 5G action plan published by the Danish Minister for Energy, Utility Supply, and Environment states that the deployment will be made on the 3.5 GHz band for the first phase followed by deployment on the 26 GHz frequency band. The assessment of network operators for 5G trials is yet to be finalized.
4. T-Mobile 5G network deployment in the Hague, Netherlands
The T-mobile 5G roll-out will begin this year to have a large-scale 5G deployments by 2020. Hague will be the first city in the Netherlands to have a full 5G network coverage.
1. 5G – Connection Density — Massive IoT and So Much More
2. Key features and requirements of 5G/IMT-2020 networks (Presentation)
3. Understanding 5G: Perspectives on future technological advancements in mobile, GSMA Intelligence
4. 5G Americas: LTE to 5G: Cellular and Broadband Innovation (Presentation)
5. Setting a World Record in 5G Wireless Spectrum Efficiency With Massive MIMO
6. What Is 5G and when will it arrive in the UK?
7. 5G Spectrum GSMA Public Policy Position, November 2018
8. Spectrum For 4G And 5G – Qualcomm
9. Minimum requirements related to technical performance for IMT-2020
radio interface(s), International Telecommunication Union
10. Mapping 4G-LTE deployments by frequency bands, GSMA Intelligence (published in February, 2017)