Downlink and Uplink Resource Allocation in LTE Networks

Downlink and Uplink Resource Allocation in LTE Networks

Johann Max Hofmann Magalhães (Federal Institute of Triângulo Mineiro, Brazil), Saulo Henrique da Mata (Federal University of Uberlândia, Brazil) and Paulo Roberto Guardieiro (Federal University of Uberlândia, Brazil)
DOI: 10.4018/978-1-4666-8732-5.ch009
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

The design of a scheduling algorithm for LTE networks is a complex task, and it has proven to be one of the main challenges for LTE systems. There are many issues to be addressed in order to obtain a high spectral efficiency and to meet the application's QoS requirements. In this context, this chapter presents a study of the resource allocation process in LTE networks. This study starts with an overview of the main concepts involved in the LTE resource allocation, and brings two new proposals of scheduling algorithms for downlink and uplink, respectively. Simulations are used to compare the performance of these proposals with other scheduler proposals widely known and explored in the literature.
Chapter Preview
Top

Introduction

In the past years, mobile networks have experienced a tremendous growth of subscribers. This fact has demanded a continuous evolution of the current mobile networks, in order to attend the always crescent expectations of these users. Applications, such as Voice over IP (VoIP), web browsing, video chat, and video streaming, have applied new challenges to design of mobile networks, due to their delay and bandwidth strict requirements.

In this context, the Long-Term Evolution (LTE) network has emerged as one of the most promising solutions for these new challenges. LTE is a packet-based mobile broadband network. It has been developed by the Third Generation Partnership Project (3GPP) and aims to delivery high throughput, low latency and an enhanced spectral efficiency with respect to previous 3G networks (Capozzi, Piro, Grieco, Boggia, & Camarda, 2013).

As stated before, one can find a plurality of different applications, each of them with specific delay and bandwidth requirements. These delay and bandwidth requisites can be mapped into different Quality of Services (QoS) classes. Therefore, it is of fundamental importance that the network can ensure these QoS requirements for each of those applications. Generally, the Call Admission Control (CAC) and the resource allocation are the main mechanisms to ensure these QoS requisites.

In the LTE system, the User Equipment (UE) gets access to the network through the base station, which is known as evolved NodeB (eNodeB or eNB). The eNodeB is responsible for the allocation of the network resources among the UEs attached to it.

In a real propagation environment, the air interface is characterized by fast fading variations, due to the multiples possible paths that the signal can travel until it reaches the receptor. Depending on the path, it can occur a constructive or a destructive recombination of the signal at the receiver. The position and distance of the receiver from the transmitter also have influence in these fast fading variations. Moreover, high data rate transmission in a multipath environment leads to Inter Symbol Interference (ISI) and, consequently, bit errors at the receiver (Cox, 2012).

From the past 3GPP mobile networks, one of the most important changes introduced by LTE is the shift from the use of Code Division Multiple Access (CDMA) to Orthogonal Frequency Division Multiple Access (OFDMA) (Aydin, Kwan, & Wu, 2013). OFDMA simplifies the design of channel equalizers and it is a powerful way to solve the ISI problem. Furthermore, OFDMA offers high spectral efficiency, scalability and flexibility of bandwidth allocation, since the resource allocation can occur in time and frequency domains.

Downlink and uplink have a radio link with a time variant nature, due to the fast fading phenomenon, as said before. Thus, the eNodeB must consider the current quality of channel of the UEs, in order to allocate the resources in an effective manner. The quality of the channel is obtained through the Channel Quality Indicator (CQI), which is reported by the UE to the eNodeB. From the CQI, the system can perform link adaption using Adaptive Modulation and Coding (AMC) techniques, i.e. the system can choose a more robust Modulation Coding Scheme (MCS) under adverse channel conditions to improve spectral efficiency.

In this sense, channel-aware solutions are usually adopted in LTE resource allocation, since they are able to exploit channel quality variations by assigning higher priority to users experiencing better channel conditions. However, the channel quality cannot be the only factor in the resource scheduling process. The scheduling algorithm has also to consider, for example, the average throughput of the cell, fairness index and mainly the QoS requirements.

Finally, one can see that the design of a scheduling algorithm for LTE networks is a complex task. There are many issues to be addressed in order to obtain a high spectral efficiency and to meet the QoS requirements.

In this context, this chapter presents a study of the resource allocation process in LTE networks. This study starts with an overview of the main concepts involved in the LTE resource allocation. These concepts are important to understand the subsequent sections, which bring two new proposals of scheduling algorithms for downlink and uplink, respectively.

Key Terms in this Chapter

Meta-Heuristic: is an approach used to find solutions for optimization problems. Meta-heuristic algorithms do not assure to find the optimal solution every time they run. However, they may provide a sufficiently good solution for the problem.

3GPP: The 3rd Generation Partnership Project (3GPP) is a collaboration between groups of telecommunications associations, known as the Organizational Partners.

Penetration Loss: Indicates the fading of radio signals from an indoor terminal to a base station due to obstruction by a building.

Digital Filter: A filter that performs mathematical operations on a sampled, discrete-time signal to reduce or enhance certain aspects of that signal.

Multipath: Relating to radio signals that travel by more than one route from a transmitter and arrive at slightly different times, causing distortion in the signal.

Path Loss: The reduction in power density (attenuation) of an electromagnetic wave as it propagates through space.

Fading: A variation in the strength of received radio signals due to variations in the conditions of the transmission medium.

Shadowing: The deviation of the power of the received electromagnetic signal from an average value. Caused by obstacles affecting the wave propagation. May vary with geographical position and/or radio frequency.

Complete Chapter List

Search this Book:
Reset