Adaptive Cache Server Selection and Resource Allocation Strategy in Mobile Edge Computing

Adaptive Cache Server Selection and Resource Allocation Strategy in Mobile Edge Computing

Michael Pendo John Mahenge, Edvin Jonathan Kitindi
DOI: 10.4018/IJICTHD.299412
Article PDF Download
Open access articles are freely available for download

Abstract

The enormous increase of data traffic generated by mobile devices emanate challenges for both internet service providers (ISP) and content service provider (CSP). The objective of this paper is to propose the cost-efficient design for content delivery that selects the best cache server to store repeatedly accessed contents. The proposed strategy considers both caching and transmission costs. To achieve the equilibrium of transmission cost and caching cost, a weighted cost model based on entropy-weighting-method (EWM) is proposed. Then, an adaptive cache server selection and resource allocation strategy based on deep-reinforcement-learning (DRL) is proposed to place the cache on best edge server closer to end-user. The proposed method reduces the cost of service delivery under the constraints of meeting server storage capacity constraints and deadlines. The simulation experiments show that the proposed strategy can effectively improve the cache-hit rate and reduce the cache-miss rate and content access costs.
Article Preview
Top

Introduction

The enormous improvement of smart mobile equipment is considered to be significant in this era of big data development to enable access to delay-critical and resource-intensive mobile applications such as video-on-demand (Tran et al., 2017). While facilitating vast potential for offering anywhere and anytime accessibility, vast amount of data generated by mobile equipment emanates great burden to the core network due to huge increase of data traffic that is expected to grow multi-fold in the future(Cisco, 2016; Jaleel et al., 2010). The enormous increase of data traffic emanates challenges for both Internet Service Providers (ISP) and Content Service Provider (CSP). The ISP strive to provide quality services along with minimizing operational expenses such as internet access costs. On the same vein, CSP strive to enhance quality of experiences (QoE) for end users in-line with achieving cost-efficient content delivery.

Cloud computing as an internet-based computing has been considered important in providing quality services and handling big data processing (Skourlelopoulos et al., 2017). Consequently, large CSP such as YouTube, Facebook, or Twitter store their content in massive data centers in the cloud. Also, the advanced features of the cloud computing such as elastic assignment of resources on-demand, and unlimited resources for processing and storage, guarantees substantial capacity to deal with huge amount of data emanating from mobile applications (Pompili et al., 2016). However, due to multi-hop communication between mobile equipment and remote servers, legacy systems such as mobile cloud computing (MCC) still face performance challenges. In the traditional content delivery network (CDN), the mobile devices form the frontend and the CDN servers are deployed at the backend. Each mobile device is associated with a nearby base station (BS) or access point (AP) for internet access services. Each content request received at the BS, is forwarded to the CDN through core network, retrieve the requested content and respond back to the requesting user. However, the overwhelming evolution of resource-intensive applications with low-latency requirement emanates challenges in the traditional CDN in terms of network overloading, high service utility cost, and inadequate service quality (Tran et al., 2017).

Recently, emerging computing paradigm such as MEC that provides cloud computing facilities at the vicinity of mobile users has been proposed (Hu et al., 2016). MEC has been considered as a significant computing paradigm to mitigate challenges emanating from the immense pressure created by resource-intensive mobile applications (Tran et al., 2017). Meanwhile, mobile edge caching deployed at the BS of mobile network is proposed as a novel and promising architecture that bring contents at the proximity of the content service requesters (Wang et al., 2014). This novel architecture offers substantial opportunity to achieve cost-efficient content delivery through caching mostly accessible contents closer to users (Zhao et al., 2016). Therefore, CSP could benefit through Infrastructure-as-a-service (IaaS) offered by MEC which guarantee scalability, low service delivery cost, high performance, location-awareness, and low delay. While mobile users could benefit from enhanced QoE achieved through content caching at the BS or AP. Moreover, MEC cooperative capability offers potential opportunity to improve QoE through cooperation between BSs and the central cloud (Tran et al., 2017). Despite the unique contributions offered by mobile-edge caching, the limited cache storage capacity at the BS become stumbling block to efficiently deal with the enormous pressure triggered by latency-critical and resource-intensive mobile applications (Tran et al., 2017). Also, varying application and users’ preferences, heterogeneity of MEC computing instances, and limited MEC resources such as bandwidth and power, intensify cache server selection and resource allocation problem.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 14: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing