Resource Allocation Scheduling Algorithm Based on Incomplete Information Dynamic Game for Edge Computing

Resource Allocation Scheduling Algorithm Based on Incomplete Information Dynamic Game for Edge Computing

Bo Wang, Mingchu Li
Copyright: © 2021 |Pages: 24
DOI: 10.4018/IJWSR.2021040101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

With the advent of the 5G era, the demands for features such as low latency and high concurrency are becoming increasingly significant. These sophisticated new network applications and services require huge gaps in network transmission bandwidth, network transmission latency, and user experience, making cloud computing face many technical challenges in terms of applicability. In response to cloud computing's shortcomings, edge computing has come into its own. However, many factors affect task offloading and resource allocation in the edge computing environment, such as the task offload latency, energy consumption, smart device mobility, end-user power, and other issues. This paper proposes a dynamic multi-winner game model based on incomplete information to solve multi-end users' task offloading and edge resource allocation. First, based on the history of end-users storage in edge data centers, a hidden Markov model can predict other end-users' bid prices at time t. Based on these predicted auction prices, the model determines their bids. A dynamic multi-winner game model is used to solve the offload strategy that minimizes latency, energy consumption, cost, and to maximizes end-user satisfaction at the edge data center. Finally, the authors designed a resource allocation algorithm based on different priorities and task types to implement resource allocation in edge data centers. To ensure the prediction model's accuracy, the authors also use the expectation-maximization algorithm to learn the model parameters. Comparative experimental results show that the proposed model can better results in time delay, energy consumption, and cost.
Article Preview
Top

Introduction

Cloud computing technology has dramatically promoted social development and produced substantial economic benefits. Data obtained from terminal devices such as sensors, cameras, and smartphones has exploded with the advent of the Internet of Thing's era. However, a significant amount of IoT data only needs localization, and If it is transmitted back to a remote cloud computing center, it will cause tremendous pressure on network bandwidth and cloud computing center. On the other hand, with the advent of the 5G era, the demands for features such as low latency and high concurrency are becoming increasingly important. These sophisticated new network applications and services require huge gaps in network transmission bandwidth, network transmission latency, user experience, making cloud computing face many technical challenges in terms of applicability. In response to cloud computing's shortcomings, edge computing (Satyanarayanan, 2017) has come into its own. Edge computing provides services between the cloud data center and end devices, and a computing mode is provided close to the user (Shi et al., 2016). In applications requiring high concurrency, low latency, and massive bandwidth, the use of edge computing can play a role that traditional cloud computing cannot achieve (Lopez et al., 2015). Using edge data center resources efficiently and rationally to reduce latency and energy consumption is critical in edge computing. Existing researches cover two aspects: resource allocation in edge data centers and strategies for task offloading. The edge data center resource refers to allocate efficiently limited CPU, memory, storage, and other resources to meet more end-users' needs. Task offloading strategy research refers to how tasks are offloaded between end-users and edge data centers in a multi-user edge computing environment to maximize overall benefits and ensure fairness. However, with many end-users joining the IoT, the resource allocation and task offloading strategies of edge computing data centers face many challenges due to end-device power, end-device mobility, energy consumption, and time delay. How to allocate the edge computing data center's limited-service resources becomes an urgent problem to be solved. In reality, on the one hand, the edge computing data center as an edge service provider will face conflicts between energy consumption, cost, and user satisfaction when managing and allocating service resources. On the other hand, as users of edge services, end-users want to pay a certain amount of money to complete their submitted tasks and ensure high satisfaction and quality requirements. In addition to the existing research methods, game theory is a useful tool to solve resource allocation competition among multiple participants. The strategy adopted by each participant in the game will affect the competitive strategy of other participants. Participants choose strategies to maximize each other's benefits. In this paper, the conflicts between energy consumption, cost, and user satisfaction are transformed into the competition between edge data centers and end-users. The auction mechanism simulates the interaction process between them, thereby effectively solving the conflict problem. This article's research will focus on the conflict between energy consumption, cost, and user satisfaction to discuss service resources' pricing in edge data centers.

This paper proposes a dynamic multi-winner game model based on incomplete information to solve multi-end users' task offloading and edge resource allocation. First, based on the history of end-users storage in edge data centers, a hidden Markov model can predict other end-users' bid prices at time t. Based on these predicted auction prices, the model determines their bids. A dynamic multi-winner game model is used to solve the offload strategy that minimizes latency, energy consumption, cost and maximizes end-user satisfaction at the edge data center. Finally, the authors designed a resource allocation algorithm based on different priorities and task types to implement resource allocation in edge data centers. Besides, to ensure the prediction model's accuracy, the authors also use the Expectation-Maximization algorithm to learn the model parameters. This paper has the following contributions.

Complete Article List

Search this Journal:
Reset
Volume 21: 1 Issue (2024)
Volume 20: 1 Issue (2023)
Volume 19: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 18: 4 Issues (2021)
Volume 17: 4 Issues (2020)
Volume 16: 4 Issues (2019)
Volume 15: 4 Issues (2018)
Volume 14: 4 Issues (2017)
Volume 13: 4 Issues (2016)
Volume 12: 4 Issues (2015)
Volume 11: 4 Issues (2014)
Volume 10: 4 Issues (2013)
Volume 9: 4 Issues (2012)
Volume 8: 4 Issues (2011)
Volume 7: 4 Issues (2010)
Volume 6: 4 Issues (2009)
Volume 5: 4 Issues (2008)
Volume 4: 4 Issues (2007)
Volume 3: 4 Issues (2006)
Volume 2: 4 Issues (2005)
Volume 1: 4 Issues (2004)
View Complete Journal Contents Listing