Echo State Network-Based Content Prediction for Mobile Edge Caching Networks

Echo State Network-Based Content Prediction for Mobile Edge Caching Networks

Zengyu Cai (Zhengzhou University of Light Industry, China), Xi Chen (Zhengzhou University of Light Industry, China), Jianwei Zhang (Zhengzhou University of Light Industry, China), Liang Zhu (Zhengzhou University of Light Industry, China), and Xinhua Hu (Zhengzhou University of Light Industry, China)
DOI: 10.4018/IJITWE.317219
Article PDF Download
Open access articles are freely available for download


With the rapid development of internet communication and the wide application of intelligent terminal, moving the cache to the edge of the network is an effective solution to shorten the delay of users accessing content. However, the existing cache work lacks the comprehensive consideration of users and content, resulting in low cache hit ratio and low accuracy of the whole system. In this paper, the authors propose a collaborative caching model that considers both user request content and content prediction, so as to improve the caching performance of the whole network. Firstly, the model uses the clustering algorithm based on Akike information criterion to cluster users. Then, combined with the clustering results, echo state network is used as the machine learning framework to predict the content. Finally, the cache contents are selected according to the prediction results and cached in the cache unit of the small base station. Simulation results show that compared with the existing cache algorithms, the proposed method has obvious improvement in cache hit ratio, accuracy, and recall rate.
Article Preview


As vast amounts of information travel through the network, most internet traffic is related to content distribution. Therefore, meeting the low-latency transmission and high throughput requirements of different types of traffic is an inevitable requirement for improving user experience and network computing performance (Du et al., 2021). In order to cope with the challenge of rapid growth of network traffic and alleviate the traffic pressure of the core network, caching has been studied as an effective tool to reduce latency by prestoring the most popular content in cache space. At the same time, with the wide application of artificial intelligence technology in people’s lives, the research on related technologies in the field of artificial intelligence has gradually received extensive attention. Echo state network is an important method in the field of artificial intelligence. As a nonlinear adaptive dynamic system, its fast machine learning speed has been successfully applied to the prediction of network traffic (Zhang et al., 2021).

In order to improve the user experience and reduce the backhaul data traffic, a lot of work related to caching technology has been done in recent years (Li et al., 2020; Hu et al., 2021; Thar et al., 2016; Chhangte et al., 2021). Li et al. (2020) propose a probabilistic cache placement method based on content centrality. This method improves the cache hit rate and cache content utilization by considering the content centrality and content acquisition delay to adaptively calculate the probability of node cache. However, its computational complexity is high, and it is only suitable for small and medium-sized networks. Hu et al. (2021) propose an edge network caching strategy based on social relationship awareness. This strategy maps the users’ social relationship strength according to the similarity of user needs, and then selects the user as the auxiliary cache location according to relationship strength. This strategy has a certain improvement in cache hit rate and system cache delay, but there are deficiencies in edge cache updates. Thar et al. (2016) propose a core router cache decision algorithm, which improved cache hit rate and reduced content acquisition delay and hit distance, but did not consider user interest and popularity of hot content. Chhangte et al. (2021) propose a service that implements distributed caching at the edge of Wi-Fi. This service combines distributed caching of software-defined networks to effectively improve the user experience in the target network, but has a certain disadvantage in content transmission delay.

Although the existing caching strategy improves the cache hit rate, there are still many problems in practice (Serhane et al., 2021; Ren et al., 2020; Krishnendu et al., 2022). Serhane et al. (2021) propose a cache-optimization algorithm based on chemical reaction. Although the algorithm reduces energy consumption, it does not consider the importance of content. Ren et al. (2020) proposed a unmanned aerial vehicle deployment and caching strategy based on user preference prediction. This strategy is combined with UAV base station scheduling to optimize the average cache hit rate of the system, but does not consider the user distribution problem. Krishnendu et al. (2022) propose a strategy for wireless edge caching and content popularity prediction using machine learning. However, the popularity of content cached in the network lags behind the change of user preferences, resulting in a decrease in cache hit rate. In addition, the statistical results of content popularity in real networks are often discretely distributed. In order to solve this problem, Chen et al. (2017) propose an algorithm that combines the machine-learning framework of echo state networks with sub-linear algorithms to study the active caching problem of cloud wireless access networks. Compared with the traditional content popularity prediction algorithm, the echo state network can understand the distribution of user content requests without a lot of training, but this paper does not consider the content correlation.

Complete Article List

Search this Journal:
Volume 18: 1 Issue (2023)
Volume 17: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 16: 4 Issues (2021)
Volume 15: 4 Issues (2020)
Volume 14: 4 Issues (2019)
Volume 13: 4 Issues (2018)
Volume 12: 4 Issues (2017)
Volume 11: 4 Issues (2016)
Volume 10: 4 Issues (2015)
Volume 9: 4 Issues (2014)
Volume 8: 4 Issues (2013)
Volume 7: 4 Issues (2012)
Volume 6: 4 Issues (2011)
Volume 5: 4 Issues (2010)
Volume 4: 4 Issues (2009)
Volume 3: 4 Issues (2008)
Volume 2: 4 Issues (2007)
Volume 1: 4 Issues (2006)
View Complete Journal Contents Listing