Statistical Inference-Based Cache Management for Mobile Learning

Statistical Inference-Based Cache Management for Mobile Learning

Qing Li, Jianmin Zhao, Xinzhong Zhu
DOI: 10.4018/978-1-60960-539-1.ch006
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Supporting efficient data access in the mobile learning environment is becoming a hot research problem in recent years, and the problem becomes tougher when the clients are using light-weight mobile devices such as cell phones whose limited storage space prevents the clients from holding a large cache. A practical solution is to store the cache data at some proxies nearby, so that mobile devices can access the data from these proxies instead of data servers in order to reduce the latency time. However, when mobile devices move freely, the cache data may not enhance the overall performance because it may become too far away for the clients to access. In this article, we propose a statistical caching mechanism which makes use of prior knowledge (statistical data) to predict the pattern of user movement and then replicates/migrates the cache objects among different proxies. We propose a statistical inference based heuristic search algorithm to accommodate dynamic mobile data access in the mobile learning environment. Experimental studies show that, with an acceptable complexity, our algorithm can obtain good performance on caching mobile data.
Chapter Preview
Top

In this section, we review some earlier research works related to our research. Such relevant works can be divided into three categories: mobility model (movement prediction), data migration and data caching.

Complete Chapter List

Search this Book:
Reset