Article Preview
TopIntroduction
Employees spend more and more of their work time away from their desk, e.g., visiting customers, performing on-site maintenance, or similar activities. Therefore, access to enterprise information systems (EIS) on mobile devices becomes increasingly important.
Today, most EIS like customer relationship management (CRM) systems or enterprise resource planning (ERP) systems use a service oriented architecture (SOA). They use and compose various backend services to implement their functionality. Hence, frameworks like described in (Hamdi, Wu & Benharref, 2008; Natchetoi, Kaufmann, & Shapiro, 2008; Tergujeff, Haajanen, Leppanen & Toivonen, 2007; Wu, Gregoire, Mrass, Fung & Haslani, 2008) emerged, aiming to support mobile SOA access. These frameworks make use of several techniques to support the peculiarities of mobile scenarios. For example, they introduce a client-side cache to bridge temporal loss of network connectivity, or use data compression to avoid long download times in low-bandwidth mobile networks. In general, the scenario of mobile SOA access looks like shown in the lower part of Figure 1. A client accesses a proxy server via a low-bandwidth, high-latency connection, e.g., EDGE and the proxy accesses one or more backend servers via high-bandwidth, low-latency connections, e.g., corporate LAN.
Figure 1. Mobile (lower part) and non-mobile (upper part) setup. In a mobile setup, the position of the low-bandwidth, high-latency link is the opposite compared to the non-mobile setup
However, the usability of mobile SOA is still greatly affected by the high latency of mobile network connections. A typical use case for an EIS requires several roundtrips from the mobile device to the SOA infrastructure, as multiple services need to be accessed. Each roundtrip causes a relatively large overhead during connection setup (Xun, Liao, & Zhu, 2008). This leads to noticeable and disturbing delays in the UI (Pervilä & Kangasharju, 2008), reducing the usability of mobile SOA. Leaving the network connection open over long periods to avoid this overhead is not an option, as this consumes too much battery power, which is a scarce resource for mobile devices (Cao, 2002).
In this paper we present prefetching and caching enhancements for an existing framework for mobile SOA access (Hamdi, Wu & Benharref, 2008), reducing user perceived latency. User perceived latency is the latency measured at the UI level using a typical use case of mobile SOA access, as suggested in (Domenech, Pon, Sahuquillo & Gil, 2007).
Our enhancements do not decrease battery lifetime for the mobile device, as the prefetched data is sent along with legitimate response data. The prefetching is done at a proxy server that employs a sequence prediction algorithm to predict future requests. Using a sequence prediction algorithm for prefetching avoids the need to hand-craft prefetching functionality at application level. At the client, the prefetched data is stored in a cache. Thus, no communication overhead occurs, should the data be actually requested by the user at a later point in time.