Article Preview
Top1. Introduction
The nature of the World Wide Web is highly decentralized and distributed. However, the traditional web service model is centralized from the perspective of a single website. All the requests from end-users (clients) for a specific webpage are handled by single web server (origin server) hosting the requested contents as is shown in figure 1.
Figure 1.Centralized web service model
In this model, the clients that are closer to the web server can access the contents faster than the clients that are away from it. The centralized approach results in scalability, reliability and performance issues for the web servers serving popular websites. With the growing number of requests, it increases the load on the server and the network link connecting to the server. When the requests exceed the server’s processing capacity or the bandwidth of the link connecting to the server, it results in the increase in access delay or even unavailability of the website (Hofmann & Beaumont, 2005). Keeping this in view, various techniques have been implemented to improve the Quality of Service (QoS) of web content delivery to end-users. These techniques include increasing the capacity of servers, clustering of servers, provision of caching on client-side and/or proxy servers, establishing mirror servers, using multiple Internet Service Provider (ISP) connections to server, outsourcing to Content Delivery Network service providers. Along with these techniques, efforts have been put to develop the algorithms such as selection of a server (Gromov & Chebotareva, 2014) to satisfy end-user’s request, balancing the load on servers along with reliability (Gupta et al., 2015) and scalability (Flavel et al., 2015) to improve the overall performance of the web. In (Juneja & Garg, 2012) collective intelligence of ant colony approach has been proposed for searching the optimal network path and balancing the load on web servers. Whereas, artificial neural network approach has been used in (Amin & Garg, 2014) to adapt the process used by users in accessing web contents based on web log files.
In this paper section 2 throws a light on the concept of QoS in the context of web content delivery. Different techniques to improve QoS are analyzed in section 3. Section 4 gives comparative analysis of these techniques. In section 5 CDN is compared with some other distributed systems. Section 6 concludes the paper.
Top2. Quality Of Service Of Web Content Delivery
The term Quality of Service has been used for expressing non-functional requirements for web services (Ran, 2003). It is a set of service requirements that needs to be met by a network. It is a measure of how a particular service responds to the requester and it is compared to the service expected. It is an important criterion for deciding about the quality of web content delivery to end users and may vary greatly because of dynamic and unpredictable nature of the Internet.
Various web applications differ in their QoS requirements. Data applications such as file transfer are not generally delay sensitive and can recover from packet loss via re-transmission. Media streaming applications require a fixed bandwidth. Business applications require security and transactional QoS. The best effort service approach to all applications may not be adequate. Further different users may have different QoS requirements depending on their needs. Human patience also imposes lower throughput bounds on applications such as web browsing. Therefore, QoS can be defined as providing service differentiation and performance assurance for Internet applications (Zhao et al., 2000). In (Statovci-Halimi & Franzl, 2013) the coexistence of quality engineering mechanisms with differentiated QoS charging has been suggested to control the speed and quality of future network development.