Data Caching Patterns

Data Caching Patterns

Tony C. Shan (IBM, USA) and Winnie W. Hua (CTS Inc., USA)
Copyright: © 2009 |Pages: 11
DOI: 10.4018/978-1-59904-845-1.ch019
OnDemand PDF Download:
No Current Special Offers


As information technology (IT) has become part of business in today’s globalized economy, increasingly higher performance of information systems is demanded by the business models to support various business operations and help the business compete and succeed. IT must strive to be nimble and adaptive to provide a higher level of services and, at the same time, reduce the total cost of ownership (TCO). In most situations, the current enterprise infrastructure must be extended to get the most out of the existing investments. Creating innovative solutions is an effective approach to achieve this goal, and scalable data management is one of the most valuable innovations.
Chapter Preview



In general, a data cache is defined as a data block that contains frequently accessed contents in a text or binary format. A persistent data block is saved in storage at either the client or the server side. Alternatively, a transient data block may be stored in a memory buffer for the lifetime of an application, a user session, or a single-client request. Caching is a widely used technique to boost the performance of data access. When a program needs to access a particular data element, the process first checks the data cache to verify whether the element has been previously retrieved and stored. If a match is found, the application will use the data directly from the cache, rather than accessing to the data source again. As a result, a drastic performance gain is realized, since the data access in RAM is significantly faster than that to a disk or external resource over the network. In addition, a cached data element is usually in a ready-to-use form, so that little or no transformation and initialization is needed, resulting in higher performance in the processing.

Value Proposition

Generally speaking, many performance challenges may be resolved via the horizontal or vertical scaling. Extra servers are added in the horizontal scaling. The vertical scaling approach upgrades the existing machines with more and faster processors, additional memory, larger hard disks, and/or higher-speed network connection. In today’s competitive environment, however, the ultimate challenge is a balance of the overall project cost and on-demand scalability to meet the capacity needs. Without looking into the end-to-end performance chain, particularly at the application software level, simply investing more on hardware alone usually does not fix the root cause despite the fact that it may temporarily alleviate the symptom. In other words, a more holistic approach should be taken to improve the overall architectural design. The best solution to systematically address the performance issue is usually an aggressive use of data caching technology.


Data Caching Patterns

A wide range of caching patterns can be applied individually or in combination as a means to increase application performance. Each pattern was designed with its own specific merits and addresses a certain type of data access issues. Classifying data caching patterns is a challenging task, resulting in a different outcome scheme based on the criteria applied. For example, they may be categorized as creational, structural, and behavioral. Alternatively, they may be grouped at the levels of the method, class, component, application, platform, and system.

A vast majority of today’s distributed applications are developed in either Java or .NET on an n-tier architecture, which consists of a series of logical/physical layers: client, Web, application, integration, and data and enterprise resource. Accordingly, a taxonomic scheme is defined in Figure 1 to sort various caching patterns into appropriate layers. Furthermore, those patterns that can be used in multiple layers are grouped in the cross-layer category.

Figure 1.

Data caching patterns


Key Terms in this Chapter

Connection Pool: A cache store of connections maintained in the memory so that the connections in the store can be reused.

Web Application Framework: A reusable, skeletal, semicomplete modular platform that can be specialized to produce custom Web applications, which commonly serve the Web browsers via the Http(s) protocol.

Data Cache: A data block that contains frequently accessed data in a textual or binary format, which may be either saved to a persistent storage at the client or server side, or persistent in the memory for the lifetime of a single client request, a user session, or an application process.

Http Cookie: A message given to a Web browser by a Web server, and the text string is sent back to the server each time the browser accesses a page from the server.

AJAX: Asynchronous JavaScript and XML.

Edge Side Includes: A markup language that enables partial page caching for HTML fragments.

Web Application: A server-based application that is accessed with a Web browser over a network.

O-R Mapping: A technology that integrates object-oriented programming language capabilities with relational databases.

Proxy Server: A software or hardware device that enables applications to connect to other network resources in an indirect fashion.

Design Patterns: Common solutions to common problems, particularly in software design.

Complete Chapter List

Search this Book: