Improving Data Quality in Health Care

Improving Data Quality in Health Care

Karolyn Kerr, Tony Norris
DOI: 10.4018/978-1-60566-026-4.ch295
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The increasingly information intensive nature of health care demands a proactive and strategic approach to data quality to ensure the right information is available to the right person at the right time in the right format. The approach must also encompass the rights of the patient to have their health data protected and used in an ethical way. This article describes the principles to establish good practice and overcome practical barriers that define and control data quality in health data collections and the mechanisms and frameworks that can be developed to achieve and sustain quality. The experience of a national health data quality project in New Zealand is used to illustrate the issues.
Chapter Preview
Top

Background

Tayi and Ballou (1998) define data as “the raw material for the information age.” English (1999) builds on the idea of information as being data in context, with knowledge being information in context, where you know the significance of the information. Translating information into knowledge requires experience and reflection.

Klein and Rossin (1999) note there is no single definition of data quality accepted by researchers and those working in the discipline. Data quality takes a consumer-focused view (consumers being people or groups who have experience in using organisational data to make business decisions) that quality data are “data that are fit for use” (Loshin, 2001; Redman, 2001; Wang, Strong, & Guarascio, 1996). Data quality is ‘contextual’; the user defines what is good data quality for each proposed use of the data, within its context of use (Pringle, Wilson, & Grol, 2002; Strong, Lee, & Wang, 1997). Therefore:

Data are of high quality if they are fit for their intended uses in operations, decision-making, and planning. Data are fit for use if they are free of defects and possess desired features (Redman, 2001).

Data quality is now emerging as a discipline, with specific research programmes underway within universities, the most significant being that of the Sloan School of Management Information Quality Programme at the Massachusetts Institute of Technology (MIT)1. The field is based upon the well-established Quality Discipline, drawing on the work of Deming (1982) and the adaptation of the “plan-do-check-ac” cycle (the Deming Cycle). It also draws upon the “quality is free” concept of Crosby (1980) arising from the notion that doing things wrong is costly, and imports the ideas behind the Six Sigma approach and Total Quality Management (Juran & Godfrey, 1999) adapted to Total Data Quality Management (TDQM) and the management of information as a product (Wang, Lee, Pipino, & Strong, 1998).

The research programmes are developing ways to combine TDQM with the strategic direction of the organization, aligning the data quality requirements with overall goals. At present, there is little research published in this area, although some organizations do have data quality programmes with some strategic alignment to the business requirements.

Data quality is also becoming an increasingly important issue for health care providers, managers and government departments. The movement towards total quality management in health care to improve patient safety and health care efficiency is demanding high quality information. Further, evidenced based care requires the assimilation of large amounts of relevant and reliable research data available at the point of clinical decision making. Strategic prevention, national consistency of improvement practices, evolving data standards, and targeted improvements with increasing consumer involvement are all moving health care towards a TDQM model of data quality management.

Key Terms in this Chapter

Complex Adaptive System: A system with many independent agents, each of which can interact with others.

Emergent Strategy: A series of actions converges into patterns that become deliberate when the pattern is recognized and legitimized by senior management.

Data Quality Framework: A tool for the assessment of data quality within an organization; a vehicle that an organization can use to define a model of its data environment, identify relevant data quality attributes, analyse data quality attributes in their current or future context, and provide guidance for data quality improvement.

Data Quality Improvement Strategy: A cluster of decisions centered on organizational data quality goals that determine the data processes to improve, solutions to implement, and people to engage.

Data Quality Dimensions: Quality properties or attributes of data; a set of data quality attributes that most data consumers react to in a fairly consistent way.

Total Data Quality Management: An approach that manages data proactively as the outcome of a process, a valuable asset rather than the traditional view of data as an incidental by-product.

Datum: A fact or value assigned to a variable; single observational point that characterises a relationship. Data is the plural noun of datum

Grounded theory: A method of extracting meaning and theories from data by systematically and intensively analysing and coding the data, sentence-by-sentence, or phrase-by-phrase.

Complete Chapter List

Search this Book:
Reset