Future Perspective: Data Validity-Driven Report Optimization

Future Perspective: Data Validity-Driven Report Optimization

Piotr Augustyniak, Ryszard Tadeusiewicz
Copyright: © 2009 |Pages: 17
DOI: 10.4018/978-1-60566-080-6.ch011
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter first presents the description of a common approach to ECG interpretation triggering, assuming that the parameters are updated each time the input data is available. The heartbeat detector runs for each acquired sample, and all heartbeat-based diagnostic parameters (e.g., arrhythmias) are calculated immediately after a positive detection of a heartbeat. This approach keeps the diagnostic parameters up to date with the frequency of the physical variability limit of their source at the cost of unnecessary computation. Slowly changing parameters are significantly over-sampled and consecutive values show high redundancy. At the end of the chapter, we present a concept of a packet-content manager. This procedure collects all requests concerning diagnostic parameter update and computation, supervises the propagation of the validity attribute backwards through the processing chain, and provides an exceptional pathway for all abnormality alerts emerging in the processing chain. As a result, all parameters are reported as rarely as possible without breaking the Shannon rule.
Chapter Preview
Top

Uniform Reporting Based On Source Data Availability Introduction

Early wearable cardio-monitor solutions used modern micro-electronic technology, but their functional aspect followed bedside interpretive electrocardiographs (HP, 1994; Nihon Kohden, 2001; Gouaux et al., 2002; Maglaveras et al., 2002; Bousseljot et al., 2003; Banitsas, Georgiadis, Tachakra, & Cavouras, 2004; Paoletti & Marchesi, 2004; CardioSoft, 2005; González, Jiménez, & Vargas, 2005). Similarly, surveillance networks were conceptually closer to a group of independent cardiologists than to a hierarchy established during the history of medicine. Moreover, the traditional approach assumes unconditional signal acquisition based on uniform time-interval, rigid processing, including all available computation stages. Most of the processing branches end up with a conclusion of no relevant change since the last diagnostic report, because the variability of the diagnostic parameters is much lower than the variability of the signal itself.

Regular Updates of the Input and Program State

Computer programs could be considered as deterministic sequential machines, whose outputs (described as the current state) are dependent on the inputs and the previous state. Following this scheme most software is built including the automatic ECG interpretation program. The scheme assumes that each change of input could potentially influence the “machine” status and the output values. Following this reasoning, calculations are triggered for each new sample of incoming raw ECG signal. Formally, all internal data is updated and in practice, fortunately for the computation complexity, most of the processing is launched conditionally upon the detection of a new heartbeat (QRS complex).

The recording of a new heartbeat in the internal database launches measurements like electrical axes, wave borders detection, arrhythmia detection, heart rate variability assessment, update of ST segment description, and many others. Here again, the appearance of new input results in the launch of every computation in order to update the output.

The analysis of data flow in a typical ECG interpretation chain leads to three main sources of events being prospective triggers for computation:

  • 1.

    the acquisition of new signal samples, implying the update of local signal buffers, filtered signal, heartbeat detectors, and pacemaker detector inputs and other signal-dependent calculations; the sampling rate is high (100-1000sps) and constant;

  • 2.

    the appearance of new positive QRS detector outcomes, causing the subsequent computation of all heartbeat-based parameters, beat measures, beat sequence detections, and sequences of beat-to-beat measures; the beat rate is low (0.8-3bps) variable and subject to physiological limitations like tissue-dependent stimulus conduction velocity and cell refraction time; and

  • 3.

    the presence of new pacemaker pulses or patient button inputs, resulting in the launch of alternative or auxiliary procedures aimed at the correlation of spikes and stimulated contractions; the pulse rate is low and subjected to technical limitations, however a burst of spikes could be expected in the case of pacemaker failure.

The characteristics of the events specified above are used in the estimation of the expected processing workload and reporting frequency.

Complete Chapter List

Search this Book:
Reset