automated electrocardiogram analysis: the state of the art

9
3lED. INFORM. (1989), VOL. 14, NO. 1, 13-51 Automated electrocardiogram analysis: the state of the art FRANCE BESSETTE and LUONG NGUYEN Department of Physiology and Biophysics, Faculty of Medicine, University of Sherbrooke, Sherbrooke, Quebec J1 H 5N4, Canada (Rereired June 1988) Abstract. The overwhelming number of electrocardiograms (ECGs) now recorded routinely has prompted the development of computer analysis which in turn has benefited electrocardiography with important technological advances. All automated ECG analysis systems adopt a similar approach: a measurement program and a program that interprets the clinical significance of these measurements along with a rhythm analysis algorithm. hleasurement, selection and classification of parameters vary according to the program used. Data compression is applied to the signal to reduce processing time and allow long- term storage. Diagnostic accuracy, however, IS not greatly improved over that of experienced cardiologists. Programs studied using a validated data bank provided by an international group of cardiologists show a variability not only in parameter measurement but also in diagnostic statement and in the way in which such statements are expressed. Recommendations for measurement standards have been made to fulfil the need for exchange of diagnostic criteria. KO recommendations concerning the selection of para- meters have been proposed, and so new parameters or combinations of parameters can be introduced with the ultimate aim of increased diagnostic performance. Keywords: Electrorordiogram; (hmputer analysis; Data compression; Spline Junctions. 1. Introduction In 1903 William Einthoven obtained the first recording of electrical signals generated by the heart and detectable at the body's surface. Now, some 80 years later, about 200 million electrocardiograms (ECGs) are recorded annually throughout the world [l]. This overwhelming recognition of the ECG as a primary tool in the screening of heart disease is mainly due to its non-invasive nature and its simplicity as a method. Computers have played a major role in electrocardiography, allowing trans- mission of compressed digitized signals from distant centres to a database centre for analysis, making epidemiological studies and serial comparison feasible, and allowing instant analysis near the patient (for instance, patient monitoring in a coronary care unit). However, automated ECG analysis, whether off line or on line (real-time processing), has not resulted in a major improvement in diagnostic accuracy. The measurement of ECG parameters, their classification and their interpretation vary according to the program used [2, 31. 2. Methods of signal acquisition 2.1. Lead systems The normal electrocardiographic signal consists mainly of a P wave (atrial depolarization), a QRS complex (ventricular depolarization), an ST segment and a T wave (ventricular repolarization). 0307-7640 89 $3 00 c 1989 Taylor & Francis Ltd Inform Health Soc Care Downloaded from informahealthcare.com by McMaster University on 11/03/14 For personal use only.

Upload: luong

Post on 10-Mar-2017

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Automated electrocardiogram analysis: The state of the art

3lED. INFORM. (1989), VOL. 14, NO. 1, 13-51

Automated electrocardiogram analysis: the state of the art

FRANCE BESSETTE and LUONG N G U Y E N Department of Physiology and Biophysics, Faculty of Medicine, University of Sherbrooke, Sherbrooke, Quebec J1 H 5N4, Canada

(Rereired June 1988)

Abstract. T h e overwhelming number of electrocardiograms (ECGs) now recorded routinely has prompted the development of computer analysis which in turn has benefited electrocardiography with important technological advances. All automated ECG analysis systems adopt a similar approach: a measurement program and a program that interprets the clinical significance of these measurements along with a rhythm analysis algorithm. hleasurement, selection and classification of parameters vary according to the program used. Data compression is applied to the signal to reduce processing time and allow long- term storage. Diagnostic accuracy, however, I S not greatly improved over that of experienced cardiologists. Programs studied using a validated data bank provided by an international group of cardiologists show a variability not only in parameter measurement but also in diagnostic statement and in the way in which such statements are expressed. Recommendations for measurement standards have been made to fulfil the need for exchange of diagnostic criteria. KO recommendations concerning the selection of para- meters have been proposed, and so new parameters or combinations of parameters can be introduced with the ultimate aim o f increased diagnostic performance.

Keywords: Electrorordiogram; (hmputer analysis; Data compression; Spline Junctions.

1. Introduction In 1903 William Einthoven obtained the first recording of electrical signals

generated by the heart and detectable at the body's surface. Now, some 80 years later, about 200 million electrocardiograms (ECGs) are recorded annually throughout the world [l]. This overwhelming recognition of the ECG as a primary tool in the screening of heart disease is mainly due to its non-invasive nature and its simplicity as a method.

Computers have played a major role in electrocardiography, allowing trans- mission of compressed digitized signals from distant centres to a database centre for analysis, making epidemiological studies and serial comparison feasible, and allowing instant analysis near the patient (for instance, patient monitoring in a coronary care unit). However, automated ECG analysis, whether off line or on line (real-time processing), has not resulted in a major improvement in diagnostic accuracy. T h e measurement of ECG parameters, their classification and their interpretation vary according to the program used [2, 31.

2. Methods of signal acquisition 2.1. Lead systems

T h e normal electrocardiographic signal consists mainly of a P wave (atrial depolarization), a QRS complex (ventricular depolarization), an ST segment and a T wave (ventricular repolarization).

0307-7640 89 $3 00 c 1989 Taylor & Francis Ltd

Info

rm H

ealth

Soc

Car

e D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y M

cMas

ter

Uni

vers

ity o

n 11

/03/

14Fo

r pe

rson

al u

se o

nly.

Page 2: Automated electrocardiogram analysis: The state of the art

44 F . Bessette and L . Nguyen

Since the polarity of the signal varies according to the orientation of the electrodes with respect to the current flux in the heart during each cardiac cycle, several systems of electrodes or leads to record the ECG have been and are being used. The most common of these systems are the three standard bipolars ( I , 11, 111), the three augmented peripheral unipolars (aVR, aVL, aVF), the six precordial unipolars (V1 to V6) and the three orthogonal derivations (X, Y, Z), also called Frank’s leads. Most ECG signals are obtained in four series of three leads recorded simultaneously. Some systems acquire the 12 leads simultaneously: only eight leads are actually recorded while the other four are reconstructed from leads I and 11. In order to obtain more information without XYZ orthogonal leads, a combination of three quasi-orthogonal leads is chosen from the 12 conventional leads (e.g. V6, 11, v 2 ) [41.

2.2. Analogue-to-digital conversion Continuous or analogue variations in the voltage in ECG signals are measured at

regular intervals (constant-frequency sampling) and converted to a series of digits corresponding to the amplitude of the voltage. The number of levels available on this numerical scale is determined by the number of basic information units (bits) of the converter. The resolution of the analogue-to-digital (A-D) conversion is dependent on the width of the analogue input window, which should be wide enough to accept the largest signal amplitude, on the sampling frequency and on the conversion capacity (number of bits). The American Heart Association (AHA) [ 5 ] , the Task Forces of the American College of Cardiology [6] and the Common Standards in Electrocardiography (CSE) Working Party [7] all recommend a sampling frequency or conversion rate of 500 Hz. Some programs apply 250 Hz, while others use 300, 400 or 500 Hz [8]. The best possible resolution with a given converter depends on its bit or word size. An 8-bit digitizer divides the range of amplitudes of the analogue signal into 2* or 256 levels, while a 10-bit digitizer produces 21° or 1024 levels. The CSE Working Party recommends a resolution of 5 pV which, for an adequate analogue input window, corresponds to a converter with a 10-bit capacity [7,9, lo].

3. Results and discussion 3 . l . Automation in electrocardiography

The first program for automated wave recognition by computer was developed by Stallman and Pipberger in 1961 and was based on XYZ Frank leads recording [l 11. The following year, Caceres and collaborators produced a program for ECG analysis of signals acquired by conventional 12-lead systems [12]. Nowadays, more than 30 programs for the automated analysis of the routine ECG (conventional leads) or vectocardiogram (Frank leads) have been developed throughout the world [ 131. Numerous researchers have studied the performance of systems using 12 leads versus three orthogonal leads but have been unable to show any significant advantage of one over the other. In practice, however, 95% of all ECG processing by computer is achieved with 12-lead systems while three-lead (XYZ) systems account for less than 4%. This is probably because most physicians are familiar with the 12-lead set and have a vast diagnostic experience with it [9, 101.

The quantity of information generated by the heart and available at the body’s surface is very large and some of it is redundant. Therefore, for practical reasons, this torrent of information must be limited. Well-defined samples must be obtained, for

Info

rm H

ealth

Soc

Car

e D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y M

cMas

ter

Uni

vers

ity o

n 11

/03/

14Fo

r pe

rson

al u

se o

nly.

Page 3: Automated electrocardiogram analysis: The state of the art

Automated electrocardiogram analysis 45

example, by recording the signal over a limited period of time, by varying the sampling frequency, by compressing the data to eliminate redundancies or by choosing an optimal number of derivations for analysis. There is no unique solution to ECG analysis: it is still performed in a heuristic way [14]. However, automated ECG analysis systems all adopt a similar approach: a measurement program and a program that interprets the clinical significance of these measurements; rhythm analysis is included and sometimes serial comparison of ECGs is also available. The measurement program generally consists of the following steps: signal acquisition and conditioning, wave detection and typification, and wave boundary recognition and parameter extraction. The interpretation program is a program for classifying these parameters which is based either on a methodology of the decision-tree type or on multivariate statistical analysis methods [6, 9, 10, 131.

3.2. The z~artous approaches to automated electrocardiogram analysis 3.2 .1 . Signal conditioning. Random and periodic components of noise, whether they are of biological or technical origin, affect the accuracy of measurements performed on the ECG. Several filtration techniques are applied to the signal in order to optimize the signal-to-noise ratio: initially, before digitization, to reduce errors that could result from low amplitude and high frequency components in the original ECG signal; then, after digitization, to eliminate a.c. power source interference (50 or 60 Hz), baseline wander due to electrode polarization and artefacts from the electrical activity of skeletal muscles. Elaborate preprocessing cannot always account for a program’s low sensitivity to noise; it can also be the result of more robust boundary detection [ 1, 91.

Moreover, all measurement programs try to use the redundancy of available complexes in the sampled ECG in order to improve the accuracy of their measurements. Three techniques are currently in use. The first localizes the ‘best complex’ for analysis, i.e. the complex with the least noise. The second localizes all the complexes in all the leads and averages; random noise is reduced in this case. The third extracts the measurements of each complex and subsequently averages these measurements. In these last two techniques, morphologically different complexes are not retained in either the average o r the measurement. It has not yet been demonstrated which approach is superior when wave recognition is made on isolated or averaged complexes [6, 91. Programs analysing an averaged beat show less variability, but at the expense of time-consuming waveform comparison [I 31.

3.2.2. Waveform recognition. The most crucial decisions in wave detection and analysis lie in the correct identification of the QRS complex and the determination of a precise search region for the other ECG components. There are two large classes of techniques for the detection of QRS complexes: the algorithms for single leads and for multiple leads. An amplitude threshold or change of amplitude with time is applied. With multi-lead algorithms, the spatial velocity is determined from XYZ Frank leads or from three quasi-orthogonal leads by averaging the amplitude changes as a function of time. Since the applied threshold can lead to the false detection of high amplitude peaky T waves [7, lo], many programs introduce a dead time after the detection of a QRS complex, and this sometimes results in a failure to detect the following complex. Threshold techniques applied for P-wave detection do not have the same performance as for the QRS complex: normal waves are slow and of low amplitudes and are consequently difficult to separate from baseline wander or noise;

Info

rm H

ealth

Soc

Car

e D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y M

cMas

ter

Uni

vers

ity o

n 11

/03/

14Fo

r pe

rson

al u

se o

nly.

Page 4: Automated electrocardiogram analysis: The state of the art

46 F. Bessette and L . Nguyen

abnormal waves often coincide with QRS complexes or T waves. In practice, failure of automated systems in the diagnosis of rhythm is most frequently associated with the incapacity to detect P waves [9, 10, 14, 1-51. A morphological classification of beats is carried out following detection. This classification, which is called typification, yields a representative or averaged beat or complex along with a more reliable interpretation of rhythm. Several parameters are determined depending on the programs (e.g. QRS duration, R-R interval, maximum amplitude). This is followed by evaluation of the intervals between a pair of QRS complexes which are attributed to P or T depending on the normalized factors applied to amplitude, slope and duration [9, 10, 161.

Wave boundary recognition is still a problem, even for the QRS complex [17]. The aim of these wave recognition programs is to determine as precisely as possible the beginning (onset) and the end (offset) of P and QRS as well as the end of T; the onset of T is more difficult to identify because of variation in the slope of the ST segment in some abnormalities. Most programs use methods with a greater or lesser degree of sophistication related to amplitude differences with time. One method that has become very popular is that of the pattern or template of the distribution of measurements not only in a localized area but over the whole region of wave onset and offset. A minimum surface area is determined within which the analysed waves must be found. However, this method is not specifically developed for the various waveshapes occurring in the ECG [9, 101. A fine tuning of this approach is the method of wave-contour limit discrimination. Its use is determined or limited by the algorithm since it requires the sampling of a greater number of points. The results are none the less more precise and reliable [3].

Increasingly, in order to minimize processing time and memory space, an optimal number of leads is chosen for wave recognition depending on the application foreseen [4]. In ECG ambulatory analysis, QRS wave recognition mainly seeks an appropriate combination of slopes in a single lead. Despite the fact that QRS delineation has been used in several systems for 20 years, no method has gained general acceptance. Discrepancies between programs may be due to the philosophy of the program or to systematic differences between cardiologists concerning onset and offset, particularly of the QRS waveform [17]. The CSE Working Party has recently made recommendations on standards for measurements in clinical electro- cardiography, which have been obtained after a repetitive study by an international group of cardiologists and statistically validated on a bank of 250 digitized ECGs representing a wide range of electrocardiographic morphologies [7]. Very few reports giving the results (range of values) of these measurement algorithms on well- documented ECGs have been published even though these limits are fundamental to any diagnostic procedure [ 141.

3.2.3. Parameter extraction and diagnostic classification. Adequate parameter selec- tion is basic for any pattern or signal classification. As soon as the parts of the signal to be classified have been determined, the question of which parameters to use arises. This question is also of utmost importance in ECG interpretation. Some researchers have assumed that the parameters with which the original ECG can be reconstructed in digital-to-analogue conversion, for instance with Fourier or Karhunen-LoGve transforms, are a sufficient basis for selection. However, this is rarely the case because these parameters are ideal for waveform reconstruction but do not necessarily have semantic information content and require a rather long processing

Info

rm H

ealth

Soc

Car

e D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y M

cMas

ter

Uni

vers

ity o

n 11

/03/

14Fo

r pe

rson

al u

se o

nly.

Page 5: Automated electrocardiogram analysis: The state of the art

Automated electrocardiogram analysis 47

time [14]. When the parameter measurement approach as such is considered, more than 300 measurements are frequently performed in each ECG. However, the diagnostic relevance of all these measurements has not been demonstrated. Ultimately, only a limited subset is used in classification programs [9]. The only way to obtain an optimal subset of parameters would be to make an exhaustive search of possible combinations using error probability as a criterion: selecting a subset of 66 parameters from 300 initial parameters would mean evaluating 2.48 x lo6’ subsets [ 18]!

Although a suitable choice of parameters essentially determines the results of classification into different diagnostic categories, two approaches to diagnostic classification have been developed and are still used. In the first approach, which uses logical methods, the cardiologist’s methods of ECG analysis are simulated. In the second, prior probabilities from ECG-independent criteria combined with a series of parameters extracted from the ECG are used in a multivariate statistical analysis. Until 1986, when a similar study was performed on the conventional 12-lead system [20], this approach was limited to the analysis of ECGs derived from XYZ Frank leads. Logical decision trees are generally used because they are more closely related to human reasoning than are multivariate statistical analysis methods. Clinical information and knowledge related to the ECG are of primary importance; i t is therefore not surprising that most practical solutions to ECG interpretation utilize decision-tree logical reasoning. There is actually a tendency to combine the two approaches in order to arrive at an optimum strategy in ECG interpretation-an interpretation which is hindered by the fact that it is not always possible to establish a one-to-one relationship between the ECG and the cardiac disease implied [9, 1-11.

3.2.4. D a t a compression in electrocardiogram analysis. Digital transmission, compact recording (e.g. in ambulatory monitoring) or long-term storage (e.g. serial ECG comparison) render the application of data compression or reduction techniques to the ECG a prerequisite. The efficiency of a compression algorithm cannot be predicted theoretically; it is evaluated experimentally from the compression ratio C (the ratio of the number of bits in the original signal to the number of bits in the compressed signal) and the quality of its visual and mathematical reproduction (expressed as various types of error) [21]. A high compression ratio is desirable; however, it often results in distortions of the signal to be transmitted, stored or analysed. The analogue signal reproduced from these data must be faithful, particularly in electrocardiography, in order to allow an independent evaluation by cardiologists [22, 231.

There are two main categories of compression algorithms for ECG processing: transformation and direct compression techniques. The first category transforms the signal in a frequency distribution. The information is then compressed by eliminating the redundancies with an analysis of the main spectral components. Time considerations limit the utility of this approach in ECG analysis [21]. The direct techniques currently used most frequently rely on the splitting of the signal according to slope changes of two consecutive samples (e.g. AZTEC [24], CORTES [25]) or on the linear interpolation of one or more past samples and one or more future samples [e.g. 261. Linear spline functions, proceeding by iterative interpol- ation of all past samples for each new sample, have the advantage of reducing noise components while reproducing a signal of good quality [21]. Cubic spline functions, which have recently been applied to ECG compression by our group, have

Info

rm H

ealth

Soc

Car

e D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y M

cMas

ter

Uni

vers

ity o

n 11

/03/

14Fo

r pe

rson

al u

se o

nly.

Page 6: Automated electrocardiogram analysis: The state of the art

48 F. Bessette and L. Nguyen

demonstrated a highly superior mathematical and visual reproduction of the signal together with significant compression [27, 281, even though all optimization possibilities have not yet been exploited [29].

Interpretation programs usually proceed separately from the data compression step which compresses the data without identifying the significant wave parameters that are useful for subsequent analysis [30]. The same algorithm, with modifications, is sometimes used for signal compression, waveform recognition and parameter extraction, but there is no direct correlation between the various operations [31]. It is well known from the literature that the identification of P waves requires a long processing time unless data compression methods and optimized algorithms are used. Analogue noise filtration or QRS subtraction often result in the loss of the P waves [9, 141. One method of improving this situation is syntactic analysis where compression allows the extraction of a reduced number of points which can serve as markers for wave recognition as well as reproducing the features of the original signal with minimum distortion in a reasonable processing time. This approach has been used in rhythm analysis [32] and, more recently, to determine the parameters of the QRS complex and of P and T waves by means of a ‘significant point extraction’ algorithm. The sampling frequency of the signal was rather low (200 Hz), however, and the reproduction was somewhat wiggly. P-wave analysis was perturbed by non- eliminated noise [30, 331.

It is anticipated that a substantial improvement of wave recognition in real time will be achieved by using cubic spline functions to obtain morphological information syntactically with successful signal compression. The selection of the points, known as nodal points, by which the iterative interpolation is executed is governed by a set of rules pertaining to slopes and slope changes. They are grouped together around a marked change in slope profile (e.g. isoelectric segment and wave onset) or close to a change in sign of the slope (e.g. amplitude at the apex of a wave). Therefore these patterns of points can easily be associated with each parameter considered (slope, duration, amplitude, S T segment), within acceptable limits of variability, or with a combination of parameters. Consequently, improvement in wave morphology definition can be achieved. Adequate reproduction of a wave, and thus determination of wave morphology, is very sensitive to the pattern of these nodal points. An abscissa displacement of a single nodal point by 1% produces a visible change in the wave reproduced [28]. P-wave localization, superimposed on a QRS complex, then becomes more feasible. The fact that noise is substantially reduced when splines are used generally improves the detection of P waves. A wave-contour approach using cubic spline functions to classify P waves according to the pattern of nodal points required to reproduce them is under way in our laboratory.

3.3. Evaluation of the performance of electrocardiogram analysis programs If the same series of ECGs is measured and interpreted using different programs,

a substantial number of disagreements results. The use of criteria coding systems (e.g. the Minnesota Code [34]) is limited by the lack of separation between diagnostic classification criteria and purely descriptive criteria for the classification of morphological parameters [6]. Application of these criteria by experienced cardiolo- gists yields a correct interpretation in 50-80% of cases [19]. Since all programs are based on points leading to wave delineation and these points are determined by various referees using various series of ECGs, systematic differences in automated results frequently reflect intra-observer variability [8]. Programs studied using the

Info

rm H

ealth

Soc

Car

e D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y M

cMas

ter

Uni

vers

ity o

n 11

/03/

14Fo

r pe

rson

al u

se o

nly.

Page 7: Automated electrocardiogram analysis: The state of the art

Automated electrocardiogram analysis 49

CSE data bank have shown a variability not only in parameter measurement but also in diagnostic statement and the way in which such statements are expressed [8,9,35].

The diagnostic accuracy of a program can be evaluated using the CSE Working Party data bank or using a validated data bank of ECGs based on ECG-independent evidence as recommended by Task Force 111. The aim of such evaluations is to determine the sensitivity (se) and specificity (sp) of computer classification. The performance is then expressed as (se + sp)/2. A statement such as ‘consider diagnosis x’ may be very sensitive but poorly specific, whereas a statement ‘definite diagnosis x’ would be more specific but less sensitive.

Receiver-operator-characteristic (ROC) graphs illustrate this trade-off between sensitivity and specificity [36]. Varying prior probabilities or increasing the number of diagnostic criteria is always a compromise [9, 20, 371. Taking a relatively high number of electrocardiographic measurements often leads to overdiagnosis and to the classification of an excessive number of normal recordings as abnormal [ 191. The selection and classification of parameters is still arbitrary in the formulation of a cardiac pathology according to its manifestations in the ECG. Differentiation between normal and abnormal populations has not yet been satisfactorily resolved despite the numerous studies aimed at improving the ill-defined overlap [9, 101 which have been performed since the introduction of automated ECG analysis [38-401. The use of prior probabilities has improved diagnostic accuracy but has not produced any changes in the discriminatory power between normal and abnormal ECG-derived parameters per se [9].

Since the optimal choice of the number of parameters is the determining element in classification results, whatever the approach used, i t has been and still is a concern in ECG computer processing [4, 14, 20, 411. The number of parameters affects the processing speed, which is critical, particularly in real-time processing. This is why the number of leads studied is optimized for each step, even in multi-lead analysis programs [4]. Robust single-lead algorithms have recently been developed for the detection of QRS complexes [42,43] and P waves [44]. New criteria, such as the R /S area ratio, have been introduced in order to improve diagnostic performance. No recommendation has been made concerning the choice of parameters leading to diagnostic classification.

- Diagnostic disagreements based solely on the use of different parameters do not necessarily indicate a program deficiency but rather reflect a choice of different criteria or differences in measurement [lo].

- Implying that criteria standardization is equivalent to high quality interpre- tation could be incorrect and inhibit the development of good criteria. ROC curves are needed instead; they provide information on diagnostic perform- ance without restricting new developments [ 2 2 , 451.

4. Conclusion Computer ECG anallGs has made a substantial contribution to reducing the

burden of analysis of the enormous number of ECGs recorded routinely. Appli- cations now in use include on-line monitoring of coronary care units, ambulatory surveying, serial ECG comparison, epidemiological studies and digitized trans- mission. However, diagnostic accuracy is little improved over that of experienced cardiologists. For several years, some groups have considered the logic used in their programs and the evaluation of program performance as proprietary and conse-

Info

rm H

ealth

Soc

Car

e D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y M

cMas

ter

Uni

vers

ity o

n 11

/03/

14Fo

r pe

rson

al u

se o

nly.

Page 8: Automated electrocardiogram analysis: The state of the art

50 F. Bessette and L. Nguyen

quently these programs have not been generally available. The need for exchange of diagnostic criteria has prompted the establishment of recommendations for stan- dards in electrocardiographic measurements and the evaluation of diagnostic performance on a validated ECG database. The choice of parameters has not been regulated, thus leaving the way open for further development in automated ECG analysis.

References 1. BANTA, R. H., JR, DORWARD, P. H., and SCAMPINI, S . A. (1985) New cardiograph family with ECG

analysis capability. Hewlett Packard Journal, 36, 23-28. 2. JENKINS, J. M. (1983) Symposium on computer applications to cardiology. Introduction. Progress in

Cardiovascular Diseases, 25, 361 -366. 3. VAN DEN AKKER, T . J.. Ros, H. H., KOELEMAN, A. S. M., and DEKKER, C. (1982) An on-line method

for reliable detection of waveforms and subsequent estimation of events in physiological signals. Computers and Biomedical Research, 15, 405-41 7.

4. KORS, 3 . A., TALMON, J . L., and VAN BEMMEL, J . H. (1986) Multilead ECG analysis. Computers and Biomedical Research, 19, 2846.

GESELOWITZ, D. B., LEPESCHKIN, E., OLIVER, G. C., SCHMITT, 0. H., and SPACH, M. (1975) Recommendations for standardization of leads and of specifications for instruments in electrocardio- graphy and vectocardiography. AHA Committee Report. Circulation, 52, 11-31.

GOETOWSKI, C. R., HOOPER, J . K., KLEIN, V., MILLAR, C. K., MILLIKEN, J . A,, MORTARA, D. W., PIPBERGER, H. V., PORDY, L., SANDBERG, R. L., SIMMONS, R. L., and WOLF, H. F. (1978) Task Force 111: Computers in diagnostic electrocardiography. American Journal of Cardiology, 41, 158-170.

7. CSE WORKING PARTY (1 985) Recommendations for measurement standards in quantitative electrocar- diography. European Heart Journal, ti, 81 5-825.

8. WILLEMS, J. L., ARNAUD, P., VAN BEMMEL, J. H., BOURDILLON, P. I., BROHET, C., DALLA VOLTA, s., ANDERSEN. J. D., DEGANI, R., DENIS, B., DEMEESTER, M., DUDECK, J., HARMS, F. M. A., MACFARLANE, P. W., MAZZOCCA, G., MEYER, J . , MICHAELIS, J., PARDAENS, J., POppL, S . J. , REARDON, B. C., VAN ECK RITSEMA, H. J . , ROBLES DE SJEDINA, E. O., RUBEL, P., TALMON, J. L., and ZYWIETZ, C. (1985) Assessment of the performance of electrocardiographic computer programs with the use of a reference data base. Circulation, 71, 523-534.

9. WILLEMS, J . L. (1986) A review of computer ECG analysis: time to evaluate and standardize. C R C Critical Reviews in Medical Informatics, 1, 165-207.

10. JENKINS, J. M. (1983) Automated electrocardiography and arrythmia monitoring. Progress in Cardiocascular Diseases, 25, 367408.

11. STALLMAN, F. W., and PIPBERGER, H. V. (1961) Automatic recognition of electrocardiographic waves by digital computer. Circulation Research, 9, 11 38-1 143.

12. CACERES, C. A,, STEINBERG, C. A., ABRAHAM, S., CARBERY, W. J., MCBRIDE, J. M., TOLLES, W. E., and RIKLI, A. E. (1962) Computer extraction of electrocardiographic parameters. Circulation, 25,

13. WILLEMS, J. L., ZYWIETZ, C., ARNAUD, P., VAN BEMMEL, J . H., DEGANI, R., and MACFARLANE, P. w . (1987) Influence of noise on wave boundary recognition by ECG measurement programs. Computers and Biomedical Research, 20, 543-562.

14. VAN BEMMEL, J . H. (1982). Recognition of electrocardiographic patterns. In Handbook of Statistics, Vol. 2. P. R. Krishnaiah and L. N . Kana1 (eds) (Amsterdam: North-Holland), pp. 501-526.

15. TALMON, J. L., and HASMAN, A. (1981) A new approach to QRS detection and typification. Computers in Cardiology, IEEE Computer Society Proceedings, 4 7 9 4 2 .

16. NGUYEN, L. (1 988) Paramhtres menant a la reconnaissance automatisee des ondes caracteristiques sur le trace electrocardiographique. MSc Thesis, University of Sherbrooke.

17. SORNMO, L. (1987) A model-based approach to QRS delineation. Computersand Biomedical Research,

18. JAIN, U., RAUTAHARJU, P. M., and WARREN, J. (1981) Selection ofoptimal features for classification of electrocardiograms. Journal of Electrocardiography, 14, 239-248.

19. PIPBERGER, H. V., MCCAUGHAN, D., LITTMANN, D., PIPBERGER, H. A., CORNFIELD, J. , DUNN, R. A,, BATCHLOR, C. D., and BERSON, A. S. (1975) Clinical application of a second generation of electrocardiographic computer program. American Journal of Cardiology, 35, 597-608.

20. WILLEMS, J . L., LESAFFRE, E., PARDAENS, J . , and DE SCHREYE, D. (1986) Multigroup logistic classification of the 12 and 3-lead ECG. In Computer ECG Analysis: Towards Standardization. J . L. Willems, J . H. van Bemmel and C. Zywietz (eds) (Amsterdam: North-Holland), pp. 203-210.

5. PIPBERGER, H. V., ARZBAECHER, R. C., BERSON, A. S., BRILLER, S. A., BRODY, D. A,, FLOWERS, N. C.,

6. RAUTAHARJU, P. M., ARIET, M., PRYOR, T . A,, ARZBAECKER, R. C., BAILEY, J. J., BONNER, R.,

356-362.

20, 526-542.

Info

rm H

ealth

Soc

Car

e D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y M

cMas

ter

Uni

vers

ity o

n 11

/03/

14Fo

r pe

rson

al u

se o

nly.

Page 9: Automated electrocardiogram analysis: The state of the art

Automated electrocardiogram analysis 51

21. BENELLI, G . , CAPPELLINI, V.. and LOTTI. F. (1980) Data compression techniques and applications. Radio and Electronic Engineer. 50, 29-53.

22. SHEFFIELD, L. T., PRINEAS. R., COHEN. H . C . , SHOENRERG, A , , and FROEIJCHER, V . (1978) Task Force 11: Quality of electrocardiographic records. American Journal of Cardiology, 41, 146-1 57.

23. CHENG, Q.-L., LEE, H . S., and THAKOR, N. V. (1987) ECG waveform analysis by significant point extraction, 1 1 . Pattern matching. Computers and Biomedical Research, 20, 428442.

21. COX, J . R., NOLLE, F. JI., FOZURD. H . A , , and OLIVER, G. C. (1968) AZTEC: A preprocessing program for real-time ECG analysis. IEEE Transactions on Biomedical Engineering, BME-15, 128- 129.

25. ABENSTEIN, 1. P., and T O h l P K l N S , W. J . (1982) A new data reduction algorithm for real-time ECG analysis. IEEE Transactions on Biomedical Engineering, BME-29, 2 3 4 8 .

26. RLTTIMANN, U. E., and PIPBERGER, t i . V . (19791 Compression of the E C G by prediction or interpolation and entropy coding. IEEE Transactions on Biomedical Engineering, BME-26, 61 3-623.

27. BESSETTE, F., and SEVFERT, W. D. (198s) Precise reproduction of electrocardiograms by spline interpolation. Medical Informatics, 10, 21 5-221.

28. BESSETTE, F., and SEI.FERT, LV. D . (1985) .4 method to express complex curves concisely and with high fidelity. Canadian Journal of Physiology and Pharmacology, 63, 1033-1037.

29. LACHI\'ER. G., EICHNER, J.-hl., BESSETTI:, F.. and SEVFERT, W. D . (1986) An algorithm for ECG data compression using spline functions. Computers in Cardiology, IEEE Computer Society Proceedings,

30. LEE, H . S., CHENG, Q.-L. , and THAROR. N. V. (1987) ECG waveform analysis by significant point extraction. I . Data reduction. Computers and Biomedical Research, 20, l l ( w 2 7 .

31. KYRKOS, A. , GIAKOL'hlhKIS, E. A , , and CARYANNIS. G . (1987) Time recursive prediction techniques on Q R S detection problem. IEEE 9th Annual Conference of the Engineering in Medicine and Biology Society (New York: IEEE), 18x5-1886.

32. UDCPA, J . K., and X ~ L R T H Y , I . S. N. (1980) Syntactic approach to ECG rhythm analysis. IEEE Transactions on Biomedical EngineerinR, BME-27, 37C37.5.

33. WEINRIB, J. , LEE, H. , and T H A R O R . N. (1984) A significant point extraction algorithm for ECG data reduction and pattern recognition. ('omputers in Cardiology, IEEE Computer Society Proceedings,

34. BLACKBURN, H., KEYS, A , , S l h l O N S O N , E. , RALITAHARJU, P. , and P U N S A R , S. (1960) T h e electrocardi-

575-578.

97-100.

ogram in population studies. Circdation. 21, 116Q-1 175.

F. M. A,, MACFARLANE, P. W., \ ~ . ~ z z o ( x ' A , G., X ~ E Y E R , J., VAN ECK RITSEMA, H. j., RORLES DE MEDINA, E. O., and ZYM'IETZ. C. (1985) I%tablishmrnt of a reference library for evaluating computer ECG measurement programs. Computers and Biomedical Research, 18, 439457.

36. BAILEY, J. j., and HORTON, M. R. (19x6) Can ECG criteria he standardized? In Computer E C G Analysis: Towards Standardization. J . 1,. Willems, J . H . van Bemmel and C . Zywietz (eds) (Amsterdam: North-Holland), pp. 163- 170.

37. RACTAHARJU, P. 51.. and QILIETS, PH. (1979) Evaluation of computer-Ecc programs. T h e strange case of the golden standard. Computers irnd Biomedical Research, 12, 3 9 4 6 .

38. DRAPER, H. W., PEFFER, C . J., STAI.I.~I.ANN, F. W., LITTMANN. D., and PIPBERGER, H . V. (1964) T h e corrected orthogonal electrocardiogram and vectorcardiogram in 5 10 normal men (Frank lead system). Circulation, 30, 853-864.

39. MICHAELS, L., and KLOVAN. J . E. (1968) \Vave amplitude relationships in the normal electrocardio- gram. British Heart Journal. 30. 412-418.

40. RIKLI , A . E., TOILES, w. E., STEINBERG. C . A , , CARBERY, W. J. , FREIMAN, A. H., ABRAHAM, S., and CACERES, C. A. ( 1 961) Computer analysis of electrocardiographic measurements. Circulation, 24,

41. CADY, L. D., J R , U'OODRIW, M. A . , TICK, L. J.. and GERTI-ER, M. M. (1961) A method for electrocardiogram wave-pattern estimation. Circulation Research, 9, 1078-1082.

42. LIGTENBERG, A., and KL~NT, \I. (1983) A robust-digital QRS-detection algorithm for arrythmia monitoring. Computers and Biomedical Research. 16, 273-286.

43. BRUECKNER, R. P. , and GCLLER, B. (1987) Q R S areas improve the electrocardiographic interpretation of right ventricular hypertrophy. Computers and Biomedical Research, 20, 99-103.

44. DVFAUI-T, R., and WILCOX, A. (1985) r2n algorithm for automatic P-wave detection in single or multiple lead surface ECGs. Computers in Cardiology, ZEEE Computer Society Proceedings, 117-1 20.

45. PIPBERGER, H. V. (1986) Diagnostic ECG classification. Discussion. In Computer E C G Analysis: Towards Standardization. J . L. Willems. J . H . van Bemmel and C. Zywietz (eds) (Amsterdam: North- Holland), p. 228.

35. WILLEMS, J . L., .4RXAUD, P., VAN BEhlhll<l., J . H . , BC)CRDII.I.ON, P. J . , DEGANI, R., DENIS, B., HARMS,

643-649.

Info

rm H

ealth

Soc

Car

e D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y M

cMas

ter

Uni

vers

ity o

n 11

/03/

14Fo

r pe

rson

al u

se o

nly.