International Journal of Bioelectromagnetism Vol. 5, No. 1, pp. 14-17, 2003. |
www.ijbem.org |
Evolution of Electrocardiographic Research
— EPICARE Center, Department of Public Health Sciences, Wake
Forest University School of Medicine, Correspondence: PM Rautaharju, 737 Vista Meadows Rd.,
Weston, FL 33327, USA. Abstract. Electrocardiographic research
has evolved in “boom and bust” cycles, like the epochs of the gold rush during
the last Century. The treatises and writings of ECG theoreticians and basic
cardiac experimental electrophysiologist constitute the Holy Scriptures of Electrocardiography.
While not really gospel, they should provide ample sources for consultation
in decisions whether a new line of applied ECG investigation has a reasonably
rational basis. Unfortunately, some heretic concepts prevail in modern ECG reporting.
One astounding heresy is the concept of equating, without any theoretical foundation,
QT dispersion with dispersion of myocardial repolarization, a most fashionable
topic in ECG research since early 1990s. Bazett’s QTc is still the most commonly
used formula for QT rate adjustment, including regulatory agencies, in spite
of the strong residual dependence of QTc on heart rate. ECG-LVH and MI prevalence
and incidence comparisons in contrasting populations has dominated ECG research
in epidemiology, although low classification accuracy of LVH and old MI criteria
often renders such comparisons highly misleading. Concepts such as population
attributable risk are often misinterpreted. ECG analysis with computers is often
performed as a blind statistical “hunting expedition”, and risk analysis is
often performed without properly considering confounding factors.
Keywords: Electrocardiograpy; QT Prolongation; QT Dispersion; Error Analysis; Attributable Risk 1. Prologue It is a great honor for me to be invited to deliver the Pierre Rijlant lecture. I met Professor Rijlant first time in 1958 at the Third World Congress on Cardiology in Brussels. Rijlant was a prominent celebrity and the organizer in that Congress. I was still at the medical school in Helsinki, presenting my first humble paper at an international forum. Vectorcardiography was at the peak of its glory. I remember vividly the presentation of Pierre Rijlant, the beautiful vector loops with “figure of eight” and other configurations of dancing on the screen, like compulsory figures used to be in World ice-skating championship competitions. Rijlant’s aristocratic, most unconventional personality was fascinating. Next time I met Pierre Rijlant at Otto Schmitt’s biophysics laboratory at the University of Minnesota where I ended up in 1959 for my graduate work. Pierre Rijlant and Otto Schmitt were friends, and they fully understood each other’s language at the hyperspace of dipolar and multipolar Electrocardiography. Subsequently I met Pierre Rijlant periodically in connection with the International Colloquia of Vectorcardiography. Pierre Rijlant was a leading member of the Colloquium, a close international circle of vectorcardiographers, and he cultivated, nurtured and kept the Symposium alive during the difficult post-war period. Our International Congress of Electrocardiology would not be at this advanced phase of evolution without early contributions of Pierre Rijlant. 2. Cyclic Nature of Electrocardiographic Research Electrocardiographic research has evolved in “boom and bust” cycles, like the epochs of the gold rush last Century. There was the initial euphoric exciting period when most everything was a new discovery. ECG research experienced a “second wind” with Wilson’s chest leads, the central terminal and the ventricular gradient concept that reached almost a place among the Holy Scriptures. There have been high peaks, like the high plateau- like period of vectorcardiography during the post World War - two golden era of analogue computing. Vectorcardiography had its boom period, with a big effort in research on so called orthogonal lead systems. The days of the standard 12-leads seemed doomed. With the appearance of digital computers in the 1960s into ECG research, many new subject areas became possible and fashionable in the realm of Electrocardiography. Exercise stress testing became a popular part of clinical Electrocardiography. Ambulatory research got its boost from Norman Holter’s innovation, with emphasis on silent ischemic episodes. Ventricular ectopic complex (VEC) counting reached a high level of esteem, as did more recently heart rate variability (HRV) analysis. Wilson’s central terminal and standard 12-lead ECG have survived the test of time while the interest in vectorcardiography and orthogonal leads declined after the 1970s. There has recently been increasing interest in semiorthogonal leads, such as the EASI leads, as a means of transformation into conventional-like leads. The ventricular gradient concept has shown periodically signs of revival in one form or another. Prognostic value of asymptomatic ischemic episodes in still unclear, and expectations of prevention of adverse cardiac events by suppression of ventricular ectopics have not materialized. 3. Holy Scriptures of Electrocardiography The treatises and the writings of ECG theoreticians and basic cardiac experimental electrophysiologist constitute the Holy Scriptures of Electrocardiography. There are several of these “holy men” amongst us in this Congress, from leading theoreticians notably Professor Robert Plonsey, my esteemed colleague from North Carolina. Nobody doubts the validity of potential theory in Electrocardiography. However, there are still gaps in our knowledge about the complex cardiac source and extracardiac structures, ionic channels and mechanisms of the effects of cardiactive drugs on ionic channels, excitation and particularly on repolarization process. Adaptation of theoretical models to condition-specific situations requires adaptation of generalizations and continuing modeling and revalidation. While not gospel, the scriptures provide ample sources for consultation when a decision is needed on whether or not a new concept or a new line of applied ECG investigation has a reasonably rational basis. These valuable sources are numerous, and the majority of electrocardiographic research is on a solid foundation, at as high scientific level as any other field of applied sciences. 4. Heretic Concepts in ECG Research Unfortunately, heretic concepts come and go, and some of them seem to prevail in modern Electrocardiography. I will elaborate on a few on them only, as exemplary “caveats” in electrocardiographic reporting. 4.1. QT Dispersion An astounding heresy is the QT dispersion concept, a most fashionable topic in ECG research since its introduction in early 1990s. QT dispersion concept associates the dispersion of QT measured in individual body surface leads to dispersion of the end of myocardial repolarization. Basic biophysical considerations tell that the prerequisite for our ability to detect localized variations in the end of myocardial repolarization from body surface ECGs is the presence of adequate level of nondipolar energy components at the terminal part of the T wave, to significantly influence the threshold logic used to define the endpoint of the T wave (Rautaharju 2002). Low microvolt-level nondipolar energy exists in the T wave, but unless and until somebody demonstrates their association with true dispersion of the time point of the end of ventricular repolarization, QT dispersion concept has no valid foundation. An enormous and expensive investigative effort has been performed wasting valuable learning period of research trainees because basic concepts were not adequately consulted before formulating these working hypotheses. Continuing confusion seems to prevail due to lack of understanding that the apparent T wave dispersion is an amplitude domain and not an interval domain ECG abnormality related to any physiologically meaningful repolarization interval at localized myocardial level. A recent report suggesting that T wave nondipolar components as possible markers of dispersion of repolarization are associated with a substantial excess risk of adverse cardiac events (Zabel et al. 2002), will require critical evaluation in prospective studies. 4.2. Rate Adjustment of QT Evaluation of QT prolongation by cardioactive drugs has taken a high priority in clinical trials. A great deal of effort has been invested in comparing various power functions and other nonlinear QT rate adjustment formulas using Akaike information criterion etc. for ranking them, without realizing that numerous functions give practically equally good results. The fundamental problem with Bazett’s and Fridericia’s formulas is that they ignore the intercept of QT versus RR regression (Rautaharju 2002). With other power functions the basic problem is that they use proportional rather than linear scaling of QT in the adjustment process, whereby the upper and lower normal limits for adjusted QT become rate-variant. Bazett’s formula is still commonly used for QT rate adjustment although it leaves a strong residual dependence of QTc on heart rate (r= 0.32). Differences in heart rate response between treatment groups can cause false or missed QT prolongation and bias critical decisions by regulatory agencies such as FDA. Simple linear rate adjustment appears the safest for use at standardized resting state within physiological sinus rates. 4.3. QT Interval Prolongation in Women QT publications including proposed new standards for regulatory agencies claim that QT interval is prolonged in women (assuming an estrogen effect) and raise the limit for QT prolongation in women 20 ms higher than that in men (470 ms versus 450 ms). The fact is that QT is not prolonged in females and QT changes little through adult life: it shortens in males during adolescence (probably testosterone effect), prolongs linearly with age, and no notable gender difference remains after age of 50 years (Rautaharju 1979). 4.4. Comparative Prevalence Evaluation in Contrasting Populations For past several Decades, the mainline effort in epidemiological Electrocardiography has been focused on reporting of comparative prevalence estimates of old myocardial infarction (MI) and left ventricular hypertrophy (LVH) in contrasting populations. Elementary error analysis models considering the limited classification accuracy of ECG criteria for LVH and MI reveal that these prevalence estimates are can be grossly misleading and based on false evidence. This problem has been discussed in some detail in the proceedings of the LVH session of this Congress. 4.5. Attributable Risk The concept of attributable risk (AR) has been around several decades but has only recently received attention in risk analysis. AR is the excess relative risk that under certain assumptions can conceivably be associated with the risk indicator, with the risk among those without the risk indicator as the reference group. The meaning of attributable risk differs considerably, depending on how it is defined (Miettinen 1972, 1974). The following two simplified expressions will be considered in this context.
where AR1 is the excess risk in the source population. It is the excess risk of an adverse event among those with risk indicator over overall risk of the event in the source population. The common modifier is f(P1), the prevalence of the risk indicator in the whole source population from which the fatal cases arise. (Also, f(P1) should be modified by patient years of observation with the risk indicator to the event over total patient years of observation). AR1 is at times called population attributable risk or community attributable risk (Cole at al. 1971), and the concept has more recently been used in the sense of a causal relationship between mortality risk and ECG abnormalities such as LVH. It is assumed that the mortality risk can be averted if the ECG abnormality is prevented, or the risk can be reduced if intervention succeeds to reduce the incidence of the abnormality. Next, consider the following expression for AR simplified by use of the notion for conditional probabilities:
Here f(P2 | D) is the proportion of cases with the risk indicator. Thus, AR2 is the excess relative risk associated with the risk indicator among those who die or become diseased and who have the risk indicator. AR2 is called attributable proportion or etiologic fraction. AR2, rather than AR1 indicates, under certain assumptions, the fraction of mortality or morbidity that could be prevented if the risk indicator prevalence in the subgroup of future cases could be reduced to the level of the indicator prevalence in the non-cases. AR2 is of interest in evaluating disease etiology. Otherwise AR2 is of little interest from the point of view of prevention. (We do not know in advance who is going to die, unless all of those with the risk indicator are going to become fatalities or diseased (i.e. f(D | P) = 1). AR1 is best applied when AR2 suggests that it is reasonable to expect a causal relationship between the risk indicator and the confounding factors are properly under control. 4.6. Confounding Factors The critical point in ECG risk analysis is proper consideration of confounding factors. Cases and non-cases in terms of the outcome may differ in characteristics other than the risk indicator, and these characteristics associated with the risk indicator may be significant confounding factors (age, gender, clinically overt or subclinical disease etc). It is pretentious to attribute excess risk to a risk indicator as an etiologic fraction of the risk, unless there is genetic evidence or strong pathophysiological reason to expect a causal relationship and confounding is well under control, as emphasized (Miettinen 1974). In ECG risk analysis, the interest is often focused on a single ECG risk factor candidate such as T wave negativity, QRS/T spatial angle or T wave complexity, without asking if it is a primary independent risk factor or secondary to some other confounding characteristic as a more direct mechanism to increased risk such as increased left ventricular mass or old MI with secondary ventricular remodeling, conduction problems and QRS prolongation. 4.7. Singular Value Decomposition and T Wave Complexity Singular value decomposition has made periodic appearances in ECG research. In singular value decomposition, monotonously decreasing values of the diagonal are obtained from the largest to the smallest energy or variance of the principal components of the 12-lead ECG. T wave complexity, the ratio of the second and the first components or Eigenvalues has been reported to be an independent risk factor and recognized to be related to the old vectorcardiographic measure of T loop width. A closer examination reveals that T wave complexity is not necessarily a simple measure of T loop width. The assertion is valid for T loops elongated in the direction of the mean T vector. T wave complexity reaches its peak value (unity) when T loop becomes round. Beyond this point of the evolution of T loop abnormality, the direction of maximal T wave variance and the first principal component is approximately perpendicular to the mean T vector, and the original first and the second principal components change ranks. The complexity of T wave decreases and assumes a small value when T loop becomes flat, that is when the direction of the terminal T vectors becomes grossly abnormal. The expression T wave complexity is more complex than generally realized. This reveals one of the weaknesses of principal component analysis due to omission of spatial directional information. 5. Perspective One may ask what is the reason for these weak points in ECG reporting. There is an enormous pressure on young investigators and research trainees to produce manuscripts for publications. Clinical cardiologist are perhaps no longer adequately versed in basic Electrocardiography to consult the scriptures and to provide proper guidance to research trainees, to deduce when there is any rational basis for the hypothesis of an investigation being considered. They go along with what seems fashionable in the scientific sessions of the professional societies. Of course these soft points in electrocardiographic reporting are minor problems compared to phenomenal steady progress in numerous areas of ECG research. Risk analysis has become an equally important aspect of ECG research as diagnostic classification. Experimental ECG research on inflation-induced ischemia in connection with transluminal percutaneous coronary artery catheterization has been another recent step for better understanding of the mechanisms of supply and demand aspects of acute ischemic injury. And just think of the progress in invasive endocardial and epicardial mapping of excitatory atrial pathways in atrial fibrillation, guiding therapeutic intervention procedures, and similar progress in locating pathways of ventricular pre-excitation. References Miettinen OS. Components of the crude risk ratio. American Journal of Epidemiology 1972:96(2):168-172. Miettinen OS. Proportion of disease caused or prevented by a given exposure, trait or intervention. American Journal of Epidemiology 1974;99(5):325-332. Rautaharju PM, Davignon A, Soumis F, Boissell E, Choquette A. Evolution of QRS-T relationship from birth to adolescence in Frank-lead orthogonal electrocardiograms of 1,492 normal children. Circulation 1979;60(1):196-204. Rautaharju PM. What killed QT dispersion? Cardiac ElectrophysiologyReviews 2002; 6:295-301. Rautaharju PM, Zhang ZM. Linearly scaled, rate-invariant normal limits for QT interval. Eight decades of incorrect application of power functions. Journal of Cardiovascular Electrophysiology 2002;13(12):1211-1218. Zapel M, Malik M, Hnatkova K, Papademetriou V, Pittaras A, Fletcher RD, Franz MR. Analysis of T-wave morphology from the 12-lead electrocardiogram for prediction of long-term prognosis in male US veterans. Circulation 2002;105(9):1066-1070.
|