Subscribe to RSS
DOI: 10.1055/s-0038-1634766
The Delphi Method to Validate Diagnostic Knowledge in Computerized ECG Interpretation
The authors would like to thank Dr. G. van Herpen (University Hospital, Leiden) who made valuable suggestions concerning the questionnaires and the cardiologists who participated in the Delphi procedure: Dr. N. van Hemel (St. Antonius Hospital, Nieuwegein), Prof. P. G. Hugenholtz (Thorax Center, Rotterdam), Dr. J. F. May (University Hospital, Groningen), Prof. E. O. Robles de Medina (University Hospital, Utrecht), Dr. F. C. Visser (Free University Hospital, Amsterdam). We are grateful to Prof. J. L. Willems, coordinator of the CSE project, for his permission to use the CSE library for our studies.Publication History
Publication Date:
06 February 2018 (online)
Abstract
We investigated the applicability of the Delphi method for increasing the agreement among multiple cardiologists on, firstly, their classifications of a set of electrocardiograms and, secondly, their reasons for these classifications. Five cardiologists were requested to judge the computer classifications of a set of thirty ECGs. If a cardiologist disagreed with the computer classification, he had to provide a new classification and a reason for this change. The results of this first round were compiled and anonymously fed back to the cardiologists. In a second round the cardiologists were asked once again to judge the ECGs and to rate the reasons provided in the first round. The level of agreement was estimated by means of the kappa statistic. The Delphi procedure substantially increased the agreement on the classifications among the cardiologists. The final agreement was very high and comparable with the intra-observer agreement. There was also a high level of agreement on the reasons provided by the cardiologists. However, their use in improving the program’s performance is hampered by the qualitative nature of many of the reasons. Suggestions are given for a more formalized elicitation of knowledge.
* Present address: Department of Industrial Design Engineering, Delft University of Technology, Delft, The Netherlands.
-
REFERENCES
- 1 Kors JA, Talmon JL, Van Bemmel JH. Multilead ECG analysis. Comp Biom Res 1986; 19: 28-46.
- 2 Acheson RM. Observer error and variation in the interpretation of electrocardiograms in an epidemiological study of coronary heart disease. Br J Prev Soc Med 1960; 14: 99-122.
- 3 Simonson E, Tuna N, Okamoto N. et al. Diagnostic accuracy of the vectorcardiogram and electrocardiogram: A cooperative study. Am J Cardiol 1966; 17: 829-78.
- 4 Dalkey NC. An experimental study of group opinion: The Delphi method. Futures 1969; (Sep): 408-26.
- 5 Linstone HA, Turoff M. The Delphi Method: Techniques and Applications. Reading, Mass: Addison-Wesley; 1975: 50.
- 6 Pill J. The Delphi method: Substance, context, a critique and an annotated bibliography. Socio-Econ Plan Sci 1971; 05: 57-7l.
- 7 Hillman BJ, Hessel SJ, Swensson RG, Herman PG. Improving diagnostic accuracy: A comparison of interactive and Delphi consultations. Invest Radiol 1977; 12: 112-5.
- 8 Milholland AV, Wheeler SG, Heieck JJ. Medical assessment by a Delphi group opinion technique. N Engl J Med 1973; 288: 1272-5.
- 9 Nagy GK, Frable WJ, Murphy WM. Classification of premalignant urothelial abnormalities: A Delphi study of the national bladder cancer collaborative group A. Pathol Annu 1982; 17: 219-33.
- 10 Stargarde A, Luening M. Die DelphiMethode zur kollektiven Diagnosefindung. Radiol Diagn 1982; 23: 172-6.
- 11 The CSE Working Party. Recommendations for measurement standards in quantitative electrocardiography. Eur Heart J 1985; 06: 815-25.
- 12 Willems JL, Arnaud P, Van Bemmel JH. et al. Assessment of the performance of electrocardiographic computer programs with the use of a reference data base. Circulation 1985; 71: 523-34.
- 13 Cohen J. A coefficient of agreement for nominal scales. Educ Psychol Meas 1960; 20: 37-46.
- 14 Fleiss JL. Statistical Methods for Rates and Proportions. New York: Wiley; 1981
- 15 Schouten HJA. Statistical Measurement of Interobserver Agreement (Ph. D. Dissertation). Rotterdam: Erasmus University; 1985
- 16 Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics 1977; 33: 671-9.
- 17 Spodick DH. On experts and expertise: The effect of variability in observer performance. Am J Cardiol 1975; 36: 592-6.