Subscribe to RSS
DOI: 10.3414/ME0627
Intuitive and Axiomatic Arguments for Quantifying Diagnostic Test Performance in Units of Information
Publication History
received:
27 November 2008
accepted:
29 May 2009
Publication Date:
17 January 2018 (online)
Summary
Objectives: Mutual information is a fundamental concept of information theory that quantifies the expected value of the amount of information that diagnostic testing provides about a patient’s disease state. The purpose of this report is to provide both intuitive and axiomatic descriptions of mutual information and, thereby, promote the use of this statistic as a measure of diagnostic test performance.
Methods: We derive the mathematical expression for mutual information from the intuitive assumption that diagnostic information is the average amount that diagnostic testing reduces our surprise upon ultimately learning a patient’s diagnosis. This concept is formalized by defining “surprise” as the surprisal, a function that quantifies the unlikelihood of an event. Mutual information is also shown to be the only function that conforms to a set of axioms which are reasonable requirements of a measure of diagnostic information. These axioms are related to the axioms of information theory used to derive the expression for entropy.
Results: Both approaches to defining mutual information lead to the known relationship that mutual information is equal to the pre-test uncertainty of the disease state minus the expected value of the posttest uncertainty of the disease state. Mutual information also has the property of being additive when a test provides information about independent health problems.
Conclusion: Mutual information is the best single measure of the ability of a diagnostic test to discriminate among the possible disease states.
-
References
- 1 Benish WA. Mutual information as an index of diagnostic test performance. Methods Inf Med 2003; 42: 260-264.
- 2 Diamond GA, Hirsch M, Forrester JS, Staniloff HM, Vas R, Halpern SW, Swan HJ. Application of information theory to clinical diagnostic testing The electrocardiographic stress test. Circulation 1981; 63: 915-921.
- 3 Cover TM, Thomas JA. Elements of Information Theory. New York: John Wiley & Sons; 1991
- 4 Simpson AJ, Fitter MJ. What is the best index of detectability?. Psychol Bull 1973; 80: 481-488.
- 5 Shannon CE. A mathematical theory of communication. Bell Syst Tech J 1948; 27: 379-423 623-656.
- 6 Shannon CE, Weaver W. The Mathematical Theory of Communication. Urbana: University of Illinois Press; 1949
- 7 Metz CE, Goodenough DJ, Rossmann K. Evaluation of receiver operating characteristic curve data in terms of information theory, with applications in radiography. Radiology 1973; 109: 297-303.
- 8 Somoza E, Mossman D. Comparing and optimizing diagnostic tests: an information- theoretical approach. Med Decis Making 1992; 12: 179-188.
- 9 Mossman D, Somoza E. Diagnostic tests and information theory. J Neuropsychiatry Clin Neurosci 1992; 4: 95-98.
- 10 Somoza E, Mossman D. Comparing diagnostic tests using information theory: the INFO-ROC technique. J Neuropsychiatry Clin Neurosci 1992; 4: 214-219.
- 11 Benish WA. The use of information graphs to evaluate and compare diagnostic tests. Methods Inf Med 2002; 41: 114-118.
- 12 Hilden J. The area under the ROC curve and its competitors. Med Decis Making 1991; 11: 95-101.
- 13 Lee WC, Hsiao CK. Alternative summary indices for the receiver operating characteristic curve. Epidemiology 1996; 7: 605-611.
- 14 Tribus M. Thermostatistics and Thermodynamics. Princeton: D. van Nostrand Company; 1961
- 15 Aczél J, Daróczy Z. On Measures of Information and Their Characterizations. New York: Academic Press; 1975
- 16 Benish WA. Relative entropy as a measure of diagnostic information. Med Decis Making 1999; 19: 202-206.
- 17 Pluim JP, Maintz JB, Viergever MA. Mutual-information-based registration of medical images: a survey. IEEE Trans Med Imaging 2003; 22: 986-1004.
- 18 Chrástek R, Skokan M, Kubeãka L, Wolf M, Donath K, Jan J, Michelson G, Niemann H. Multimodal retinal image registration for optic disk segmentation. Methods Inf Med 2004; 43: 336-342.
- 19 Blokh D, Zurgil N, Stambler I, Afrimzon E, Shafran Y, Korech E, Sandbank J, Deutsch M. An information-theoretical model for breast cancer detection. Methods Inf Med 2008; 47: 322-327.
- 20 Brunel H, Perera A, Buil A, Sabater-Lleal M, Souto JC, Fontcuberta J, Vallverdu M, Soria JM, Caminal P. SNP sets selection under mutual information criterion, application to F7/FVII dataset. Conf Proc IEEE Eng Med Biol Soc. 2008: 3783-3786.
- 21 Priness I, Maimon O, Ben-Gal I. Evaluation of gene-expression clustering via mutual information distance measure. BMC Bioinformatics 2007; 8: 111.
- 22 Qiu P, Gentles AJ, Plevritis SK. Fast calculation of pairwise mutual information for gene regulatory network reconstruction. Comput Methods Programs Biomed 2009; 94: 177-180.
- 23 Osaka M, Saitoh H, Yokoshima T, Kishida H, Hayakawa H, Cohen RJ. Nonlinear pattern analysis of ventricular premature beats by mutual information. Methods Inf Med 1997; 36: 257-260.
- 24 Thakor NV, Tong S. Advances in quantitative electroencephalogram analysis methods. Annu Rev Biomed Eng 2004; 6: 453-495.