Relative entropy as a measure of diagnostic information. |
| |
Authors: | W A Benish |
| |
Affiliation: | Department of Internal Medicine, Case Western Reserve University, Cleveland, Ohio, USA. wab4@po.cwru.edu |
| |
Abstract: | Relative entropy is a concept within information theory that provides a measure of the distance between two probability distributions. The author proposes that the amount of information gained by performing a diagnostic test can be quantified by calculating the relative entropy between the posttest and pretest probability distributions. This statistic, in essence, quantifies the degree to which the results of a diagnostic test are likely to reduce our surprise upon ultimately learning a patient's diagnosis. A previously proposed measure of diagnostic information that is also based on information theory (pretest entropy minus posttest entropy) has been criticized as failing, in some cases, to agree with our intuitive concept of diagnostic information. The proposed formula passes the tests used to challenge this previous measure. |
| |
Keywords: | |
|
|