Methods Inf Med 2001; 40(04): 293-297
DOI: 10.1055/s-0038-1634424
Original Article
Schattauer GmbH

Evaluation of a Method that Supports Pathology Report Coding

A. Hasman
1   Department of Medical Informatics, University of Maastricht, The Netherlands Department of Pathology, Academic Hospital Maastricht, Maastricht, The Netherlands
,
L. M. de Bruijn
1   Department of Medical Informatics, University of Maastricht, The Netherlands Department of Pathology, Academic Hospital Maastricht, Maastricht, The Netherlands
,
J. W. Arends
1   Department of Medical Informatics, University of Maastricht, The Netherlands Department of Pathology, Academic Hospital Maastricht, Maastricht, The Netherlands
› Author Affiliations
Further Information

Publication History

Received 12 July 2000

Accepted 22 March 2001

Publication Date:
08 February 2018 (online)

Summary

Objectives: The paper focuses on the problem of adequately coding pathology reports using SNOMED. Both the agreement between pathologists in coding and the quality of a system that supports pathologists in coding pathology reports were evaluated.

Methods: Six sets of three pathologists each received a different set of 40 pathology reports. Five different SNOMED code lines accompanied each pathology report. Three pathologists evaluated the correctness of each of these code lines. Kappa values and values for the reliability coefficients were determined to gain insight in the variance observed when coding pathology reports. The system that is evaluated compares a newly entered report, represented as a multi-dimensional word vector, with reports in a library, represented in the same way. The reports in the library are already coded. The system presents the code lines belonging to the five library reports most similar to the newly entered one to the pathologist in this way supporting the pathologist in determining the correct codes. A high similarity between two reports is indicated by a large value of the inproduct of the vector of the newly entered report and the vector of a report in the library.

Results: Agreement between pathologists in coding was fair (average kappa of 0.44). The reliability coefficient varied from 0.81 to 0.89 for the six sets of pathology reports. The system gave correct suggestions in 50% of the reports. In another 30% it was helpful for the pathologists.

Conclusions: On the basis of the level of the reliability coefficients it could be concluded that three pathologists are indeed sufficient for obtaining a gold standard for evaluating the system. The method used for comparing reports is not strong enough to allow fully automatic coding. It could be shown that the system induces a more uniform coding by pathologists. An evaluation of the incorrect suggestions of the system indicates that the performance of the system can still be improved.