Subscribe to RSS
DOI: 10.1055/s-0040-1704292
A DCNN-BASED SYSTEM FOR CLASSIFICATION OF GASTRITIS LESIONS
Publication History
Publication Date:
23 April 2020 (online)
Aims Endoscopic morphological diagnosis of gastritis plays a crucial role in the gastritis diagnosis. Endoscopists tend to be subjective when classifying gastritis images, and different physicians will have certain differences in the judgment of the same image. In order to avoid the influence of subjectivity on the endoscopists’ diagnosis, we used deep learning to classify endoscopic gastritis images more objectively and accurately.
Methods We collected a total of 3621 endoscopic images of 921 patients with atrophic gastritis, erosive gastritis hemorrhagic gastritis and normal gastric mucosa. Based on the Sydney System and clinical experience, the images were classified into three types of lesions- “atrophy, erosion and hemorrhage”. Models are built by learning important features on typical images through deep convolutional neural network. The training set and the validation set are randomly generated from the study data set. The model was trained and validated based on the consensus of three experienced endoscopists.
Results For the training set, the accuracy of the models “atrophy, erosion and hemorrhage” are 86%, 78% and 92%, the sensitivity of whom are 74%, 68% and 57%, while the specificity 90%, 80% and 97% respectively. For the test set, the accuracy of the models “atrophy, erosion and hemorrhage” are 81%, 74% and 91%, the sensitivity of whom are 84%, 74% and 80%, while the specificity 80%, 74% and 94%, on a par of the performance of the consensus of three endoscopists. The intraobserver agreement of the DCNN were evaluated by Cohen’s kappa coefficient.
Conclusions our study shows that our models have decent specificity and good accuracy in gastritis lesions classification. Deep learning has great potential in the field of gastritis classification in endoscopy, which could assist endoscopists make an accurate diagnosis after endoscopic procedure in the future.