Multi-class Classification of Thorax Diseases Using Neural Networks
10 Pages Posted: 10 Jun 2021 Last revised: 11 Jun 2021
Date Written: May 25, 2021
Abstract
Chest radiograph or chest X-ray (CXR) is one of the most conducted radiology examinations, it is also considered difficult to interpret. Increasing availability of imaging equipment has also increased the demand for highly trained staff. Doctors are absolutely good at putting forward the diagnosis, but some minor details can be overlooked. In this paper, deep learning method is used to predict Thorax disease categories using CXR and its metadata using Convolutional Neural Network (CNN) and MobileNets neural network architecture. The huge NIH (National Institutes of Health, Maryland) dataset of chest X-rays available is used for making the predictions of a multiclass image classification problem with 15 different labels. This model will provide a sanity test in the form of second opinion for radiologists and doctors to achieve more confidence in predicting accurate diagnosis. The accuracy of this model is appreciable as compared to the current practice of predicting diseases by radiologists. This paper successfully classifies different categories of Thorax disease with 94.68% accuracy, which concludes that this model trained with deep learning neural network has a real-world application and can be used in the prediction of Thorax diseases. The scope of this model would increase along with the accuracy when more data is collected from some other lab tests, clinical notes, or some other scans.
Note: Funding Statement: This research did not receive any specific grant from any funding agencies in the public, commercial, or not-for-profit sectors.
Declaration of Interests: The authors declare that there are no competing interest.
Ethics Approval Statement: The authors declare that this study was approved by ethics and research committee and that it complied with all measures.
Keywords: Neural networks, Multiclass classification, Chest x-ray, Medical image processing
Suggested Citation: Suggested Citation