147 360

Cited 3 times in

Malignant thoracic lymph node classification with deep convolutional neural networks on real-time endobronchial ultrasound (EBUS) images

Authors
 Seung Hyun Yong  ;  Sang Hoon Lee  ;  Sang-Il Oh  ;  Ji-Soo Keum  ;  Kyung Nam Kim  ;  Moo Suk Park  ;  Yoon Soo Chang  ;  Eun Young Kim 
Citation
 TRANSLATIONAL LUNG CANCER RESEARCH, Vol.11(1) : 14-23, 2022-01 
Journal Title
TRANSLATIONAL LUNG CANCER RESEARCH
ISSN
 2218-6751 
Issue Date
2022-01
Keywords
Convolutional neural networks (CNNs) ; deep learning ; endobronchial ultrasound (EBUS) ; endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) ; lung cancer
Abstract
Background: Thoracic lymph node (LN) evaluation is essential for the accurate diagnosis of lung cancer and deciding the appropriate course of treatment. Endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) is considered a standard method for mediastinal nodal staging. This study aims to build a deep convolutional neural network (CNN) for the automatic classification of metastatic malignancies involving thoracic LN, using EBUS-TBNA.

Methods: Patients who underwent EBUS-TBNAs to assess the presence of malignancy in mediastinal LNs during a ten-month period at Severance Hospital, Seoul, Republic of Korea, were included in the study. Corresponding LN ultrasound images, pathology reports, demographic data, and clinical history were collected and analyzed.

Results: A total of 2,394 endobronchial ultrasound (EBUS) images of 1,459 benign LNs from 193 patients, and 935 malignant LNs from 177 patients, were collected. We employed the visual geometry group (VGG)-16 network to classify malignant LNs using only traditional cross-entropy for classification loss. The sensitivity, specificity, and accuracy of predicting malignancy were 69.7%, 74.3%, and 72.0%, respectively, and the overall area under the curve (AUC) was 0.782. We applied the new loss function to train the network and, using the modified VGG-16, the AUC improved to a value of 0.8. The sensitivity, specificity, and accuracy improved to 72.7%, 79.0%, and 75.8%, respectively. In addition, the proposed network can process 63 images per second on a single mainstream graphics processing unit (GPU) device, making it suitable for real-time analysis of EBUS images.

Conclusions: Deep CNNs can effectively classify malignant LNs from EBUS images. Selecting LNs that require biopsy using real-time EBUS image analysis with deep learning is expected to shorten the EBUS-TBNA procedure time, increase lung cancer nodal staging accuracy, and improve patient safety.
Files in This Item:
T202201223.pdf Download
DOI
10.21037/tlcr-21-870
Appears in Collections:
1. College of Medicine (의과대학) > Dept. of Internal Medicine (내과학교실) > 1. Journal Papers
Yonsei Authors
Kim, Eun Young(김은영) ORCID logo https://orcid.org/0000-0002-3281-5744
Park, Moo Suk(박무석) ORCID logo https://orcid.org/0000-0003-0820-7615
Yong, Seung Hyun(용승현)
Lee, Sang Hoon(이상훈) ORCID logo https://orcid.org/0000-0002-7706-5318
Chang, Yoon Soo(장윤수) ORCID logo https://orcid.org/0000-0003-3340-4223
URI
https://ir.ymlib.yonsei.ac.kr/handle/22282913/188454
사서에게 알리기
  feedback

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse

Links