31 70

Cited 1 times in

Densely Convolutional Spatial Attention Network for nuclei segmentation of histological images for computational pathology

DC Field Value Language
dc.contributor.author조남훈-
dc.date.accessioned2024-03-22T05:51:26Z-
dc.date.available2024-03-22T05:51:26Z-
dc.date.issued2023-01-
dc.identifier.urihttps://ir.ymlib.yonsei.ac.kr/handle/22282913/198253-
dc.description.abstractIntroductionAutomatic nuclear segmentation in digital microscopic tissue images can aid pathologists to extract high-quality features for nuclear morphometrics and other analyses. However, image segmentation is a challenging task in medical image processing and analysis. This study aimed to develop a deep learning-based method for nuclei segmentation of histological images for computational pathology. MethodsThe original U-Net model sometime has a caveat in exploring significant features. Herein, we present the Densely Convolutional Spatial Attention Network (DCSA-Net) model based on U-Net to perform the segmentation task. Furthermore, the developed model was tested on external multi-tissue dataset - MoNuSeg. To develop deep learning algorithms for well-segmenting nuclei, a large quantity of data are mandatory, which is expensive and less feasible. We collected hematoxylin and eosin-stained image data sets from two hospitals to train the model with a variety of nuclear appearances. Because of the limited number of annotated pathology images, we introduced a small publicly accessible data set of prostate cancer (PCa) with more than 16,000 labeled nuclei. Nevertheless, to construct our proposed model, we developed the DCSA module, an attention mechanism for capturing useful information from raw images. We also used several other artificial intelligence-based segmentation methods and tools to compare their results to our proposed technique. ResultsTo prioritize the performance of nuclei segmentation, we evaluated the model's outputs based on the Accuracy, Dice coefficient (DC), and Jaccard coefficient (JC) scores. The proposed technique outperformed the other methods and achieved superior nuclei segmentation with accuracy, DC, and JC of 96.4% (95% confidence interval [CI]: 96.2 - 96.6), 81.8 (95% CI: 80.8 - 83.0), and 69.3 (95% CI: 68.2 - 70.0), respectively, on the internal test data set. ConclusionOur proposed method demonstrates superior performance in segmenting cell nuclei of histological images from internal and external datasets, and outperforms many standard segmentation algorithms used for comparative analysis.-
dc.description.statementOfResponsibilityopen-
dc.formatapplication/pdf-
dc.languageEnglish-
dc.publisherFrontiers Research Foundation-
dc.relation.isPartOfFRONTIERS IN ONCOLOGY-
dc.rightsCC BY-NC-ND 2.0 KR-
dc.titleDensely Convolutional Spatial Attention Network for nuclei segmentation of histological images for computational pathology-
dc.typeArticle-
dc.contributor.collegeCollege of Medicine (의과대학)-
dc.contributor.departmentDept. of Pathology (병리학교실)-
dc.contributor.googleauthorRashadul Islam Sumon-
dc.contributor.googleauthorSubrata Bhattacharjee-
dc.contributor.googleauthorYeong-Byn Hwang-
dc.contributor.googleauthorHafizur Rahman-
dc.contributor.googleauthorHee-Cheol Kim-
dc.contributor.googleauthorWi-Sun Ryu-
dc.contributor.googleauthorDong Min Kim-
dc.contributor.googleauthorNam-Hoon Cho-
dc.contributor.googleauthorHeung-Kook Choi-
dc.identifier.doi10.3389/fonc.2023.1009681-
dc.contributor.localIdA03812-
dc.relation.journalcodeJ03512-
dc.identifier.eissn2234-943X-
dc.identifier.pmid37305563-
dc.subject.keywordattention mechanism-
dc.subject.keywordcomputational pathology-
dc.subject.keyworddeep learning-
dc.subject.keywordhistological image-
dc.subject.keywordnuclei segmentation-
dc.contributor.alternativeNameCho, Nam Hoon-
dc.contributor.affiliatedAuthor조남훈-
dc.citation.volume13-
dc.citation.startPage1009681-
dc.identifier.bibliographicCitationFRONTIERS IN ONCOLOGY, Vol.13 : 1009681, 2023-01-
Appears in Collections:
1. College of Medicine (의과대학) > Dept. of Pathology (병리학교실) > 1. Journal Papers

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.