9 10

Cited 0 times in

Cited 0 times in

Transformer-Driven Semi-Supervised Learning for Prostate Cancer Histopathology: A DINOv2-TransUNet Framework

DC Field Value Language
dc.contributor.authorRabeya, Rubina Akter-
dc.contributor.authorSeo, Jeong-Wook-
dc.contributor.authorCho, Nam Hoon-
dc.contributor.authorKim, Hee-Cheol-
dc.contributor.authorChoi, Heung-Kook-
dc.date.accessioned2026-03-31T02:37:46Z-
dc.date.available2026-03-31T02:37:46Z-
dc.date.created2026-03-20-
dc.date.issued2026-01-
dc.identifier.issn2504-4990-
dc.identifier.urihttps://ir.ymlib.yonsei.ac.kr/handle/22282913/211703-
dc.description.abstractProstate cancer is diagnosed through a comprehensive study of histopathology slides, which takes time and requires professional interpretation. To minimize this load, we developed a semi-supervised learning technique that combines transformer-based representation learning and a custom TransUNet classifier. To capture a wide range of morphological structures without manual annotation, our method pretrains DINOv2 on 10,000 unlabeled prostate tissue patches. After receiving the transformer-derived features, a bespoke CNN-based decoder uses residual upsampling and carefully constructed skip connections to merge data from many spatial scales. Expert pathologists identified only 20% of the patches in the whole dataset; the remaining unlabeled samples were contributed by using a consistency-driven learning method that promoted reliable predictions across various augmentations. The model received precision and recall scores of 91.81% and 89.02%, respectively, and an accuracy of 93.78% on an additional test set. These results exceed the performance of a conventional U-Net and a baseline encoder-decoder network. All things considered, the localized CNN (Convolutional Neural Network) decoding and global transformer attention provide a reliable method for prostate cancer classification in situations with little annotated data.-
dc.language영어-
dc.publisherMDPI-
dc.relation.isPartOfMACHINE LEARNING AND KNOWLEDGE EXTRACTION-
dc.titleTransformer-Driven Semi-Supervised Learning for Prostate Cancer Histopathology: A DINOv2-TransUNet Framework-
dc.typeArticle-
dc.contributor.googleauthorRabeya, Rubina Akter-
dc.contributor.googleauthorSeo, Jeong-Wook-
dc.contributor.googleauthorCho, Nam Hoon-
dc.contributor.googleauthorKim, Hee-Cheol-
dc.contributor.googleauthorChoi, Heung-Kook-
dc.identifier.doi10.3390/make8020026-
dc.subject.keywordprostate cancer-
dc.subject.keywordhistopathology-
dc.subject.keywordself-supervised learning-
dc.subject.keywordVision Transformer (ViT)-
dc.subject.keywordDINOv2-
dc.subject.keywordTransUNet-
dc.subject.keywordcomputational pathology-
dc.contributor.affiliatedAuthorCho, Nam Hoon-
dc.identifier.scopusid2-s2.0-105031090447-
dc.identifier.wosid001700747800001-
dc.citation.volume8-
dc.citation.number2-
dc.identifier.bibliographicCitationMACHINE LEARNING AND KNOWLEDGE EXTRACTION, Vol.8(2), 2026-01-
dc.identifier.rimsid92135-
dc.type.rimsART-
dc.description.journalClass1-
dc.description.journalClass1-
dc.subject.keywordAuthorprostate cancer-
dc.subject.keywordAuthorhistopathology-
dc.subject.keywordAuthorself-supervised learning-
dc.subject.keywordAuthorVision Transformer (ViT)-
dc.subject.keywordAuthorDINOv2-
dc.subject.keywordAuthorTransUNet-
dc.subject.keywordAuthorcomputational pathology-
dc.subject.keywordPlusCIRCUIT-
dc.subject.keywordPlusFINFET-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscopus-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryComputer Science, Interdisciplinary Applications-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.identifier.articleno26-
Appears in Collections:
1. College of Medicine (의과대학) > Dept. of Pathology (병리학교실) > 1. Journal Papers

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.