0 0

Cited 0 times in

Cited 0 times in

Automated quantification of brain PET in PET/CT using deep learning-based CT-to-MR translation: a feasibility study

DC Field Value Language
dc.contributor.authorKim, Daesung-
dc.contributor.authorChoo, Kyobin-
dc.contributor.authorLee, Sangwon-
dc.contributor.authorKang, Seongjin-
dc.contributor.authorYun, Mijin-
dc.contributor.authorYang, Jaewon-
dc.date.accessioned2025-11-14T06:40:00Z-
dc.date.available2025-11-14T06:40:00Z-
dc.date.created2025-07-24-
dc.date.issued2025-07-
dc.identifier.issn1619-7070-
dc.identifier.urihttps://ir.ymlib.yonsei.ac.kr/handle/22282913/208834-
dc.description.abstractPurpose Quantitative analysis of PET images in brain PET/CT relies on MRI-derived regions of interest (ROIs). However, the pairs of PET/CT and MR images are not always available, and their alignment is challenging if their acquisition times differ considerably. To address these problems, this study proposes a deep learning framework for translating CT of PET/CT to synthetic MR images (MRSYN) and performing automated quantitative regional analysis using MRSYN-derived segmentation. Methods In this retrospective study, 139 subjects who underwent brain [F-18]FBB PET/CT and T1-weighted MRI were included. A U-Net-like model was trained to translate CT images to MRSYN; subsequently, a separate model was trained to segment MRSYN into 95 regions. Regional and composite standardised uptake value ratio (SUVr) was calculated in [F-18]FBB PET images using the acquired ROIs. For evaluation of MRSYN, quantitative measurements including structural similarity index measure (SSIM) were employed, while for MRSYN-based segmentation evaluation, Dice similarity coefficient (DSC) was calculated. Wilcoxon signed-rank test was performed for SUVrs computed using MRSYN and ground-truth MR (MRGT). Results Compared to MRGT, the mean SSIM of MRSYN was 0.974 +/- 0.005. The MRSYN-based segmentation achieved a mean DSC of 0.733 across 95 regions. No statistical significance (P > 0.05) was found for SUVr between the ROIs from MRSYN and those from MRGT, excluding the precuneus. Conclusion We demonstrated a deep learning framework for automated regional brain analysis in PET/CT with MRSYN. Our proposed framework can benefit patients who have difficulties in performing an MRI scan.-
dc.languageEnglish-
dc.publisherSpringer-Verlag Berlin-
dc.relation.isPartOfEUROPEAN JOURNAL OF NUCLEAR MEDICINE AND MOLECULAR IMAGING-
dc.relation.isPartOfEUROPEAN JOURNAL OF NUCLEAR MEDICINE AND MOLECULAR IMAGING-
dc.subject.MESHAdult-
dc.subject.MESHAged-
dc.subject.MESHAutomation-
dc.subject.MESHBrain* / diagnostic imaging-
dc.subject.MESHDeep Learning*-
dc.subject.MESHFeasibility Studies-
dc.subject.MESHFemale-
dc.subject.MESHHumans-
dc.subject.MESHImage Processing, Computer-Assisted* / methods-
dc.subject.MESHMagnetic Resonance Imaging*-
dc.subject.MESHMale-
dc.subject.MESHMiddle Aged-
dc.subject.MESHPositron Emission Tomography Computed Tomography* / methods-
dc.subject.MESHRetrospective Studies-
dc.titleAutomated quantification of brain PET in PET/CT using deep learning-based CT-to-MR translation: a feasibility study-
dc.typeArticle-
dc.contributor.googleauthorKim, Daesung-
dc.contributor.googleauthorChoo, Kyobin-
dc.contributor.googleauthorLee, Sangwon-
dc.contributor.googleauthorKang, Seongjin-
dc.contributor.googleauthorYun, Mijin-
dc.contributor.googleauthorYang, Jaewon-
dc.identifier.doi10.1007/s00259-025-07132-2-
dc.relation.journalcodeJ00833-
dc.identifier.eissn1619-7089-
dc.identifier.pmid39964542-
dc.subject.keywordPET/CT-
dc.subject.keywordAmyloid-
dc.subject.keywordQuantification-
dc.subject.keywordDeep learning-
dc.subject.keywordSegmentation-
dc.contributor.affiliatedAuthorLee, Sangwon-
dc.contributor.affiliatedAuthorKang, Seongjin-
dc.contributor.affiliatedAuthorYun, Mijin-
dc.identifier.scopusid2-s2.0-85218130201-
dc.identifier.wosid001425011400001-
dc.citation.volume52-
dc.citation.number8-
dc.citation.startPage2959-
dc.citation.endPage2967-
dc.identifier.bibliographicCitationEUROPEAN JOURNAL OF NUCLEAR MEDICINE AND MOLECULAR IMAGING, Vol.52(8) : 2959-2967, 2025-07-
dc.identifier.rimsid88142-
dc.type.rimsART-
dc.description.journalClass1-
dc.description.journalClass1-
dc.subject.keywordAuthorPET/CT-
dc.subject.keywordAuthorAmyloid-
dc.subject.keywordAuthorQuantification-
dc.subject.keywordAuthorDeep learning-
dc.subject.keywordAuthorSegmentation-
dc.subject.keywordPlusTEMPLATE-
dc.type.docTypeArticle-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalWebOfScienceCategoryRadiology, Nuclear Medicine & Medical Imaging-
dc.relation.journalResearchAreaRadiology, Nuclear Medicine & Medical Imaging-
Appears in Collections:
1. College of Medicine (의과대학) > Dept. of Nuclear Medicine (핵의학교실) > 1. Journal Papers

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.