0 194

Cited 80 times in

Keratinocytic Skin Cancer Detection on the Face Using Region-Based Convolutional Neural Network

DC Field Value Language
dc.date.accessioned2022-09-06T06:43:11Z-
dc.date.available2022-09-06T06:43:11Z-
dc.date.issued2020-01-
dc.identifier.issn2168-6068-
dc.identifier.urihttps://ir.ymlib.yonsei.ac.kr/handle/22282913/190288-
dc.description.abstractThis diagnostic/prognostic study evaluates a deep learning algorithm designed to identify facial lesions and predict malignancy. Importance Detection of cutaneous cancer on the face using deep-learning algorithms has been challenging because various anatomic structures create curves and shades that confuse the algorithm and can potentially lead to false-positive results. Objective To evaluate whether an algorithm can automatically locate suspected areas and predict the probability of a lesion being malignant. Design, Setting, and Participants Region-based convolutional neural network technology was used to create 924538 possible lesions by extracting nodular benign lesions from 182348 clinical photographs. After manually or automatically annotating these possible lesions based on image findings, convolutional neural networks were trained with 1106886 image crops to locate and diagnose cancer. Validation data sets (2844 images from 673 patients; mean [SD] age, 58.2 [19.9] years; 308 men [45.8%]; 185 patients with malignant tumors, 305 with benign tumors, and 183 free of tumor) were obtained from 3 hospitals between January 1, 2010, and September 30, 2018. Main Outcomes and Measures The area under the receiver operating characteristic curve, F1 score (mean of precision and recall; range, 0.000-1.000), and Youden index score (sensitivity + specificity -1; 0%-100%) were used to compare the performance of the algorithm with that of the participants. Results The algorithm analyzed a mean (SD) of 4.2 (2.4) photographs per patient and reported the malignancy score according to the highest malignancy output. The area under the receiver operating characteristic curve for the validation data set (673 patients) was 0.910. At a high-sensitivity cutoff threshold, the sensitivity and specificity of the model with the 673 patients were 76.8% and 90.6%, respectively. With the test partition (325 images; 80 patients), the performance of the algorithm was compared with the performance of 13 board-certified dermatologists, 34 dermatology residents, 20 nondermatologic physicians, and 52 members of the general public with no medical background. When the disease screening performance was evaluated at high sensitivity areas using the F1 score and Youden index score, the algorithm showed a higher F1 score (0.831 vs 0.653 [0.126], P < .001) and Youden index score (0.675 vs 0.417 [0.124], P < .001) than that of nondermatologic physicians. The accuracy of the algorithm was comparable with that of dermatologists (F1 score, 0.831 vs 0.835 [0.040]; Youden index score, 0.675 vs 0.671 [0.100]). Conclusions and Relevance The results of the study suggest that the algorithm could localize and diagnose skin cancer without preselection of suspicious lesions by dermatologists. Question Can an algorithm using a region-based convolutional neural network detect skin lesions in unprocessed clinical photographs and predict risk of skin cancer? Findings In this diagnostic study, a total of 924538 training image-crops including various benign lesions were generated with the help of a region-based convolutional neural network. The area under the receiver operating characteristic curve for the validation data set (2844 images from 673 patients comprising 185 malignant, 305 benign, and 183 normal conditions) was 0.910, and the algorithm's F1 score and Youden index score were comparable with those of dermatologists and surpassed those of nondermatologists. Meaning With unprocessed photographs, the algorithm may be able to localize and diagnose skin cancer without manual preselection of suspicious lesions by dermatologists.-
dc.description.statementOfResponsibilityopen-
dc.languageEnglish-
dc.publisherAmerican Medical Association-
dc.relation.isPartOfJAMA DERMATOLOGY-
dc.rightsCC BY-NC-ND 2.0 KR-
dc.subject.MESHAdult-
dc.subject.MESHAged-
dc.subject.MESHCarcinoma, Basal Cell / diagnosis*-
dc.subject.MESHCarcinoma, Basal Cell / pathology-
dc.subject.MESHCarcinoma, Squamous Cell / diagnosis*-
dc.subject.MESHCarcinoma, Squamous Cell / pathology-
dc.subject.MESHDatasets as Topic-
dc.subject.MESHFace-
dc.subject.MESHFemale-
dc.subject.MESHHumans-
dc.subject.MESHImage Processing, Computer-Assisted / methods*-
dc.subject.MESHKeratinocytes / pathology-
dc.subject.MESHMale-
dc.subject.MESHMiddle Aged-
dc.subject.MESHNeural Networks, Computer*-
dc.subject.MESHPhotography-
dc.subject.MESHROC Curve-
dc.subject.MESHSkin / cytology-
dc.subject.MESHSkin / diagnostic imaging-
dc.subject.MESHSkin / pathology-
dc.subject.MESHSkin Neoplasms / diagnosis*-
dc.subject.MESHSkin Neoplasms / pathology-
dc.titleKeratinocytic Skin Cancer Detection on the Face Using Region-Based Convolutional Neural Network-
dc.typeArticle-
dc.contributor.collegeCollege of Medicine (의과대학)-
dc.contributor.departmentDept. of Dermatology (피부과학교실)-
dc.contributor.googleauthorSeung Seog Han-
dc.contributor.googleauthorIk Jun Moon-
dc.contributor.googleauthorWoohyung Lim-
dc.contributor.googleauthorIn Suck Suh-
dc.contributor.googleauthorSam Yong Lee-
dc.contributor.googleauthorJung-Im Na-
dc.contributor.googleauthorSeong Hwan Kim-
dc.contributor.googleauthorSung Eun Chang-
dc.identifier.doi10.1001/jamadermatol.2019.3807-
dc.relation.journalcodeJ01197-
dc.identifier.eissn2168-6084-
dc.identifier.pmid31799995-
dc.identifier.urlhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6902187/-
dc.citation.volume156-
dc.citation.number1-
dc.citation.startPage29-
dc.citation.endPage37-
dc.identifier.bibliographicCitationJAMA DERMATOLOGY, Vol.156(1) : 29-37, 2020-01-
Appears in Collections:
1. College of Medicine (의과대학) > Dept. of Dermatology (피부과학교실) > 1. Journal Papers

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.