0 15

Cited 0 times in

Cited 0 times in

Identifying Gender-Specific Visual Bias Signals in Skin Lesion Classification

DC Field Value Language
dc.contributor.authorLee, Heejae-
dc.contributor.authorYang, Sejung-
dc.contributor.authorChu, Yuseong-
dc.contributor.authorOh, Byungho-
dc.date.accessioned2026-03-25T07:32:05Z-
dc.date.available2026-03-25T07:32:05Z-
dc.date.created2026-03-24-
dc.date.issued2026-01-
dc.identifier.issn0302-9743-
dc.identifier.urihttps://ir.ymlib.yonsei.ac.kr/handle/22282913/211491-
dc.description.abstractRecent advances in AI-based medical diagnosis have demonstrated impressive accuracy. However, concerns remain regarding the fairness of these models across demographic groups. Gender-related biases embedded in skin lesion images may compromise diagnostic equity and lead to systematic disparities in clinical decision making. In this study, we investigate gender-specific visual bias signals in skin lesion classification using the International Skin Imaging Collaboration 2019 dataset. We trained ConvNeXt models on male-only, female-only, and mixed-gender datasets, and evaluated them across all test sets. In addition, we selected 100 samples per signal from our dataset based on visual prominence scores across 10 dermoscopic features known to influence skin lesion appearance. Our results reveal systematic disparities in model performance across gender groups. For example, the Blue signal led to a sharp performance drop when the female model was evaluated on male data, while the male model performed substantially better on the same subset. This contrast highlights how certain visual signals can hinder cross-gender generalization. These findings suggest that certain visual features affect model reliability depending on patient gender, raising concerns for fairness in real-world clinical deployment. Our work provides empirical evidence and diagnostic insights that can support the development of bias-aware dermatological AI systems.-
dc.languageEnglish-
dc.publisherSpringer-
dc.relation.isPartOfFAIRNESS OF AI IN MEDICAL IMAGING, FAIMI 2025-
dc.relation.isPartOfLecture Notes in Computer Science-
dc.titleIdentifying Gender-Specific Visual Bias Signals in Skin Lesion Classification-
dc.typeArticle-
dc.contributor.googleauthorLee, Heejae-
dc.contributor.googleauthorYang, Sejung-
dc.contributor.googleauthorChu, Yuseong-
dc.contributor.googleauthorOh, Byungho-
dc.identifier.doi10.1007/978-3-032-05870-6_6-
dc.relation.journalcodeJ02160-
dc.identifier.urlhttps://link.springer.com/chapter/10.1007/978-3-032-05870-6_6-
dc.subject.keywordGender Bias-
dc.subject.keywordModel Fairness in Medical AI-
dc.subject.keywordSkin Lesion Classification-
dc.contributor.affiliatedAuthorOh, Byungho-
dc.identifier.scopusid2-s2.0-105017965640-
dc.identifier.wosid001678692600006-
dc.citation.volume15976-
dc.citation.startPage53-
dc.citation.endPage62-
dc.identifier.bibliographicCitationFAIRNESS OF AI IN MEDICAL IMAGING, FAIMI 2025, Vol.15976 : 53-62, 2026-01-
dc.identifier.rimsid92259-
dc.type.rimsART-
dc.description.journalClass1-
dc.description.journalClass1-
dc.subject.keywordAuthorGender Bias-
dc.subject.keywordAuthorModel Fairness in Medical AI-
dc.subject.keywordAuthorSkin Lesion Classification-
dc.type.docTypeProceedings Paper-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscopus-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryComputer Science, Theory & Methods-
dc.relation.journalWebOfScienceCategoryRadiology, Nuclear Medicine & Medical Imaging-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaRadiology, Nuclear Medicine & Medical Imaging-
Appears in Collections:
1. College of Medicine (의과대학) > Dept. of Dermatology (피부과학교실) > 1. Journal Papers

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.