0 64

Cited 0 times in

Wound image segmentation using deep convolutional neural network

Authors
 Hyunyoung Kang  ;  Kyungdeok Seo  ;  Sena Lee  ;  Byung Ho Oh  ;  Sejung Yang 
Citation
 Progress in Biomedical Optics and Imaging - Proceedings of SPIE, Vol.12352 : 123520F, 2023-03 
Journal Title
Progress in Biomedical Optics and Imaging - Proceedings of SPIE
ISSN
 1605-7422 
Issue Date
2023-03
Abstract
Traditional methods of wound diagnosis have been diagnosed and prescribed by the naked eye of an expert. If the wound segmentation algorithm is applied to the wound diagnosis, the area of wound can be quantitated and used as an auxiliary means of treatment. Even with dramatic development of Deep learning technology in recent years, However, a lack of datasets generally occurs overfitting problem of deep learning model, which leads to poor performance for external datasets. Therefore, we trained the wound segmentation model by adding a new wound dataset in addition to the existing Open dataset, the Diabetic Foot Ulcer Challenge Dataset. Machine learning based methods are used when producing new dataset, ground truth images. Thus, in addition to the manual methods, Gradient Vector Flow machine learning techniques is used for ground-truth image production to reduce the time consumed in vain. The wound segmentation model used in this study is a U-net with residual block combined with cross entropy loss and Dice loss. As a result of the experiment, the wound segmentation accuracy was about 90% for Dice coefficient
Full Text
https://www.spiedigitallibrary.org/conference-proceedings-of-spie/12352/2649913/Wound-image-segmentation-using-deep-convolutional-neural-network/10.1117/12.2649913.short
DOI
10.1117/12.2649913
Appears in Collections:
1. College of Medicine (의과대학) > Dept. of Dermatology (피부과학교실) > 1. Journal Papers
Yonsei Authors
Oh, Byung Ho(오병호) ORCID logo https://orcid.org/0000-0001-9575-5665
URI
https://ir.ymlib.yonsei.ac.kr/handle/22282913/199631
사서에게 알리기
  feedback

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse

Links