50 137

Cited 0 times in

A study on the effectiveness of intermediate features in deep learning on facial expression recognition

 KyeongTeak Oh  ;  Sun K. Yoo 
 International Journal of Advanced Smart Convergence, Vol.12(2) : 25-33, 2023-06 
Journal Title
 International Journal of Advanced Smart Convergence 
Issue Date
Intermediate Feature ; Artificial Intelligence ; Facial Expression Recognition
The purpose of this study is to evaluate the impact of intermediate features on FER performance. To achieve this objective, intermediate features were extracted from the input images at specific layers (FM1~FM4) of the pre-trained network (Resnet-18). These extracted intermediate features and original images were used as inputs to the vision transformer (ViT), and the FER performance was compared. As a result, when using a single image as input, using intermediate features extracted from FM2 yielded the best performance (training accuracy: 94.35%, testing accuracy: 75.51%). When using the original image as input, the training accuracy was 91.32% and the testing accuracy was 74.68%. However, when combining the original image with intermediate features as input, the best FER performance was achieved by combining the original image with FM2, FM3, and FM4 (training accuracy: 97.88%, testing accuracy: 79.21%). These results imply that incorporating intermediate features alongside the original image can lead to superior performance. The findings can be referenced and utilized when designing the preprocessing stages of a deep learning model in FER. By considering the effectiveness of using intermediate features, practitioners can make informed decisions to enhance the performance of FER systems.
Files in This Item:
T202306008.pdf Download
Appears in Collections:
1. College of Medicine (의과대학) > Dept. of Medical Engineering (의학공학교실) > 1. Journal Papers
Yonsei Authors
Oh, Kyeong Taek(오경택) ORCID logo https://orcid.org/0000-0002-6857-0945
Yoo, Sun Kook(유선국) ORCID logo https://orcid.org/0000-0002-6032-4686
사서에게 알리기


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.