Young Researcher Paper Award 2025
🥇Winners

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 32, Number 10(1) (2020)
Copyright(C) MYU K.K.
pp. 3243-3259
S&M2336 Research Paper of Special Issue
https://doi.org/10.18494/SAM.2020.2863
Published: October 9, 2020

Continuous Facial Emotion Recognition Method Based on Deep Learning of Academic Emotions [PDF]

Szu-Yin Lin, Chao-Ming Wu, Shih-Lun Chen, Ting-Lan Lin, and Yi-Wen Tseng

(Received March 15, 2020; Accepted June 23, 2020)

Keywords: academic emotions, face emotion recognition, deep learning, convolutional neural networks, long short-term memory networks

It is important to comprehend students’ academic emotions in interactive teaching environments. Academic emotions refer to facial expressions that students display along with their academic performance in a learning process. By noting students’ academic emotions, teachers can provide the most suitable teaching material according to the emotions to improve their academic performance and motivation. The results can also be subsequently applied to adaptive learning. Recently, some researchers have attempted to study academic emotions with the aid of facial and emotion recognition technologies. However, most studies focused on the analysis and recognition of a single image. It was not considered that academic emotions are a continuous expression in response to the learning situation over a period of time. To address this problem, a continuous facial emotional pattern recognition method based on deep learning is proposed in this study to analyze academic emotions. This method combines the convolutional neural network (CNN) and the long short-term memory (LSTM) network for deep learning to recognize and analyze the continuous facial academic emotional pattern of students and thus recognize academic emotions. Through this method, the e-learning system can understand the learning progress of students quickly and accurately, and offer the students appropriate teaching materials to enhance their academic performance and motivation. The experimental results showed that the recognition accuracies of the CNN model and CNN plus LSTM were 72.47 and 84.33%, respectively. The combination of two neural networks improved the accuracy by approximately 12% compared with that for the CNN alone.

Corresponding author: Szu-Yin Lin


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Szu-Yin Lin, Chao-Ming Wu, Shih-Lun Chen, Ting-Lan Lin, and Yi-Wen Tseng, Continuous Facial Emotion Recognition Method Based on Deep Learning of Academic Emotions, Sens. Mater., Vol. 32, No. 10, 2020, p. 3243-3259.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Signal Collection, Processing, and System Integration in Automation Applications 2026
Guest editor, Hsiung-Cheng Lin (National Chin-Yi University of Technology), Ming-Te Chen (National Chin-Yi University of Technology), and Chin-Yi Cheng (National Yunlin University of Science and Technology)
Call for paper


Special Issue on Advanced GeoAI for Smart Cities: Novel Data Modeling with Multi-source Sensor Data
Guest editor, Prof. Changfeng Jing (China University of Geosciences Beijing)
Call for paper


Special Issue on Advanced Sensor Application Development
Guest editor, Shih-Chen Shi (National Cheng Kung University) and Tao-Hsing Chen (National Kaohsiung University of Science and Technology)
Call for paper


Special Issue on Mobile Computing and Ubiquitous Networking for Smart Society
Guest editor, Akira Uchiyama (The University of Osaka) and Jaehoon Paul Jeong (Sungkyunkwan University)
Call for paper


Special Issue on Advanced Materials and Technologies for Sensor and Artificial- Intelligence-of-Things Applications (Selected Papers from ICASI 2026)
Guest editor, Sheng-Joue Young (National Yunlin University of Science and Technology)
Conference website
Call for paper


Special Issue on Biosensing Devices
Guest editor, Kiyotaka Sasagawa (Nara Institute of Science and Technology)
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.