Young Researcher Paper Award 2025
🥇Winners

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 32, Number 10(1) (2020)
Copyright(C) MYU K.K.
pp. 3137-3155
S&M2329 Research Paper of Special Issue
https://doi.org/10.18494/SAM.2020.2771
Published: October 9, 2020

Uniform Experimental Design for Optimizing the Parameters of Multi-input Convolutional Neural Networks [PDF]

Cheng-Jian Lin, Chen-Hsien Wu, Chi-Chia Sun, and Cheng-Hsien Lin

(Received January 17, 2020; Accepted May 25, 2020)

Keywords: image recognition, gender classification, convolutional neural network, uniform experimental design

In this paper, a multi-input convolutional neural network (CNN) based on a uniform experimental design (UED) is proposed for gender classification applications. The proposed multi-input CNN uses multiple CNNs to obtain output results through individual training and concatenation. In addition, to avoid using trial and error for determining the architecture parameters of the multi-input CNN, a UED was used in this study. The experimental results confirmed that the dual-input CNN with a UED achieved accuracies of 99.68 and 99.06% for the CIA and MORPH datasets, respectively. The accuracy of the proposed CNN increased significantly when increasing the number of inputs.

Corresponding author: Cheng-Jian Lin


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Cheng-Jian Lin, Chen-Hsien Wu, Chi-Chia Sun, and Cheng-Hsien Lin, Uniform Experimental Design for Optimizing the Parameters of Multi-input Convolutional Neural Networks, Sens. Mater., Vol. 32, No. 10, 2020, p. 3137-3155.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Signal Collection, Processing, and System Integration in Automation Applications 2026
Guest editor, Hsiung-Cheng Lin (National Chin-Yi University of Technology), Ming-Te Chen (National Chin-Yi University of Technology), and Chin-Yi Cheng (National Yunlin University of Science and Technology)
Call for paper


Special Issue on Advanced GeoAI for Smart Cities: Novel Data Modeling with Multi-source Sensor Data
Guest editor, Prof. Changfeng Jing (China University of Geosciences Beijing)
Call for paper


Special Issue on Advanced Sensor Application Development
Guest editor, Shih-Chen Shi (National Cheng Kung University) and Tao-Hsing Chen (National Kaohsiung University of Science and Technology)
Call for paper


Special Issue on Mobile Computing and Ubiquitous Networking for Smart Society
Guest editor, Akira Uchiyama (The University of Osaka) and Jaehoon Paul Jeong (Sungkyunkwan University)
Call for paper


Special Issue on Advanced Materials and Technologies for Sensor and Artificial- Intelligence-of-Things Applications (Selected Papers from ICASI 2026)
Guest editor, Sheng-Joue Young (National Yunlin University of Science and Technology)
Conference website
Call for paper


Special Issue on Biosensing Devices
Guest editor, Kiyotaka Sasagawa (Nara Institute of Science and Technology)
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.