Young Researcher Paper Award 2023
🥇Winners

Notice of retraction
Vol. 34, No. 8(3), S&M3042

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Copyright(C) MYU K.K.
Published in advance: March 25, 2024

Flexible Temporal Correlation Learning for Human, Animal, and Interactor Detection in Videos [PDF]

Yanjun Feng and Jun Liu

(Received April 8, 2023; Accepted December 15, 2023)

Keywords: object detection, video understanding, attention, temporal learning

Video object detection is a key technology for detecting and tracking humans and animals in behavior-understanding tasks. Furthermore, detecting small-scale interactors involved in human activities is challenging. Exploiting the temporal context relationship is important for continuous understanding. Temporal object detection has been the subject of significant attention, but most commonly used detection methods fail to fully leverage the abundant temporal information in videos. In the paper, we propose a novel approach to detect humans and animals in videos, called attentional temporal You Only Look Once (ATYOLO), which exploits the attention mechanism and convolutional long short-term memory. We use the proposed attentional module to integrate a pyramidal feature hierarchy temporally and design a unique structure that includes a low-level temporal unit and a high-level unit for multiscale feature maps. We have developed an innovative temporal analysis group with a temporal attention mechanism tailored for background and scale suppression. This attentional group integrates attention-aware features over time. Extensive comparisons are conducted to evaluate the detection capability of the proposed approach, and its superiority has been confirmed. As a result, the developed ATYOLO achieves fast speed and overall competitive performance in video detection, including ImageNet Video (VID) and Stanford Drone Dataset (SDD).

Corresponding author: Jun Liu




Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Applications of Novel Sensors and Related Technologies for Internet of Things
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)
Call for paper


Special Issue on Advanced Sensing Technologies for Green Energy
Guest editor, Yong Zhu (Griffith University)
Call for paper


Special Issue on Room-temperature-operation Solid-state Radiation Detectors
Guest editor, Toru Aoki (Shizuoka University)
Call for paper


Special Issue on International Conference on Biosensors, Bioelectronics, Biomedical Devices, BioMEMS/NEMS and Applications 2023 (Bio4Apps 2023)
Guest editor, Dzung Viet Dao (Griffith University) and Cong Thanh Nguyen (Griffith University)
Conference website
Call for paper


Special Issue on Advanced Sensing Technologies and Their Applications in Human/Animal Activity Recognition and Behavior Understanding
Guest editor, Kaori Fujinami (Tokyo University of Agriculture and Technology)
Call for paper


Special Issue on Piezoelectric Thin Films and Piezoelectric MEMS
Guest editor, Isaku Kanno (Kobe University)
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.