|
pp. 5411-5424
S&M4253 Research Paper https://doi.org/10.18494/SAM5952 Published: December 8, 2025 Eye–hand Coordination Simulator of Robot Arms for Science, Technology, Engineering, and Mathematics Education [PDF] Pimpran Preedanont and Pitikhate Sooraksa (Received September 30, 2025; Accepted October 14, 2025) Keywords: STEM education, cyber-physical system, eye–hand coordination, vision system, displacement sensors, pick/place
In this paper, we present an eye–hand coordination simulator for robot arms as a compact cyber-physical science, technology, engineering, and mathematics (STEM) learning unit that links visual perception to robot motion in pick/place interactions with a mobile robot as an automatic guided vehicle (AGV). The unit integrates four domains into one workflow: science (kinematics and motion), technology (sensors, motor controllers, vision), engineering (mechanisms and control states), and mathematics (geometric computation and frame transforms). The pick/place machine prototype includes linear X–Y–Z-axes with rotary and flip joints to realign an item box between a shelf and an AGV. A vision system detects a pair of fiducial circles to estimate the AGV centerline, yaw, and slot positions, while displacement sensors measure stand-off and assist parallel alignment. Performance was evaluated using a mock-up AGV positioned with varied offsets and yaw within a ±10 mm parking tolerance. Across 10 trials, the vision-based estimates of middle-slot X and stand-off Sx closely matched tape measurements, achieving 98–100% accuracy. The results show that the simulator is dependable in vision-guided coordination and usable as a simple, accessible platform for STEM education.
Corresponding author: Pitikhate Sooraksa![]() ![]() This work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Pimpran Preedanont and Pitikhate Sooraksa, Eye–hand Coordination Simulator of Robot Arms for Science, Technology, Engineering, and Mathematics Education, Sens. Mater., Vol. 37, No. 12, 2025, p. 5411-5424. |