pp. 2321-2335
S&M3325 Research Paper https://doi.org/10.18494/SAM4509 Published: July 14, 2023 Tool Wear Prediction Based on Attention Long Short-term Memory Network with Small Samples [PDF] Weiwei Yu, Hua Huang, Runlan Guo, and Pengqiang Yang (Received May 11, 2023; Accepted June 20, 2023) Keywords: attention long short-term memory network, data augmentation, state recognition, k-nearest neighbor classifier, tool wear prediction
In tool wear monitoring, the environment for signal collection is always complex, which leads to insufficient signal state samples and unbalanced category labels. Moreover, the hidden state features extracted by neural networks in conventional methods are mixed together, resulting in the low prediction accuracy of tool wear. Therefore, a tool wear prediction method based on an attention long short-term memory (LSTM) network with data imbalance is proposed. First, a generative adversarial network (GAN) is used to improve the imbalance of state category labels and expand data samples. Then, an extended data sample is used as the input of the stacked sparse autoencoder network (SSAE) to adaptively extract features, and the k-nearest neighbor classifier is used to identify the different stages of tool wear. Finally, on the basis of the state identification results, the time series features with different tool expansion data samples are extracted and input into the attention LSTM network to map the tool wear values for different tool wear processes. The experimental results show that the proposed method can improve the imbalance of category labels, increase the selection of more informative components in sequence data, and obtain excellent prediction accuracy and generalization.
Corresponding author: Hua HuangThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Weiwei Yu, Hua Huang, Runlan Guo, and Pengqiang Yang, Tool Wear Prediction Based on Attention Long Short-term Memory Network with Small Samples, Sens. Mater., Vol. 35, No. 7, 2023, p. 2321-2335. |