An attention-based long short-term memory prediction model for working conditions of copper electrolytic plates

被引:4
|
作者
Zhu, Hongqiu [1 ,2 ]
Peng, Lei [1 ]
Zhou, Can [1 ]
Dai, Yusi [1 ]
Peng, Tianyu [1 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
[2] Cent South Univ, State Key Lab High Performance Complex Mfg, Changsha 410083, Peoples R China
基金
中国国家自然科学基金;
关键词
plate states prediction; average gray value; LSTM; attention mechanism; SHORT-CIRCUIT DETECTION; MECHANISM;
D O I
10.1088/1361-6501/acc11f
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Copper is an important source of non-ferrous metals, with electrolytic refining being one of the main methods to produce fine copper. In the electrolytic process, plate states seriously affect the output and quality of the copper. Therefore, timely and accurate prediction of the working states of the plates is of great significance to the copper electrolytic refining process. Aiming at the issues associated with traditional plate state detection algorithms of large lag, poor anti-interference ability and low accuracy, a plate state prediction model based on a long short-term memory (LSTM) neural network with an attention mechanism is here proposed in this paper. The average gray values of the plates in infrared imagery are used to characterize the plates' working states. To address the problems of large fluctuation and the large amount of time series data required in such a study, a double-layer LSTM neural network structure is used to improve the efficiency and accuracy of model training. Meanwhile, in view of the periodicity of the time series data and the possible correlation between adjacent data, a unique attention mechanism is proposed to enable the model to learn this correlation between the adjacent data so as to improve the accuracy of the model prediction. The experimental results show that the accuracy of the proposed model for plate state prediction reaches 95.11%. Compared with commonly used prediction algorithms, the plate state prediction model proposed in this paper demonstrates stronger prediction ability.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Attention-based long short-term memory network temperature prediction model
    Kun, Xiao
    Shan, Tian
    Yi, Tan
    Chao, Chen
    PROCEEDINGS OF 2021 7TH INTERNATIONAL CONFERENCE ON CONDITION MONITORING OF MACHINERY IN NON-STATIONARY OPERATIONS (CMMNO), 2021, : 278 - 281
  • [2] Can Eruptions Be Predicted? Short-Term Prediction of Volcanic Eruptions via Attention-Based Long Short-Term Memory
    Le, Hiep, V
    Murata, Tsuyoshi
    Iguchi, Masato
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 13320 - 13325
  • [3] Influenza surveillance with Baidu index and attention-based long short-term memory model
    Dai, Shangfang
    Han, Litao
    PLOS ONE, 2023, 18 (01):
  • [4] Prediction of dengue cases using the attention-based long short-term memory (LSTM) approach
    Majeed, Mokhalad A.
    Shafri, Helmi Z. M.
    Wayayok, Aimrun
    Zulkafli, Zed
    GEOSPATIAL HEALTH, 2023, 18 (01)
  • [5] Attention-based Long Short Term Memory Model for DNA Damage Prediction in Mammalian Cells
    Alsharaiah, Mohammad A.
    Baniata, Laith H.
    Adwan, Omar
    Abu-Shareha, Ahmad Adel
    Abu Alhaf, Mosleh
    Kharma, Qasem
    Hussein, Abdelrahman
    Abualghanam, Orieb
    Alassaf, Nabeel
    Baniata, Mohammad
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (09) : 91 - 99
  • [6] Forecasting Teleconsultation Demand with an Ensemble Attention-Based Bidirectional Long Short-Term Memory Model
    Chen, Wenjia
    Yu, Lean
    Li, Jinlin
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2021, 14 (01) : 821 - 833
  • [7] Towards Attention-Based Convolutional Long Short-Term Memory for Travel Time Prediction of Bus Journeys
    Wu, Jianqing
    Wu, Qiang
    Shen, Jun
    Cai, Chen
    SENSORS, 2020, 20 (12) : 1 - 13
  • [8] Correlational graph attention-based Long Short-Term Memory network for multivariate time series prediction
    Han, Shuang
    Dong, Hongbin
    Teng, Xuyang
    Li, Xiaohui
    Wang, Xiaowei
    APPLIED SOFT COMPUTING, 2021, 106
  • [9] Hybrid attention-based Long Short-Term Memory network for sarcasm identification
    Pandey, Rajnish
    Kumar, Abhinav
    Singh, Jyoti Prakash
    Tripathi, Sudhakar
    APPLIED SOFT COMPUTING, 2021, 106
  • [10] Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification
    Zhou, Peng
    Shi, Wei
    Tian, Jun
    Qi, Zhenyu
    Li, Bingchen
    Hao, Hongwei
    Xu, Bo
    PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2016), VOL 2, 2016, : 207 - 212