|Table of Contents|

[1] Xu Xinzhou, Chen Yongfa, Liu Guangming, Li Ziqian, et al. Motion estimation for cable-driven distal end-effectors usingattention-based bi-directional gated recurrent neural networks [J]. Journal of Southeast University (English Edition), 2023, 39 (2): 187-193. [doi:10.3969/j.issn.1003-7985.2023.02.010]
Copy

Motion estimation for cable-driven distal end-effectors usingattention-based bi-directional gated recurrent neural networks()
Share:

Journal of Southeast University (English Edition)[ISSN:1003-7985/CN:32-1325/N]

Volumn:
39
Issue:
2023 2
Page:
187-193
Research Field:
Automation
Publishing date:
2023-06-20

Info

Title:
Motion estimation for cable-driven distal end-effectors usingattention-based bi-directional gated recurrent neural networks
Author(s):
Xu Xinzhou1 Chen Yongfa1 Liu Guangming2 Li Ziqian2 Zhao Li3 Wang Zhengyu2
1School of Internet of Things, Nanjing University of Posts and Telecommunications, Nanjing 210003, China
2School of Mechanical Engineering, Hefei University of Technology, Hefei 230009, China
3School of Information Science and Engineering, Southeast University, Nanjing 210096, China
Keywords:
cable-driven distal end-effectors motion estima-tion bi-directional gated recurrent neural networks attention mechanism
PACS:
TP241.3
DOI:
10.3969/j.issn.1003-7985.2023.02.010
Abstract:
A data-driven motion estimation approach based on attention-based bi-directional gated recurrent neural networks was proposed to adaptively estimate the motion of a cable-driven distal end-effector. First, the data construction was performed to obtain short-term temporal sequences as training samples. The data were then processed using the bi-directional gated recurrent neural networks with self-attention modules for building sequential models on the samples. Finally, based on the motion dataset of the cable-driven distal end-effectors, the estimation-performance comparison experiments were performed using the motor’s position, speed, and input time sequence for the system-control signal as the sample features. The results show that compared with conventional sequence modeling regression approaches, the proposed approach can achieve better performance for estimating the motion of the end-effector. Therefore, it can effectively estimate the motion of cable-driven distal end-effectors under complex conditions.

References:

[1] Wang Z Y, Zi B, Wang D M, et al. External force self-sensing based on cable-tension disturbance observer for surgical robot end-effector[J]. IEEE Sensors Journal, 2019, 19(13): 5274-5284. DOI: 10.1109/JSEN.2019.2903776.
[2] Yu X, Liu G M, Wang Z Y, et al. Full-closed loop tracking control based on multi-factor coupling compensations using artificial neural network for a cable-pulley-driven surgical robotic manipulator[C]//Proc USCToMM Symposium on Mechanical Systems and Robotics. Rapid City, SD, USA, 2022: 43-53. DOI: 10.1007/978-3-030-99826-4_5.
[3] Wang W, Li J M, Wang S X, et al. System design and animal experiment study of a novel minimally invasive surgical robot[J]. The International Journal of Medical Robotics and Computer Assisted Surgery, 2016, 12(1): 73-84. DOI: 10.1002/rcs.1658.
[4] Chen B, Zi B, Wang Z Y, et al. Knee exoskeletons for gait rehabilitation and human performance augmentation: A state-of-the-art[J]. Mechanism and Machine Theory, 2019, 134: 499-511. DOI: 10.1016/j.mechmachtheory.2019.01.016.
[5] Chen Q, Zi B, Sun Z, et al. Design and development of a new cable-driven parallel robot for waist rehabilitation[J]. IEEE/ASME Transactions on Mechatronics, 2019, 24(4): 1497-1507. DOI: 10.1109/TMECH.2019.2917294.
[6] Xie F, Shang W W, Zhang B, et al. High-precision trajectory tracking control of cable-driven parallel robots using robust synchronization[J]. IEEE Transactions on Industrial Informatics, 2020, 17(4): 2488-2499. DOI: 10.1109/TII.2020.3004167.
[7] Rasheed T, Long P, Roos A S, et al. Optimization based trajectory planning of mobile cable-driven parallel robots[C]//2019 Proc IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS). Macau, China, 2019: 6788-6793. DOI: 10.1109/IROS40897.2019.8968133.
[8] Kuan J Y, Pasch K A, Herr H M. A high-performance cable-drive module for the development of wearable devices[J]. IEEE/ASME Transactions on Mechatronics, 2018, 23(3): 1238-1248. DOI: 10.1109/TMECH.2018.2822764.
[9] Min S, Yi S. Development of cable-driven anthropomorphic robot hand[J]. IEEE Robotics and Automation Letters, 2021, 6(2): 1176-1183. DOI: 10.1109/LRA.2021.3056375.
[10] Wang Y Y, Yan F, Zhu K W, et al. A new practical robust control of cable-driven manipulators using time-delay estimation[J]. International Journal of Robust and Nonlinear Control, 2019, 29(11): 3405-3425. DOI: 10.1002/rnc.4566.
[11] Zhang J, Kan Z Y, Li Y, et al. Novel design of a cable-driven continuum robot with multiple motion patterns[J]. IEEE Robotics and Automation Letters, 2022, 7(3): 6163-6170. DOI: 10.1109/LRA.2022.3166547.
[12] Korayem M H, Yousefzadeh M, Kian S. Precise end-effector pose estimation in spatial cable-driven parallel robots with elastic cables using a data fusion method[J]. Measurement, 2018, 130: 177-190. DOI: 10.1016/j.measurement.2018.08.009.
[13] Peng H N, Yang X J, Su Y H, et al. Real-time data driven precision estimator for RAVEN-II surgical robot end effector position[C]// 2020 Proc IEEE International Conference on Robotics and Automation(ICRA). Paris, France, 2020: 350-356. DOI: 10.1109/ICRA40945.2020.9196915.
[14] Lertpiriyasuwat V, Berg M C. Adaptive real-time estimation of end-effector position and orientation using precise measurements of end-effector position[J]. IEEE/ASME Transactions on Mechatronics, 2006, 11(3): 304-319. DOI: 10.1109/TMECH.2006.876515.
[15] Liu X, Zhao F, Ge S S, et al. End-effector force estimation for flexible-joint robots with global friction approximation using neural networks[J].IEEE Transactions on Industrial Informatics, 2019, 15(3): 1730-1741. DOI: 10.1109/TII.2018.2876724.
[16] Xue R F, Du Z J, Yan Z Y, et al. An estimation method of grasping force for laparoscope surgical robot based on the model of a cable-pulley system[J]. Mechanism and Machine Theory, 2019, 134: 440-454. DOI: 10.1016/j.mechmachtheory.2018.12.032.
[17] Chung J, Gulcehre C, Cho K H, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling[EB/OL].(2014-12-11)[2022-09-01]. https://arxiv.org/1412.3555.pdf.
[18] Yin W, Kann K, Yu M, et al. Comparative study of CNN and RNN for natural language processing[EB/OL].(2017-02-07)[2022-09-01]. https://arxiv.org/1702.01923.pdf.
[19] Shewalkar A, Nyavanandi D, Ludwig S A. Performance evaluation of deep neural networks applied to speech recognition: RNN, LSTM and GRU[J]. Journal of Artificial Intelligence and Soft Computing Research, 2019, 9(4): 235-245. DOI: 10.2478/jaiscr-2019-0006.
[20] Basiri M E, Nemati S, Abdar M, et al. ABCDM: An attention-based bidirectional CNN-RNN deep model for sentiment analysis[J]. Future Generation Computer Systems, 2021, 115: 279-294. DOI: 10.1016/j.future.2020.08.005.
[21] Fernando T, Ghaemmaghami H, Denman S, et al. Heart sound segmentation using bidirectional LSTMs with attention[J]. IEEE Journal of Biomedical and Health Informatics, 2020, 24(6): 1601-1609. DOI: 10.1109/JBHI.2019.2949516.
[22] Zhou L Y, Fan X J, Tjahjadi T, et al. Discriminative attention-augmented feature learning for facial expression recognition in the wild[J]. Neural Computing and Applications, 2022, 34(2): 925-936. DOI: 10.1007/s00521-021-06045-z.
[23] Wang Z Y, Zi B, Wang D M, et al. Design, modeling and analysis of a novel backdrivable cable-driven series elastic actuator[C]//2019 Proc IEEE International Conference on Nanotechnology(IEEE-NANO). Macau, China, 2019: 179-183. DOI: 10.1109/NANO46743.2019.8993930.
[24] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]//Proc International Conference on Neural Information Processing Systems. Long Beach, CA, USA, 2017:30. DOI: 10.5555/3295222.3295349.
[25] Liu X, You J L, Wu Y L, et al. Attention-based bidirectional GRU networks for efficient HTTPS traffic classification[J]. Information Sciences, 2020, 541: 297-315. DOI: 10.1016/j.ins.2020.05.035.
[26] Xu X, Deng J, Cummins N, et al. Autonomous emotion learning in speech: A view of zero-shot speech emotion recognition[C]//Proc Annual Conference of the International Speech Communication Association(INTERSPEECH). Graz, Austria, 2019: 949-953. DOI: 10.21437/Interspeech.2019-2406.
[27] Dong X, Williamson D S. An attention enhanced multi-task model for objective speech assessment in real-world environments[C]//Proc IEEE International Conference on Acoustics, Speech and Signal Processing(ICASSP). Virtual Barcelona, Spain, 2020: 911-915. DOI: 10.1109/ICASSP40776.2020.9053366.
[28] Xu X, Deng J, Zhang Z, et al. Rethinking auditory affective descriptors through zero-shot emotion recognition in speech[J]. IEEE Transactions on Computational Social Systems, 2022, 9(5): 1530 - 1541. DOI: 10.1109/TCSS.2021.3130401.

Memo

Memo:
Biographies: Xu Xinzhou(1987—), male, doctor, associate professor; Wang Zhengyu(corresponding author), male, doctor, associate professor, wangzhengyu_hfut@hfut.edu.cn.
Foundation items: The China Postdoctoral Science Foundation(No.2022M711693), the Natural Science Foundation of China(No.52175221, 61801241, 62071242), the Natural Science Foundation of Jiangsu Province(No.BK20191381).
Citation: Xu Xinzhou, Chen Yongfa, Liu Guangming, et al. Motion estimation for cable-driven distal end-effectors using attention-based bi-directional gated recurrent neural networks[J].Journal of Southeast University(English Edition), 2023, 39(2):187-193.DOI:10.3969/j.issn.1003-7985.2023.02.010.
Last Update: 2023-06-20