Hostname: page-component-745bb68f8f-b6zl4 Total loading time: 0 Render date: 2025-01-21T21:58:43.421Z Has data issue: false hasContentIssue false

Wearable gesture control design for unmanned aerial vehicle based on multi-sensor fusion

Published online by Cambridge University Press:  21 January 2025

Guang Liu
Affiliation:
School of Electronic and Information Engineering, Hebei University of Technology, Tianjin, 300401, China Innovation and Research Institute of Hebei University of Technology (Shijiazhuang), Shijiazhuang, 050299, China
Yang Liu
Affiliation:
School of Electronic and Information Engineering, Hebei University of Technology, Tianjin, 300401, China Innovation and Research Institute of Hebei University of Technology (Shijiazhuang), Shijiazhuang, 050299, China
Shurui Fan*
Affiliation:
School of Electronic and Information Engineering, Hebei University of Technology, Tianjin, 300401, China Innovation and Research Institute of Hebei University of Technology (Shijiazhuang), Shijiazhuang, 050299, China
Weijia Cui
Affiliation:
The 54th Research Institute of CETC, Shijiazhuang, 050081, China
Kewen Xia
Affiliation:
School of Electronic and Information Engineering, Hebei University of Technology, Tianjin, 300401, China
Li Wang
Affiliation:
School of Electronic and Information Engineering, Hebei University of Technology, Tianjin, 300401, China
*
Corresponding author: Shurui Fan; Email: [email protected]

Abstract

Traditional bulky and complex control devices such as remote control and ground station cannot meet the requirement of fast and flexible control of unmanned aerial vehicles (UAVs) in complex environments. Therefore, a data glove based on multi-sensor fusion is designed in this paper. In order to achieve the goal of gesture control of UAVs, the method can accurately recognize various gestures and convert them into corresponding UAV control commands. First, the wireless data glove fuses flexible fiber optic sensors and inertial sensors to construct a gesture dataset. Then, the trained neural network model is deployed to the STM32 microcontroller-based data glove for real-time gesture recognition, in which the convolutional neural network-Attention mechanism (CNN-Attention) network is used for static gesture recognition, and the convolutional neural network-bidirectional long and short-term memory (CNN-Bi-LSTM) network is used for dynamic gesture recognition. Finally, the gestures are converted into control commands and sent to the vehicle terminal to control the UAV. Through the UAV simulation test on the simulation platform, the average recognition accuracy of 32 static gestures reaches 99.7%, and the average recognition accuracy of 13 dynamic gestures reaches 99.9%, which indicates that the system’s gesture recognition effect is perfect. The task test in the scene constructed in the real environment shows that the UAV can respond to the gestures quickly, and the method proposed in this paper can realize the real-time stable control of the UAV on the terminal side.

Type
Research Article
Copyright
© The Author(s), 2025. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Kandemir, H. and Kose, H., “Development of adaptive human-computer interaction games to evaluate attention,” Robotica 40(1), 5676 (2021).CrossRefGoogle Scholar
Liu, X. and Durrani, T. S., “Joint multi-UAV deployments for air-ground integrated networks,” IEEE Aerosp Electron Syst Mag 37(12), 412 (2022).CrossRefGoogle Scholar
Ahmed, M. A., Zaidan, B. B., Zaidan, A. A., Salih, M. M. and Lakulu, M.M.B., “A review on systems-based sensory gloves for sign language recognition state of the art between 2007 and 2017,” Sensors 18(7), 2208 (2018).CrossRefGoogle ScholarPubMed
Arafat, M. Y., Alam, M. M. and Moh, S., “Vision-based navigation techniques for unmanned aerial vehicles: Review and challenges,” Drones 7(2), 89 (2023).CrossRefGoogle Scholar
Zhuang, L., Zhong, X., Xu, L., Tian, C. and Yu, W., “Visual SLAM for unmanned aerial vehicles: Localization and perception,” Sensors 24(10), 2980 (2024).CrossRefGoogle ScholarPubMed
Li, J., Liu, X., Wang, Z., Zhang, T., Qiu, S., Zhao, H., Zhou, X., Cai, H., Ni, R. and Cangelosi, A., “Real-time hand gesture tracking for human-computer interface based on multi-sensor data fusion,” IEEE Sens J 21(23), 2664226654 (2021).CrossRefGoogle Scholar
Ojeda-Castelo, J. J., Capobianco-Uriarte, M. L. M., Piedra-Fernandez, J. A. and Ayala, R., “A survey on intelligent gesture recognition techniques,” IEEE Access 10, 8713587156 (2022).CrossRefGoogle Scholar
Chen, W., Yu, C., Tu, C., Lyu, Z., Tang, J., Ou, S., Fu, Y. and Xue, Z., “A survey on hand pose estimation with wearable sensors and computer-vision-based methods,” Sensors 20(4), 1074 (2020).CrossRefGoogle ScholarPubMed
Wang, R. and Tao, D., “Context-aware implicit authentication of smartphone users based on multi-sensor behavior,” IEEE Access 7, 119654119667 (2019).CrossRefGoogle Scholar
Galván-Ruiz, J., Travieso-González, C. M., Tejera-Fettmilch, A., Pinan-Roescher, A., Esteban-Hernández, L. and Domínguez-Quintana, L., “Perspective and evolution of gesture recognition for sign language: A review,” Sensors 20(12), 3571 (2020).CrossRefGoogle ScholarPubMed
Mohamed, N., Mustafa, M. B. and Jomhari, N., “A review of the hand gesture recognition system: Current progress and future directions,” IEEE Access 9, 157422157436 (2021).CrossRefGoogle Scholar
Amin, M. S., Rizvi, S. T. H. and Hossain, M. M., “A comparative review on applications of different sensors for sign language recognition,” J Imaging 8(4), 98 (2022).CrossRefGoogle ScholarPubMed
Lee, M. and Bae, J., “Deep learning based real-time recognition of dynamic finger gestures using a data glove,” IEEE Access 8, 219923219933 (2020).CrossRefGoogle Scholar
Sanchez, L. F., Abaunza, H. and Castillo, P., “User-robot interaction for safe navigation of a quadrotor,” Robotica 38(12), 21892203 (2020).CrossRefGoogle Scholar
Antillon, D. W. O., Walker, C. R., Rosset, S. and Anderson, I. A., “Glove-Based Hand Gesture Recognition for Diver Communication,” In: IEEE Transactions On Neural Networks and Learning Systems, (2022) pp. 113.Google Scholar
Fang, B., Sun, F., Liu, H. and Liu, C., “3D human gesture capturing and recognition by the IMMU-based data glove,” Neurocomputing 277, 198207 (2018).CrossRefGoogle Scholar
Perez-Ramirez, C., Almanza-Ojeda, D., Guerrero-Tavares, J., Mendoza-Galindo, F., Estudillo-Ayala, J. and Ibarra-Manzano, M., “An architecture for measuring joint angles using a long period fiber grating-based sensor,” Sensors 14(12), 2448324501 (2014).CrossRefGoogle ScholarPubMed
Setiawan, J. D., Ariyanto, M., Munadi, M., Mutoha, M., Glowacz, A. and Caesarendra, W., “Grasp posture control of wearable extra robotic fingers with flex sensors based on neural network,” Electronics 9(6), 905 (2020).CrossRefGoogle Scholar
Hwang, Y.-T., Lu, W.-A. and Lin, B.-S., “Use of functional data to model the trajectory of an inertial measurement unit and classify levels of motor impairment for stroke patients,” IEEE Trans Neural Syst Rehabil Eng 30, 925935 (2022).CrossRefGoogle ScholarPubMed
Fei, F., Xian, S., Xie, X., Wu, C., Yang, D., Yin, K. and Zhang, G., “Development of a wearable glove system with multiple sensors for hand kinematics assessment,” Micromachines 12(4), 362 (2021).CrossRefGoogle ScholarPubMed
Connolly, J., Condell, J., B. OFlynn, Sanchez, J. T. and Gardiner, P., “IMU sensor-based electronic goniometric glove (iSEG-glove) for clinical finger movement analysis,” IEEE Sens J 1-1, 11 (2017).CrossRefGoogle Scholar
Sarwat, H., Sarwat, H., Maged, S. A., Emara, T. H., Elbokl, A. M. and Awad, M. I., “Design of a data glove for assessment of hand performance using supervised machine learning,” Sensors 21(21), 6948 (2021).CrossRefGoogle ScholarPubMed
Ullah, F., AbuAli, N. A., Ullah, A., Ullah, R., Siddiqui, U. A. and Siddiqui, A. A., “Fusion-based body-worn ioT sensor platform for gesture recognition of autism spectrum disorder children,” Sensors 23(3), 1672 (2023).CrossRefGoogle ScholarPubMed
Barioul, R. and Kanoun, O., “k-tournament grasshopper extreme learner for FMG-based gesture recognition,” Sensors 23(3), 1096 (2023).CrossRefGoogle ScholarPubMed
Kim, P., Lee, J. and Shin, C. S., “Classification of walking environments using deep learning approach based on surface EMG sensors only,” Sensors 21(12), 4204 (2021).CrossRefGoogle ScholarPubMed
Jeon, H., Choi, H., Noh, D., Kim, T. and Lee, D., “Wearable inertial sensor-based hand-guiding gestures recognition method robust to significant changes in the body-alignment of subject,” Mathematics 10(24), 4753 (2022).CrossRefGoogle Scholar
Watanabe, K., Chen, Y. D., Komura, H. and Ohka, M., “Tangential-force detection ability of three-axis fingernail-color sensor aided by CNN,” Robotica 41(7), 20502063 (2023).CrossRefGoogle Scholar
Liu, J., Luo, Y. and Ju, Z., “An interactive astronaut-robot system with gesture control,” Comput Intel Neurosc 2016, 111 (2016).Google ScholarPubMed
Xu, Z., Yu, J., Xiang, W., Zhu, S., Hussain, M., Liu, B. and Li, J., “A novel SE-CNN attention architecture for sEMG-based hand gesture recognition[J],” Comput Model Eng Sci 134(1), 157177 (2023).Google Scholar
Park, G., Chandrasegar, V. K. and Koh, J., “Accuracy enhancement of hand gesture recognition using CNN[J],” IEEE Access 11, 2649626501 (2023).CrossRefGoogle Scholar
Mekruksavanich, S. and Jitpattanakul, A., “Deep convolutional neural network with RNNs for complex activity recognition using wrist-worn wearable sensor data,” Electronics 10(14), 1685 (2021).CrossRefGoogle Scholar
Areeb, Q. M., Maryam, , Nadeem, M., Alroobaea, R. and Anwer, F., “Helping hearing-impaired in emergency situations: A deep learning-based approach,” IEEE Access 10, 85028517 (2022).CrossRefGoogle Scholar
Zhang, P. F., Xue, J. R., Lan, C. L., Zeni, W. J., Gao, Z. N. and Zheng, N. N., “Adding Attentiveness to the Neurons in Recurrent Neural Networks,” In: Computer Vision (2018) pp. 136152.Google Scholar
Hu, M. J., Gong, Y. L., Chen, X. J. and Han, B., “A gesture recognition method based on MIC-attention- LSTM[J],” Hum-Centric Comput Inform Sci 13, 21 (2023).Google Scholar
Liu, H., Hu, F., Su, J., Wei, X. and Qin, R., “Comparisons on Kalman-filter-based dynamic state estimation algorithms of power systems,” IEEE Access 8, 5103551043 (2020).CrossRefGoogle Scholar
Ji, P., Wang, X., Ma, F., Feng, J. and Li, C., “A 3D hand attitude estimation method for fixed hand posture based on dual-view RGB images,” Sensors 22(21), 8410 (2022).CrossRefGoogle ScholarPubMed
Jahanchahi, C. and Mandic, D. P., “A class of quaternion kalman filters,” IEEE Trans Neur Net Learn Syst 25(3), 533544 (2014).CrossRefGoogle ScholarPubMed