Hostname: page-component-cd9895bd7-gxg78 Total loading time: 0 Render date: 2024-12-26T08:44:12.811Z Has data issue: false hasContentIssue false

Pure Vision-Based Motion Tracking for Data-Driven Design – A Simple, Flexible, and Cost-Effective Approach for Capturing Static and Dynamic Interactions

Published online by Cambridge University Press:  26 May 2022

S. H. Johnston*
Affiliation:
Norwegian University of Science and Technology, Norway
M. F. Berg
Affiliation:
Norwegian University of Science and Technology, Norway
S. W. Eikevåg
Affiliation:
Norwegian University of Science and Technology, Norway
D. N. Ege
Affiliation:
Norwegian University of Science and Technology, Norway
S. Kohtala
Affiliation:
Norwegian University of Science and Technology, Norway
M. Steinert
Affiliation:
Norwegian University of Science and Technology, Norway

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

This paper presents an exploratory case study where video-based pose estimation is used to analyse human motion to support data-driven design. It provides two example use cases related to design. Results are compared to ground truth measurements showing high correlation for the estimated pose, with an RMSE of 65.5 mm. The paper exemplifies how design projects can benefit from a simple, flexible, and cost-effective approach to capture human-object interactions. This also entails the possibility of implementing interaction and body capturing in the earliest stages of design, at minimal effort.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is unaltered and is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use or in order to create a derivative work.
Copyright
The Author(s), 2022.

References

Bazarevsky, V. et al. . (2020) ‘BlazePose: On-device Real-time Body Pose tracking’, arXiv:2006.10204 [cs] [Preprint]. Available at: http://arxiv.org/abs/2006.10204 (Accessed: 9 November 2021).Google Scholar
Bini, R., Daly, L. and Kingsley, M. (2020) ‘Changes in body position on the bike during seated sprint cycling: Applications to bike fitting’, European Journal of Sport Science, 20(1), pp. 3542. https://dx.doi.org/10.1080/17461391.2019.1610075.CrossRefGoogle ScholarPubMed
Bradski, G. and Kaehler, A. (2000) ‘OpenCV’, Dr. Dobb's journal of software tools, 3.Google Scholar
Eikevåg, S.W. et al. . (2020) ‘DESIGNING AN EXPERIMENT FOR EVALUATING SEATING POSITIONS IN PARALYMPIC ROWING’, Proceedings of the Design Society: DESIGN Conference, 1, pp. 24852494. https://dx.doi.org/10.1017/dsd.2020.101.Google Scholar
Faucett, J. and Rempel, D. (1994) ‘VDT-related musculoskeletal symptoms: Interactions between work posture and psychosocial work factors’, American Journal of Industrial Medicine, 26(5), pp. 597612. https://dx.doi.org/10.1002/ajim.4700260503.Google ScholarPubMed
Gerr, F., Marcus, M. and Monteilh, C. (2004) ‘Epidemiology of musculoskeletal disorders among computer users: lesson learned from the role of posture and keyboard use’, Journal of Electromyography and Kinesiology, 14(1), pp. 2531. https://dx.doi.org/10.1016/j.jelekin.2003.09.014.CrossRefGoogle ScholarPubMed
Glover, N.A., Kakar, R.S. and Chaudhari, A.M.W. (2021) ‘Effects of spinal coupling and marker set on tracking of spine models during running’, Journal of Biomechanics, 116, p. 110217. https://dx.doi.org/10.1016/j.jbiomech.2020.110217.Google ScholarPubMed
Hou, T. et al. . (2020) ‘MobilePose: Real-Time Pose Estimation for Unseen Objects with Weak Shape Supervision’, arXiv:2003.03522 [cs] [Preprint]. Available at: http://arxiv.org/abs/2003.03522 (Accessed: 9 November 2021).Google Scholar
Kurakin, A., Zhang, Z. and Liu, Z. (2012) ‘A real time system for dynamic hand gesture recognition with a depth sensor’, in 2012 Proceedings of the 20th European Signal Processing Conference (EUSIPCO). 2012 Proceedings of the 20th European Signal Processing Conference (EUSIPCO), pp. 19751979.Google Scholar
Lugaresi, C. et al. . (2019) ‘MediaPipe: A Framework for Building Perception Pipelines’, arXiv:1906.08172 [cs] [Preprint]. Available at: http://arxiv.org/abs/1906.08172 (Accessed: 12 November 2021).Google Scholar
Ma, Y., Paterson, H.M. and Pollick, F.E. (2006) ‘A motion capture library for the study of identity, gender, and emotion perception from biological motion’, Behavior Research Methods, 38(1), pp. 134141. https://dx.doi.org/10.3758/BF03192758.CrossRefGoogle Scholar
Pose, MediaPipe (2020) Mediapipe. Available at: https://google.github.io/mediapipe/solutions/pose.html (Accessed: 11 November 2021).Google Scholar
Mitra, S. and Acharya, T. (2007) ‘Gesture Recognition: A Survey’, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 37(3), pp. 311324. https://dx.doi.org/10.1109/TSMCC.2007.893280.Google Scholar
Moggridge, B. (2007) Designing interactions. Cambridge, Mass: MIT Press.Google Scholar
Morrison, C. et al. . (2016) ‘Vision-based body tracking: turning Kinect into a clinical tool’, Disability and Rehabilitation: Assistive Technology, 11(6), pp. 516520. https://dx.doi.org/10.3109/17483107.2014.989419.Google ScholarPubMed
Roetenberg, D., Luinge, H. and Slycke, P. (2013) ‘Xsens MVN: Full 6DOF Human Motion Tracking Using Miniature Inertial Sensors’, p. 9.Google Scholar
Severin, A.C. et al. . (2021) ‘Case Report: Adjusting Seat and Backrest Angle Improves Performance in an Elite Paralympic Rower’, Frontiers in Sports and Active Living, 3, p. 625656. https://dx.doi.org/10.3389/fspor.2021.625656.CrossRefGoogle Scholar
Shimizu, M. et al. . (2016) ‘LIDAR-based body orientation estimation by integrating shape and motion information’, in 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO). 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 19481953. https://dx.doi.org/10.1109/ROBIO.2016.7866614.CrossRefGoogle Scholar
Sjöman, H. et al. . (2015) ‘Dynamically capturing engineering team interactions with wearable technology’, in DS 80-11 Proceedings of the 20th International Conference on Engineering Design (ICED 15) Vol 11: Human Behaviour in Design, Design Education; Milan, Italy, 27-30.07. 15, pp. 153162.Google Scholar
Sudderth, E.B. Erik, B. (2006) Graphical models for visual object recognition and tracking. Thesis. Massachusetts Institute of Technology. Available at: https://dspace.mit.edu/handle/1721.1/34023 (Accessed: 12 November 2021).Google Scholar
Wulvik, A., Erichsen, J. and Steinert, M. (2016) ‘Capturing Body Language in Engineering Design–Tools and Technologies’, DS 85-1 Proceedings of NordDesign 2016, Volume 1, Trondheim, Norway, 10th-12th August 2016, pp. 165174.Google Scholar
Yoshioka, S. et al. . (2018) ‘Pose tracking with rate gyroscopes in alpine skiing’, Sports Engineering, 21(3), pp. 177188. https://dx.doi.org/10.1007/s12283-017-0261-y.CrossRefGoogle Scholar
Zhou, H. et al. (2008) ‘Use of multiple wearable inertial sensors in upper limb motion tracking’, Medical Engineering & Physics, 30(1), pp. 123133. https://dx.doi.org/10.1016/j.medengphy.2006.11.010.CrossRefGoogle Scholar
Zhu, R. and Zhou, Z. (2004) ‘A real-time articulated human motion tracking using tri-axis inertial/magnetic sensors package’, IEEE Transactions on Neural Systems and Rehabilitation Engineering, 12(2), pp. 295302. https://dx.doi.org/10.1109/TNSRE.2004.827825.Google ScholarPubMed