Journal of Mechanical Engineering ›› 2025, Vol. 61 ›› Issue (15): 21-39.doi: 10.3901/JME.2025.15.021
WANG Baicun1,2, SONG Ci1, YUAN Yixiu1, ZHOU Huiying1,3, BAO Jinsong4, HUANG Sihan5, LIU Weiran6, LIU Tingyu7, RUAN Bing8, TAO Fei6, XIE Haibo1,2, YANG Huayong1,2
Received:2025-03-03
Revised:2025-05-23
Published:2025-09-28
CLC Number:
WANG Baicun, SONG Ci, YUAN Yixiu, ZHOU Huiying, BAO Jinsong, HUANG Sihan, LIU Weiran, LIU Tingyu, RUAN Bing, TAO Fei, XIE Haibo, YANG Huayong. Research and Application Progress of Human Motion Digital Twin for Human-centric Smart Manufacturing[J]. Journal of Mechanical Engineering, 2025, 61(15): 21-39.
| [1] WANG B,TAO F,FANG X,et al. Smart manufacturing and intelligent manufacturing:A comparative review[J].Engineering,2021,7(6):738-757. [2] 王柏村,薛塬,延建林,等.以人为本的智能制造:理念、技术与应用[J].中国工程科学,2020,22(4):139-146.WANG Baicun, XUE Yuan, YAN Jianlin, et al.Human-centered intelligent manufacturing:Overview and perspectives[J]. Strategic Study of CAE,2020,22(4):139-146. [3] HUANG S,WANG B,LI X,et al. Industry 5.0 and Society 5.0-Comparison, complementation and coevolution[J]. Journal of Manufacturing Systems,2022,64:424-428. [4] LENG J,SHA W,WANG B,et al. Industry 5.0:Prospect and retrospect[J]. Journal of Manufacturing Systems,2022,65:279-295. [5] WANG B,ZHOU H,LI X,et al. Human digital twin in the context of Industry 5.0[J]. Robotics and ComputerIntegrated Manufacturing,2024,85:102626. [6] WANG B,ZHENG P,YIN Y,et al. Toward humancentric smart manufacturing:A human-cyber-physical systems (HCPS) perspective[J]. Journal of Manufacturing Systems,2022,63:471-490. [7] WANG B,LI Y,FREIHEIT T. Towards intelligent welding systems from a HCPS perspective:A technology framework and implementation roadmap[J]. Journal of Manufacturing Systems,2022,65:244-259. [8] 王柏村,易兵,刘振宇,等. HCPS视角下智能制造的发展与研究[J].计算机集成制造系统,2021,27(10):2749-2761.WANG Baicun,YI Bing,LIU Zhenyu,et al. Evolution and State-of-the-art of smart manufacturing from HCPS perspective[J]. Computer Integrated Manufacturing Systems,2021,27(10):2749-2761. [9] 王柏村,黄思翰,易兵,等.面向智能制造的人因工程研究与发展[J].机械工程学报,2020,56(16):240-253.WANG Baicun, HUANG Sihan, YI Bing, et al.State-of-art of human factors/ergonomics in smart manufacturing[J]. Journal of Mechanical Engineering,2020,56(16):240-253. [10] HOMAYOUNFAR S Z, ANDREW T L. Wearable sensors for monitoring human motion:A review on mechanisms, materials, and challenges[J]. SLAS TECHNOLOGY:Translating Life Sciences Innovation,2020,25(1):9-24. [11] BONCI A, CEN CHENG P D, INDRI M, et al.Human-robot perception in industrial environments:A survey[J]. Sensors,2021,21(5):1571. [12] KONG Y, FU Y. Human action recognition and prediction:A survey[J]. International Journal of Computer Vision,2022,130(5):1366-1401. [13] RUDENKO A,PALMIERI L,HERMAN M,et al.Human motion trajectory prediction:A survey[J]. The International Journal of Robotics Research,2020,39(8):895-935. [14] ZHU W,MA X,RO D,et al. Human motion generation:A survey[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2024,46(4):2430-2449. [15] WANG L,LIU S,LIU H,et al. Overview of human-robot collaboration in manufacturing[C]//Proceedings of the Proceedings of 5th International Conference on the Industry 40 Model for Advanced Manufacturing,2020,15-58. [16] 陶飞,刘蔚然,张萌,等.数字孪生五维模型及十大领域应用[J].计算机集成制造系统,2019,25(1):1-18.TAO Fei, LIU Weiran, ZHANG Meng, et al.Five-dimension digital twin model and its ten applications[J]. Computer Integrated Manufacturing Systems,2019,25(1):1-18. [17] JEUNET C,GLIZE B,MCGONIGAL A,et al. Using EEG-based brain computer interface and neurofeedback targeting sensorimotor rhythms to improve motor skills:Theoretical background,applications and prospects[J].Neurophysiologie Clinique,2019,49(2):125-136. [18] K DIVYA B,P. A. KARTHICK,RAMAKRISHNAN S.Automated detection of muscle fatigue conditions from cyclostationary based geometric features of surface electromyography signals[J]. Computer Methods in Biomechanics&Biomedical Engineering,2022,25(3):320-332. [19] LI T,BOLKART T,BLACK M J,et al. Learning a model of facial shape and expression from 4D scans[J]. ACM Transactions on Graphics,2017,36(6):194. [20] PAVLAKOS G,CHOUTAS V,GHORBANI N,et al.Expressive body capture:3D hands,face,and body from a single image[C]//Proceedings of the Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition,2019,10975-10985. [21] ROMERO J,TZIONAS D,BLACK M J. Embodied hands:Modeling and capturing hands and bodies together[J]. ArXiv preprint:220102610,2022. [22] INKOL K A,BROWN C,MCNALLY W,et al. Muscle torque generators in multibody dynamic simulations of optimal sports performance[J]. Multibody System Dynamics,2020,50(4):435-452. [23] SAUL K R,HU X,GOEHLER C M,et al. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model[J].Computer Methods in Biomechanics and Biomedical Engineering,2015,18(13):1445-1458. [24] 牛泽海.多视角三维人体姿态估计方法研究[D].北京:中国科学院大学,2024.NIU Zehai. Research on multi-view 3D human pose estimation[D]. Beijing:University of Chinese Academy of Sciences,2024. [25] CAO Z,SIMON T,WEI S E,et al. Realtime multi-person2D pose estimation using part affinity fields[C]//Proceedings of the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2017,7291-7299. [26] LIU D,HUANG Y,LIU Z,et al. A skeleton-based assembly action recognition method with feature fusion for human-robot collaborative assembly[J]. Journal of Manufacturing Systems,2024,76:553-566. [27] ZHANG Z,JI Y,TANG D,et al. Enabling collaborative assembly between humans and robots using a digital twin system[J]. Robotics and Computer-Integrated Manufacturing,2024,86:102691. [28] FEBRER-NAFRÍA M,NASR A,EZATI M,et al.Predictive multibody dynamic simulation of human neuromusculoskeletal systems:A review[J]. Multibody System Dynamics,2023,58(3):299-339. [29] SITOLE S P,SUP F C. Continuous prediction of human joint mechanics using EMG signals:A review of model-based and model-free approaches[J]. IEEE Transactions on Medical Robotics and Bionics,2023,5(3):528-546. [30] HU L,ZHAI D H,YU D,et al. A hybrid framework based on bio-signal and built-in force sensor for human-robot active co-carrying[J]. IEEE Transactions on Automation Science and Engineering,2024,22:3553-3566. [31] SELVARAJ V,AL-AMIN M,YU X,et al. Real-time action localization of manual assembly operations using deep learning and augmented inference state machines[J].Journal of Manufacturing Systems,2024,72:504-518. [32] BOLDO M,DE MARCHI M,MARTINI E,et al.Real-time multi-camera 3D human pose estimation at the edge for industrial applications[J]. Expert Systems with Applications,2024,252:124089. [33] DUARTE L, NETO P. Classification of primitive manufacturing tasks from filtered event data[J]. Journal of Manufacturing Systems,2023,68:12-24. [34] ZHOU H,YANG G,WANG B,et al. An attention-based deep learning approach for inertial motion recognition and estimation in human-robot collaboration[J]. Journal of Manufacturing Systems,2023,67:97-110. [35] YI X,ZHOU Y,XU F. Transpose:Real-time 3D human translation and pose estimation with six inertial sensors[J].ACM Transactions on Graphics,2021,40(4):1-13. [36] SAN-SEGUNDO R, BLUNCK H, MORENOPIMENTEL J,et al. Robust human activity recognition using smartwatches and smartphones[J]. Engineering Applications of Artificial Intelligence,2018,72:190-202. [37] XIE C,ZHANG D,WU Z,et al. RPM:RF-based pose machines[J]. IEEE Transactions on Multimedia,2023,26:637-649. [38] ADIB F,HSU C Y,MAO H,et al. Capturing the human figure through a wall[J]. ACM Transactions on Graphics,2015,34(6):1-13. [39] YUK D G,SOHN J W. User independent hand motion recognition for robot arm manipulation[J]. Journal of Mechanical Science and Technology, 2022, 36(6):2739-2747. [40] 李瀚哲,张小栋,李睿,等.利用运动准备电位的人体下肢自主运动意图预先感知方法[J].西安交通大学学报,2019,53(10):16-23.LI Hanzhe, ZHANG Xiaodong, LI Rui, et al. A preperception method for voluntary movement intention of lower limb using readiness potential[J]. Journal of Xi'an Jiaotong University,2019,53(10):16-23. [41] WU H,JIANG D,GAO H. Tactile motion recognition with convolutional neural networks[C]//Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),2017,1572-1577. [42] KARATSIDIS A,BELLUSCI G,SCHEPERS H M,et al.Estimation of ground reaction forces and moments during gait using only inertial motion capture[J]. Sensors,2016,17(1):75. [43] CONFORTI I,MILETI I,PANARIELLO D,et al.Validation of a novel wearable solution for measuring L5/S1 load during manual material handling tasks[C]//Proceedings of the 2020 IEEE International Workshop on Metrology for Industry 40&Io T,2020,501-506. [44] YU J,LI M,ZHANG X,et al. A multi-sensor gesture interaction system for human-robot cooperation[C]//Proceedings of the 2021 IEEE International Conference on Networking,Sensing and Control (ICNSC),2021,1-6. [45] LIU Z,CHENG Q,SONG C,et al. Cross-scale cascade transformer for multimodal human action recognition[J].Pattern Recognition Letters,2023,168:17-23. [46] YASAR M S,ISLAM M M,IQBAL T. IMPRINT:Interactional dynamics-aware motion prediction in teams using multimodal context[J]. ACM Transactions on Human-Robot Interaction,2024,13(3):1-29. [47] 董元发,蒋磊,彭巍,等.融合脑电-肌电信号的人机协作装配意图识别方法[J].中国机械工程,2022,33(17):2071-2078.DONG Yuanfa,JIANG Lei,PENG Wei,et al. Intention recognition method of human robot cooperation assembly based on EEG-EMG signals[J]. China Mechanical Engineering,2022,33(17):2071-2078. [48] HUANG F,ZENG A,LIU M,et al. DeepFuse:An IMU-aware network for real-time 3D human pose estimation from multi-view image[C]//Proceedings of the Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision,2020,429-438. [49] WANG B,SONG C,LI X,et al. A deep learning-enabled visual-inertial fusion method for human pose estimation in occluded human-robot collaborative assembly scenarios[J].Robotics and Computer-Integrated Manufacturing,2025,93:102906. [50] WANG Z,LI P,ZHANG Q,et al. A LiDAR-depth camera information fusion method for human robot collaboration environment[J]. Information Fusion,2025,114:102717. [51] PATIL A K,BALASUBRAMANYAM A,RYU J Y,et al. An open-source platform for human pose estimation and tracking using a heterogeneous multi-sensor system[J].Sensors,2021,21(7):2340. [52] FABER G,KINGMA I,CHANG C,et al. Validation of a wearable system for 3D ambulatory L5/S1 moment assessment during manual lifting using instrumented shoes and an inertial sensor suit[J]. Journal of Biomechanics,2020,102:109671. [53] YIN M-Y,LI J-G. A systematic review on digital human models in assembly process planning[J]. The International Journal of Advanced Manufacturing Technology,2023,125(3):1037-1059. [54] PAPANAGIOTOU D,SENTERI G,MANITSARIS S.Egocentric gesture recognition using 3D convolutional neural networks for the spatiotemporal adaptation of collaborative robots[J]. Frontiers in Neurorobotics,2021,15:703545. [55] SLAMA R,SLAMA I,TLAHIG H,et al. An overview on human-centred technologies, measurements and optimisation in assembly systems[J]. International Journal of Production Research,2024,62(14):5336-5358. [56] PUCHERT P, ROPINSKI T. A3GC-IP:Attention-oriented adjacency adaptive recurrent graph convolutions for human pose estimation from sparse inertial measurements[J]. Computers&Graphics,2023,117:96-104. [57] 陈彦,张锐,李亚东,等.基于无线信号的人体姿态估计综述[J].雷达学报,2024,13:1-19.CHEN Yan,ZHANG Rui,LI Yadong,et al. Human pose estimation based on wireless signals[J]. Journal of Radars,2024,13:1-19. [58] JIANG W,XUE H,MIAO C,et al. Towards 3D human pose construction using Wi Fi[C]//Proceedings of the Proceedings of the 26th Annual International Conference on Mobile Computing and Networking,2020,1-14. [59] 李承翰,胡明皓,李航,等.运动捕捉技术及柔性传感器用于人体连续运动监测的研究进展[J].微纳电子技术,2023,60(11):1703-1714.LI Chenghan,HU Minghao,LI Hang,et al. Research progress of motion capture technology and flexible sensors for continuous human motion monitoring[J].Micronanoelectronic Technology, 2023, 60(11):1703-1714. [60] CHEN F,LYU H,PANG Z,et al. WristCam:A wearable sensor for hand trajectory gesture recognition and intelligent human-robot interaction[J]. IEEE Sensors Journal,2018,19(19):8441-8451. [61] SPARROW D,KRUGER K,BASSON A. Human digital twin for integrating human workers in industry[C]//Proceedings of the Proceedings of the International Conference on Competitive Manufacturing (COMA 19) Proceedings,2019:1-6. [62] JOUKOV V,LIN J F-S,KULIĆD. Closed-chain pose estimation from wearable sensors[C]//Proceedings of the2019 IEEE-RAS 19th International Conference on Humanoid Robots (Humanoids),2019:594-600. [63] KAICHI T,MARUYAMA T,TADA M,et al. Resolving position ambiguity of imu-based human pose with a single rgb camera[J]. Sensors,2020,20(19):5453. [64] LIN B S,LEE I J,WANG S P,et al. Residual neural network and long short-term memory-based algorithm for estimating the motion trajectory of inertial measurement units[J]. IEEE Sensors Journal,2022,22(7):6910-6919. [65] 代磊.基于多模型计算方法的下肢行走动力学分析[D].长春:吉林大学,2024.DAI Lei. Dynamic analysis of lower limb walking based on multi-model calculation method[D]. Changchun:Jilin University,2024. [66] LIANG W,WANG F,FAN A,et al. Extended application of inertial measurement units in biomechanics:From activity recognition to force estimation[J]. Sensors,2023,23(9):4229. [67] ZAFAR M H,MOOSAVI S K R,SANFILIPPO F.Hierarchical recurrent-inception residual transformer (HRIRT) for multi-dimensional hand force estimation using force myography sensor[J]. IEEE Sensors Letters,2024,8(9):6011204. [68] FAN J,ZHENG P,LI S. Vision-based holistic scene understanding towards proactive human-robot collaboration[J]. Robotics and Computer-Integrated Manufacturing,2022,75:102304. [69] WANG Z,YAN J,YAN G,et al. Multi-scale control and action recognition based human-robot collaboration framework facing new generation intelligent manufacturing[J]. Robotics and Computer-Integrated Manufacturing,2025,91:102847. [70] DELPRETO J,SALAZAR-GOMEZ A F,GIL S,et al.Plug-and-play supervisory control using muscle and brain signals for real-time gesture and error detection[J].Autonomous Robots,2020,44:1303-1322. [71] ZHANG J,LIU H,CHANG Q,et al. Recurrent neural network for motion trajectory prediction in human-robot collaborative assembly[J]. CIRP Annals,2020,69(1):9-12. [72] ZHANG J,WANG P,GAO R X. Hybrid machine learning for human action recognition and prediction in assembly[J]. Robotics and Computer-Integrated Manufacturing,2021,72:102184. [73] 王梦思.基于肌电信号和关节角度的正常步态下肢关节力矩估计方法研究[D].成都:电子科技大学,2024.WANG Mengsi. A research on estimation method of lower limb joint torques during normal gait based on electromyographic signals and joint angles[D]. Chengdu:University of Electronic Science and Technology of China,2024. [74] ELTOUNY K A,LIU W,TIAN S,et al. DE-TGN:Uncertainty-aware human motion forecasting using deep ensembles[J]. IEEE Robotics and Automation Letters,2024,9(3):2192-2199. [75] MALE J,MARTINEZ-HERNANDEZ U. Deep learning based robot cognitive architecture for collaborative assembly tasks[J]. Robotics and Computer-Integrated Manufacturing,2023,83:102572. [76] YAO B,YANG B,XU W,et al. Virtual data generation for human intention prediction based on digital modeling of human-robot collaboration[J]. Robotics and Computer-Integrated Manufacturing,2024,87:102714. [77] VALENTIM D P, COMPER M L C, SANDY MEDEIROS RODRIGUES CIRINO L, et al.Observational methods for the analysis of biomechanical exposure in the workplace:A systematic review[J].Ergonomics,2024(29):1-22. [78] YUNUS M N H,JAAFAR M H,MOHAMED A S A,et al. Implementation of kinetic and kinematic variables in ergonomic risk assessment using motion capture simulation:A review[J]. International Journal of Environmental Research and Public Health,2021,18(16):8342. [79] SARDAR S K,LIM C H,YOON S H,et al. Ergonomic risk assessment of manufacturing works in virtual reality context[J]. International Journal of Human-Computer Interaction,2024,40(14):3856-3872. [80] FERRARI E,GAMBERI M,PILATI F,et al. Motion analysis system for the digitalization and assessment of manual manufacturing and assembly processes[J].IFAC-Papers On Line,2018,51(11):411-416. [81] RITT M,COSTA A M,MIRALLES C. The assembly line worker assignment and balancing problem with stochastic worker availability[J]. International Journal of Production Research,2016,54(3):907-922. [82] BORTOLINI M,FACCIO M,GAMBERI M,et al.Motion Analysis System (MAS) for production and ergonomics assessment in the manufacturing processes[J].Computers&Industrial Engineering,2020,139:105485. [83] KATSAMPIRIS-SALGADO K,DIMITROPOULOS N,GKRIZIS C,et al. Advancing human-robot collaboration:Predicting operator trajectories through AI and infrared imaging[J]. Journal of Manufacturing Systems,2024,74:980-994. [84] 黄思翰,陈建鹏,徐哲,等.基于大语言模型和机器视觉的智能制造系统人机自主协同作业方法[J].机械工程学报,2025,61(3):130-141.HUANG Sihan,CHEN Jianpeng,XU Zhe,et al. Humanrobot autonomous collaboration method of smart manufacturing systems based on large language model and machine vision[J]. Journal of Mechanical Engineering,2025,61(3):130-141. [85] SELVARAJ V,AL-AMIN M,TAO W,et al. Intelligent assembly operations monitoring with the ability to detect non-value-added activities as out-of-distribution (OOD) instances[J]. CIRP Annals,2023,72(1):413-416. [86] 宋学官,何西旺,李昆鹏,等.人体骨骼数字孪生的构建方法及应用[J].机械工程学报,2022,58(18):218-228.SONG Xueguan, HE Xiwang, LI Kunpeng, et al.Construction method and application of human skeleton digital twin[J]. Journal of Mechanical Engineering,2022,58(18):218-228. [87] YOU Y,CAI B,PHAM D T,et al. A human digital twin approach for fatigue-aware task planning in human-robot collaborative assembly[J]. Computers&Industrial Engineering,2025,200:110774. [88] BITTENCOURT V,SAAKES D,THIEDE S. Surrogate modelling for continuous ergonomic assessment and adaptive configuration of industrial human-centered workplaces[J]. Journal of Manufacturing Systems,2025,79:383-397. [89] LI C, ZHENG P, ZHOU P, et al. Unleashing mixed-reality capability in deep reinforcement learning-based robot motion generation towards safe human-robot collaboration[J]. Journal of Manufacturing Systems,2024,74:411-421. [90] GUI H, LI M, YUAN Z. A behavioral conditional diffusion probabilistic model for human motion modeling in multi-action mixed human-robot collaboration[J]. Advanced Engineering Informatics,2024,62:102742. |
| [1] | HUANG Sihan, CHEN Jianpeng, XU Zhe, YAN Yan, WANG Guoxin. Human-robot Autonomous Collaboration Method of Smart Manufacturing Systems Based on Large Language Model and Machine Vision [J]. Journal of Mechanical Engineering, 2025, 61(3): 130-141. |
| [2] | LI Jiajia, YI Qian, FENG Yixiong, ZHU Pengxing, YI Shuping. Research on Dual-agent Work Mechanism with Human-smart System Collaboration in Human-centric Smart Manufacturing Cell [J]. Journal of Mechanical Engineering, 2025, 61(3): 105-118. |
| [3] | ZHANG Jie, DING Pengfei, WANG Baicun, ZHANG Peng, Lü Youlong, WANG Junliang. Human-robot Collaboration for Human-centric Smart Manufacturing: Developmental Evolution, Integration Applications, and Future Perspectives [J]. Journal of Mechanical Engineering, 2025, 61(15): 4-20. |
| [4] | JIANG Zhoumingju, XIONG Yi, WANG Baicun. Human-machine Collaborative Additive Manufacturing for Industry 5.0 [J]. Journal of Mechanical Engineering, 2024, 60(3): 238-253. |
| [5] | MA Nanfeng, YAO Xifan, CHEN Feixiang, YU Hongjun, WANG Kesai. Human-centric Smart Manufacturing for Industry 5.0 [J]. Journal of Mechanical Engineering, 2022, 58(18): 88-102. |
| [6] | YAO Xifan, MA Nanfeng, ZHANG Cunji, ZHOU Jiajun. Human-centric Smart Manufacturing: Evolution and Outlook [J]. Journal of Mechanical Engineering, 2022, 58(18): 2-15. |
| [7] | YANG Geng, ZHOU Huiying, WANG Baicun. Digital Twin-driven Smart Human-machine Collaboration: Theory, Enabling Technologies and Applications [J]. Journal of Mechanical Engineering, 2022, 58(18): 279-291. |
| [8] | HUANG Sihan, WANG Baicun, ZHANG Meidi, HUANG Jintang, ZHU Qizhang, YANG Geng. Operator 4.0 Towards Human-centric Smart Manufacturing: Framework, Enabling Technologies and Typical Scenarios [J]. Journal of Mechanical Engineering, 2022, 58(18): 251-264. |
| Viewed | ||||||
|
Full text |
|
|||||
|
Abstract |
|
|||||
