High-precision Motion Capture System Based on 3D Space Animation Scene Construction Based on Optical Fiber Communication Network
Abstract
Motion capture technology has advanced quickly in recent years thanks to the rapid development of sensor technology, the invention of inertial navigation, and the ongoing development of hardware configurations for measurement and computation. With the advancement of communication technology, optical fiber communication networks are now the most widely used form of communication, and they are fiercely competitive with other forms of communication. Consequently, the combination of a motion capture system and an optical fiber communication network offer several potential applications. For instance, the electronic optical motion capture technology and the virtual reality technology can be combined to produce a very realistic character animation effect in the creation of large-scale 3D games. Human motion capture technology is widely used in the film and television animation industry to make the characters more realistic. In this research, a high-precision motion capture system based on a three dimensional space animation scenario using an optical fiber communication network is proposed. The system was built using the technology of the optical fiber communication network and motion capture. An experiment was conducted to determine the accuracy of the motion capture system. The experimental findings indicate that without a high-speed optical fiber in the time domain, the average motion capture accuracy was 46.7%. The addition of high-speed optical fiber time domain improved the average accuracy of the motion capture by 2.5%, and improved the performance of each joint point capture.
Keywords: motion capture system, optical fiber communication network, three-dimensional space, animation scene
Cite As
J. Li, "High-precision Motion Capture System Based on 3D Space Animation Scene Construction Based on Optical Fiber Communication Network", Engineering Intelligent Systems, vol. 33 no. 5, pp. 545-553, 2025.
 
						