Eye Tracking and Gesture Intelligent Systems Recognition System for Intelligent Human-Computer Interaction
Abstract
In response to the issue of decreased accuracy of eye tracking and gesture recognition systems under different lighting conditions, user postures, and hand occlusions, the aim of this study is to enhance the human-computer interaction experience in smart home scenarios and construct an intelligent human-computer interaction system. The system captures eye movement images using infrared ray sources and cameras, processes iris edges using Canny edge detection, and captures changes in the user’s posture using the Lucas-Kanade optical flow algorithm to improve the accuracy of eye tracking. Multi-scale convolutional neural network (CNN) layers are designed, and a self-attention mechanism is applied to enhance the precision
of hand feature extraction. Long short-term memory (LSTM) networks are used to improve the accuracy of dynamic gesture recognition and achieve a more natural and intuitive interaction mode. The experimental results show that the accuracy of eye tracking reaches 95% at a light intensity of 1000 lux, and that of gesture recognition is the lowest at 89% among all light intensity conditions tested. In the case of partial occlusion, the highest success rate of gesture recognition is 95%. After interferences such as head movement and high-frequency blinking are added, the success rates of eye tracking are 95% and 94%, respectively. The experimental results demonstrate the efficiency of the system under good lighting conditions and its robustness in capturing complex gestures and occlusion situations, showing the accuracy of the optimized system for smart home control.
Keywords: Intelligent Human-Computer Interaction, Eye Tracking, Gesture Recognition, Convolutional Neural Networks, Long Short-Term Memory
Cite As
K. Wang, W. Zhu, "Eye Tracking and Gesture Intelligent Systems Recognition System for Intelligent Human-Computer
Interaction", Engineering Intelligent Systems, vol. 33 no. 4, pp. 385-395, 2025.