Algorithmic Implementation of Visually Guided Interceptive Actions: Enhancing Motion Perception in Virtual and Augmented Reality Systems
Abstract
Wangdo Kim and Eunice Ortiz
This research introduces an innovative algorithmic framework designed to enhance motion perception and visually guided interceptive actions in virtual and augmented reality (VR/AR) environments. By applying harmonic ratios and stimulation invariants, the proposed algorithms enable real-time prediction of interception points and improve the responsiveness of VR/AR systems. This methodology translates complex theories of visual perception and motion into practical algorithmic solutions, providing dynamic prediction capabilities critical for applications such as online gaming, virtual simulations, and neurorehabilitation. Our findings underscore the potential of these algorithms to advance interactive systems, improving user experiences in immersive virtual environments through more precise and adaptive motion tracking and control. This approach is particularly relevant for enhancing the realism and efficiency of VR and AR applications in media, online gaming, and other collaborative digital environments.