Sensor-Based and VR-Assisted Visual Training Enhances Visuomotor Reaction Metrics in Youth Handball Players

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background: Sensor-based systems and virtual reality (VR) technologies provide new opportunities for the objective, technology-driven assessment and training of visuo-motor performance in applied contexts such as sport. Methods: This study examined the effects of an integrated visual training program combining stroboscopic stimula-tion, VR-based vergence exercises, and instrumented reaction-light tasks in adolescent handball players. Twenty-eight youth athletes completed two baseline assessments separated by six weeks, followed by a six-session training program integrated into reg-ular team practice. Sensor-derived outcome measures included dynamic accommoda-tive performance, simple and choice visual reaction times, peripheral-field response metrics, binocular alignment, stereoscopic depth perception, and basic oculomotor function. Results: Compared with both baseline measurements, the intervention pro-duced selective improvements in accommodative facility—particularly near–far fo-cusing speed—and in multiple reaction-time conditions involving manual and deci-sion-based responses. Specific peripheral-field locations showed increased response scores, whereas binocular alignment, AC/A ratio, near phoria, and stereoscopic acuity remained unchanged. Conclusions: These findings indicate that technology-supported visual training protocols incorporating sensor-based reaction systems and VR stimuli can be associated with measurable adaptations in dynamic visuomotor processing while preserving fundamental binocular vision parameters.

Article activity feed