Author First name, Last name, Institution

Ivan Camponogara, Zayed University

Document Type

Article

Source of Publication

Scientific Reports

Publication Date

12-1-2025

Abstract

Everyday actions often involve reaching for targets sensed by auditory and proprioceptive senses (reaching a ringing smartphone in the dark or tapping on it while holding it with the other hand). However, it is still unclear whether reaching performance toward auditory and proprioceptive targets is modality-specific and whether performance improves under multisensory compared to unisensory conditions. Here, we addressed these questions by measuring reaching performance toward auditory, proprioceptive, and combined audio-proprioceptive targets along the azimuth and depth dimensions. Accuracy was similar along the azimuth dimension, whereas precision was generally lower for auditory targets compared to proprioceptive and audio-proprioceptive targets. A second experiment investigated whether providing additional proprioceptive information while reaching for an auditory target could improve precision in subsequent auditory-only trials. A slight general improvement was observed, indicating that proprioceptive cues may help reduce spatial variability in auditory-guided actions, though not to the level seen in the multisensory condition. Overall, the results suggest that while the target modality slightly impacts movement accuracy, it has a significant impact on movement precision, with proprioceptive input playing a crucial role in enhancing precision. The concurrent availability of auditory and proprioceptive target information does not enhance precision beyond that achieved with proprioceptive information alone, whereas proprioception can modestly improve subsequent auditory-guided reaching.

ISSN

2045-2322

Publisher

Springer Science and Business Media LLC

Volume

15

Issue

1

Disciplines

Medicine and Health Sciences

Keywords

Audio-Proprioceptive, Auditory, Multisensory, Proprioceptive, Reaching

Scopus ID

105026271616

Creative Commons License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Indexed in Scopus

yes

Open Access

yes

Open Access Type

Gold: This publication is openly available in an open access journal/series

Share

COinS