Document Type

Article

Source of Publication

Experimental Brain Research

Publication Date

4-1-2026

Abstract

Object categorisation is a fundamental cognitive process, involving the integration of information across the senses. We investigated, using smartphones, whether visual and tactile motion cues could enhance object category learning and generalisation to novel object shapes. Two categories of similar shapes were associated with specific correlated visual and tactile vibration motion cues. After learning object categories, participants were assessed on categorisation of learned and novel objects across four cue conditions: shape-only, shape-visual motion, shape-tactile motion, and shape-visual and tactile motion. We also assessed if accuracy was influenced by blocked versus interleaved cue-conditions at test. In Experiment 1, we found more accurate categorisation and generalisation when all cues were available at test. In Experiment 2 we replicated this effect even when the reliability of the shape-only cue for predicting category membership was reduced. In Experiment 3, we found that the absence of motion cues during learning removed the benefit of motion cues at test. Overall, our findings suggest that multisensory motion cues benefit the formation of novel object categories and allow for better generalisation. The results have implications for our understanding of the underlying dynamic and multisensory nature of object categories and the predictive role of multisensory features on category formation.

ISSN

0014-4819

Publisher

Springer Science and Business Media LLC

Volume

244

Issue

4

Disciplines

Social and Behavioral Sciences

Keywords

Multisensory perception, Object categories, Object motion, Online testing, Tactile perception

Scopus ID

105030570132

Creative Commons License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Indexed in Scopus

yes

Open Access

yes

Open Access Type

Hybrid: This publication is openly available in a subscription-based journal/series

Share

COinS