Data from: Audiovisual synchrony perception in observing human motion to music
Published Aug 30, 2019 on Dryad.
Cite this dataset
Takehana, Akira; Uehara, Tsukasa; Sakgauchi, Yutaka (2019). Data from: Audiovisual synchrony perception in observing human motion to music [Dataset]. Dryad. https://doi.org/10.5061/dryad.sq25h1q
To examine how individuals perceive synchrony between music and body motion, we investigated the characteristics of synchrony perception during observation of a Japanese Radio Calisthenics routine. We used the constant stimuli method to present video clips of an individual performing an exercise routine. We generated stimuli with a range of temporal shifts between the visual and auditory streams, and asked participants to make synchrony judgments. We then examined which movement-feature points agreed with music beats when the participants perceived synchrony. We found that extremities (e.g., hands and feet) reached the movement endpoint or moved through the lowest position at music beats associated with synchrony. Movement onsets never agreed with music beats. To investigate whether visual information about the feature points was necessary for synchrony perception, we conducted a second experiment where only limited portions of video clips were presented to the participants. Participants consistently judged synchrony even when the video image did not contain the critical feature points, suggesting that a prediction mechanism contributes to synchrony perception. To discuss the meaning of these feature points with respect to synchrony perception, we examined the temporal relationship between the motion of body parts and the ground reaction force (GRF) of exercise performers, which reflected the total force acting on the performer. Interestingly, vertical GRF showed local peaks consistently synchronized with music beats for most exercises, with timing that was closely correlated with the timing of movement feature points. This result suggests that synchrony perception in humans is based on some global variable anticipated from visual information, instead of the feature points found in the motion of individual body parts. In summary, the present results indicate that synchrony perception during observation of human motion to music depends largely on spatiotemporal prediction of the performer’s motion.
Data of Experiment 1
PSS and TIW data from individual participants for 6 exercises. The unit is millisecond.
Data of Experiment 2
PSS and TIW data from individual participants in on-beat condition and pre-beat condition. The unit is millisecond.
Data of Experiment 3
Temporal difference between music beats and vGRF peaks and that between music beats and relevant movement feature points. Data are shown for all beats (16 beats) of 5 exercises. (5 trials x 3 participants). The unit is second.
National Science Foundation, Award: JSPS KAKENHI Grant Number 18K19823.