Monkeys have rhythm: Macaque and human tapping responses to musical excerpts
Data files
Jan 21, 2026 version files 1.20 MB
-
data_monkeys_have_rhythm.mat
1.19 MB
-
README.md
4.43 KB
Abstract
Synchronizing movements to music is one of the hallmarks of human culture whose evolutionary and neurobiological origins remain unknown. The ability to synchronize movements requires 1) detecting a steady rhythmic pulse, or beat, out of a stream of complex sounds, 2) projecting this rhythmic pattern forward in time to predict future input, and 3) timing motor commands in anticipation of predicted future beats. Here, we demonstrate that the macaque is capable of synchronizing taps to a subjective beat in real music, and even spontaneously chooses to do so over alternative strategies. This contradicts the influential “vocal learning hypothesis” that musical beat synchronization is only possible in species with complex vocalizations such as humans and some songbirds. We propose an alternative view of musical beat perception and synchronization ability as a continuum onto which a wider range of species can be mapped depending on their ability to perform and coordinate the general abilities listed above through association with reward.
Dataset DOI: 10.5061/dryad.sf7m0cgjr
Description of the data and file structure
The data provided here reproduce the results reported in ‘Monkeys have rhythm’ by Vani G. Rajendran, Luis Prado, Juan Pablo Márquez, and Hugo Merchant from the National Autonomous University of Mexico.
The data consist of tap times recorded from two monkeys (M1 & M2) and 18 human subjects (H01-H18) who performed three main experiments:
- Experiment 1: Music
- Experiment 2: Scrambled Music
- Experiment 3: Free Tapping
Additionally, the data file includes supplementary data collected from one monkey (M1) that includes:
- Extended tapping to music (Song 1 & 3 only): 7 taps
- Free tapping to 6 new songs
Files and variables
File: data_monkeys_have_rhythm.mat
This single .mat file contains 4 datatables
Variables
- m_datatable: Monkey datatable, contains all data used to create the manuscript’s main figures
- h_datatable.mat: Human datatable, contains the data used to make Supp. Fig. S4 showing human data for the 3 main experiments
- m_suppdata_7taps.mat: Supplementary monkey datatable presenting Song 1 & 3, 7 taps (M2 only), used to create Supp. Fig. S6
- m_suppdata_freetapping_newsongs_6taps.mat: Free tapping, new songs, 6 taps (M2 only), used to create Supp. Fig. S7
Each row of the datatables is a trial.
Datatable column definitions:
.timestamp: timestamp of experiment
.justdate: yyyymmdd
.fn: filename of raw data files (available if desired, but datatables provided for convenience)
.exptname and .stimset describe the stimuli used and are the following:
- Monkey data table (m_datatable.mat)
- Experiment 1: Music (exptname: '01,02,05’, stimset: ‘mirex’ & ‘mirex_p’ for 0 and pi data respectively)
- Experiment 2: Scrambled Music (exptname: '01,02,05’, stimset: ‘mirexQ’ & ‘mirexQ_p’ for 0 and pi data respectively)
- Experiment 3: Free Tapping (exptname: ‘libre’, stimset: ‘backstreet’ & ‘backstreet_pi’ for 0 and pi data respectively)
- Human data table (h_datatable.mat)
- Experiment 1: Music (exptname: '01,02,05’, stimset: ‘mirex’ & ‘mirex_p’ for 0 and pi data respectively)
- Experiment 2: Scrambled Music (exptname: '01,02,05’, stimset: ‘Q’ & ‘Q_p’ for 0 and pi data respectively)
- Experiment 3: Free Tapping (exptname: ‘libre’, stimset: ‘backstreet’ & ‘backstreet_pi’ for 0 and pi data respectively)
- Supplementary monkey data tables:
- Song 1 & 3, 7 taps (M2 only) (exptname: ‘music(2 songs(0.465, 0.882))’ & ‘music_pi(2 songs(0.465, 0.882))’ for 0 and pi data respectively, stimset: ‘7taps’ )
- Free tapping, new songs, 6 taps (M2 only) (exptname: ‘Extra1’, ‘Extra2’, ‘Extra3’, ‘Extra4’ referring to the pairs of songs presented, stimset: ‘6taps’ )
.subject: ‘Gil’ or ‘Tomas’ (M1 & M2, respectively), or H01-H18 (human data)
.interval: correct target tempo of the song/stimulus, in seconds
<datatable>.P: number of intervals that make up the “perception” phase. For intervals <800 ms, the GO signal was presented at approximately P x interval + 0.975 seconds after the start of the trial, or at approximately P x interval + 0.2 seconds if the interval was >800 ms. Taps occurring after the estimated GO signal were used in the analysis.
.S: number of taps required for a trial to be accepted as correct
.taps: subject tap “times”, subtract from this “stimstart” for this trial to get tap times in seconds, relative to start of trial
.stimstart: when the trial starts, ie time “0”
Code/software
Matlab R2022a was used to analyze the data using custom scripts.
Note:
If you have any further questions feel free to contact vani.g.rajendran@gmail.com or hugomerchant@unam.mx.
Enjoy!
May 26, 2025
Human subjects data
Participants granted their explicit consent to publish the de-identified data in the public domain. Data were de-identified by assigning participants an anonymous ID (H01, H02, ...).
