The microstructure of intra- and interpersonal coordination
Data files
Nov 15, 2023 version files 71.41 MB
-
Couple1.mat
-
Couple10.mat
-
Couple2.mat
-
Couple3.mat
-
Couple4.mat
-
Couple5.mat
-
Couple6.mat
-
Couple7.mat
-
Couple8.mat
-
Couple9.mat
-
README.md
-
Subj1.mat
-
Subj10.mat
-
Subj11.mat
-
Subj12.mat
-
Subj13.mat
-
Subj2.mat
-
Subj3.mat
-
Subj4.mat
-
Subj5.mat
-
Subj6.mat
-
Subj7.mat
-
Subj8.mat
-
Subj9.mat
Abstract
Movements are naturally composed of submovements - i.e., recurrent speed pulses (2-3 Hz) - possibly reflecting intermittent feedback-based motor adjustments. In (unimanual) movement synchronization tasks, submovements produced by interacting partners alternate in time, indicating mutual coregulation based on visual information exchange. However, it is unclear whether submovement coordination is organized differently between individuals and within individuals (between effectors). Indeed, different types of sensory feedback (proprioceptive, visual) can be variably exploited for intrapersonal and interpersonal coordination. In a series of bimanual tasks performed alone or in pairs, we show distinct coordinative structures emerging at the submovement level. Specifically, the relative timing of submovements (between partners/effectors) shifts from alternation to simultaneity and a mixture of the two when using only visual information (interpersonal), only proprioceptive information (intrapersonal, without vision), or both types of information (intrapersonal, full feedback), respectively. These results suggest that submovement coordination represents a behavioral proxy for the adaptive weighting of different sources of information within action-perception loops depending on the sensory feedback that is most relevant and/or available for motor coordination. In sum, the microstructure of movement reveals common principles governing the dynamics of low-level sensorimotor control to achieve both intra- and interpersonal coordination.
README: The microstructure of intra- and interpersonal coordination
https://doi.org/10.5061/dryad.z8w9ghxk7
For all tasks (dyadic/solo) and conditions, data were collected in separate trials during which participants performed the task continuously for 2.5 min (45000 samples). Participants repeated two trials (of 2.5 min each) for each task and experimental condition with short breaks in between. Participants were instructed to move at a reference pace of 0.25 Hz (flexion/extension movement duration: 2 s; whole flexion-extension cycle: 4 s), they internalized this pace by listening to a metronome prior to the experiment. The experimental conditions for both tasks were randomized.
DYADIC TASK
Pairs of participants were seated at a table facing each other on opposite sides of a panel, which prevented them from seeing each other’s faces. They were asked to hold the ulnar sides of their right and left forearms resting on the table and to keep their hands in a closed fist position, with both their right and left index fingers pointing straight toward their partner’s index fingers.
Participants were instructed to perform rhythmic flexion-extension movements of both their index fingers together, as much as possible synchronized toward the same direction (Dyadic INPHASE) or opposite direction (Dyadic ANTIPHASE). Moreover participants were asked to perform alone rhythmic flexion extension movements of the right and left finger as synchronized as possible toward the same direction (in-phase) while seeing (vision) one’s own hands (INPHASE vision). In the latter case, participants were seated alone in a posture similar to that described for the dyadic conditions, but with their own left and right index fingers pointing toward each other.
Data relative to DYADIC TASK are .mat files named (Couple#.mat).
The .mat files contain data structure named extracted_data with the following field:
- Pos_Raw: this field contain raw position data of subjects’ right and left hand movements recorded (sampling rate: 300 Hz) during a single trial (45600 samples, note that the first and last 300 samples must be excluded). Please note that: 1. For Dyadic conditions the first two rows correspond to right (first row) and left (second row) hand movements of one of the subject (by convention “Subj A”). While the last two rows correspond to right (third row) and left (fourth row) hand movements of the other subject of the couple (by convention “Subj B”). 2. For Inphase Vision Condition the two rows correspond to right (first row) and left (second row) hand movements of single subjects (by convention "Subj A" or "Subj B").
- Condition:
- Dyadic INPHASE
- Dyadic ANTIPHASE
- INPHASE vision
SOLO TASK
Participants were seated alone at a table and were asked to hold the ulnar sides of their right and left forearms resting on the table and clench their hands into fists, pointing their own left and right index fingers toward each other. Participants were asked to perform rhythmic movements of flexion-extension, synchronizing their fingers as much as possible either in-phase or anti-phase, while keeping (in separate trials) their eyes open (INPHASE/ANTIPHASE vision) or closed (INPHASE/ANTIPHASE no vision).
Data relative to SOLO TASK are .mat files named (Subj#.mat).
The .mat files contain data structure named extracted_data with the following field:
- Pos_Raw: this field contain raw position data of subject’s right (first row) and left (second row) hand movements recorded (sampling rate: 300 Hz) during a single trial (45600 samples, note that the first and last 300 samples must be excluded).
- Condition:
- INPHASE vision
- INPHASE no vision
- ANTIPHASE vision
- ANTIPHASE no vision