Data and code from: Motor origins of timbre in piano performance
Data files
Sep 10, 2025 version files 32.54 KB
-
Code_Data.zip
29.36 KB
-
README.md
3.18 KB
Abstract
Creativity in the arts, such as painting and musical performance, hinges on the ability to produce a wide spectrum of perceptual experiences. In music, it has long been believed that the timbre of tones can be altered by nuanced movements of performers. Previous studies have described relationships between fundamental elements of auditory perceptions (e.g., loudness, tempo) and physical movements (e.g., force, speed), but it remains unknown whether and how delicate features of perceptual experiences, such as tone timbre, are manipulated through dexterous motor skills. Here, we bridge this gap using a two-fold experimental approach. First, our listening test revealed that the timbral qualities pianists intended to express in piano playing were perceived as intended by both pianists and musically untrained individuals, with pianists showing a greater perceptual sensitivity to different timbres. Second, through a motor behavioral experiment using a non-contact, high-resolution sensing system, we identified five specific movement features in piano touch that were intricately linked to three categories of perceived timbre: weight, clarity, and brightness. Furthermore, the direct manipulation of a specific key movement feature resulted in systematic changes in perceived timbre, providing evidence for a causal relationship. The result indicates that pianists share common motor skills to modify perceived tone timbre by manipulating specific movement features. Our findings underscore the pivotal roles of subtle physical gestures in creating the rich timbral palette of piano tones, advancing our understanding of the intersection between motor control and artistic expression.
Dataset DOI: 10.5061/dryad.95x69p8z3
Description of the data and file structure
The datasets were collected as part of two experimental studies investigating the relationship between piano performance and perceptual experiences.
Study 1: Pianists played a short excerpt from No. 1 of "Hanon the Virtuoso Pianist in 60 Exercises" under multiple expression conditions. For each performance, features of the key movements were recorded, and evaluation scores were obtained from the listening experiment. The data (data_study1.mat) provide information linking performers, condition, expression, movement features, and perceptual ratings.
Study 2: Participants completed a discrimination task in which they compared two sound sources (LargeAcc vs. SmallAcc). The dataset contains trial-level information, including whether SmallAcc was presented first, and whether the participant judged the first-presented sound as “lighter/clearer.” Associated JSON files store the measured key position data corresponding to the stimuli.
Files and variables
File: Code_Data.zip
Description:
create_figure_4, 5, and 6: These functions generate Figures 4, 5, and 6 using the experimental data from Study 1, stored in data_study1.mat.
data_study1.mat: T is a table that stores, for each performer and each perceptual concept under each condition, the following information: ・values of keyboard movement features (vel_escapement, acc_escapement, vel_mute, acc_mute, onset_noise, bottom_noise, offset_noise, overlap,IKI, d_vel_escapement, d_escapement_timing, and d_overlap) ・Score: the average evaluation score for that performance ・Group: the evaluator’s group label ・ID: performer ID ・Condition: performance condition (small:0, large:1, mechanical: 2) ・Concept: which concept was performed
create_figure_7: This function generates Figure 7 using the experimental data from Study 2, which are stored in data_study2.mat, core_output_hackkey_note_largeACC.json, and core_output_hackkey_note_smallACC.json.
data_study2.mat: T is a table containing: ・ID: participant ID ・Concept: which concept condition was performed ・which: whether the SmallAcc sound source was presented in the first or second trial (0: first trial, 1: second trial) ・answer: whether the participant judged the first trial as “lighter or clearer” (1 if the first trial was chosen, 0 if the second trial was chosen) ・correct: whether the participant judged SmallAcc as “lighter or clearer” (true/false)
JSON files: The files core_output_hackkey_note_largeACC.json and core_output_hackkey_note_smallACC.json contain the measured key position data.
Code/software
MATLAB is required to run create_figure_4, 5, 6, and 7; the script was created using version R2024b.
File: error_ellipse.zip
Tool for drawing ellipsoids (used in Figure 4).
File: hex2rgb.zip
Tool for converting a hexadecimal color code (HEX) to its corresponding RGB values.
File: povilaskarvelis-DataViz-3.2.7.0.zip
Tool for creating box plots with individual data points shown as dots.
