Skip to main content

Data from: The effect of social environment on bird song: listener-specific expression of a sexual signal

Cite this dataset

Jablonszky, Mónika et al. (2020). Data from: The effect of social environment on bird song: listener-specific expression of a sexual signal [Dataset]. Dryad.


Animal signals should consistently differ among individuals to convey distinguishable information about the signalers. However, behavioral display signals, such as bird song are also loaded with considerable within-individual variance with mostly unknown function. We hypothesized that the immediate social environment may play a role in mediating such variance component, and investigated in the collared flycatcher (Ficedula albicollis) if the identity and quality of listeners could affect song production in signalers. After presenting territorial males with either a female or male social stimulus, we found in the subsequent song recordings that the among-stimulus effects corresponded to non-zero variance components in several acoustic traits indicating that singing males are able to plastically adjust their songs according to stimulus identity. Male and female stimuli elicited different responses as the identity of the female stimuli affected song complexity only, while the identity of male stimuli altered also song length, maximum frequency and song rate. The stimulus-specific effect on song in some cases decreased with time, being particularly detectable right after the removal of the stimulus and ceasing later, but this pattern varied across the sex of the stimulus and the song traits. We were able to identify factors that can explain the among-stimulus effects (e.g. size and quality of the stimuli) with roles that also varied among song traits. Our results confirm that the variable social environment can raise considerable variation in song performance, highlighting that within-individual plasticity of bird song can play important roles in sexual signaling.


After presenting territorial collared flycatcher males with either a female or male social stimulus, we recorded their song and extracted several song traits from the recordings to investigate whether the song is influenced by the immediate social environment. We generally calculated song traits from 20 songs, but we also repeated the analysis calculating song traits for consecutive 5 songs in order to investigate the temporal change of the effect of the social stimuli.

We provide four tables, two containing song variables calculated for 20 songs from the recordings, including also morphological data for the listener birds among others, and two with song variables calculated for the subsequent bins of 5 songs, separately for the two datasets that corresponds to the experiments using male or female listener birds.

Usage notes

Citation: Mónika Jablonszky, Sándor Zsebők, Miklós Laczi, Gergely Nagy, Éva Vaskuti, László Zsolt Garamszegi (2020): The effect of social environment on bird song: listener-specific expression of a sexual signal. Behavioral Ecology, accepted manuscript
Contact information:
Methods of analysis: R atatistical environment, Linear Mixed Models (lme4 package)
Description of files: 4 tables, containing information from different subset of data (recordings after male or female listener, song traits calculated for 20 songs or for 5 songs
Variables: the variables in the 4 tables are similar, so they are presented together
    recording_ID: the unique identifier of the song recording
    year: the year of the measurement
    ring: the ring number of the individual, whose song was recorded
    aprildate_of_test: date of the measurement, as days from April 1st
    age_binary: age of the focal individual, binary, 1: one-year old, 2: more than one years old
    time_till_recording: time elapsed between the removal of the listener bird and the start of the song recording
    listener_Male: the ring number of the listener male
    listener_Female: the ring number of the listener female
    songlength: length of the focal song (s)    
    minimumfrequency_khz: minimum frequency of the song (kHz)
    maximumfrequency_khz: maximum frequency of the song (kHz)
    meanfrequency_khz: mean frequency of the song (kHz)    
    frequencyrange_khz: frequency range of the song (kHz)
    tempo: the ratio between the number of syllables within song and song length (1/s)
    complexity: the number of different syllable types/total number of syllables within songs
    repertoiresize: the number of k-mean clusters that could be detected for a given individual based on 20 songs
    songrate: the number of songs in a minute calculated as 60/median of song intervals
    listener_Male_tarsus: tarsus length of the listener male (0.1 mm)
    listener_Male_condition: condition of the listener male (residuals from body mass-tarsus regression separately built for the sexes, and we also controlled for year effects by including year as a random factor)
    listener_Male_wingpatchsize: the size of the wing patch of the listener (sum of the length of the white area on the outer vanes of the 4-8th primaries, 0.1 mm)    
    listener_Male_foreheadpatchsize: the size of the forehead patch of the listener (the product of the maximum length and width of this white patch, 0.1 mm * 0.1 mm)
    days_in_captivity: days the listener bird has been held in captivity until the day of the measurement
    order (in the files with song traits calculated for bins of 5 songs): order of 5 song bins along their original order within a recording (traits were calculated for the 1-5th song, 6-10th song, etc. in the recording)