Skip to main content
Dryad

Imprinting on time-structured acoustic stimuli in ducklings

Cite this dataset

Monteiro, Tiago; Hart, Tom; Kacelnik, Alex (2021). Imprinting on time-structured acoustic stimuli in ducklings [Dataset]. Dryad. https://doi.org/10.5061/dryad.4mw6m909g

Abstract

Filial imprinting is a dedicated learning process that lacks explicit reinforcement. The phenomenon itself is narrowly heritably canalized, but its content, the representation of the parental object, reflects the circumstances of the newborn. Imprinting has recently been shown to be even more subtle and complex than previously envisaged, since ducklings and chicks are now known to select and represent for later generalization abstract conceptual properties of the objects they perceive as neonates, including movement pattern, heterogeneity, and inter-component relationships of same or different. Here we investigate day-old Mallard (Anas platyrhynchos) ducklings’ bias towards imprinting on acoustic stimuli made from mallards’ vocalizations as opposed to white noise, whether they imprint on the temporal structure of brief acoustic stimuli of either kind, and whether they generalize timing information across the two sounds. Our data are consistent with a strong innate preference for natural sounds, but do not reliably establish sensitivity to temporal relations. This fits with the view that imprinting includes the establishment of representations of both primary percepts and selective abstract properties of their early perceptual input, meshing together genetically transmitted prior predispositions with active selection and processing of the perceptual input.

Methods

Video was recorded using Sony wireless 4K action cameras (FDR-X3000 R) at 30 Hz. A colour thresholding method was implemented using custom Bonsai code (Lopes et al., 2015) to track the position of the subjects and each speaker.

Usage notes

"ALLdata.mat" is a nested structure.

[data] is divided into [.raw] or [.processed].

[data.raw] contains all raw positions (x,y) of each stimuli and individual ducklings during tests.

inside [data.raw] there is the following structure:

raw.imprinting.[impringSoundType].[duration].[difficulty], with:

[.impringSoundType] —> artificial or natural

[.duration] —> short or long

[.difficulty] —> easy; int1_soundDIFF; int2_timeDIFF; hard

Note: During the revision of the paper, we have changed the naming of the “difficulty” label to “test condition”, and categories “easy”, “int1”, “int2”, “hard”, to “sound&time”, “sound”, “time (imprint=target)” and “time (imprint≠target), respectively.

[.raw] files are xyRED (position of the red stimulus), xyGREEN (position of the green stimulus), xyANIMAL for the entire testing time (rows)

and individual animals (3rd dimension).

In all cases, stimuli were marked with visual (red and/or green) tags (not visible to the animals) to facilitate tracking.

For practical reasons, imprinting exposure stimuli is always RED for [short] groups and GREEN otherwise.

Exceptions are for group '.natural.short.int1_soundDIFF' where GREEN marked the imprinting stimulus and for  '.natural.long.int1_soundDIFF'  where RED marked the imprinting stimulus

[.processed] files are arrays of 600x40, with every 2 columns showing x,y position of each animal in a particular testing condition, rotated and normalized.

Code for analysis can be found here: https://github.com/PTMonteiro/MonteiroHartKacelnik_2021.git

Funding

Leverhulme Foundation, Award: RPG-2016-192