Studies of gestural communication systems find that they originate from spontaneously created iconic gestures. Yet, we know little about how people create vocal communication systems, and many have suggested that vocalizations do not afford iconicity beyond trivial instances of onomatopoeia. It is unknown whether people can generate vocal communication systems through a process of iconic creation similar to gestural systems. Here, we examine the creation and development of a rudimentary vocal symbol system in a laboratory setting. Pairs of participants generated novel vocalizations for 18 different meanings in an iterative ‘vocal’ charades communication game. The communicators quickly converged on stable vocalizations, and naive listeners could correctly infer their meanings in subsequent playback experiments. People's ability to guess the meanings of these novel vocalizations was predicted by how close the vocalization was to an iconic ‘meaning template’ we derived from the production data. These results strongly suggest that the meaningfulness of these vocalizations derived from iconicity. Our findings illuminate a mechanism by which iconicity can ground the creation of vocal symbols, analogous to the function of iconicity in gestural communication systems.
charadesGameAnalysisR
R code for analysis of charades game.
charadesGameAnalysis.R
charadesGameAnalysisPython
Python code for processing and analysis of the charades game.
charadesGameAnalysis.py
charadesGameProcessedData
Processed data from the charades game. Used for the R analysis.
charadesGameProcessed.csv
charadesGameData
Data from the charades game. Processed by the corresponding python code.
charadesGameOriginal.csv
charadesAnalysisPraat
Praat code for measuring charades vocalizations.
charades.praat
mturkAnalysisR
R code for analysis of playback experiments
mturkAnalysis.R
mturkAnalysisPython
Python code for analyzing data from the Mturk playback experiments.
mturkAnalysis.py
distanceCorrelationsProcessedData
Correlations between vocalization distance to the iconic and median meaning templates and probability of selecting each meaning.
distanceCorrs.csv
speakerNormsProcessedData
Normed means of the five acoustic variables for each vocalizer in the charades game.
speakerNorms.csv
charadesMturkIconicData
Data from the second playback experiment with more and less iconic stimuli.
charadesMturkIconic.csv
charadesMturkData
Data from the Mturk playback experiment. The python code uses this to compute various derived variables.
charadesMturkOriginal.csv
charadesMeaningAccuracyData
Processed data from the charades experiment. Contains average accuracy for each meaning. Used to compare to accuracies for each meaning in the playback experiment.
charadesAccMeaning.csv
meaningTemplatesData
Processed data from the charades experiment. Contains median, max, and min values of the acoustic variables for each meaning. Used for calculating iconic and median distance.
meaningTemplates.csv
wordPropsAnalysisPython
Python code to analyze the pronunciation of the target words for comparison to vocalizations.
wordProps.py
wordPropsProcessedData
Processed word properties for comparison to vocalizations.
wordPropsProcessed.csv
wordPropsData
Data containing the acoustic properties of the pronounced target words.
wordPropsOriginal.csv
soundPropsAnalysisPraat
Code for measuring acoustic properties of target words.
soundProps.praat