# Title of Dataset: Spontaneous mimicry of live facial expressions: A biological mechanism for emotional contagion Brief summary of dataset contents, contextualized in experimental procedures and results. Included in the tar gzip compressed folder are the data collected from typically developing participants interacting with a partner that watches an emotion provoking video. The data include functional near-infrared spectroscopic, face tracking, and rating files from both partners during the interactive task. The zipped folder also included the 3D localizer information for each participant. ## Description of the Data and file structure This is a freeform section for you to describe how the data are structured and how a potential consumer might use them. Be as descriptive as necessary. Keep in mind that users of your data might be new to the field and unfamiliar with common terminology, metrics, etc. The data archive file was crated using gzip and includes the following file structures: There is one folder "PXX" per pair of participants (Participant A and Participant B). These folders are numbered P01-P22. Each PXX folder has 2 types of files in its root and 2 additional subfolders (described below). In the root of the PXX folder are 4 .csv files that define 3D location of the fNIRS optodes on each particpant A or B. For Participant A and B, one file has the origin locations which correspond to 10-20 fidicual locations. Another file has the word "others" in it which defines the 80 locations of each optode on the head of the participant (A or B) 40 locations per participant. In addition there are 6 csv files that are prepended with FaceReadingPXX_ in the filename followed by the timestamp of the data collection time. These 6 files are the NIRS data exported into OxyHB, DeOxyHB, and TotalHB information per channel starting with the 4th column. The first 60 channels are for participant A and the last 60 channels are from participant B. The location of each channel is defined above in the origins and others file for each participant (A and B). The first column contains time information. The 2nd column contains the trigger information that corresponds to the results/log file (described below). The information in the 3rd column is not used. The timestamp in the name of the file allows the files to be sorted chronologically and match them with their corresponding "facevideo" and "results" information (described below). The first subfolder in each PXX directory is named "facevideo" each "facevideo" folder has a "processed" subfolder which contains the output of OpenFace recorded during the interaction between the participants. The 320th frame of the output is the time when the fNIRS recording is started (time 0 for fNIRS). Inside the "processed" folder are 12 csv files. These are prepended with either sA or sB which correspond to output of OpenFace for Participant A and Participant B. These csv files contain timestamps and frame numbers from the videos they were processed from. The output of OpenFace is a standard output format with columns containing facial tracking data, eye gaze data and Action Unit Data. Each filename has the time stamp from its recording time in its filename. In the case that there are not 12 files in this folder this means that the data was not recorded for that trial. This is detailed for each subject below. The timestamp in the name of the file allows the files to be sorted chronologically and match them with their corresponding NIRS data (above) and "results" information (described below). The second subfolder in each PXX directory is named "results" The "results" subfolder contains 6 files that are the logged information recorded during the experiment including: start of run times of triggers corresponding to NIRS files in the root folder. rating information end of run Each run has a start and end time defined by trigger "s" and "e" There are events defined by "a", "b" and "c" in the file. These events repeat 6 times per run and the times of the events are defined in the corresponding result file. These event times are the corresponding trigger times in the NIRS file in column 2 for each corresponding file. The first three runs are the times of the onset of a video event in which subject A watches the face of subject B who watches the presented video. The second three runs are the times of the onset of a video event in which subject B watches the face of subject A who watches the presented video. The ratings are defined in the results file as the following: wheel 2,2,23.497749 where wheel is the input device which is a keyboard driver attached to a rotating wheel attached to the USB port of a computer. The first number is the participant number 1 or 2 (A or B) The second number/value is the rating of the interaction (this is a scale from -5 to 5). Only the last number per interaction per subject should be considered valid. The last value is the time stamp of the input relative to the start of each run. The timestamp in the name of the file allows the files to be sorted chronologically and match them with their corresponding NIRS data and "facevideo" information (described above). A csv file named FilesIncluded.csv lists all the included files. Describe relationship between data files, missing data codes, other abbreviations used. Be as descriptive as possible. There is some missing data which is described for each PXX below. If there is no description here, that means the dataset per pair of participants is complete. P01 The NIRS data from first run failed to record and is not included. P01 There is no facevideo captured for both participants. P02 There is no facevideo captured for both participants. P03 There is no facevideo captured for both participants. P12 There is no facevideo captured for both participants. P16 The last face video failed to record for participant B. P20 The last 3 face videos failed to record for participant B. ## Sharing/access Information Links to other publicly accessible locations of the data: N/A Was data derived from another source? No If yes, list source(s):