Autism-like atypical face processing in Shank3 mutant dogs
Data files
Mar 04, 2025 version files 230.73 KB
-
beagle_house.xlsx
11.74 KB
-
beagle_husky.xlsx
12.78 KB
-
car_house.xlsx
11.46 KB
-
dog_chimp.xlsx
12.72 KB
-
dog_human.xlsx
12.72 KB
-
ECoG_beagle_husky.xlsx
21.61 KB
-
ECoG_face_house.xlsx
22.48 KB
-
ECoG_human_dog.xlsx
21.58 KB
-
eyetracking_beagle_husky.xlsx
19.15 KB
-
eyetracking_dog_chimp.xlsx
19.11 KB
-
eyetracking_dog_human.xlsx
19.17 KB
-
eyetracking_face_house.xlsx
22.65 KB
-
inverted_beagle_inverted_house.xlsx
11.47 KB
-
README.md
12.08 KB
Abstract
Atypical face processing is a neurocognitive basis of social deficits in autism spectrum disorder (ASD) and a candidate cognitive marker for the disease. Although hundreds of risk genes have been identified in ASD, it remains unclear whether mutations in a specific gene may cause ASD-like atypical face processing. Dogs have acquired exquisite face processing abilities during domestication and may serve as an effective animal model for studying genetic associations of ASD-like atypical face processing. Here, we showed that dogs with Shank3 mutations exhibited behavioral and attentional avoidance of faces, contrasting with wild-type controls. Moreover, neural responses specific to faces (versus objects) recorded from the electrodes over the temporal cortex were significantly decreased and delayed in Shank3 mutants compared to wild-type controls. Cortical responses in the frontal/parietal region underlying categorization of faces by species/breeds were reduced in Shank3 mutants. Our findings of atypical face processing in dogs with Shank3 mutations provide a useful animal model for studying ASD mechanisms and treatments.
https://doi.org/10.5061/dryad.ht76hdrsp
Description of the data and file structure
- Files of ‘beagle&house, beagle&husky, car&house, dog&chimp, inverted beagle&inverted house’ are the data of behavioral preference data.
- Each file contains the number of times and the proportion that wild-type dogs and mutant dogs respectively choose the corresponding category of pictures.
- The file names that include "ECoG" are for the data of electrophysiological experiments. Different sheets in each file represent the results of different electroencephalographic components in the corresponding experiments. This data is mainly used for variance analysis.
- The files with names including "eye-tracking" contain data from our eye-tracking experiments. It contains the results of each trial for each animal.
- The statistical values mentioned in the text are obtained by averaging the trials for each animal into a single value for the purpose of statistical analysis.
- The uploaded code file can be opened in the psychology software Presentation (0.71 09.24.03), and it is for an ECoG experiments. The file name corresponds to the experimental content.
Visual stimuli
The visual stimuli used in the present study included photos of 16 human faces, 16 beagle faces, 24 husky faces, 8 poodle faces, 16 chimpanzee faces, 16 cars, and 16 houses. Black and white stimuli were used to control the potential effects of color differences between face and house stimuli. The details of the stimuli used for each experiment are shown in Tables S1 and S2. The human faces were adopted from the previous work (28). Chimpanzees, dog faces, and cars were collected from public Internet image resources. Faces of full-frontal views with eyes open, direct forward gaze, and neutral expression were used. All the images were unfamiliar to the subjects. Luminance and contrast were matched for images of each category using the SHINE toolbox in MATLAB and for images of different categories. All images were presented on a gray background (122 cd/m2). The stimuli produced a 20.3 cm × 25.4 cm picture (resolution: 800 ×1000 pixels) on the center of the screens, positioned 240 cm away from the subject in face preference tests. In the electrophysiological experiments, each stimulus with a resolution of 400×500 pixels was displayed on a 23.7-inch screen, positioned approximately 65 cm from the subject.
Behavioral preference test
The experimental set-up was modified from an approach-avoidance test in a previous study (34), as illustrated in Fig. 1A. The set-up included a food tray placed close to the bottom of a computer monitor on each side of the room. The two sets of food trays and monitors were separated by a board. Before the experiment, a dog was allowed to move freely in the test room for about 10 min. Thereafter, the experimenter turned the dog around with the back facing the two monitors. At the beginning of each trial, two identical dog snacks were placed on each of the trays. A pair of photos of two different categories were then randomly displayed on the left and right monitors (Fig. 1B). The photos of the two stimuli used in each trial varied randomly across trials. The test dog was guided to sit in the middle of the testing room facing the monitors for 3-5 seconds before being released to walk toward a food tray. After eating the snack on one food tray, the subject was led back to the release point for the next trial. A subject was tested for 16 trials in 15–45 min per day. If a test dog did not approach one of the two sides with different photo stimuli in a trial, the test was terminated.
Eye movement recording
The preferential viewing task was adapted from a previous study (64) for the eye tracking measurements after the calibration procedure using the EyeLink five-point calibration program. The experimenter guided the dogs through finger tapping or food cues to ensure their gaze fixation on the calibration point. The experimenter then repeated the five-point calibration to get the value of calibration accuracy. Subjects seated or stood approximately 85 cm away from a 23.7-inch LCD monitor (1920×1080 pixels resolution) when viewing the stimuli. Eye movements were recorded at a sampling rate of 500 Hz using an EyeLink 1000 Plus eye tracker (SR Research Ltd, Mississauga, Ontario, Canada). Each trial started with a moving circle that caught a subject’s attention toward the center of the monitor. Once a fixation on the circle lasted longer than 100 ms, two stimuli of different categories were presented simultaneously on the left and the right sides of the screen in a random order across trials for 3 seconds, similar to the paradigm used in previous studies of humans (65). Eye-tracking data during the presentation of the stimuli were analyzed. Eye-tracking measurements were conducted in Experiment 2 using photos of dog faces and houses and in Experiment 4 using photos of dog/human faces, dog/chimpanzee faces, and beagle/husky faces. In each experiment, a subject was tested in 16 trials using each set of face stimuli. Additional trials were performed in case of failed trials to ensure the completion of 16 trials for each experiment.
ECoG data recording
ECoG signals were recorded from 32 electrodes over the right hemisphere which covered the frontal, parietal, temporal, and occipital cortices, using the protocol described in previous research (Fig. S2) (27). Each electrode disc was 2.0 mm in diameter and spaced 5 mm apart. Electrode positions were verified using computer tomography scans. ECoG recordings were performed in a sound-attenuated room with dim light. Zeus data acquisition system (Zeus, Nanjing, China) was used to record ECoG signals with a sampling rate of 1 kHz. The subject was seated approximately 65 cm from the computer screen that was positioned to ensure the stimulus positions at the center of the screen. the stimuli. During ECoG recording, a dog was instructed to sit on the ground or on the experimenter’s lap. A camera was set to allow the experimenter to monitor the subject’s eye gaze. Each trial in Experiment 3 started with the presentation of an image of a face or house for 1500 ms in the center of the gray background. This was followed by a fixation cross with a duration varying randomly from 250 to 550 ms. Each subject was tested in 4 to 6 sessions on separate days and there were 6 runs in each session. Each run consisted of 2 blocks of 24 trials. There was a 5s break between two consecutive blocks. In Experiments 6 and 7, ECoG was recorded in a repetition suppression paradigm in 5 sessions. There were 6 runs in each session and there were 4 blocks of 16 trials in each run. The stimulus duration and inter-stimulus interval were the same as those in Experiment 3.
Behavioral data analysis
We quantified behavioral preferences for faces (against houses) as the percentage of trials in which a dog walked toward a face photo and preference for houses (against cars) as the percentage of trials in which a dog initially walked toward a house photo in Experiment 1. Behavioral preferences in Experiment 4 were quantified as the percentage of trials in which a dog walked toward for faces of its own species (against human or chimpanzee faces) or breed (against husky faces). One-sample t-tests were conducted to assess behavioral preferences for faces (or faces of a specific species) in Experiments 1 and 4.
Eye-tracking data analysis
We first quantified the ratio of time devoted to observing various categories of stimuli in relation to the total looking time in each trial. Trials with less than 25% screening-looking time were considered invalid and excluded from data analyses. We defined the area on the left side of the screen, which encompasses the width subtracted from the fixation point to the center as an area of interest 1 (AOI-1), and the area on the right side of the screen, which encompasses the width subtracted from the fixation point to the center as AOI-2. The two AOIs independently represent the two different pictures in each trial. The AOIs were determined for faces and house stimuli in Experiment 2 and for faces of different species/breeds in Experiment 5. We also defined the area of eye region in Experiment 2 as AOI-3. By calculating the duration of all fixations falling inside each AOI in each trial, we obtained the total looking time on the target region for each trial. Because eye-tracking data do not follow a normal distribution, we used the nonparametric Mann-Whitney test to access statistical differences in behaviors between the two testing groups (i.e., WT controls and Shank3 mutants), and the nonparametric Wilcoxon signed rank test to assess differences in behaviors between two measures in one testing group (e.g., gaze fixations on faces versus house in WT controls). These statistical methods were applied similarly to individual-based data analyses (see results in the main text) and trial-based data analysis (see results in the Supplementary Materials, Fig. S1 and Fig. S5). Heatmap of the gaze distribution was drawn through a Python toolbox GazePointHeatMap (https://github.com/TobiasRoeddiger/GazePointHeatMap), based on the viewing time of the stimuli.
ECoG data analysis
ECoG data analyses were performed using MATLAB (version 2020b, Mathworks Inc., Natick, MA) and the EEGLAB toolbox. During pre-processing, the ECoG signals were re-referenced using a common average reference (CAR) montage and band-pass filtered from 0.1 to 30 Hz. ERPs in each condition were averaged separately offline with an epoch beginning 200 ms before stimulus onset and continuing for 600 ms after stimulus onset. Trials contaminated by noises exceeding ±200 μV at any electrode were excluded from the average analysis. The baseline for ERP measurements was the mean voltage across a 200 ms prestimulus time window. The latency was measured relative to the stimulus onset.
The time windows of peak responses were independently defined for each ERP component in each experiment (see Table S3 for details). The mean amplitudes and peak latencies of the P1, N1, and P2 components in Experiment 3 were subjected to repeated-measures analyses (ANOVAs) with Stimulus category (human face vs. dog face vs. house) as a within-subjects factor and Testing group (WT vs. Shank3 mutants) as a between-subjects factor. The mean P1 and N2 amplitude in Experiments 6 and 7 were subject to ANOVAs with Stimulus categories (human vs. dog faces, or beagle versus husky) and Conditions (repetition vs. alternating conditions) as within-subjects factors and Group (WT vs. mutants) as a between-subjects factor. RS effects were defined as decreased amplitudes to faces of a category in the repetition compared to alternating conditions.
To further verify the ERP results shown in ANOVAs by controlling the potential effects of within-subject correlations, we conducted linear mixed-effect model (LMM) analyses of the ERP data. The statistical analyses of single-trial ECoG amplitudes were conducted using the mixed-effects single-trial regression models, similar to the method of a previous work (66). The fixed-effect factors included Stimuli (first level: -0.67 for house, +0.33 for human, +0.33 for dog; second level: -0.33 for house, -0.33 for human, +0.67 for dog), Testing Group (-0.5 for WTs, +0.5 for Shank3 mutants), and their interactions in Experiment 3. In Experiments 6 and 7, the fixed-effect factors included Stimuli (-0.5 for dog/beagle faces, +0.5 for human/husky faces), Condition (-0.5 for the alternating condition, +0.5 for the repetition condition), and Testing Group (-0.5 for WTs, +0.5 for Shank3 mutants), and their interactions. Subjects’ ID and the recording sessions of each subject were included as the nested random-effect factors.
Subjects
Twenty-three WT (beagles) from Sinogene Ltd (Beijing, China) and fifteen Shank3 mutant dogs (beagles) were tested in this study. The mean age did not differ significantly between WT controls and Shank3 mutants (WT, 19.3 ± 1.33 months of age (mean ± SEM); Mutants, 24.5 ± 2.75 months of age, U = 125.5, P = 0.164, see Tables S1 and S2 for detailed information about the subjects in each experiment). Four WT controls (80, 201138, 210755, 201115) and three Shank3 mutants (201111, 201112, 201141) were tested in a previous behavioral study of dog-human interactions (20). Two Shank3 mutants (190203, 190604) were tested in an ECoG study which required the subjects to passively listen to pure sinusoidal tones (27). None of these studies involved any training with food or employed stimuli of animal/human faces or houses. Three dogs tested in this work were littermates (i.e., WT(201115) and Shank3 mutants (201111 and 201112)). WT(201115) participated in Experiment 1. Shank3 mutants 201111 participated in Experiments 1 and 4. Shank3 mutants 201112 participated in Experiment 4. The Shank3 mutations generate frameshifts and truncated proteins disrupting the ANK domain and proline-rich domain of Shank3 in mutant dogs (20). All mutant dogs showed a similarly reduced level of Shank3 protein and similar autism-like social deficits, including social withdrawal and reduced social interactions with humans (20). Each dog was housed in a single cage and maintained on a 12-hour light/12-hour dark cycle with lights on at 7:00 am. All subjects were submitted to ophthalmological and behavior evaluation to verify their health conditions before the study. No animal was sacrificed in these studies. All experimental procedures were approved by the Ethical Committee of the Institute of Genetics and Developmental Biology of the Chinese Academy of Sciences (AP2022033).