Data from: Individual identification and confirmation of nest site fidelity in Painted Stork (Mycteria leucocephala) using Deep Transfer Learning
Data files
Jan 21, 2026 version files 93.56 MB
-
Details_of_the_PsScarNet_Layers.txt
17.78 KB
-
Morphometrics_of_Ringo.xlsx
16.04 KB
-
PsScarNet_Code.m
250 B
-
PsScarNet.mat
87.83 MB
-
README.md
3.23 KB
-
Ringo-SIFT.zip
5.47 MB
-
Sample_images_PS.zip
228.88 KB
Abstract
Accurate individual identification is vital in field studies. Since traditional marking techniques, though effective, can be intrusive and potentially disrupt natural behaviours, identification using natural markings has gained popularity across various taxa as a non-invasive alternative. Here, we report on a Painted Stork (Mycteria leucocephala) with a distinctive neck injury mark, observed at the National Zoological Park (Delhi Zoo) over three consecutive breeding seasons (2022–2024). To verify its identity and assess nest-site fidelity, we employed a non-invasive approach combining morphometric measurements and Deep Transfer Learning-based image analysis. High-resolution photographs were used to extract linear measurements and assess repeatability, while a Deep Transfer Learning classifier further validated the individual’s identity with 98% accuracy. Image-based morphometric measurements were particularly reliable for longer morphological features, confirming that the scar-marked stork observed over three consecutive years is indeed the same individual. The repeated sightings of the scar-marked stork on the same patch support evidence of nest-site fidelity. Our findings highlight the potential of Deep Transfer Learning and pattern-based recognition as powerful, non-intrusive tools for long-term monitoring of colonial waterbirds.
Dataset DOI: 10.5061/dryad.c2fqz61p0
Description of the data and file structure
File: Details_of_the_PsScarNet_Layers.txt
Description: The layer details of the Deep Transfer Learning model which we have developed is available in text file.
Variables
- It has 177 layers.
File: PsScarNet_Code.m
Description: The MATLAB code is provided to run the model.
% This PsScarNet_Code.m MATLAB code classifies a new image using a pre-trained neural network model (PsScarNet.mat).
% Ensure you have the following prerequisites before running the code:
% Prerequisites:
% 1. MATLAB installed with the Deep Learning Toolbox.
% 2. The pre-trained neural network model (PsScarNet.mat) should be in MATLAB workspace.
% 3. Extract the images from the Sample_images_PS.zip and the image you want to classify should be in the working directory.
% Instructions to run the code:
% 1. Update the 'newImagePath' variable with the path to your image file.
% Example: newImagePath = 'path_to_your_image.jpg';
% 2. Run the PsScarNet_Code.m code in MATLAB. The code will display the image and the predicted label.
File: Sample_images_PS.zip
Description: The sample photographs (representing the right, left side of Ringo and the unknown Painted Stork) can be used to test the model.
File: PsScarNet.mat
Description: This is the deep transfer learning model that we have developed to identify the Ringo
File: Morphometrics_of_Ringo.xlsx
Description: The morphological measurements of Ringo are available in the excel format, measured in pixels
Variables - Description
- Name - Image number
- Upper BillM1 - Upper Bill (mandible) measurement
- Lower BillM2 - Lower Bill (mandible) measurement
- Mouth to EyeM3 - Mouth to Eye measurement
- Eye to nostrilM4 - Eye to nostril measurement
- wing length M5 - wing length measurement
- Ref mean tibia thicknessM6 - Tibia thickness measurement
- Angle MEN - Angle of Mouth, Ear and Nostril
- AngleMTN - Angle of Mouth, Bill_Tip and Nostril
Code/software
1: The layer details of the Deep Transfer Learning model which we have developed are available in a text file, and it has 177 layers (Details_of_the_PsScarNet_Layers.txt).
2: The MATLAB code is provided to run the model (PsScarNet_Code.m).
3: The sample photographs (representing the right, left side of Ringo and the unknown Painted Stork) can be used to test the model (Sample_images_PS.zip).
4: This is the deep transfer learning model that we have developed to identify the Ringo. To run this model MATLAB environment is mandatory (PsScarNet.mat).
File: Details - SIFT Feature extraction
Description: The sample jpg images (Ringo and non-Ringo) and the required MATLAB code (detect_sift_batch_interactive_final.m) to extract the SIFT features is given in the zip file (Ringo-SIFT.zip).
Access information
Other publicly accessible locations of the data:
- No
Data was derived from the following sources:
- No
1. Morphological measurements of Ringo
We used 82 photographs of Ringo from a collection of 100 that had proper orientation. The mean of the morphological variables of Ringo taken during 2024 was compared with randomly selected images of 2022 and 2023. With tps DIG we carefully measured morphological features in pixel units, namely (1) Upper mandible, (2) Lower mandible, (3) the distance between the mouth corner to the inside corner of the eye, (4) the distance between the inside corner of the eye to the top end of the right nostril, (5) The distance from the base of the pectoral wing to the tip of the wing covering black and white, (6) distance from the feather region to the tarsus and tibia junction. All the above morphological measurements are available in the Excel format.
2. Deep Transfer Learning
Using the pre-trained model (ResNet-50), we fine-tuned our model with Ringo datasets collected during 2022, 2023, and 2024 to train, validate, and test the accuracy. Our input data size is 224 by 224 pixels, and output with multiclass layers (n=3). Three classes, namely 1) the right side of the Ringo (n = 824 images), 2) the left side of the Ringo (n = 678 images), and 3) non-marked or unknown Painted Storks (n = 1755 images) were created. Using a Transfer Learning approach, we developed a model trained with the digital images of Ringo and other Painted Storks.
We have provided the Deep Transfer Learning model and its MATLAB code along with few sample images to cross check and validate the model. To run this model, one requires a MATLAB environment.
We also provided the layer details of the model.
