Skip to main content

Unobtrusive tracking of interpersonal orienting and distance predicts the subjective quality of social interactions

Cite this dataset

Lahnakoski, Juha M; Forbes, Paul AG; McCall, Cade; Schilbach, Leonhard (2020). Unobtrusive tracking of interpersonal orienting and distance predicts the subjective quality of social interactions [Dataset]. Dryad.


Interpersonal coordination of behavior is essential for smooth social interactions. Measures of interpersonal behavior, however, often rely on subjective evaluations, invasive measurement techniques or gross measures of motion. Here, we constructed an unobtrusive motion tracking system that enables detailed analysis of behavior at the individual and interpersonal levels, which we validated using wearable sensors. We evaluate dyadic measures of joint orienting and distancing, synchrony and gaze behaviors to summarize data collected during natural conversation and joint action tasks. Our results demonstrate that patterns of proxemic behaviors, rather than more widely used measures of interpersonal synchrony, best predicted the subjective quality of the interactions. Increased distance between participants predicted lower enjoyment, while increased joint orienting toward each other during cooperation correlated with increased effort reported by the participants. Importantly, the interpersonal distance was most informative of the quality of interaction when task demands and experimental control were minimal. These results suggest that interpersonal measures of behavior gathered during minimally constrained social interactions are particularly sensitive for the subjective quality of social interactions and may be useful for interaction-based phenotyping for further studies.


This data was collected using Kinect V2 sensors and Tinkerforge IMU Brick 2.0 inertial measurement units, and questionnaires. Gaze behaviors are manually annotated from video images saved by the Kinect sensors. Anonymous data derived from the Kinect and IMU tracking results is shared in preprocessed format due to limitations in the ethics approval for the study. Preprocessing steps applied to these intermediate data are included in the provided analysis script. Video images are not included in this dataset to preserve the privacy of our participants and to follow the requirements of the ethics approval.

Usage notes

# InteractionTracking

__Code and motion tracking results for producing results in:__

__Lahnakoski JM, Forbes PAG, McCall C, Schilbach L -__
_Unobtrusive tracking of interpersonal orienting and distance predicts the subjective quality of social interactions_

Note: Because of randomization, the exact statistics produced by the analyses may change slightly on repeated runs of the script.

(C) 2019, 2020 Juha Lahnakoski,

For license information, see LICENSE file.

Directory structure:

Matlab code for loading data and performing analyses and visual studio project for the Kinect motion tracking application.

Data containing manual gaze ratings and Kinect data from the corresponding timepoints and other preprocessed data either created by the analysis script or presaved when raw data cannot be shared.

Basic tracking results converted to Matlab format for reproducing the final results.

Overall scores of questionnaires (responses to Intrinsic Motivation Inventory are included directly in the analysis script).

Functions used for reading data, estimating thresholds, plotting, etc.

Any additional files needed for creating the output. Currently contains only the colormap legend image.

Code description

__Motion tracking software__

Visual studio project including a compiled binary for the motion tracking system (/Release folder). Requires Windows 8/10, Kinect 2.0 with Kinect for Windows adapter, associated software and compatible hardware.

Visual studio project for recording data from the inertial measurement units. Compiled binary is not included because the sensors have unique IDs that are hard coded and need to be changed for each new unit individually.

__Analysis scripts__

Main code for loading and analysing data. In the beginning of the script, you must set the base directory of these files so that Matlab can find the code and the tracking results.

__Video annotation scripts__

These are provided for reference only. Due to privacy concerns, the video frames used by these scripts are not available outside of the research site; contact authors in case reanalysis of the video frames is required.

Used for verifying that the timepoints marked by the experimenter
during the measurement correspond to actual times when trials
start (due to duplicate/missing button presses).

Script for manual annotation of gaze directions used by the first
rater. Randomly select timepoints from all conditions in all dyads,
and save Kinect data as well as subjective annotation of whether
each individual is looking at the partner, looking at a target or

Script for manual annotation of gaze directions used by the second
rater. Loads the original rater's data to extract the same (randomly
sampled) timepoints

Script for third "tiebreaker" participant for annotating gaze during
timepoints when initial two raters disagreed.

__Supporting functions__

Color palette (can be replaced by other colormaps).

Simple function for transforming correlation values to p-values assuming
t-distribution (similar to the corr-funtion in Matlab).

Function for producing a Gaussian smoothing kernel.

Convolution for data with NaNs, from Matlab Central:
Copyright (c) 2013, Benjamin Kraus, All rights reserved.

Function to visualize lines as patches.
Brett Shoelson (2020). Patchline
MATLAB Central File Exchange. Retrieved January 30, 2020.

Simple function for transforming p-values to correlation values assuming
t-distribution (similar to the p-values returned by corr-funtion in Matlab).

Reads headers of Kinect result files saved by the
tracking system.

Reads the Kinect tracking result files saved by the
tracking system.

Reads in IMU values in the format saved by the tracking system.

Function for registering two point clouds.
(Points are assumed to be in the same order in the two clouds).

Massively univariate 2-way repeated measures ANOVA
Based on rm_anova.m by Aaron Schurger (2005.02.04)
-Derived from Keppel & Wickens (2004) "Design and Analysis" ch. 18
Edited by Juha Lahnakoski (Jan 10th 2020)
-Updated to handle multiple variables at once and save results to a
struct rather than a cell table.

Function for producing windowed cross correlation matrices (and optionally,
peak timecourses.

Creates the motion tracking results from the raw data files.
This is included for reference only; raw data logs cannot be
shared due to privacy concerns.

__Required toolbox licenses for Matlab__



July 18th, 2019 \
Last revision, July 10th, 2020

Juha Lahnakoski, Max Planck Institute of Psychiatry, FZ Juelich \, \