Data from: Visual context influences how humans walk on winding paths
Data files
Feb 13, 2026 version files 24.81 GB
-
Environ-Salience_Kinematics_ACCL_n28.mat
1.26 GB
-
Environ-Salience_Kinematics_HIFH_n28.mat
5.91 GB
-
Environ-Salience_Kinematics_HIFL_n28.mat
5.90 GB
-
Environ-Salience_Kinematics_STRH_n28.mat
5.86 GB
-
Environ-Salience_Kinematics_STRL_n28.mat
5.86 GB
-
Environ-Salience_MarkerSetKey.xlsx
12.28 KB
-
Environ-Salience_PairStat2MotionTrials.xlsx
71.86 KB
-
Environ-Salience_Participant-Info.xlsx
18.16 KB
-
Environ-Salience_STAT_n28.mat
10.06 MB
-
README.md
7.22 KB
Abstract
During walking, proactive balance control mechanisms enable individuals to anticipate and respond to changes in their environment, such as terrain layout or obstacles. These mechanisms rely on sensory inputs, particularly vision, to adjust gait patterns in advance. Visuomotor coupling integrates information from central and peripheral vision to guide locomotion. Central vision provides detailed information about the walking path, including surface characteristics and layout. Peripheral vision processes environmental landmarks to support spatial orientation, depth perception, and self-motion. Together, these inputs allow the nervous system to plan and execute gait adjustments. Disruptions to visual information, whether due to reduced visual acuity, contrast sensitivity, or environmental conditions, can significantly alter walking behavior and challenge proactive balance control. The study associated with these data was designed to investigate how changes in the availability of central and peripheral visual information affect walking behavior when the fundamental walking task remains the same. It also investigated whether these effects vary with path complexity. For the study, 28 young healthy human adult participants (16F/12M; Age 26.2±4.2yrs) walked on both straight and winding virtual paths, while visual information from the walking path and surrounding environment was systematically reduced. This dataset includes their head, pelvis, and feet kinematics as they performed each of these tasks. Additional files provide participant characteristics, such as demographics, anthropometrics, and assessment scores, as well as a marker-set definition key. The study was designed to grow our understanding of visual perception-driven gait adaptations during different goal-directed walking tasks. These data offer a resource to investigate visual and mechanical factors that affect dynamic balance control during walking.
[DOI Link: https://doi.org/10.5061/dryad.7sqv9s56j]
Description of the Data and File Structure
We collected data from 28 young, healthy human adult participants (16F/12M; Age 26.2±4.2yrs). Age, height, body mass and leg length (measure from greater trochanter to lateral malleolus) of each participant were recorded. Additionally, participants completed three assessments: a contrast sensitivity test (logCSWeber), four-choice reaction time test (4CRT), and four-square step test (FSST).
Participants walked on a 1.2 m wide motorized treadmill in a Motek M-Gait virtual reality system (https://www.motekmedical.com/). They walked on each of two 0.45 m wide virtual paths: Straight (STR) and Winding (HIF). The pseudo-randomly oscillating path was created from a sum of three sin waves with incommensurate frequencies:
z(x)= 0.22 sin(A·0.46875x) + 0.05 sin(A·0.625x) + 0.03 sin(A·0.9375x)
where z is the lateral position (in meters) of the path center, A is a frequency scaling factor, and x is forward treadmill distance (in meters) starting at -0.55 m due to the projection of the path relative to the origin of the treadmill. Path Frequency Scaling Factors were A = 0 for straight (STR) paths,and A = 4 for the winding (HIF) path.
Each path shape was presented in two color contrasts: High-Contrast and Low-Contrast, and each combination of path shape and color contrast was presented within each of two virtual environments: Rich Forest and Sparse Plain. The Rich Forest environment featured dense foliage, while the Sparse Plain environment depicted an open, grassy plain.
Therefore, each participant walked a total of eight experimental conditions: 2 path shapes x 2 path contrasts x 2 environments. For each condition, participants completed two 3-min experimental trials.
For each trial performed by each participant, motion capture data were recorded with a 10-camera Vicon system (https://www.vicon.com/). These data were cleaned using Vicon Nexus software, and further processed in Matlab (https://www.mathworks.com/). All marker trajectories and path data (treadmill distance) are provided in this data set.
Files and Variables
File: Environ-Salience_Participant-Info.xlsx
Description: Participant characteristics and baseline test scores
Variables
- Group Age (years)
- Group Body Mass (kg)
- Group Body Height (m)
- Group Body Mass Index (BMI; kg/m^2)
- Individual Participant Sex (F/M)
- Individual Participant Leg Length (m)
- Individual Contrast Sensitivity (logCSWeber)
- Individual Four Square Step Test (FSST) Times (s)
- Individual Four-Choice Reaction Time (4CRT) Accuracy Rates (%)
- Individual Four-Choice Reaction Time (4CRT) Mean Reaction Times (s)
File: Environ-Salience_MarkerSetKey.xlsx
Description: Definitions of anatomical landmarks where motion capture markers were placed on each participant.
Variables
- Marker #
- Marker Label
- Marker Description
File: Environ-Salience_PairStat2MotionTrials.xlsx
Description: Key to match static trials with their corresponding motion/walking trials.
Variables
- Static Trial
- Walking Path/File
- Participant
- Environment
- Trial
File: Environ-Salience_STAT_n28.mat
Description: Kinematic (motion capture) marker data for Static Calibration ("STAT") trials performed by each participant. Marker data recorded at 100 Hz and interpolated to 600 Hz.
Variables
- Video Frame #
- Time [s]
- Treadmill distance ("TMDist") [m]
3D (XYZ) coordinates of 18 reflective markers
File: Environ-Salience_Kinematics_ACCL_n28.mat
Description: Kinematic (motion capture) marker data for Acclimation ("ACCL") trials performed by each participant. Marker data recorded at 100 Hz and interpolated to 600 Hz.
Variables
- Video Frame #
- Time [s]
- Treadmill distance ("TMDist") [m]
- 3D (XYZ) coordinates of 18 reflective markers
File: Environ-Salience_Kinematics_HIFH_n28.mat
Description: Kinematic (motion capture) marker data (600 Hz) for all walking trials performed by each participant on the High-Frequency High-Contrast (HIFH) walking paths in both the Rich Forest and Sparse Plain virtual scenes.
Variables
- Video Frame #
- Time [s]
- Treadmill distance ("TMDist") [m]
- 3D (XYZ) coordinates of 18 reflective markers
File: Environ-Salience_Kinematics_HIFL_n28.mat
Description: Kinematic (motion capture) marker data (600 Hz) for all walking trials performed by each participant on the High-Frequency Low-Contrast (HIFL) walking paths in both the Rich Forest and Sparse Plain virtual scenes.
Variables
- Video Frame #
- Time [s]
- Treadmill distance ("TMDist") [m]
- 3D (XYZ) coordinates of 18 reflective markers
File: Environ-Salience_Kinematics_STRH_n28.mat
Description: Kinematic (motion capture) marker data (600 Hz) for all walking trials performed by each participant on the Straight High-Contrast (STRH) walking paths in both the Rich Forest and Sparse Plain virtual scenes.
Variables
- Video Frame #
- Time [s]
- Treadmill distance ("TMDist") [m]
- 3D (XYZ) coordinates of 18 reflective markers
File: Environ-Salience_Kinematics_STRL_n28.mat
Description: Kinematic (motion capture) marker data (600 Hz) for all walking trials performed by each participant on the Straight Low-Contrast (STRL) walking paths in both the Rich Forest and Sparse Plain virtual scenes.
Variables
- Video Frame #
- Time [s]
- Treadmill distance ("TMDist") [m]
- 3D (XYZ) coordinates of 18 reflective markers
Code/software
Primary Data files are in Matlab *.mat format (https://www.mathworks.com/)
There are multiple open-source alternatives to Matlab. Two common alternatives include GNU Octave Octave (https://octave.org/) and SciLab (https://www.scilab.org/), but numerous others exist as well.
Additional / ancillary data files (2) are in Microsoft Excel *.xlxs format (https://www.microsoft.com/).
There are multiple open-source alternatives to Microsoft Excel. The most prominent of these is probably LibreOffice (https://libreoffice.org/), but other options can easily be found also.
Access information
Other publicly accessible locations of the data:
- N/A
Data was derived from the following sources:
- N/A
Human subjects data
For this study, human participants gave explicit consent to publish their de-identified data in the public domain.
Primary data are electronic (motion capture) and therefore contain no identifiers. No identifiers are used in naming of files etc (all files are named by participant number - randomly assigned). Demographic and anthropometric data for participants (see Excel spreadsheet) are kept to the minimum necessary to process / analyze the data. Variables not directly necessary for data processing are reported as aggregate values (mean ± s.d.).
This experiment included data from 28 healthy human adult participants (16F/12M; Age 26.2±4.2yrs). Data regarding their baseline demographics and relevant assessment scores are provided (*.xlsx file).
Participants walked on a motorized treadmill in a Motek M-Gait virtual reality system (https://www.motekmedical.com/). They walked on 4 distinct walking paths, varying by 2 shapes (straight and winding) and 2 color contrasts (high and low). Each path was presented within two distinct virtual environments. Detailed descriptions of the walking paths and environments are provided in the associated README file. Each participant performed two experimental trials (3 min long each) for all 8 walking conditions.
For each trial performed by each participant, motion capture data were recorded with a 10-camera Vicon system (https://www.vicon.com/). These data were cleaned using Vicon Nexus software, and further processed in Matlab (https://www.mathworks.com/). All marker trajectories and path data (treadmill distance) are provided in this data set.
