Skip to main content
Dryad

Data from: Multiobjective optimization algorithm for accurate MADYMO reconstruction of vehicle-pedestrian accidents

Cite this dataset

Huang, Jiang et al. (2022). Data from: Multiobjective optimization algorithm for accurate MADYMO reconstruction of vehicle-pedestrian accidents [Dataset]. Dryad. https://doi.org/10.5061/dryad.2ngf1vhs6

Abstract

Uncertainty in reconstruction accuracy is a critical problem faced in the current traffic accident reconstruction process. The purpose of this study is to explore the use of an improved optimization algorithm combined with MAthematical DYnamic MOdels (MADYMO) multibody simulations and crash data to conduct accurate reconstructions of vehicle–pedestrian accidents. The performance of three commonly employed multiobjective optimization algorithms, including nondominated sorting genetic algorithm-II (NSGA-II), neighbourhood cultivation genetic algorithm (NCGA) and multiobjective particle swarm optimization (MOPSO) were compared and evaluated. The effects of the number of objective functions, the selection of different objective functions and the optimal number of iterations are also investigated. The present study indicated that NSGA-II had better convergence and generated more noninferior solutions and better final solutions than NCGA and MOPSO. And multibody simulations coupled with optimization algorithms can be used to accurately reconstruct vehicle-pedestrian collisions.

Methods

We selected the commercial DJI Inspire 2 (DJI, China) to document the accident scene and established the accident scene model based on Context Capture software (Bentley, USA). We compared this model with video information to determine the speed of the vehicle at the time of the accident. Faro Focus 3D S120 laser scanning and postprocessing software FARO SCENE (FARO, US) were used to scan the accident vehicle and obtain point cloud data that were processed in Geomagic 2017 software (3D Systems Corporation, America). The facet model of the vehicle was obtained. And we built the accident collision model in MADYMO using the human model provided by TNO. The relevant models have been uploaded in the metadata.

Based on the actual accident results, the objective function was established by different collision markers. The optimization algorithms were used to optimize the parameters of the accident crash model by Isight software (DS SIMULIA) to study the effects of different algorithms and different preimpact parameters on the optimization results. We created two datasets (described below) including three subsets of optimization results for different algorithms and 11 subsets of optimization results for different preimpact parameters, respectively.

Usage notes

ZIP files contains the accident scene model, the vehicle facet model and two datasets of accident simulation parameters obtained by algorithms, and two datasets of validation cases.

The download links for the related software are as follows:

Context Capture software: https://www.bentley.com/software/contextcapture/  

FARO SCENE: https://www.faro.com/en/Products/Software/SCENE-Software 

Geomagic: https://cn.3dsystems.com/software and alternative software Meshlab: https://www.meshlab.net/ 

Isight: https://www.3ds.com/newsroom/press-releases/dassault-systemes-announces-isight-abaqus and alternative software Rhttps://www.r-project.org/ 

MADYMO: https://tass.plm.automation.siemens.com/cn/madymo-0 

Hypermesh: https://www.altair.com/hypermesh/ and alternative software ANSA https://www.ansa-usa.com/software/ansa/

Funding

National Key Research and Development Plan, Award: 2022YFC3302002

National Natural Science Foundation of China, Award: 82171872

National Natural Science Foundation of China, Award: 21ZR1464600

Shanghai Key Laboratory of Forensic Medicine, Academy of Forensic Science, Award: 21DZ2270800

Shanghai Forensic Service Platform, Award: 19DZ2290900

Central Research Institute Public Project, Award: GY2020G-4 and GY2021G-5

Guizhou Provincial College Students' Innovation and Entrepreneurship Training Plan Project, Award: S202110660012