Skip to main content
Dryad

Survey of Core Facilities Raw Data

Cite this dataset

Kos-Braun, Isabelle; Gerlach, Bjoern; Pitzer, Claudia (2020). Survey of Core Facilities Raw Data [Dataset]. Dryad. https://doi.org/10.5061/dryad.zkh18938m

Abstract

Recently, it has become evident that academic research faces issues with the reproducibility of research data. As Core Facilities (CFs) have a central position in the research infrastructure they are able to promote and disseminate good research standards through their users. To identify the most important factors for research quality, we polled 253 CFs across Europe about their practices and analysed in detail the interaction process between CFs and their users, from the first contact to the publication of the results. Although the survey showed that CFs aim to train and advise their users, it highlighted the following areas, the improvement of which would directly increase research quality: 1) motivating users to follow the advice and procedures for best research practice, 2) providing clear guidance on data management practices, 3) improving communication along the whole research process and 4) clearly defining the responsibilities of each party.

Methods

We developed a 68-question online survey asking various questions about research quality in core facilities (CFs) using Limesurvey software. We sent the survey individually to 1000 CFs’ leaders by email. In addition, our survey was publicized in the CTLS newsletter (Core Technologies for Life Sciences) and several facilities we contacted by initial email further forwarded the survey link to their colleagues. The survey was open from December 2019 to July 2020. All survey participants were anonymous. We received 276 total forms (28% participation rate), 253 of which were complete.

The survey contained yes/no, multiple-choice and open-field text questions. The survey data was analysed using Microsoft Office 365 Excel. We had 28 free text fields to allow the respondents express themselves freely, to eliminate potential bias stemming from suggested answers. Open-field answers were evaluated by reading each of them personally and defining categories manually based on the replies so that they correspond to the opinions of the participants as faithfully as possible (see the last sheet “explanations” of the excel file). Keywords were then chosen to allow automatic counting in Excel.

We analysed the data using standard Excel tools in three different ways: 1) all facilities together, 2) facilities grouped by their type/specialization (genomics, microscopy, etc) and 3) grouped by their operating mode (full-, hybrid-, self-service).

Funding

Federal Ministry of Education and Research BMBF, Award: 01PW18001

Federal Ministry of Education and Research BMBF, Award: 01PW18001