Skip to main content
Dryad logo

Enhancing research informatics core user satisfaction through agile practices


Post, Andrew et al. (2021), Enhancing research informatics core user satisfaction through agile practices, Dryad, Dataset,


Objective: The Huntsman Cancer Institute (HCI) Research Informatics Shared Resource (RISR), a software and database development core facility, sought to address a lack of published operational best practices for research informatics cores. It aimed to use those insights to enhance effectiveness after an increase in team size from 20 to 31 full-time equivalents coincided with a reduction in user satisfaction.

Materials and Methods: RISR migrated from a water-scrum-fall model of software development to agile software development practices, which emphasize iteration and collaboration. RISR’s agile implementation emphasizes the product owner role, which is responsible for user engagement and may be particularly valuable in software development that requires close engagement with users like in science.

Results: All RISR’s software development teams implemented agile practices in early 2020. All project teams are led by a product owner who serves as the voice of the user on the development team. Annual user survey scores for service quality and turnaround time recorded nine months after implementation increased by 17% and 11%, respectively.

Discussion: RISR is illustrative of the increasing size of research informatics cores and the need to identify best practices for maintaining high effectiveness. Agile practices may address concerns about the fit of software engineering practices in science. The study had one time point after implementing agile practices and one site, limiting its generalizability.

Conclusion: Agile software development may substantially increase a research informatics core facility’s effectiveness and should be studied further as a potential best practice for how such cores are operated.


We used Huntsman Cancer Institute (HCI)'s annual user survey of its shared resources to evaluate the impact of the Research Informatics Shared Resource (RISR)'s new structure in its first year. The survey is administered by the HCI Research Administration office and is distributed through Survey Monkey to cancer center members and recent users of at least one HCI shared resource. While the survey asks many questions that applied to RISR, the questions that are the focus of this analysis are listed below:

  • Overall, how would you rate the quality of the service/product you received from the Research Informatics Shared Resource? Answers: Exceptional, high, average, poor, unacceptable
  • Overall, how would you rate the turnaround time for receiving data, products or other services from the Research Informatics Shared Resource? Answers: Exceptional, high, average, poor, unacceptable

The user survey was open between September 11 and September 24. Thus, it provided feedback nine months after RISR introduced agile practices into its operations. A total of 17 respondents answered the questions above out of 52 identified RISR users (33% response rate).


National Institutes of Health, Award: P30CA042014