Skip to main content
Dryad

Usability of electronic health record systems in UK emergency departments

Cite this dataset

Bloom, Ben (2021). Usability of electronic health record systems in UK emergency departments [Dataset]. Dryad. https://doi.org/10.5061/dryad.7h44j0zsq

Abstract

Background

The large volume of patients, rapid staff turnover and high work pressure means that the usability of all systems within the Emergency Department (ED) is important. The transition to electronic health records (EHRs) has brought many benefits to emergency care but imposes a significant burden on staff to enter data. Poor usability has a direct consequence and opportunity cost in staff time and resources that could otherwise be employed in patient care. This research measures the usability of EHR systems in UK EDs using a validated assessment tool.

Methods

This was a survey completed by members and fellows of the Royal College of Emergency Medicine conducted during summer 2019. The primary outcome was the System Usability Scale score, which ranges from 0 (worst) to 100 (best). Scores were compared to an internationally recognised measure of acceptable usability of 68. Results were analysed by EHR system, country, healthcare organisation and physician grade. Only EHR systems with at least 20 responses were analysed.

Results

There were 1,647 responses from a total population of 8,794 (19%) representing 192 healthcare organisations (mainly UK National Health Service), and 25 EHR systems. Fifteen EHR systems had at least 20 responses and were included in the analysis. No EHR system achieved a median usability score that met the industry standard of acceptable usability.

The median usability score was 53 (interquartile range [IQR] 35 – 68). Individual EHR systems’ scores ranged from 35 (IQR 26 to 53) to 65 (IQR 44 to 80).

Conclusion

In this survey, no UK ED EHR system met the internationally validated standard of acceptable usability for information technology. Better usability of emergency department EHRs is a cheap and effective way of increasing staff productivity.

Methods

This was an open web-based survey run by the Royal College of Emergency Medicine (RCEM) Informatics Committee. A survey tool (REDCap, https://www.project-redcap.org/) was used to survey members and fellows of RCEM using the System Usability Scale from 25/06/19 to 15/08/19. 

The System Usability Scale has a minimum value of 0 and a maximum value of 100. We defined acceptable usability as a System Usability Scale score of ≥ 68. A score of 68 is both an average score across industries, and the threshold of acceptable usability (Figure 1).10,11  The System Usability Scale was treated as continuous, and also dichotomised into acceptable and non-acceptable usability at a threshold of 68. EHR system, country, healthcare organisation, consultant status (consultants vs. non-consultants), and trainee status (trainees vs. non-trainees) were a priori defined as potential predictive variables of acceptable usability. Physician grade was categorised in this way to include Staff Grade, Associate Specialist and Specialty Grade (SAS) doctors, who occupy neither training nor consultant grades. In addition, respondents were asked whether their system can link directly, or whether users have to log into a separate system, to access blood tests,  x-rays, out-patient notes, or primary care notes. Lastly, respondents were asked whether they wished to be contacted to take part in future usability work.

Usage notes

There are two tabs; Data and EHR system key

Data contains five columns; a respondent ID, Healthcare organisation code (anonymised code representing NHS trust or board, or other unit of healthcare organisation), EHR system (the vendor of the EHR), EHR system code (a numerical code unique to each EHR system, included for ease of analysis), and the System Usability Scale score calculated for each respondent (SUS).

EHR system key contains the unique numerical code and EHR system to which the code refers.

There are missing values.