The role of health informatics specialists, specifically nursing informaticists, has evolved over the past several decades with advances in health information technology. The American Nursing Informatics Association (2018) defines nursing informatics as a combination of "nursing science, information science, and computer science" (n.d.). Roles and responsibilities of nurse informatic specialists vary by facility but commonly involve areas of quality metrics, clinical documentation, and information technology support. Evolving roles due to the HITECH Act of 2009, Meaningful Use Incentive programs, and financial reimbursement systems that reward value-based care have increased the need for nursing informaticists (Nagle, Sermeus & Junger, 2017). As clinical leaders with technical and analytic skills, health informatics professionals have the ability to drive change within organizations wishing to improve quality of care for patients and reduce healthcare expenditures. Usability features of the hospital's electronic health record (EHR) system are critical to ensure clinicians are equipped with the necessary tools to provide safe and high-quality care.
This research project assessed the usability of a 54-bed hospital’s electronic health record (EHR) to provide the organization with a snapshot of nursing staff satisfaction, efficiency, and effectiveness. The Data-Information-Knowledge-Wisdom framework was applied to study results and demonstrate the power of nursing perceptions to drive change within a healthcare organization. Efforts to improve patient safety and quality through usable EHR tools align with the goals of health care reform.
The health care system in the United States has sought to reform issues related to access, cost, and quality for over three decades (Sultz & Young, 2014). The Triple Aim Initiative, initiated by the Institute for Healthcare Improvement (2017), seeks to "improve the patient experience of care (including quality and satisfaction); improve the health of populations, and reduce the per capita cost of healthcare" (p.1).
Key legislation passed to meet the goals of health care reform include the American Recovery and Reinvestment Act of 2009 and subsequent HITECH Act. Providing federally funded financial incentives for healthcare organizations to adopt electronic health records was an attempt to alleviate issues of cost and quality. Implementing health information technology aims to improve quality and cost by increasing transparency, improving communication between providers and facilities, and exchanging and storing patient data privately and securely. As of 2016, "over 95 percent of all eligible and Critical Access hospitals have demonstrated meaningful use of certified health information technology through participation in the Centers for Medicare & Medicaid Services (CMS) Electronic Health Record (EHR) Incentive Programs" (Office of the National Coordinator for Health Information Technology, 2017, p. 1).
These advances in health information technology are aimed at transforming care delivery processes and supporting patients and clinicians. Despite efforts of the Triple Aim to improve the health care system, clinician burnout and dissatisfaction with electronic medical records is a growing concern among healthcare leaders. Bodenheimer and Sinsky (2014) suggested adding a fourth component to the Triple Aim measures, which includes “the goal of improving the work life of health care providers” (p. 573). Additionally, Bodenheimer and Sinsky (2014) report clinician dissatisfaction with EHR use decreases the quality of care provided and increases patient safety risks, resulting in avoidable health care costs.
Clinical end-user dissatisfaction has highlighted the need for usability evaluations of electronic health records. The International Organization for Standardization (2018) defines the three main components of usability as efficiency, effectiveness, and satisfaction. New literature is exploring the relationship between the emotional experiences of the end-user, in addition to the historical components (Bevan, Carter, & Harker, 2015). Incorporating usability principles into the design and development of clinical software applications is imperative to promote patient safety and quality enhancements, as dissatisfaction and poor usability are associated with errors that compromise patient safety (Middleton et al., 2013).
John Brooke developed the System Usability Scale (SUS) in 1986. As a questionnaire-based survey providing a global snapshot of the efficiency, effectiveness, and satisfaction rates of a system, it is one of the most widely used evaluation tools (Usability.gov, 2013). SUS is beneficial for measuring usability due to its easy distribution to participants, and quick analysis of results. The survey is comprised of ten questions related to software use. Participants rate the questions on a scale ranging from "strongly agree" to "strongly disagree." Results are then analyzed using Brooke’s formulated calculation, which involves converting scores into a new number and determining a composite score. SUS scores range from 0-100, with the 68th percentile considered an “average” score (Usability.gov, 2013).
Moody, Slocumb, Berg, & Jackson (2004) developed another useful usability evaluation tool. The usability assessment survey sought to assess the perception of nurses on the effects of electronic health record systems on quality and safety of patient care. This assessment is still relevant today, as widespread adoption of EHRs has highlighted the need for evaluating the usability of electronic tools used daily to assist nursing staff in caring for the patient. Nurses represent the largest group of EHR users, utilizing electronic tools to record and analyze pertinent health data (Rojas & Seckman, 2014). Designing clinical applications that meet the usability principles of efficiency, effectiveness, and satisfaction for nursing professionals supports the potential to drive change in a healthcare industry seeking methods to improve quality and safety for patients.
Among the many types of usability evaluation tools, surveys remain widely popular due to easy distribution and quick analyzation of results. For this research study, an electronic survey was selected as the usability evaluation tool for this study. The current EHR has been in place for over three years. Nurses utilize EHR software throughout their assigned shift to access pertinent patient information, document results, and communicate with members of the multi-disciplinary care team. Results of this study sought to contribute to the broader knowledge base of health informatics, providing the perception of nursing professionals on the usability features of electronic tools utilized to provide direct patient care on a daily basis. As stated by Bodenheimer and Sinsky (2014), equipping health care professionals with the appropriate electronic tools is key to the success of improving quality of care for patients, patient safety, and reducing health care expenditures. Results of this study will provide evidence for nursing administration at the performance site to consider workflow improvement strategies, or software enhancement requests for vendor development. As the most substantial body of health care professionals, nurses are increasingly respected for their influence on improving quality and patient safety initiatives (Kennedy, 2018).
The American Nurses Association has adopted the Data-Information-Knowledge-Wisdom framework as an integral component of informatics nursing (Topaz, 2013). In its simplest form, the data obtained from this usability evaluation are the perceptions of usability by nursing professionals. This data is turned into information by combining "data + meaning" (Topaz, 2013). Knowledge is the result of understanding the implications of research information, and lastly, wisdom is the application of knowledge to improve healthcare processes for patients. Applying this framework to the Usability Evaluation of Electronic Health Record System for Nursing Professionals allows nursing perceptions about the impact of EHRs on patient care to influence reform efforts on quality and safety initiatives for patients. Understanding the usability of electronic health tools is critical to developing and maintaining health information technology that will assist clinicians in delivering high quality, cost-effective care.
Research study participants included full-time, part-time, and PRN (pro re nata or "as needed") nursing professionals at a 54-bed specialty hospital in south-central Kansas. Eligible participants utilized the hospital's EHR system on a daily basis to perform necessary job activities. Nurses working in a unit that does not utilize a electronic health record for documentation or patient care activities were ineligible to participate in the study.
The study site location has utilized electronic nursing documentation for over five years and implemented a computerized physician order entry (CPOE) system in 2015. Nursing staff utilize health information technology on a daily basis to access patient information, communicate with members of the multi-disciplinary care team, and coordinate patient services. The Usability Evaluation of Electronic Health Record System for Nursing Professionals was used to gather nursing perceptions on the efficiency, effectiveness, and satisfaction of the hospital’s EHR, to provide insight on process improvement and potential product enhancement requests for nursing administration.
After receiving clearance from the study site location, a research protocol was developed and submitted to the University of Cincinnati Internal Review Board, in accordance with federal regulations. Required documentation was submitted, including the recruitment letter, information statement, research study protocol, IRB application, and study site letter of support. Additionally, the student and faculty advisor completed required CITI training for research studies conducted through the University of Cincinnati. The student completed requested revisions by IRB staff to the research protocol. The revised protocol was submitted and authorized under expedited IRB.
The Usability Evaluation of Electronic Health Record System for Nursing Professionals is comprised of three content areas: Brooke’s System Usability Scale (2013), Moody et al.’s (2004) Usability Assessment, and open-ended questions related to the positive and negative aspects of the hospital's electronic health record. This mixed methods study utilized quantitative and qualitative data to gather nursing perceptions on the usability features of the EHR, provide insight to process improvement strategies, potential product enhancements, and expand the knowledge base of health informatics.
John Brooke developed the System Usability Scale in 1986. It is one of the most widely used tools to assess the usability features of hardware or software applications. This “quick and dirty” assessment tool seeks to determine if the software utilized by nursing professionals meets the demands of daily job activities. The scale is comprised of the following ten questions, with responses rated on a Likert Scale from “Strongly Agree” to “Strongly Disagree” (Usability.gov, 2013):
Interpretation of SUS scores is accomplished by calculating a percentile composite ranking. Brooke (Usability.gov, 2013) defines the SUS calculation as follows:
To calculate the SUS score, first sum the score contributions from each item. Each item's score contribution will range from 0 to 4. For items 1,3,5,7,and 9 the score contribution is the scale position minus 1. For items 2,4,6,8 and 10, the contribution is 5 minus the scale position. Multiply the sum of the scores by 2.5 to obtain the overall value of SU.
The final composite ranking is a percentile of System Usability. The average score is 68, with numbers above 68 representing above-average usability; and scores below 68 representing below average usability. Brooke (1986) stated that the System Usability scale is easy to administer, analyze, and providew tangible results on the usability features of an electronic system. While it can be used alone, incorporating subjective questions into a usability assessment survey is beneficial in obtaining a well-rounded picture of system usability.
Moody et al., (2004) developed the Usability Assessment Survey to determine “user satisfaction with the functionality of the current system, perceived problems, barriers, and frustrations associated with the current EHR documentation system, and attitudes in general toward the use of an EHR” (Background, para. 2). This usability survey was first administered to 100 nursing staff at a magnet hospital in Florida. The survey consisted of demographics, access and location of hardware used, and a Likert Scale of nursing perceptions related to electronic nursing documentation.
For the purpose of this capstone research study, the following questions were deemed appropriate for the usability evaluation at the study site location:
Questions related to the location of hardware were omitted, since the study site location had previously addressed these issues. Descriptive statistics were used to analyze usability assessment survey results, including the correlation between demographics and nursing perceptions of the EHR.
Nursing professionals were asked two opened ended questions:
Content analysis was performed to interpret results of these open-ended questions.
Survey results were exported from the online survey site to a CVS file and imported into Microsoft Excel. In order to analyze the System Usability Scale results, data cleaning was performed by assigning numerical values to categorical data. After data cleaning, the SUS scores were determined based on John Brooke’s calculation guide (Usability.gov, 2013).
The average SUS score among all participants was calculated, along with the average SUS scores among different demographics. This included the average SUS score of participants based on level of education, employment status, and years of nursing experience.
Questions from Moody et al.’s (2004) Usability Assessment Survey were also exported to Microsoft Excel for data analysis. Participant responses to each question were summarized, and demographic relationships and patterns were explored. Responses based on level of education, employment status, and years of nursing experience were compared and contrasted.
Finally, the pen-ended question responses related to the positive and negative aspects of the hospital’s EHR were imported into Microsoft Excel for analysis. Content analysis tallied common responses from participants.
Documentation of consent was waived by the Internal Review Board at the University of Cincinnati, due to the minimal risk presented to research participants in an electronic survey. Study site human resource personnel distributed a recruitment letter and information statement to eligible nursing professionals prior to completing the survey. Contact information for the primary investigator and IRB at UC was available to potential research participants, if they wished to discuss any questions or concerns related to the study.
The Usability Evaluation of Electronic Health Records for Nursing Professionals was electronically delivered to 100 nursing personnel. Twenty-eight nurses participated in the study. The majority of participants held a bachelor’s degree (64%), followed by an associate degree (29%), and a master’s degree (7%). The majority of nursing participants were employed full-time (68%), five worked part-time and five were PRN employees. Participant nursing experience ranged from 1 to 51 years, with an average of 22.32 years of nursing service.
The average SUS score among the twenty-eight participants was 44.9. This SUS score is considered low, since John Brooke defined the average SUS score of 68 to be an indicator of good system usability. The highest SUS scores were reported by participants with a bachelor’s degree, with an average SUS score of 49.3.
Fifty-five percent of participants agreed the "use of electronic health records is more of a help than a hindrance to patient care." Full-time employees were more likely to find electronic health records helpful to patient care. Analyzing respondents’ level of education found no difference of opinion among participants with a bachelor's versus an associate degree. Half of each of these groups of respondents said they "agree" or "strongly agree" with the statement.
Fifty-three percent of participants agreed to the statement “the use of computerized charting has helped improve documentation of clinical records.” There was no noticable variance in participant responses based on level of education or employment status.
Sixty percent of all nurses surveyed reported "disagree" and "strongly disagree" that computerized charting has decreased the workload of nurses and other personnel. The majority of participants disagreed with this statement, regardless of demographics. Sixty-three of full-time employees disagreed that computerized charting has decreased the workload of nurses and other personnel. Interestingly, fifty-six percent of nurses with less than ten years of experience neither agreed nor disagreed that computerized charting had decreased the workload of nurses and other personnel. However, seventy-nine percent of nurses with greater than ten years of experience disagreed with the statement.
Forty-eight percent of nurses surveyed agreed that electronic health records would lead to improved patient care, while forty percent of participants were neutral. Forty-three percent of full-time employees agreed, while a separate forty-three percent of full-time employees neither agreed nor disagreed. Level of education did not influence participant response; however, sixty-seven percent of nurses with less than ten years of experience agreed that electronic health records would lead to improved patient care. Alternatively, only thirty-eight percent of nurses with greater than 30 years of experience agreed that EHRs would improve patient care.
Nursing professionals were asked to list the positive aspects of the hospital's electronic health record. Sixty-one percent of participants responded to positive aspects of the EHR. Thirty-five percent of participants listed the ease to access patient information, including documentation on previous admissions and test results as a positive component of the hospital’s EHR. Eighteen percent of participants referenced documentation legibility as a benefit. (See Table 1).
Nursing professionals were asked to list the negative aspects of the hospital’s EHR. Seventy-one percent of participants listed the following undesirable characteristics of the hospital’s software (See Table 2). Twenty-five percent of participants listed "cumbersome," and fifteen percent of participants listed "slow." 45% of participants referred to "time" as an issue, whether referring to the number of computer clicks to complete nursing documentation, time to access pertinent patient information, or the amount of time computerized documentation reduces time with the patient.
The Data-Knowledge-Information-Wisdom framework applies to the Usability Evaluation of Electronic Health Records for Nursing Professionals. Research study participants responded to questions related to John Brooke's System Usability Scale, Moody et al.’s Usability Assessment Survey, and open-ended questions related to the positive and negative aspects of the hospital's EHR. These questions produced an enormous amount of data, however this data is futile without analysis, interpretation, and application to clinical practice.
Data transformation into information occurs with the meaningful analysis of study results. This usability evaluation of nursing professionals found nurses at the study site reported low system usability of the hospital’s EHR. Additionally, nursing professionals believed that EHRs hinder patient care and increase the workload of nursing staff. However, nursing professionals at the study site also reported that EHRs have improved clinical documentation and will eventually lead to improved patient care. Information from this research study can be compared to previous usability evaluations to generate knowledge.
Understanding the relationships between research study information and the various data patterns produces knowledge of usability evaluation outcomes. The study site results showed a low SUS score of 49.3%. A comparison was made with results from a previous study conducted at six different hospitals with 1879 particiapants that reported SUS scores of 56.9, 60.1, 60.5, 63.3, 52.2, and 58.3 (Cho, Kim, Choi, & Staggers, 2016).
Comparing the nursing perceptions of this research study with Moody et al.’s (2004) results, this study site reported lower agreement statements. Only 50% of nurses at the study site agreed that EHRs are more of a help than a hindrance to patient care, compared to 81% in Moody’s study. Additionally, only 53% of nurses agreed that EHRs have improved documentation, unparalleled to 75% in Moody’s research. 18% of nurses surveyed reported EHRs have decreased the workload of nursing personnel, whereas 36% of participants in Moody’s study agreed. 48% of nurses agreed EHRs will lead to improved patient care, while 76% of nurses surveyed by Moody agreed.
Considerations to the variance in these reports are the type of facility surveyed, number of research participants, and different software programs utilized by each health care organization. Moody et al.’s (2004) study also took place prior to Meaningful Use incentive programs, which increased the usage of electronic health records. Changes in reporting requirements and mandated electronic documentation may have also influenced the perception of nursing professionals.
The wisdom component of this framework is achieved when nursing leadership utilizes the knowledge gained from this research study and applies it to clinical practice to improve the usability features of the hospital's electronic health record. This may include further investigation into low SUS scores, as 49.3% falls considerably below John Brooke's average SUS score of 68. Submitting negative aspects of the hospital's EHR as enhancement requests to the software vendor for future product development, such as accessing pertinent medical information and reducing the number of clicks required to complete documentation would acknowledge issues reported by nursing staff. Additionally, modifying clinical workflows to improve time spent with the patient versus time spent on the computer would address concerns described by survey participants.
Previous studies have sought to gather nursing perceptions on the impact of computerized documentation on patient care. This research study intended to gather information on how EHR usability affects patient care, by utilizing John Brooke’s System Usability Scale and Moody’s System Usability Assessment. Ongoing research is necessary to ensure nursing professionals are adequately equipped with the appropriate electronic tools to deliver high quality care.
Usability features of electronic health records are a critical component of health care reform. The widespread adoption of EHRs was intended to address long-standing issues related to cost, quality, and access to health care services in the United States. As Bodenheimer and Sinsky (2014) stated, these overarching goals of health care reform cannot be achieved without providing clinicians with adequate health information technology to address these challenges. As this research study found, nurses at this study site location reported a low SUS score and found that electronic documentation has taken time away from patient care activities. Study results reinforced the need for clinical insights into the design, build, implementation, and ongoing maintenance of electronic health records. This is essential to avoid clinical dissatisfaction and burnout while improving the quality of patient care and patient safety.
Citation:Nation, J., & Wangia-Anderson, V. (Winter 2019). Applying the Data-Knowledge-Information-Wisdom framework to a Usability Evaluation of Electronic Health Record System for Nursing Professionals. Online Journal of Nursing Informatics (OJNI), 23(1). Available at http://www.himss.org/ojni
The views and opinions expressed in this blog or by commenters are those of the author and do not necessarily reflect the official policy or position of HIMSS or its affiliates.
Powered by the HIMSS Foundation and the HIMSS Nursing Informatics Community, the Online Journal of Nursing Informatics is a free, international, peer reviewed publication that is published three times a year and supports all functional areas of nursing informatics.
Jacqueline Nation BSN, RN
Jacqueline Nation is a graduate student at the University of Cincinnati’s Master of Health Informatics program. She graduated from Wichita State University with a Bachelor of Science in Nursing. She worked for six years in inpatient cardiovascular care before transitioning into nursing informatics.
Victoria Wangia-Anderson, PhD, FHIMSS
Associate Professor, Director
Doctorate in Health Informatics from the University of Minnesota, Master's in Information Science.
Work experience - 10+ as a Health Informatics educator in higher education and 5 years as an administrator.
Previously a Fellow at the Centers for Disease Control and Prevention, and faculty at the University of Kansas Medical Center - School of Nursing.
American Nursing Informatics Association. (2018, July 12). Nursing Informatics definition. Retrieved from https://www.ania.org/
Bevan, N., Carter, J., & Harker, S. (2015). ISO 9241-11 Revised: What have we learnt about usability since 1998? Human-Computer Interaction: Design and Evaluation Lecture Notes in Computer Science, 143-151. doi:10.1007/978-3-319-20901-2_13
Bodenheimer, T., & Sinsky, C. (2014). From Triple to Quadruple Aim: Care of the patient requires care of the provider. The Annals of Family Medicine, 12(6), 573-576. doi:10.1370/afm.1713
Brookes, J. (1986). SUS - A quick and dirty usability scale. Retrieved from https://hell.meiert.org/core/pdf/sus.pdf
Cho, I., Kim, E., Choi, W. H., & Staggers, N. (2016). Comparing usability testing outcomes and functions of six electronic nursing record systems. International Journal of Medical Informatics, 88, 78-85. doi:10.1016/j.ijmedinf.2016.01.007 Institute for Healthcare Improvement. (2017). Triple Aim Overview. Retrieved from http://www.ihi.org/Engage/Initiatives/TripleAim/Pages/default.aspx
International Organization for Standardization (2018). ISO 9241-11:2018 Ergonomics of human-system interaction -- Part 11: Usability: Definitions and concepts. Retrieved from https://www.iso.org/standard/63500.html
Kennedy, S. (2018, May 04). Nurses, health care's 'Silent Majority,' are becoming more visible. Retrieved from https://ajnoffthecharts.com/nurses-week-are-health-cares-silent- …; majority-becoming-more-visible/
Middleton, B., Bloomrosen, M., Dente, M. A., Hashmat, B., Koppel, R., Overhage, J. M., … Zhang, J. (2013). Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA. Journal of the American Medical Informatics Association : JAMIA, 20(e1), e2–e8. http://doi.org/10.1136/amiajnl-2012-001458
Moody, L. E., Slocumb, E., Berg, B., & Jackson, D. (2004). Electronic Health Records documentation in nursing. CIN: Computers, Informatics, Nursing, 22(6), 337-344. doi:10.1097/00024665-200411000-00009
Nagle, L. M., Sermeus, W., & Junger, A. (2017). Evolving role of the nursing informatics specialist. Studies in Health Technology and Informatics, 232, 212.
Office of the National Coordinator for Health Information Technology. (2017). Hospitals participating in the CMS EHR Incentive Programs, Health IT Quick-Stat #45. http://dashboard.healthit.gov/quickstats/pages/FIG-Hospitals-EHR-Incent….
Rojas, C. L., & Seckman, C. A. (2014). The informatics nurse specialist role in electronic health record usability evaluation. Computers, Informatics, Nursing: CIN, 32(5), 214-220. doi:10.1097/CIN.0000000000000042
Sultz, H. A., & Young, K. M. (2014). Health care USA: understanding its organization and delivery. Gaithersburg, MD: Aspen.
Topaz, M. (2013). Invited Editorial: The Hitchhiker’s Guide to nursing informatics theory: using the Data-Knowledge-Information-Wisdom framework to guide informatics research. Online Journal of Nursing Informatics (OJNI), 17 (3). Available at http://ojni.org/issues/?p=2852
Usability.gov (2013, October 04). System Usability Scale (SUS). Retrieved from https://www.usability.gov/how-to-and-tools/resources/templates/system-u…; sus.html