Pordeli, L. (November, 2018). Informatics Competency-Based Assessment: Evaluations and Determination of Nursing Informatics Competency Gaps among Practicing Nurse Informaticists. Online Journal of Nursing Informatics (OJNI), 22(3), Available at http://www.himss.org/ojni
In today’s transformed healthcare industry, nursing informatics has become an essential and fast-growing specialty. While nursing informatics competencies have been identified and recommended by national organizations as an essential component of nursing education and practice, limited information is available on the skills and informatics competencies of practicing informaticists.
This evidence-based Doctor of Nursing Practice (DNP) project implemented and evaluated an evidence-based professional development program to address informatics competency gaps among practicing informaticists at a magnet-recognized hospital in northeast Florida and to ensure that the informaticists possessed the essential skills and knowledge to achieve the organization’s informatics needs.
The Nursing Informatics Competency Assessment (NICA) L3/L4 tool was used in this evidence-based project (McGonigle, Hunter, Hebda, & Hill, 2014). The Kruskal Wallis test was implemented to compare the pre-test and post-test responses on three selected subcategories to determine if there were any statistically significant changes between the pre-test and post-test results and competencies.
The results indicated positive outcomes and improvements in competencies following the implementation and evaluation of the evidence-based professional development program. Findings from this project are valuable to healthcare organizations and nurse leaders who wish to identify and address informatics competency gaps among informaticists and provide on-the-job-training to address gaps in knowledge and skills. The results of this quality improvement project contribute to and fill gaps in the literature on informatics competency assessment and development.
In today’s healthcare system, informatics has become an essential part of the infrastructure to improve access to health information, make patient care safer, decrease health care costs, and improve outcomes (Tellez, 2012). Currently information technology is a critical part of our health care industry, and the mandates for electronic health record (EHR) implementation requires healthcare providers to have basic computer knowledge, as well as informatics competencies to manage and use technology to deliver care. The nursing profession, with more than four million members in the USA, is the largest group of healthcare providers who must possess computer skills and competencies given the current advanced technology in the workplace (Henry J. Kaiser Family Foundation, 2016).
Still, many nurses entering the workforce are not prepared and lack the informatics competencies needed (Found, 2012). To prepare nursing graduates, both at the undergraduate and graduate levels, nursing educators must incorporate informatics into the nursing curricula. The American Nursing Association (ANA), National League for Nursing (NLN), TIGER Initiative, and American Association of Colleges of Nursing (AACN) all have argued that new nursing graduates must be able to demonstrate informatics competencies, and all call for the integration of informatics into nursing curricula (ANA, 2001; AACN, 2008; De Gagne, Bisaanar, Makowski & Neuman, 2011; Hunter, McGonigle, & Hebda, 2013; NLN, 2008; TIGER, 2014). However, neither the national nursing governing organizations or their accrediting bodies mandate how informatics should be incorporated into the curricula; therefore, some programs offer a stand-alone course in informatics, others may integrate informatics content throughout the curriculum, and some offer neither. This lack of uniformity in informatics education causes an uneven preparedness for some nurse graduates, thereby negatively impacting their competency level and their use of health information technology (Hunter, et al., 2013; De Gagne et al., 2012).
Even though nursing informatics competencies have been defined by national nursing organizations, the introduction and adoption of the competencies and work-related informatics knowledge and skills for nurses have been very slow (Shultz, 2009). With the use of well-designed technologies, nurse informaticists provide tools to make nursing’s role more visible and support the work of nurses and clinical staff members. Informatics competencies are crucial to safe, efficient and quality care that enhances patient care outcomes (McGonigle, Hunter, Sipes, & Hebda, 2014). The role of nurse informaticists requires proficiency in a broad range of technological functions, but some nurse informaticists do not possess the competencies or required training because their knowledge and skills in informatics competencies were not assessed (Hill et al., 2014; Sipes et al., 2016). In many organizations, nursing informaticists have a diversified role and function with a vague job description and lack in-service educational training. Healthcare organizations must overcome these shortcomings by implementing strategies to improve informatics knowledge and to assess competency levels, as they are key elements of successfully accomplishing organizational goals and optimizing care delivery (Liu, Lee, & Mills, 2015). Nurses who pursue a career in health informatics will likely seek a degree in nursing informatics or health information management; still, healthcare organizations should implement informatics competency assessments and develop continuing education to support and facilitate competency development (Camilli, 2014; Hill et al., 2014).
The general problems addressed in this DNP quality improvement project are ill-defined informatics competencies and role requirements. Unclearly defined informatics competencies and role development requirements can lead to confusion and job ambiguity among nurse informaticists. It is important to identify the essential information technology proficiencies, informatics competencies, and skills defined by the AACN, TIGER Initiative, IOM and ANA to develop job-specific standardized nurse informaticist competencies and to form effective teaching and learning strategies (ANA, 2001; AACN, 2008; IOM, 2010; TIGER, 2009). There is a need for nurse informaticists to develop essential competencies through targeted training, in-service training, and continuing education .
The specific problem addressed in this project is the lack of role-specific informatics competencies and staff development curriculum pertinent to the organization’s nursing informatics team. The current competency checklist is very general and specific role requirements are not measured. Informatics competencies are not formally written, and continuing education or training plans have not been verified to specifically address the clinical informatics competency needs identified by the national nursing and informatics organizations.
This project developed, implemented, and evaluated an evidence-based practice (EBP) informatics competency curriculum to address the identified competency gaps and support and enrich the team’s informatics knowledge and skills. This project took place at a 304-bed, non-for-profit magnet-recognized hospital in northeast Florida. The organization’s clinical informatics department consisted of 21 full-time nurses. Evidence-based role-specific informatics competencies and staff development curricula were not formally written, and no continuing education or training plans were verified to specifically address the needs for clinical informatics competencies in the clinical informatics department.
As the organization prepares for the transition to a new EHR, the clinical informatics leadership is formulating plans for new roles that will expand the organizational structure of the clinical informatics department. One of the necessary preparations is an educational plan to improve and enhance the existing team’s informatics knowledge and competencies. The goal of the organization is to evaluate and enhance the staff’s current informatics competencies, knowledge and skills by following national organizations’ standards. The achieved level of competency will become a required starting level competency for new hires as they transition to the new EHR. Through a survey process, the informatics competencies of the current staff have been evaluated and exclusive competencies requirements have been identified.
This DNP quality improvement project provides an EBP professional development program addressing the identified informatics competency needs of a group of practicing nurse informaticists. The outcome of this project will be beneficial to the healthcare organization’s clinical informatics leadership. The results provide guidance on how to identify informatics competency gaps among nursing informaticists, uncover critical competencies needed for the role, and develop in-service educational trainings. While the importance of nursing informatics competencies for nursing students, bedside nurses and nursing leadership have been explored, this project contributes to the literature within the domain of nursing informatics competencies by focusing on competency gaps among practicing nurse informaticists at a designated institution.
The general problems addressed by this DNP project are ill-defined informatics competencies and role requirements. The specific problem addressed in this DNP project is the lack of evidence-based, role-specific informatics competencies and absence of a corresponding staff development curriculum pertinent to the organization’s nursing informatics team. Currently, the project site’s clinical informatics department has a basic competency checklist for new nursing informaticist employees. The preceptor completes this checklist during the three-month training period. This checklist includes basic computer competencies, knowledge of clinical informatics processes, and essential informatics competencies recommended by national nursing organizations. The organization’s baseline assessment results indicate gaps in informatics competencies. Developing an educational training program and competency checklists, based on an organization’s informatics needs, are essential to bridging the competency gaps in nursing informaticist practice.
The purpose of this pre-test, post-test quality improvement project was to evaluate participants’ informatics competencies meeting the ANA scope and standards of practice (2015) and TIGER Initiative (2009) informatics competencies (computer skills, informatics skills, and informatics knowledge) using the NICA L3/L4 tool (Sipes et al., 2016). The author implemented and evaluated an evidence-based professional development educational program to address competency gaps identified in a pre-program baseline assessment (NICA L3/L4 tool) and to address participants’ informatics competencies that require further development. The primary objective of this project was to integrate an evidence-based informatics competency professional development program tailored to meet the organization’s informatics competency needs and to develop a competency skills checklist to identify evidence-based competency needs for future new nurse informatics specialists joining the group. The secondary objective was to explore the impact of age, educational background, and years of experience on informatics competencies.
This project took place at a 304-bed, not-for-profit, magnet-recognized hospital in northeast Florida. The organization’s clinical informatics department currently consists of 21 full time nurses. The campus includes a clinic, a 304-bed hospital, and 22 operating rooms. The hospital offers care in more than 35 medical and surgical specialties to patients nationally and internationally.
Currently, the organization’s clinical informatics department, due to transitioning to a single EHR platform, is comprised of two informatics groups; the clinical informatics group that oversees and maintains the legacy EHR and information systems and the second group whose members are trained and certified to use the future single EHR platform as the transition occurs within in the next year and half. This group is currently overseeing the development of education curricula, education materials, workflow processes, and go-live planning toward the transition to and implementation of a new single EHR platform.
A convenience sampling was appropriate for this project as the sample size was limited to the 21 nurse informaticists at the specific study site. The inclusion criteria for participants consisted of registered nurse informaticians employed at the selected study site’s clinical informatics department at the time of the project interventions and evaluation. Initially, 21 surveys were distributed, 15 participants completed the pre-test survey, and 10 participants joined the professional training programs and completed the post-test survey.
Selection of the Measurement Method
The informatics competency model for nurses was first introduced by Staggers, Gassert, and Curren (2001), which included four levels: “beginning nurse-level 1, experienced nurse-level 2, informatics specialists-level 3, and informatics innovators-level 4” (p 385). The Nursing Informatics Competency Assessment (NICA) L3/L4 tool was derived from the ground-breaking Delphi study of Staggers, Gassert and Curren. The instrument was developed by Hill, McGonigle, Hunter, Sipes and Hebda (2014) as the first valid and reliable self-assessment tool to measure informatics competencies for nurse informatics specialists and informatics innovators. The lack of self-assessment of nursing informatics competencies was identified by the team as one of the gaps in competency development (Hill et al., 2014). The ANA scope and standards of practice (2015) and TIGER Initiative’s competency sets (2009) are the guides for developing the competency assessment tool. The competencies are divided into three categories: computer skills, informatics knowledge, and informatics skills. (Sipes et al., 2016).
The NICA L3/L4 assessment tool includes a demographic sheet and a self-assessment questionnaire divided into essential competencies assessing 178 perceived competencies in three main categories: computer skills, informatics knowledge, and informatics skills. Each competency category is assigned a four-item Likert score of expert, proficient, comfortable, and beginner/N/A. The authors of the NICA L3/L4 instrument have also developed a TIGER-based Assessment of Nursing Informatics Competencies (TANIC) tool to measure basic informatics competencies for level 1 and level 2 nurses.
The project was approved by the clinical informatics department as an evidence-based quality improvement project. The Nursing Informatics Competency Assessment (NICA) L3/L4 tool, developed by Hill et al., (2014) was used as the baseline assessment tool to identify potential informatics competency development needs (gaps). The clinical informatics leadership provided invaluable qualitative insight to identify the essential informatics competencies for the clinical informatics group, relevant to the current role requirements. The informatics competencies relevant to the current role requirements that needed further improvement and training were three subcategories in the computer skills and information knowledge categories. Thus, the project interventions focused on addressing gaps in the competencies of creating of spreadsheets and macros, running reports on statistical data management software (Excel), Cerner role security, and data mining. The gaps identified for the information skills category (third category) are currently overseen by the information technology (IT) department; therefore, they were not be addressed in this QI project. In addition, the author consulted with education specialists regarding Microsoft Excel training, as well as with information technology specialists considering Cerner role security training, to draft lesson plans and formalize staff training sessions.
Upon institutional review board (IRB) approval, lesson plans were developed, and the investigator co-created the data management software application training, Cerner role security lesson plans, and data-mining training session. Lesson plans for the three continuing education courses and objectives for each lesson plan were aligned to address gaps in competencies identified through the baseline assessment.
Approximately four weeks after the final educational intervention session, the investigator attended the study site’s informatics department staff meeting and explained the study interventions and the post-test tool to measure educational interventions. The post-test survey included a cover page; demographic sheet; and modified informed consent, which explained the participant’s rights and purpose of the study; and abbreviated NICA L3/L4 (questions pertained to the selected subcategorized competencies included in the intervention training sessions). The cover letter disclosed the details of the study and included a statement indicating that by completing and returning the survey to the sealed box, the participants consented to participate in the project. The investigator informed the participants that to ensure privacy and confidentiality, they should not include their names or other personal information on the questionnaire. Further, it was explained that data collected would be codified and data reporting would be at the aggregate and not individual level.
Descriptive statistical analysis summarized the participant’s demographics. Competency questions pertained to the three sub-categories assessed: general computer skills (data management), security, and data mining. Each question in the pre-test and post-test assessment was summarized statistically by time period (pre-/post-) and domain (i.e. general computer skills, privacy/security, and data mining subcategories) using traditional descriptive statistics. Statistical testing between time periods tested the primary objective of improved effect via the education intervention. The test was used to determine if there were any statistically significant changes between the pre-test and post-test results. The Spearman’s Rho and analysis of variance (ANOVA) were used to explore the relationship between demographics such as age, highest educational level achieved, and length of practice with nursing informatics competencies measured in the three subcategories. Accuracy of statistical differences was described through the use of 95% confidence intervals. All tests were conducted at an alpha=0.05. This alpha was used due to low response rates in the post session. Statistical analysis was conducted using Minitab version 17 (State College, PA).
During the design phase, the author used a retrospective pre-test method to assess and examine the results in the 15 anonymous NICA L3/L4 surveys. The focus was not on assessing individual knowledge and skill sets; rather, it was identifying competency gaps as a group, with reporting done at the aggregate level. The mean score analysis on responses to each competency item in all three categories (computer skills, informatics knowledge, and informatics skills) was used to identify gaps in informatics knowledge and skills. From the overall mean analysis report, responses revealing a mean score lower than 2 were selected and grouped for each subcategory to identify gaps for the competency assessed.
Following the identification of the competency needs, the mean score analysis results were reviewed with the organization’s clinical informatics leadership. The author consulted with the leadership to select relevant informatics competencies meeting the organization’s needs. The leadership provided invaluable insight toward identifying the informatics competencies essential to the current role requirements. The informatics competencies requiring further improvement and training were identified as the subcategories in the general computer skills and informatics knowledge categories. Training focused on addressing gaps in competencies in creating spreadsheets, macros and reports on statistical data management software (Excel); in Cerner role security; and in data mining.
The gaps identified for the information skills category (third category) are currently overseen by the IT department; therefore, they were not be addressed in this project. The author consulted with education specialists regarding Excel training, as well as with information technology specialists considering Cerner role security training. Lesson plans were formalized for the staff professional development program.
The Kruskal Wallis test was implemented to compare the pre-test and post-test responses on the selected three subcategories to determine if there were any statistically significant changes between the pre-test and post-test results and competencies. Results indicating a p value greater than 0.05 were not significantly different. Table 1 presents the descriptive statistics for post-test survey questions.
Table 1: Descriptive Statistics – Pre-test and Post-test Questions
The test results on the general computer skills subcategory (computer skills category) indicated a statistical significance between the pre-test and post-test results pertaining to one of the five questions (mean=2.20, SD=0.422, P=0.001): “write macros or shortcuts for spreadsheets”; meanwhile, the mean scores for pre-test and post-test responses were analyzed and the mean score for all questions in the same subcategory revealed an overall improvement in the mean scores in the post-test survey. The NICA L3/L4 post-test assessment survey results revealed an increase in the participant’s self-assessment of general computer skills competency by 25.41% (mean=11.2, SD=1.98, P=0.045) (see Table 2).
Table 2: Descriptive Statistics – Kruskal Wallis Test
The second subcategory measuring the privacy/security competencies (information knowledge) revealed an increased score by 26.21% (Mean=9.10, SD=3.10, P=0.155), but the Kruskal Wallis test did not reveal a statistically significant change in results. The overall increases in post-test mean scores indicated an improvement in competencies tested and participants rated themselves as comfortable (mean=2.27) in four measured competencies.
Finally, the project results indicated a statistically significant improvement in the data mining subcategory (informatics knowledge category). The post-test scores (mean=14.30, SD=3.62, P=0.003) increased by 51.64% compared to the pre-test scores and participants identified themselves as comfortable on items measuring data mining competencies. P values on four out of five survey questions indicated a statistically significant difference and improvement in the post-test scores. Although no study was found on assessing the informatics competency gaps among nurse informaticists in the USA, the interpretation of findings was consistent with previous studies on informatics competency assessments and gaps in general computer skills, informatics knowledge, and informatics skills (Chonsilapawit & Rungpragayphan, 2016; Choi & Zucker, 2012; Hill, et al., 2014; Hunter, et al., 2013; Sun & Falan, 2013). The Peltonen et al., (2016) report exploring future nursing informatics research indicated data mining as one of the top nursing informatics trends in future.
Overall, the project results indicated a positive outcome and improvement in competencies following the implementation and evaluation of an evidence-based professional development program. The results support the main objective of the quality improvement project. Table 3 presents the pre-test and post-test percentage change in respondents per competency measured. The design for the interpretation and comparison of the results table was adopted from a study done by Chonsilapawit & Rungpragayphan (2016).
Table 3: Descriptive Statistics – NICA L3/L4 Pre-Test and Post-test Results per Subcategories
The anticipated outcome of the quality improvement project was to reveal an improvement in participant’s post-test NICA L3/L4 assessment following the implementation and evaluation of an evidence-based professional development program. The post-test modified NICA L3/L4 survey focused on three sessions offered addressing the competency gaps determined in the pre-assessment surveys. The evaluation of the participants’ learning followed the intervention was assessed by comparing the mean score of the pre-test and post-test scores.
The participants in the post-test survey response to the selected three subcategories scores increased by 27.44% in the general computer skills, 26.21% in the security, and 51.64% in the data mining subcategory. Figure 1 presents the overall change in the three subcategories.
Participants rated themselves as proficient (Mean of the sums=11.2, SD=1.98) with general computer skills, comfortable (Mean on the sums=9.1, SD=3.10) in privacy/security, and expert (Mean of the sums=14.30, SD=3.62, percent increase) in the data mining subcategory. Table 3 illustrates the pre-test and post-test overall score results comparison. Figure 1 presents the interpretation and ranking information.
Figure 1: Change in three subcategories: Pre-test and Post-test Results
The qualitative methodology examined the informatics competency gaps and described the positive impact of professional development programs on competency based on the self-assessment post-test results. Despite the fact that the sample size in both the pre-test and post-test was small, results showed that gaps existed in informatics competencies. Furthermore, positive outcome and competency improvement was successfully measured following intervention and staff training sessions. A few factors may have contributed to the post-test results not revealing a statistically significant improvement on all competencies measured. First, survey questions might have been more general compared to the content covered in the training sessions. Also, the time gap between taking the pre-test and post-test assessment may have influenced the answers associated with the competencies measured, and a shorter interval between the pre-test, intervention and post-test survey could be considered.
Following this project, opportunities for future improvement in competencies were evaluated and discussed with the informatics leadership. An ongoing evaluation process will enable the leadership to provide evidence-based training to reduce gaps in informatics competencies. In six months, a refresher training session will be conducted. Overall, this project was overwhelmingly judged as favorable by the leadership and staff. Moving forward, the competency of informaticists participating in this project will be assessed annually through self-learning reviews. These competencies will be incorporated into the new hire orientation through online self-learning trainings, preceptor assistance, and classes.
Although this project provided valuable information on nursing informatics competency assessment and intervention, there were barriers influencing the project. One limitation was the conveniently selected small sample size, which may have impacted the results of the study. The NICA L3/L4 instrument includes 178 questions and some domains did not apply to the participants’ roles and may have impacted the study results. Another limitation was that limited research studies were available on informatics competency assessments and interventions. This project was tailored to address the specific informatics competency needs of the informaticists at the study site; therefore, results cannot be generalized.
Informatics competencies are an integral component of today’s clinical practice, and as health information technology continues to change and grow, the need to continually evaluate and refine informatics competencies is necessary (Schleyer, Burch, & Schoessler, 2011). It is evident that nursing informatics competencies, skill sets, and the application of informatics in practice are still not clearly understood in our industry (McGonigle, Hunter, Spies, & Hebda, 2014). To plan for an informatics curriculum, both at the clinical setting and in academic institutions, a baseline assessment should be established to determine competency gaps and training considerations (Choi & Zucker, 2012; Sipes, 2016; Liu, Lee, & Mills, 2015). Using nursing informatics competency self-assessment tools (NICA L3/L4 or TANIC) will enable healthcare organizations to identify gaps in competencies and better understand required skill sets for both bedside nurses and informatics nurses (Sipes, 2016).
Lack of procedures and assessments for determining nurses’ informatics competencies in organizations has been a concern. Although the NICA L3/L4 instrument is detailed and includes ANA and TIGER suggested informatics competencies, the competencies measured may not be unit specific, and the length of the instrument may decrease the accuracy of the measurement. One recommendation is to assess role-specific competencies that are listed in selected categories/subcategories on the tool instead of the entire 178 questions. This may increase the accuracy of the measurement and help to develop professional development programs specific to the unit’s informatics competency needs.
According to Sipes (2016): “we must assess and understand current competencies/skills, then address gaps in education by developing more relevant curricula that will meet the needs of the workforce for 2020” (p 255). The outcomes of this quality improvement project validate that using the NICA L3/L4 instrument is an effective method to assess competency levels and to design professional development programs and educational trainings to address those gaps. Additionally, the findings could help academic institutions and nursing curricula to ensure that graduate nurses entering the workforce are prepared and have accomplished informatics competencies.
Very limited research is available on informaticists competency assessment. Future qualitative and quantitative research is needed to expand the knowledge on the assessment and implementation of nursing informatics competency trainings. One recommendation for future research is to replicate this quality improvement project at multiple institutions and include a broader number of informaticists to generalize the identified competencies gaps. Further examination of the NICA L3/L4 tool in practice will provide insight on the potential need to revise and shorten the length of the assessment tool to enhance competency gap assessments. Furthermore, research is needed to raise awareness on the importance of informatics competencies and guide the development of training needs required for ongoing proficiency in informatics competencies in practice.
Findings from this quality improvement project have multiple implications for practice and add to the body of knowledge for informatics competency assessment among practicing informatics specialists. Findings suggest an effective process for conducting an informatics competency baseline assessment and for developing professional training programs needed for knowledge and skills enhancement (Sipes et al., 2016). Interpretation of the project results are consistent with previous studies in identifying informatics competency gaps in practice (Choi & Zucker, 2012; Choi & DeMartinis, 2013; Hwang & Parker, 2011; Hunter, et al., 2013). This project demonstrates the effectiveness of using a self-assessment instrument to identify informatics competency gaps, which can guide the design of a professional development program to address the identified gaps in unit-specific competencies (Hill, et al., 2014; Hunter, et al., 2013; Kleib, Simpson, & Rhodes, 2016; Sipes, et al., 2016; Schleyer, et al., 2011).
To improve and advance informatics competencies, training in informatics knowledge is needed regularly for both the bedside and informatics nurses. Subsequently, as healthcare information technology continues to grow, competencies and trainings must be revisited, revised and updated routinely to adapt to fast-changing technological advances and innovations (Sipes, et al., 2016, Schleyer, et al., 2011). Nurse leaders should support, promote and provide informatics continuing education to develop competencies to increase nurse participation in healthcare information technology decisions.
The methodology of this quality improvement project is promising and can pave the way as a method to evaluate and intervene informatics competency gaps in practice. This project was tailored to address specific informatics competency needs of the informaticists at the study site; therefore, results cannot be generalized. The NICA L3/L4 can be used as a baseline assessment to identify competency gaps in practice and as a tool to describe job descriptions, informatics skills and knowledge required (Hunter, et al., 2013). Although the results of this study may not be generalizable, the results reported provide guidance for future research in nursing informatics competency assessment. This project provides evidence that the use of a self-assessment informatics competency instrument is an effective method for identifying competency gaps and developing professional development programs to address the gaps. As health information technology changes, to ensure high quality and efficient care, healthcare organizations should focus on strategies to develop and improve staff informatics knowledge and competencies skills. This requires establishing a baseline assessment of the informatics competencies to guide curricula developments addressing competency needs (Choi & Zucker, 2012; Sipes, 2016; Liu, et al., 2015).
American Association of Colleges of Nursing (2008). The essentials of baccalaureate education for professional nursing practice. Retrieved from http://www.aacn.nche.edu/Education/pdf/BEdraft.pdf
American Nurses Association (2001). Nursing informatics: Scope and standards of practice. Washington, DC. American Nurses Association Publishing.
American Nurses Association (2015). Nursing informatics: Scope and standards of practice (2nd ed.). Silver Spring, Maryland: American Nurses Association.
Camilli, S. (2014). Plugging into Nursing Informatics: Preparation, Practice, and Beyond. Canadian Journal of Nursing Informatics (CJNI), 9(1-2). Retrieved from http://cjni.net/journal/?p=3508
Choi, J. & Zucker, D. M. (2012). Self-assessment of nursing informatics competencies for doctoral of nursing practice students. Journal of Professional Nursing, 29(6), 381-387.
Choi, J. & De Martinis, J. E. (2013). Nursing informatics competencies: assessment of undergraduate and graduate nursing students. Journal of Clinical Nursing. 22, 1970- 1976. Doi: 10.1111/jocn.12188
Chonsilapawit, T. & Rungpragayphan, S. (2016). Skills and knowledge of informatics, and training needs of hospital pharmacists in Thailand: A self-assessment survey. International Journal of Medical Informatics, 94, 255-262
De Ganged, J. C., Bisanar, W. A., Makowski, J. T., & Neuman, J. L. (2012). Integrating informatics into BSN curriculum: A review of the literature. Nurse Education Today, 32. 675-681. Doi: 10.1016/j.nedt.2011.09.003
Found, J. (2012). Developing competency in baccalaureate nursing education: Preparing Canadian nurses to enter today’s practice environment. Canadian Journal of Nursing Informatics, 7(2), 320-329.
Henry J. Kaiser Family Foundation (2016). Total number of professionally active nurses. Retrieved from http://kff.org/other/state-indicator/total-registered- nurses/?currentTimeframe=0
Hill, T., McGonigle, D., Hunter, K.M., Sipes, C., & Hebda, T. T.L. (2014). An instrument for assessing advanced nursing informatics competencies. Journal of Nursing Education and Practice, 4(7), 104-112
Hunter, H., McGonigle, D., & Hebda, T. (2013). The integration of informatics content in baccalaureate and graduate nursing education. Nurse Educator, 38(3), 110-113. doi: 10.1097/NNE.0b013e31828dc292.
Hwang, J. & Park, H. (2011). Factors associated with nurses’ informatics competency. CIN: Computer, Informatics, Nursing, 29(4). 256-262.
Institute of Medicine, (2003). Health care professional education: A bridge to quality. National Academics Press. Washington, DC. Retrieved form http://www.iom.edu/Reports/2003/health-professions-education-a-bridge-t…; quality.aspx
Institute of Medicine (2010). The future of nursing: Leading change, advancing health. Retrieved from https://www.nap.edu/catalog/12956/the-future-of-nursing-leading- change-advancing-health
Kleib, M., Simpson, N., Rhodes, B., (May 31, 2016) "Information and communication technology: Design, delivery, and outcomes from a nursing informatics boot camp" OJIN: The Online Journal of Issues in Nursing, 21(2), Manuscript 5.
Klieb, M., Sales, A. E., Lima, I, Andea-Baylon, M. & Beaith, A. (2010). Continuing education in informatics among registered nurses in the United States in 2000. The Journal of Continuing Education in Nursing 41 (7). 329-336. DOI: 10.3928/00220124-20100503-08
Liu, Ch, Lee, T, & Mills, M. T. (2015). The experience of informatics nurses in Taiwan. Journal of Professional Nursing, 31(2), 158-64. doi: 10.1016/j.profnur.2014.09.005
McGonigle, D., Hunter. K, Sipes, C., & Hebda, T. (2014). Why nurses need to understand nursing informatics. AORN Journal,100 (3). Retrieved from https://www.ncbi.nlm.nih.gov/pubmed/25172566
National League of Nursing, (2008). Preparing the next generation of nurses to practice in a technology-rich environment. An informatics agenda. Retrieved from http://www.nln.org/docs/default-source/professional-development-program…; the-next-generation-of-nurses.pdf?sfvrsn=6
Peltonen, L.-M., Topaz, M., Ronquillo, C., Pruinelli, L., Sarmiento, R. F., Badger, M. K., … Alhuwail, D. (2016). Nursing Informatics Research Priorities for the Future: Recommendations from an International Survey. Studies in health technology and informatics, 225, 222–226.
Schleyer, R. H., Bruch, C. K., & Schoessler, M. T. (2011). Defining and integrating informatics competences into a hospital nursing department. CIN: Computers, Informatics, Nursing, 29(3), 167-173.
Shultz, C. (2015). Preparing to work in n informatics-based world. National Student Nurses' Association Publication,56(3):36-9. Retrieved from http://www.nsna.org/Portals/0/Skins/NSNA/pdf/Imprint_AprMay09_Feat_Shul…
Sipes, C. (2016). Project Management: Essential Skill of Nurse Informaticists. Studies In Health Technology And Informatics, 225, 252–256. Doi:10.3233/978-1-61499-658-3-252.
Sipes, C., McGonigle, D., Hunter, K., Hebda, T., Hill, T., & Lamblin, J. (2016). Operationalizing the TANIC and NICA-L3/L4 Tools to Improve Informatics Competencies. Studies In Health Technology And Informatics, 225, 292–296. Doi:10.3233/978-1-61499-658-3-292
Staggers, N., Gassert, C. A., & Curran, C. (2001). Informatics competencies for nurses at four levels of practice. The Journal Of Nursing Education, 40(7), 303–316.
Sun, X., & Falan, S. (2013). What is your informatics skills level? The reliability of an informatics competency measurement tool. Transactions of the International Conference on Health Information Technology Advancement. Retrieved from http://scholarworks.wmich.edu/ichita_transactions/31
Technology Informatics Guiding Education Reform, (2009). TIGER informatics competencies collaboration final report. Retrieved from http://tigercompetencies.pbworks.com/f/TICC_Final.pdf
Technology Informatics Guiding Education Reform, (2014). Informatics competencies for every practicing nurse: Recommendations from the TIGER collaborative. Retrieved from http://www.thetigerinitiative.org/docs/TigerReport_InformaticsCompetenc…
Tellez, M. (2012). Nursing informatics education past, present, and future. Computer, Informatics, Nursing Journal 30(5), 229-234. doi: 10.1097/NXN.0b013e3182569f42
Leyla Pordeli DNP, MBA, RN completed her DNP in Nursing Leadership and her post-Masters certificate in Nursing Informatics in the Spring of 2017. She is an instructor in the Keigwin School of Nursing at Jacksonville University.