Citation: Mwase, A., Bwiino, K. & Kyambadde, A. (2022). An evaluation of a Framework for Supporting Ehealth Service Delivery in a Ugandan rural setting. Online Journal of Nursing Informatics (OJNI), 26(2).
Background: eHealth is an emerging field within the intersection of medical informatics, public health, business, and clinical practices. It can be applied to meet the rising global demand for cost effective and reliable services in the health sector. As a result, eHealth has been put forward on the planning agendas of different organizations.
Purpose: Consequently, this study aims at evaluating a designed framework for supporting eHealth service delivery in a rural setting in Uganda, specifically in the Iganga district.
Method: A quantitative research approach was adopted, and the framework was evaluated using the Delphi technique amongst 13 eHealth experts. The evaluation criterion consisted of three parameters; functionality, usability and traceability. SPSSv20 was used for data analysis and descriptive statistics were generated.
Results: The results revealed that the framework was usable, understandable, and applicable in addressing the major challenges hindering eHealth service delivery in a rural setting. The study recommends further studies focused on developing a framework for enhancing adoption of eHealth service delivery in Uganda.
In developing countries, especially where a large proportion of the population still resides in rural areas, healthcare access and delivery are often poor, and can potentially benefit from innovative service models and supporting technologies (Barjis et al, 2013). Health perspectives often differ between rural and urban communities. The health perceptions of rural and urban residents significantly reflect their health-promotion behaviors, health maintenance, and illness treatment (Ouma & Herselman, 2009).
Health service delivery to rural communities has always been challenged since specialized services and infrastructure are usually less available (Ouma & Herselman, 2009). It is argued that rural communities are confronted with the out-migration of working-age adults from rural to urban areas and the in-migration of former urban dwellers, often at retirement age. This affects the quality of life and health in rural communities (Hage et al., 2013). Nonetheless, Electronic health (eHealth) services are seen as one solution to these concerns. This is because eHealth provides diverse web portals and can encompass both core healthcare services and social innovation (Hage et al., 2013). eHealth is the application of information and communication technologies (ICT) across a whole range of functions that affect the healthcare industry (Ouma & Herselman, 2009).
eHealth initiatives present the ability to tackle challenges that exist within the healthcare industry especially for the rural communities (Ouma & Herselman, 2009). These initiatives include artifacts in the form of models, frameworks, web based and mobile based applications or platforms. However, it is argued that a designed artifact ought to be verified for quality, effectiveness, and efficiency using well-grounded assessment approaches (Iivari & Venable, 2009). Offermann et al (2007) added that a designed artifact can be evaluated in terms of completeness, accuracy, functionality, reliability, consistency, traceability, understandability, and usability. In another related study, the researchers rigorously evaluated designed artifacts using observational, analytical, experimental, testing, and descriptive evaluation methods (Hevner et al, 2004). Oesterle et al., (2010), further argued that artifact evaluation should be done through laboratory experiments, pilot applications (i.e., instantiation of prototypes), simulation procedures, expert reviews (Delphi technique approach), and field experiments (i.e., in several user organizations). This paper, therefore, is aimed at evaluating a designed framework for supporting eHealth Service Delivery in Iganga district, a rural setting of Uganda using the expert judgment evaluation method.
Healthcare in Rural Areas
While the urban localities have healthcare options from five-star medical colleges to small private dispensaries run by trained doctors, the rural areas often are left with untrained private practitioners (Ouma & Herselman, 2009). The quality of healthcare in the rural areas is often constrained. In these rural areas, the challenges of healthcare quality are many, ranging from poor infrastructure, low literacy, poverty, to inadequate monitoring of patients with chronic or serious diseases (Barjis et al., 2013). Patients in rural areas often incur heavy expenditures from traveling long distances and spending a lot of time to consult with specialists in urban areas due to the lack of specialists locally (Sudhahar et al., 2010). The myriad of challenges requires innovative solutions that are affordable, robust, and sustainable over time (Barjis et al., 2013).
Evaluation of Artifacts
It is argued that evaluation is a core activity in conducting design science research and artifact construction (Baskerville et al, 2018). This is because a novel IT artifact must demonstrate measurable improvements to illustrate technology evolution, advancement and thus acceptance (Elragal & Haddara, 2019).
According to Venable et al., (2012), evaluations provide evidence that a new technology developed in design science research (DSR) works or achieves the purpose for which it was designed. Without evaluation, outcomes of DSR are unsubstantiated assertions that the designed artifacts, if implemented and deployed in practice, will achieve their purpose. Hevner et al., (2004) further posited that to rigorously reveal the quality of a designed artifact, design science requires proper artifact evaluation. This presents a formal procedure to determine whether the artifact is complete, effective and applicable. It is suggested that evaluation of a designed artifact may be performed at two levels: the abstract artifact is either assessed directly, or through one or several instantiations (Prat et al, 2014).
Additionally, evaluation criteria are classified along system dimensions and may be decomposed into several levels, forming a hierarchy. The same criterion may be assessed by several generic evaluation methods. Generic evaluation methods vary along four fundamental characteristics: form of evaluation, secondary participant, level of evaluation, and relativeness of evaluation (Prat, et al, 2014). The structure of artifacts can be assessed by completeness, simplicity, clarity, style, homomorphism, level of detail, and consistency (March and Smith, 1995). Sonnenberg and vom Brocke (2012) added the criterion of clarity.
Parameters for the evaluation criterion
A designed artifact can be evaluated in terms of functionality, completeness, consistency, accuracy, performance, reliability, usability, fit with the organization, and other relevant quality attributes (Venable et al., 2004; Hevner et al., 2004). Evaluation of artifacts takes too much time mainly because it involves a lot of parameters and at the same time some parameters are difficult to apply (Van Hee & Van Overveld, 2012; Abima, 2015). Due to time constraints, the study adopted three parameters for the evaluation criterion of the Framework for Supporting eHealth Service delivery (FSEHSD). In addition, it was considered more important to determine whether the framework performed its functions well (functionality), was easy to use (usability) and had traceability. Each of the evaluation parameters are explained below:
Usability (ease of use): Usability is the degree to which a product can be used by specified users to realize intended objectives with efficiency, effectiveness, and satisfaction in a specified context of use (Niazi et al., 2003). The purpose of this parameter is to identify areas of confusion and ambiguity for the users which, when improved increase the efficiency and quality of a users’ experience with the framework.
Traceability: Traceability is defined as the ability to chronologically interrelate the uniquely identifiable entities in a way that matters (Olsena and Borit, 2012). This parameter was used to measure how well the framework steps and guidelines/principles can be traced in the designed framework. It looks at how the framework requirements can be traced from the origin through the interconnections and interdependences.
Functionality: The functionality of an entity is defined as its intended behavior, interpretation of its behavior under a goal, a kind of hierarchical abstraction or effects to the environment of the entity (Kitamura and Mizoguchi,1999).This parameter was used to measure whether the framework addresses all the major challenges hindering eHealth service delivery in the Iganga District.
The Delphi technique is a group process used to survey and collect the opinions of experts on a particular subject. The Delphi technique is applied whenever policies, plans, or ideas have to be based on informed judgment. This technique is useful where the opinions and judgments of experts and practitioners are needed (Yousuf, 2007). It is observed that the Delphi technique uses a series of judges as experts to define or evaluate components of a theoretical issue (Linstone & Turoff, 1975).
According to Giannarou and Zervas (2014) there are two important factors when conducting the Delphi technique namely, the panel size and the response rate. In both cases, there are not strict rules to conducting the technique. It is inferred that the group size is highly related to the purpose of the investigation (Giannarou & Zervas,2014) and the response rate may differ across the different disciplines, according to the participants’ research interest (Mason & Alamdari, 2007). It is proposed that the sample ranges from 7 to 30 (Mullen, 2003). Additionally, it is claimed that the panel’s size selection is determined by the homogeneity, since in this case a sample of between 10 to 15 people can yield sufficient results and assures validity (Skulmoski et al., 2007).
This study adopted the Design Science Research (DSR) methodology. Design science methodology attempts to create items that serve human purposes, is technology oriented (March and Smith, 1995) and is a paradigm in information system science for understanding, executing, and evaluating research that aims at designing new and novel artifacts intended to solve identified organizational problems (Hevner et al., 2004). The artifacts can be defined as constructs, models, methods, or instantiations.
A quantitative research approach was adopted and used in the evaluation exercise. Avison and Heje (2005) argued that quantitative research enables researchers to answer scholarly and practical questions about the interaction of humans and artifacts such as computer systems and applications.
According to Skulmoski et al., (2007), when applying the Delphi technique, a sample of between 10 to 15 people yields sufficient results. In this study, therefore 13 eHealth experts were purposively selected from the five health facilities within two divisions (Central and Northern division) of the Iganga District. A five-point Likert scale questionnaire was administered for data collection. SPSS v20 software was used for data analysis and descriptive statistics were generated.
To ensure validity, the questionnaire was tested to check its content, construct, and face validity. Content validity refers to how well an instrument includes a representative sample of questions that relate to the content domain being measured (Patten, 2004), while Construct validity determined the nature of psychological construct or characteristics being measured by the instrument. Content and construct validity were ensured by experts, supervisors and peers from Makerere University who helped in the review to ensure the instrument accurately measured the variables it was intended to measure in the study. Face validity dealt with instrument format and included aspects like clarity of printing, font size and type, adequacy of workspace, and appropriateness of language among others identified by peer review.
Reliability indicates the degree to which a survey instrument is consistent with what it measures. To ensure reliability, the instrument was pre-tested with a sample of 20 health workers in the Iganga district, who were not necessarily included in the final sample. The number 20 was chosen for the pre-test because, according to Israel (2003), it is the smallest number that can yield meaningful results on data analysis in a survey study. Pre- test results showed that the questions were easily understood and answered in the same way by the sampled health workers, hence the instrument was reliable.
This study adopted the expert judgment (Delphi technique approach) evaluation method to evaluate the FSEHSD. This is because the expert judgment relies on a group of experienced scientists with a good understanding of environmental problems and who are the most knowledgeable and capable members of society to judge the relative significance of interventions (Virtanen et al, 1999). Expert judgment further plays a vital role in risk management, uncertainty analysis, and decision-making (Beaudrie et al, 2016).
The Evaluated framework (FSEHSD)
In this study, the artifact is the designed Framework for Supporting eHealth Service delivery in a rural setting of Uganda (FSEHSD). This is presented in figure 1.
The FSEHSD was designed basing on adopted design decisions from existing frameworks that is: the Shifo framework for sustaining eHealth and health service delivery by (Shifo 2015), the RapidSMS framework for healthcare service delivery (Ministry of Health, 2013) and derived decisions from primary data. The various components of the FSEHSD were graphically illustrated using Microsoft Office Visio since this software makes it easy to draw complicated diagrams.
Explanation of the Framework for Supporting eHealth Service delivery in a Ugandan rural setting
A favorable government policy should provide a legal framework to support the eHealth budget, and this will lead to availability of funds for supporting eHealth in terms of setting up the infrastructure which includes implementations of suitable secure backing up of data in a secondary device or location and power backups like standby generators and solar power. Hence a constant power supply that leads to internet connectivity to support eHealth service delivery.
Availability of funds will also aid capacity building that will enable the recruitment of qualified staff and ICT personnel, ICT trainings and introduction of eHealth to the public. Once the staff are trained, they can develop, operate and provide sustainable technical support for coordinated eHealth systems. eHealth systems should provide data reporting and a feedback mechanism in the form of SMS alerts and emails to service recipients as well as stakeholders. Finally, appropriate monitoring and evaluation techniques should be implemented across the whole chain to ensure effectiveness and comprehensiveness. Thus, the FSEHSD can be used to support eHealth service delivery in Ugandan rural settings since most of the challenges concerning eHealth service delivery have been addressed. Moreover, the experts’ opinions have also been incorporated.
A total of 13 participants were purposively selected to participate in the evaluation exercise. These participants included 2 database administrators, 2 IT/ IS managers and 2 IT/IS experts from the five health facilities and 3 eHealth experts.
Framework usability (ease of use).
The results presented in Table 2 indicate that the majority (72.7%) of the respondents agreed that the framework is easy to understand. In addition, 63.6% of the respondents indicated that the framework requires little or no training to be used. It was also observed that 63.6% of the respondents reported that the framework is easy to learn and use.
The results presented in Table 3 indicate that a majority (81.8%) of the respondents indicated that the various components of the framework are interdependent on each other. In addition, 72.7% of the respondents indicated that guidelines/principles of the framework are interrelated. It was also observed that 54.5% of the respondents indicated that the factors/variables leading to the support of eHealth are logically arranged.
Regarding the functionality evaluation parameter, the results indicated that the majority (54.5%) of the respondents reported that the framework addresses all the major challenges hindering eHealth service delivery in the Iganga district. In addition, 72.7% of the respondents agreed that the framework simplifies the process of supporting eHealth service delivery by providing guidelines or principles to be followed. Lastly, it was also observed that 63.6% of the respondents indicated that the framework contributed to the support of eHealth service delivery in the Iganga district. This implies that the FSEHSD can support eHealth service delivery in the Iganga District since most of the challenges concerning eHealth service delivery have been addressed.
The results in Table 5 indicated that all the experts thought that the framework was easy to learn and use, and the majority (66.7%) agreed that the framework is easy to understand. In addition, 33.3% of the experts indicated that the framework requires little or no training to be used.
The results in Table 6 indicate that 33.3% of respondents reported that the various components of the framework are interdependent on each other, 66.7% of the respondents reported that the guidelines/principles of the framework are interrelated and 33.3% of respondents disagreed that the factors/variables leading to supporting of eHealth were logically arranged.
eHealth experts’ evaluative opinions regarding the parameter of traceability indicated that the majority (66.7%) agreed that the framework addresses all the major challenges hindering eHealth service delivery in the Iganga district. In addition, 33.3% of the respondents indicated that the framework simplifies the process of supporting eHealth service delivery by providing guidelines or principles to be followed. Lastly, it was also observed that 66.7% of the respondents indicated that the framework contributes to the support of eHealth service delivery in the Iganga district. This means that the FSEHSD can be used to support eHealth service delivery in the Iganga District since the majority of the challenges concerning eHealth service delivery have been addressed.
General Expert recommendations for an improved FSEHSD
Through face-to-face interactions, the experts recommended government consideration as one of the constructs to the framework. This was given focus by the researchers and represented on the framework. Additionally, the experts suggested that the variables for the FSEHSD be rearranged to a more uniform flow such as a bottom-up approach, top to bottom, or left to right. This issue was given attention by the researchers and hence the FSEHSD variables were rearranged to the new top to bottom flow as seen in Figure 1.
One of the experts suggested the need to consider integration and interoperability of information from different eHealth systems. With the staff ICT training and human resource recruitment and capacity building constructs as part of the FSEHSD, the staff shall be equipped with skills to develop interoperable eHealth systems that integrate information from different eHealth systems. Therefore the FSEHSD was fine-tuned and thus improved to incorporate the suggestions from the experts.
Conclusively, this study looked at the concept of evaluating a designed framework for supporting eHealth service delivery (FSEHSD) in a rural setting to ensure that health facilities seamlessly deliver services and exchange information amongst health workers and patients. The FSEHSD was evaluated using the Delphi technique and the findings revealed that the framework was useful, and the framework layout was understandable and applicable. This affirmed that the FSEHSD can support eHealth service delivery in a Ugandan rural setting.
The potential for effective use of eHealth initiatives in healthcare in rural settings are promising. Therefore, this study recommends that Health service providers use the developed framework (FSEHSD) when planning and implementing new electronic services and technological innovations. Further research should focus on developing a framework for enhancing adoption of eHealth service delivery in rural areas.
Powered by the HIMSS Foundation and the HIMSS Nursing Informatics Community, the Online Journal of Nursing Informatics is a free, international, peer reviewed publication that is published three times a year and supports all functional areas of nursing informatics.
Abima, B. (2015). A service-oriented framework for guiding the development of interoperable-health systems in Uganda. Makerere University, Kampala.
Avison, D.E., & Pries-Heje, J. (2005). Research in information systems: a handbook for research supervisors and their students. Butterworth-Heinemann.
Barjis, J., Kolfschoten, G., & Maritz, J. (2013). A sustainable and affordable support system for rural healthcare delivery. Decision Support Systems, 56, 223-233.
Baskerville, R., Baiyere, A., Gregor, S., Hevner, A. & Rossi, M. (2018). Design science research contributions: Finding a balance between artifact and theory. Journal of the Association for Information Systems, 19(5), 358–376.
Beaudrie, C., Kandlikar, M. & Ramachandran, G. (2016). Chapter 5: Using Expert Judgment for Risk Assessment. In Gurumurthy Ramachandran, Editor, Micro and Nano Technologies,
Assessing Nanoparticle Risks to Human Health, William Andrew Publishing, p.109-138. https://doi.org/10.1016/B978-1-4377-7863-2.00005-4
Elragal, A. & Haddara, M. (2019). Design Science Research: Evaluation in the Lens of Big Data Analytics. Systems, 7(2), 27. https://doi.org/10.3390/systems7020027
Giannarou, L., & Zervas, E. (2014). Using Delphi technique to build consensus in practice. International Journal of Business Science & Applied Management (IJBSAM), 9(2), 65-82.
Hage, E., Roo, J.P., van Offenbeek, M.A. & Boonstra, A. (2013). Implementation factors and their effect on e-Health service adoption in rural communities: a systematic literature review. BMC Health Service Research, 13, Article 19.
Hevner, A., March, S., Park, J. & Ram, S. (2004) Design Science Research in Information Systems. Management Information Systems Quarterly, 28(1), p. 75-105.
Iivari, J. & Venable, J. R. (2009). Action research and design science research-Seemingly similar but decisively dissimilar. European Conference on Information Systems (ECIS) Proceedings. https://core.ac.uk/download/pdf/301355252.pdf
Israel, G. (2003). Determining sample size. University of Florida.
Khoja, S., Durrani, H., Scott, R. E., Sajwani, A & Piryani, U. (2013). Conceptual Framework for Development of Comprehensive e-Health Evaluation Tool. Telemedicine and e-health, 19(1), 48 – 53.
Kitamura,Y. & Mizoguchi, R. (1999). Meta-Functions of Artifacts. Proceedings of The Thirteenth International Workshop on Qualitative Reasoning (QR-99), pp.136-145.
Kothari, C.R. (2009). Research Methodology: Methods & Techniques.New Age International Ltd.
Linstone, H. A., & Turoff, M. (Eds.). (1975). The Delphi method (pp. 3-12). Addison-Wesley.
March, S. T. and Smith, G. F. (1995). Design and natural science research on information technology. Decision Support Systems, 15(4), 251-266.
Mason, K. J. & Alamdari, F. (2007). EU network carriers, low-cost carriers and consumer behavior: A Delphi study of future trends. Journal of Air Transport Management, 13(5), 299-310.
Ministry of Health (2010). Health Sector Strategic Plan III 2010/11-2014/15. https://www.health.go.ug/docs/HSSP_III_2010.pdf
Ministry of Local Government (2013). Principles of Service Delivery in Uganda’s Local Governments Handbook. https://www.undp.org/uganda/publications/principles-service-delivery-uganda%E2%80%99s-local-governments-handbook
Mullen, P. M. (2003). Delphi: Myths and reality. Journal of Health Organization and Management, 17(1), 7-52.
Niazi, M., Wilson, D. & Zowghi, D. (2003). A model for the implementation of software process improvement: a pilot study. Third International Conference on Quality Software. Proceedings 2003, p. 196-203, doi: 10.1109/QSIC.2003.1319103.
Oesterle, H., Becker, J., Frank, U., Hess, T., Karagiannis, D., Krcmar, H., Loos, P., Mertens, P.,
Oberweis, A. & Sinz, E. (2010). Memorandum on design-oriented information systems research. European Journal of Information Systems, 20(1), 7-10.
Olsena, P. & Borit, M. (2012). How to define traceability. Trends in Food Science & Technology, 29(2), 1-9.
Ouma, S. & Herselman, M. (2009). E-health in rural areas: Case of developing countries. International Journal of Humanities and Social Sciences, 2(4), 560-566.
Patten, M.L. (2004). Understanding research Methods. (4th ed.). Pyrczak Publishing.
Prat, N., Comyn-Wattiau, I., & Akoka, J. (2014). Artifact Evaluation in Information Systems Design-Science Research - a Holistic View. PACIS Proceedings.
Shifo Foundation. (2015). Uganda. https://www.shifo.org/country/uganda
Skulmoski, G. J., Hartman, F. T., & Krahn, J. (2007). The Delphi method for graduate research. Journal of Information Technology Education, 6, 1-21.
Sonnenberg, C., vom Brocke, J. (2012). Evaluations in the Science of the Artificial – Reconsidering the Build-Evaluate Pattern in Design Science Research. In: Peffers, K., Rothenberger, M., Kuechler, B. (eds) Design Science Research in Information Systems. Advances in Theory and Practice. DESRIST 2012. Lecture Notes in Computer Science, vol 7286. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29863-9_28
Sudhahar, S., Vatsalan, D., Wijethilake, D., Wickramasinghe, Y., Arunathilake, S., Chapman, K., & Seneviratna, G. (2010, February). Enhancing rural healthcare in emerging countries through an eHealth solution. In 2010 Second International Conference on eHealth, Telemedicine, and Social Medicine (pp. 23-28). IEEE.
van Hee, K. & van Overveld, K. (2012). New criteria for assessing a technological design. School for Technological Design, Stan Ackermans Institute. https://www.4tu.nl/sai/testimonials/2012-12-10_2012_April_NewCriteriaSA…
Venable, J., Pries-Heje, J. & Baskerville, R. (2016). FEDS: a Framework for Evaluation in Design Science Research. European Journal of Information Systems, 25, 77–89
Venable, John, Pries-Heje, Jan, & Baskerville, Richard. (2012). A Comprehensive Framework for Evaluation in Design Science Research. In K. Peffers, M. Rothenberger & B. Kuechler (Eds.), Design Science Research in Information Systems. Advances in Theory and Practice (Vol. 7286, pp. 423-438). Springer.
Virtanen, Y., Torkkeli, S., & Wilson, B. (1999). Evaluation of a Delphi technique based expert judgement method for LCA valuation DELPHI II. VTT Technical Research Centre of Finland. VTT Tiedotteita - Meddelanden - Research Notes No. 1972 https://publications.vtt.fi/pdf/tiedotteet/1999/T1972.pdf
Yousuf, M. I. (2007). Using Experts` Opinions Through Delphi Technique. Practical Assessment, Research, and Evaluation, 12, Article 4.
Ali Mwase is a Lecturer at Makerere University Business School. He has a Master of Information Technology and has conducted research in hospital management information system and eHealth. Currently, he is pursuing a PhD in Information systems and studying the area of Fintech security.
Keefa Bwiino is a Lecturer at Makerere University Business School. He has a Master of Information Technology and has conducted research in eLearning, and he is currently pursuing a PhD in Information Technology studying the area of eLearning.
Abdnoor Kyambadde is a Lecturer at Makerere University Business School. He has a Master of Information Technology and has conducted research in cloud computing, and he is currently pursuing a PhD in Information Technology studying the area cloud computing.