ISSN:
Annals of Circulation
Research Article       Open Access      Peer-Reviewed

Evaluating the relevance and clarity of the heart failure eMeasure implementation toolkit by using a web-based survey instrument

Megha Kalsy1-4*, Katherine Sward4,5, Bruce Bray3-5, Andrew Redd3,6, Karen Eilbeck4,7 and Jennifer H Garvin1-4

1VA Center for Health Information and Communication, Richard L. Roudebush VAMC, 1481 W 10th St, Indianapolis, IN, 46202, USA
2Division of Health Information Management and Systems, School of Health and Rehabilitation Sciences, The Ohio State University, 453 W 10th Ave., Columbus, OH, 43210, USA
3Informatics, Decision-Enhancement, and Analytic Sciences Center (IDEAS 2.0), VA Salt Lake City Health Care System, 500 Foothill Dr., Salt Lake City, UT, 84148, USA
4Department of Biomedical Informatics, University of Utah, 421 Wakara Way #140, Salt Lake City, Utah, 84108, USA
5The college of Nursing, University of Utah, 10 N 2000 E, Salt Lake City, Utah, 84112, USA
6University of Utah School of Medicine, Division of Epidemiology, 295 Chipeta Way, Salt Lake City, UT, 84132, USA
7University of Utah School of Medicine, Department of Human Genetics, 15 N 2030 E, Salt Lake City, UT, 84132, USA
*Corresponding author: Megha Kalsy, PhD, Division of Health Information Management and Systems, School of Health and Rehabilitation Sciences, The Ohio State University, 453 W 10th Ave, Columbus, OH, 43210, USA, Tel: 4409858404; E-mail: megha.kalsy@utah.edu
Received: 21 May, 2019 | Accepted: 03 August, 2019 | Published: 05 August, 2019
Keywords: eMeasures; Quality improvement; Health informatics; Survey; Heart failure; Implementation toolkit

Cite this as

Kalsy M, Sward K, Bray B, Redd A, Eilbeck K, et al. (2019) Evaluating the relevance and clarity of the heart failure eMeasure implementation toolkit by using a web-based survey instrument. Ann Circ 4(1): 001-008. DOI: 10.17352/ac.000014

Background: The impact on workflow is an important component in determining whether an HIT implementation will be successful. Workflow is, unfortunately, a concept that is often ignored when implementing HIT and the literature about workflow in domains of quality improvement, system implementation, and process improvement has not been adequate. HIT is not always designed to fit the workflow of a given organization, making it difficult to truly assess HIT impact on outcomes or processes.

The context and purpose of the study: The purpose of this study was to establish an initial toolkit, which would be generalizable to assess the impact of implementation on the workflow of quality improvement and information technology professionals’ and their team in the inpatient hospital setting. The toolkit, in the form of an implementation guide, is a compilation of resources such as checklists, forms, and planning documents that provided a template for workflow analysis.

Results: The expert evaluators rated the tools as moderately relevant, and moderately clear. The postimplementation tools were rated highest for relevance. Comments predominantly highlighted areas of the toolkit that needed additional depth or detail in the toolkit, rather than suggesting additional tools.

Main findings: This study provided a methodology for identifying information needs, detecting conflicts, and implementing possible workflow solutions; thus, the toolkit could serve as a pragmatically useful implementation guide to assess the workflow pertaining to the quality improvement and information technology professionals’ and their team during the implementation of eMeasures at an inpatient hospital setting.

Conclusions: The toolkit provided a useful collection of tools in the form of checklists, forms, and planning documents to enhance the workflow during implementation of heart failure (HF) eMeasures.

Brief summary: We developed an initial eMeasure Implementation Toolkit for the heart failure (HF) eMeasure to allow QI and information technology (IT) professionals and their team to assess the impact of implementation on workflow. During the development of the toolkit, we conducted stakeholder interviews with the VA key informants and subject matter experts, who provided valuable information about the context of understanding the workflow during the implementation of eMeasures. The information obtained from the stakeholder engagement highlighted areas and tools that were essential to the toolkit development phase. The final phase involved the evaluation of the eMeasure Implementation Toolkit for relevance and clarity by non-VA experts. The non-VA subject matter experts predominantly evaluated the sections of the toolkit that contained the tools for evaluating the workflow during eMeasure implementation via a survey.

Any potential implications: This eMeasure Implementation Toolkit should continue to be refined. The toolkit should undergo further evaluation with a variety of subject matter experts from various job categories and medical centers. Second, the toolkit should be tailored to the level of cognition of each user, while focusing on the syntax used to describe the sections of the toolkit. Finally, the toolkit could be developed for other quality measures such as stroke, diabetes, pneumonia, etc., based on the research methodology used in the development of the existing toolkit.

Abbreviations

HIT: Health Information Technology; QI: Quality Improvement; eMeasure: Electronic Performance Measure; HF: Heart Failure; IT: Information Technology; PARIHS: Promoting Action on Research Implementation in Health Services; QM: Quality Measurement; SME: Subject Matter Expert; VA: Veterans Affairs; UDOH: Utah Department of Health; SQL: Structured Query Language; NLP: Natural Language Processing.

Background

Technology is rapidly transforming healthcare by enabling the sharing of real-time health information across institutions to support patient care, administration, and research [1]. Health Information Technology (HIT) tools are being used as a component of interventions to improve the quality of care and to reduce costs [2]. Given their capacity to reduce costs, informatics methods are integral to healthcare quality metric assessment and reporting. Quality Improvement (QI) activities (e.g., data gathering) from sources such as electronic health records (EHRs), data warehouses, and decision support facilitate the evaluation of quality metrics [3]. Although Health Information Technology (HIT) support for QI activities is increasing, little research has been done on the workflow involved in the automation of quality metric assessment and reporting [4-6].

The impact on workflow is an important component in determining whether a Health Information Technology (HIT) implementation will be successful [7-9]. Workflow is, unfortunately, a concept that is often ignored when implementing HIT and the literature about workflow in domains of quality improvement, system implementation, and process improvement has not been adequate [10,11]. HIT is not always designed to fit the workflow of a given organization, making it difficult to truly assess HIT impact on outcomes or processes [12-14]. The literature demonstrates inadequate sophistication in studies regarding the role of workflow in the adoption of HIT in the domain of Quality Improvement (QI), due to the absence of formal workflow design and methodologies, lack of comprehensive knowledge about the system, and a lack of interest by the quality improvement staff towards the use of the new technology [15,16].

Conducting a comprehensive workflow analysis is a critical step in HIT implementation. Workflow analysis allows health centers to critically analyze the work environment. Workflow is loosely defined as a set of tasks that can be grouped chronologically into processes, and the set of people or resources needed for those tasks in order to accomplish an end goal [16]. One solution to addressing implementation challenges such as the lack of attention to workflow is an implementation toolkit [17,18]. An implementation toolkit is an assembly of instruments such as checklists, forms, and planning documents. Implementation toolkits are intended to provide guidance or assistance; they may provide a template or blueprint for what to do, when to do it, and how to do it. Users can apply an implementation toolkit in its entirety, or only apply certain portions that are informative for their needs. In this research, the workflow is comprised of a set of processes that are needed to transition from a manual approach of data collection to an automated one, the set of people or resources that are available to perform this transition, and the human-technology interactions between them. To ensure that HIT successfully integrates with workflow, it is essential to understand the current system before implementing the new technology [19,20]. Therefore, an implementation toolkit that supports workflow evaluation for HIT-enabled QI efforts needs to include evaluation of both the current workflow, and the potential impact of the new system on workflow.

The purpose of this study was to establish a generalizable toolkit to assess the impact of implementing electronic Quality Improvement (QI) reporting on the workflow of quality improvement professionals and their team, in the inpatient (hospital) setting. During the development phase of the toolkit, we undertook a literature review to determine the components of the toolkit. We conducted stakeholder interviews with HIT and QI key informants and subject matter experts (SMEs) at the US Department of Veteran Affairs (VA). Key informants provided a broad understanding about the context of workflow during eMeasure implementation. Based on snowball sampling, we also interviewed other SMEs based on the recommendations of the key informants who suggested tools and provided information essential to the toolkit development. The second phase involved evaluation of the toolkit for relevance and clarity, by experts in non-VA settings. The experts evaluated the sections of the toolkit that contained the tools, via a survey.

The final toolkit provides a distinct set of resources and tools, which were iteratively developed during the research and available to users in a single source document. The toolkit was a compilation of resources such as checklists, forms, and planning documents that provide a template for workflow analysis. The toolkit information was designed to support decision making on the implementation approaches related to workflow analysis. The research methodology provided a strong unified overarching implementation framework in the form of the Promoting Action on Research Implementation in Health Services (PARIHS) model in combination with a socio-technical model of HIT that strengthened the overall design of the study. The various phases of the research have been described in figure 1.

In this article, we summarized the results of the evaluation of the eMeasure Implementation Toolkit. We used a web-based survey to evaluate stakeholder perceptions about relevance and clarity of specific toolkit items, to assess for missing or unnecessary information, and perceptions about potential generalizability of the toolkit. This was a descriptive study, in which we recruited the participants from non-VA settings. These settings included federal and state healthcare institutions, private nonprofit healthcare organizations, academic institutions, and academic healthcare institutions. This was a single point in time, web-based survey. It was distributed using REDCap, a secure, web-based application for building and managing online surveys and databases.

Materials and Methods

Theoretical Framework

The study was guided by the Promoting Action on Research Implementation in Health Services (PARIHS) model as a unified overarching research framework [21]. This framework, which was developed in 1998 to explain the process of implementing research into practice, has helped explain the variable success of many types of implementation projects. The framework suggests that three elements (evidence, context, and facilitation) exist on a continuum from weak to strong in terms of the extent to which each supports successful implementation. Evidence examines the science behind the innovation to be implemented, context examines the environment in which the implementation will occur, and facilitation examines barriers and suggests strategies to support implementation. The PARIHS model has been used to guide implementation projects in VA settings. The PARIHS model was supplemented by concepts from a Socio-Technical Model for Studying Health IT [22], containing eight dimensions: 1) hardware and software, 2) clinical content, 3) human-computer interface, 4) people, 5) workflow and communication, 6) internal organizational factors, 7) external rules, and 8) measuring and monitoring. For the purpose of the research, we combined the PARIHS and socio-technical approaches to target the areas that were most useful in guiding this study. This study focused on context and facilitation, from the PARIHS model, and five of the socio-technical model dimensions: hardware and software, clinical content, workflow and communication, people, and internal organizational features.

Inclusion Criteria

We included 10 Quality Measurement (QM), HIT, and measure automation experts who had national leadership and technical roles and who had knowledge of quality performance measurement within a non-VA setting. The experts reviewed the toolkit online and participated in the online survey. The stakeholders had the following types of job categories:

1. Directors or associate directors of healthcare quality and safety

2. Primary care providers

3. Clinical Quality Program Specialists (QI team members)

4. Informatics professionals or HIT team professionals

Evaluation of the Assessment Toolkit

The QI, HIT, and measure automation experts were asked to review the eMeasure Implementation Toolkit online and to provide their viewpoint about the toolkit. The link to the survey was embedded in the Toolkit website; and hyperlinks in the survey allowed the participants to go back and review sections of the Toolkit. The survey contained demographic information and questions that represented toolkit elements. The experts were asked to rate each item in the toolkit for relevance and clarity and add additional comments at the end of the survey.

The total time of participation was approximately 20-30 minutes, approximately 10-15 minutes to review the toolkit and 10-15 minutes to complete the questionnaire. Participation in this study was voluntary and the experts could choose not to take part in the research. The experts could also omit any question they preferred not to answer without penalty or loss of benefits. We provided the URLs for the eMeasure Implementation Toolkit and the REDCap survey at the end of the questionnaire cover letter.

Generalizability of the Assessment Toolkit

The survey was developed using participants who worked at the VA. We assessed the generalizability of the eMeasure Implementation Toolkit by surveying a different set of subject matter experts (SMEs) from VA and non-VA settings. We asked the experts to evaluate the toolkit for relevance and clarity. The 10 subject matter experts were QI, HIT, and measure automation experts from 3 healthcare quality sectors including: the governmental sector at the federal level via the Department of Veterans Affairs, and at the state level, via the Utah Department of Health (UDOH), as well as from the academic medical center healthcare sector via the University of Utah Medical Center and Health Sciences, and Partners Healthcare.

Analysis

Descriptive Statistics: We obtained survey data from experts to evaluate the eMeasure Implementation Toolkit. We used descriptive statistics to summarize the results from the survey, for items rating relevance and clarity.

Content Validity: Evidence for content validity was supported by having QI, HIT, and measure automation experts review the content of the toolkit. Each item was rated for relevance to the underlying constructs (a scale of 1–5 will be used, with 1 being “very relevant”) and for clarity (a scale of 1–5 with 1 being “very clear”). A correlation of the assessment toolkit was measured via similarities in answers between the various non-VA settings. A qualitative research report documented the findings from this analysis.

Results

We evaluated the eMeasure Implementation Toolkit by using a survey in REDCap. The non-VA subject matter experts evaluated the eMeasure Implementation Toolkit for relevance and clarity. The PDF of the REDCap survey can be found in the Appendix.

The subject matter experts predominantly evaluated the sections of the toolkit that contained the tools for evaluating the workflow during eMeasure implementation. The total time of participation in the evaluation phase was approximately 20-30 minutes, approximately 10-15 minutes to review the toolkit and 10-15 minutes to complete the questionnaire. Table 1 lists the tools under each section of the eMeasure Implementation Toolkit that were evaluated by the non-VA subject matter experts.

Non-VA Stakeholders Who Participated in the Study

We included non-VA Quality Measurement (QM), Health Information Technology (HIT), and measure automation experts who had national leadership and technical roles and who had knowledge of healthcare quality performance measurement within a non-VA setting. The QM, HIT, and measure automation experts reviewed the toolkit online and participated in the online survey, to evaluate the eMeasure Implementation Toolkit for relevance and clarity.

The first section of the survey contained demographic information. A total of 10 non-VA subject matter experts participated in the online survey. The expert evaluators consisted of non-VA subject matter experts with job categories such as: directors or associate directors of quality and safety, information technology or quality improvement professionals, clinical quality program specialists, primary care providers, and health information coders. The subject matter experts were employed by various types of healthcare organizations, Figure 2 depicts the percentages of the expert evaluators by the type of organization where they were employed. The largest percentage of expert evaluators belonged to academic healthcare system, which amounted to 30% of the total participants who evaluated the toolkit for relevance and clarity. Evaluators from academic institutions amounted to 20% of the total participants who took the survey. Federal government and state government each formed 20% of the total percentage of expert evaluators, while nonprofit organizations formed 10% of the total percentage of evaluators who took the survey.

The non-VA stakeholder respondent characteristics have been described in table 2. Table 2 describes the demographic information of the expert evaluators by their respective type of organizations. In addition, the table contains information regarding the percentage of expert evaluators who belonged to that organizations, the range for the number of years of work experience at their respective institution, their position title, the range of the number of years of work experience in quality improvement, and the range of the number of years of work experience in health information technology. The respondends encompassed a broad range of work experience. Diverse key stakeholder types were represented ranging from clinicians, to IT professionals, to QI specialists.

Summary of the descriptive statistics results of the survey

The expert evaluators were asked to review the tools in the eMeasure Implementation Toolkit and provide their perspectives about the tools. The experts were asked to rate each item in the toolkit for relevance and clarity and add additional comments at the end of the survey. Each item was rated on a 1 to 5 scale, with 1 representing “very relevant” (or very clear) and 5 representing not at all relevant (or not at all clear). For each tool, the correlation between tool relevance and clarity was also assessed.

Section A: Determine pre-implementation requirements

Table 3 shows results for pre-implementation requirements. This section of the toolkit featured three tools: the planning checklist, a template for stakeholder interviews, and flowcharts. Each of the tools in this section was rated moderately relevant and moderately clear. There appears to be a strong correlation between relevance and clarity ratings for each tool. Completeness of the toolkit could be a concern, as comments included “Pre-implementation planning checklist - concerning the jobs lists diary knowing how busy the floor staff can be I’m not sure they would take the time to fill out the form in total”. Other comments included “I was a little unclear how I would use the planning checklist. A few more details or links to instructions would be valuable”.

Section B: Tools for Implementing a single eMeasure

Table 4 displays survey findings for four tools for implementing a single eMeasure. The analysis of the eMeasure document, standard terminology and data sources for implementing an eMeasure, structured query language tools, and natural language programming tools were the four tools in the single eMeasure implementation section.

The analysis of the eMeasure document, the standard terminology and data sources, and structured query language tools were each rated moderately for relevance and clarity. The natural language programming tools were rated moderately for relevance. For clarity of the natural language tools, the evaluators took a more neutral stance. There appears to be a strong correlation between relevance and clarity rating for each tool. On average, the tools in this section were seen as moderately relevant and moderately clear. Completeness could be a concern, as comments included “NLP section needs further work as this is an area that may hold the key to capturing data from a large portion of medical records”.

Section C: Tools for managing team activity

Table 5 displays results for three tools for determining the implementation of multiple eMeasures. The tools for version control, project evaluation, and templates for planning tools were the three tools in the multiple eMeasure implementation section.

Each of the tools in this section was rated moderately relevant and moderately clear. There appears to be a strong correlation between relevance and clarity rating for each tool. On average, the tools in this section were seen as moderately relevant and moderately clear. Comments included “HIT terminology is unfamiliar to me so clarity is difficult to rate”. Completeness of the tools for managing team activity could be a concern, as other comments suggested that additional tools in each of these tool categories might be helpful. The number of eMeasures being simultaneously implemented, as well as characteristics of the team, could impact eMeasure implementation workflow.

Section D: Determine post-implementation requirements

Table 6 displays results for three tools for post-implementation requirements. The tools for assessment of barriers and facilitators, and process improvement requirements, were rated very relevant and moderately clear. The post-implementation assessment to finalize workflows was rated moderately relevant and moderately clear. There appears to be a strong correlation between relevance and clarity rating for each tool. On average, the tools in this section were seen as very relevant and moderately clear

Findings summary

In general, the expert evaluators felt the toolkit was a useful collection of tools to assess the workflow during the implementation of eMeasures. Overall, the tools in the toolkit were rated by the reviewers as moderately relevant, and moderately clear. The post-implementation tools were rated highest for relevance. Comments predominantly highlighted areas of the toolkit that needed additional depth or detail in the toolkit, rather than suggesting additional tools.

Discussion

The impact on workflow is an important component in determining whether an HIT implementation will be successful. Workflow is, unfortunately, a concept that is often ignored when implementing HIT and the literature about workflow in domains of quality improvement, system implementation, and process improvement has not been adequate. HIT is not always designed to fit the workflow of a given organization, making it difficult to truly assess HIT impact on outcomes or processes [16]. The literature demonstrates inadequate sophistication in studies regarding the role of workflow in the adoption of HIT in the domain of QI, due to the absence of formal workflow design and methodologies, lack of comprehensive knowledge about the system, and a lack of interest by the quality improvement staff towards the use of the new technology [16].

One solution to addressing implementation challenges such as the lack of attention to workflow is an implementation toolkit. An implementation toolkit is an assembly of instruments such as checklists, forms, and planning documents. Implementation toolkits are intended to provide guidance or assistance; they may provide a template or blueprint for what to do, when to do it, and how to do it. Users can apply an implementation toolkit in its entirety, or only apply certain portions that are informative for their needs.

To ensure that HIT successfully integrates with workflow, it is essential to understand the current system before implementing the new technology [23,24]. Therefore, an implementation toolkit that supports workflow evaluation for HIT-enabled QI efforts needs to include evaluation of both the current workflow, and the potential impact of the new system on workflow [25].

The purpose of this study was to establish an initial toolkit, which would be generalizable to assess the impact of implementation on the workflow of quality improvement and information technology professionals’ and their team in the inpatient hospital setting. The toolkit, in the form of an implementation guide, is a compilation of resources such as checklists, forms, and planning documents that provided a template for workflow analysis. The toolkit was designed to support and provide guidance on developing and implementing plans for achieving optimal workflow at any acute inpatient setting. The toolkit information was designed to support decision making on the implementation approaches related to workflow analysis.

The final phase involved evaluation of the eMeasure Implementation Toolkit for relevance and clarity by experts in non-VA settings. The non-VA subject matter experts evaluated the sections of the toolkit that contained the tools for evaluating the workflow during eMeasure implementation via a survey. During the evaluation phase of the toolkit, the expert evaluators rated the tools as moderately relevant, and moderately clear. The post-implementation tools were rated highest for relevance. Comments predominantly highlighted areas that needed additional depth or detail in the toolkit, rather than suggesting additional tools.

The toolkit provides a distinct set of resources and tools, which were available to the users in a single consolidated document. The research methodology provided a strong unified overarching implementation framework in the form of the Promoting Action on Research Implementation in Health Services (PARIHS) model [21], in combination with socio-technical model of HIT [22], that strengthened the overall design of the study.

Conclusion

The purpose of this study was to establish a generalizable toolkit to assess the impact of implementing eMeasures on the workflow of quality improvement and information technology professionals’ and their team in the inpatient hospital setting. The toolkit developed during this research was guided by a strong unified overarching implementation framework in the form of the Promoting Action on Research Implementation in Health Services (PARIHS) model [21], in combination with a socio-technical model of HIT [22], that strengthened the overall design of our study. The toolkit provided a useful collection of tools in the form of checklists, forms, and planning documents to enhance the workflow during implementation of eMeasures.

The final phase involved the evaluation of the eMeasure Implementation Toolkit for relevance and clarity by non-VA experts. The non-VA subject matter experts predominantly evaluated the sections of the toolkit that contained the tools for evaluating the workflow during eMeasure implementation via a survey. During the evaluation phase of the toolkit, the expert evaluators rated the tools as moderately relevant, and moderately clear. The post-implementation tools were rated highest for relevance. Comments predominantly highlighted areas of the toolkit that needed additional depth or detail in the toolkit, rather than suggesting additional tools.

Future directions

There are numerous future directions that may be drawn as a result of this study. Some of these relate to the findings of the study, while others deal with the potential applications and use of the developed eMeasure Implementation Toolkit. Each is enumerated in detail below.

Future direction one: This eMeasure Implementation Toolkit should continue to be refined. The toolkit should undergo further evaluation with a variety of subject matter experts from various job categories and medical centers. Further evaluation of the eMeasure Implementation Toolkit may result in newer tools being recognized.

Future direction two: Further research should be done to develop a specialized toolkit for beginners. There are different levels of cognition and diverse use cases involved in the implementation of eMeasures. The toolkit should be tailored to the level of cognition of each user, while focusing on the syntax used to describe the sections of the toolkit. A toolkit map could be created to point to the relevant use cases for each category of users to simplify the steps involved in assessing the workflow during eMeasurement. In addition, it would be important to provide extra links to beginners about the background information to understand the details about the eMeasurement process.

Future direction three: The eMeasure Implementation Toolkit could be implemented in an actual healthcare setting to determine the usefulness. It would be essential to have quality improvement and information technology professionals use the toolkit and then determine if additional modifications are needed.

Future direction four: The toolkit could be developed for other quality measures such as stroke, diabetes, pneumonia, etc., based on the research methodology used in the development of the existing toolkit.

Appendix-REDCap Survey.

This research is supported by VA HSR&D IBE 09-069. Views expressed are those of author and not necessarily those of the Department of Veterans Affairs or affiliated institutions. I thank the VA stakeholder and non-VA stakeholder partners for their valuable contributions to this research study. I also thank the members of my dissertation committee, Dr. Jennifer Garvin PhD, MBA, RHIA, CTR, CPHQ, CCS, FAHIMA (Chair, Department of Biomedical Informatics, Division of Epidemiology, University of Utah; VA Healthcare System Salt Lake City, Utah), Dr. Katherine Sward PhD, RN (Department of Biomedical Informatics, College of Nursing; University of Utah), Dr. Andrew Redd PhD (Statistician, Department of Epidemiology; University of Utah), Dr. Bruce E Bray MD (Department of Biomedical Informatics, College of Nursing, Department of Internal Medicine; University of Utah) , and Dr. Karen Eilbeck PhD MSc (Department of Biomedical Informatics, Department of Human Genetics; University of Utah). Additional thanks go to Natalie Kelly MBA (Project coordinator, VA Healthcare System IDEAS Center, Salt Lake City; Utah), Jonathan Cardwell MS Note Taker (Department of Biomedical Informatics; University of Utah), Troy Perchinsky MBA Note Taker (Tippie School of Management; University of Iowa).

  1. Clarke JL, Bourn S, Skoufalos A, Beck EH, Castillo DJ (2017) An Innovative Approach to Health Care Delivery for Patients with Chronic Conditions. Popul Health Manag 20: 23-30. Link: http://bit.ly/2YRdMf3
  2. Bowles KH, Dykes P, Demiris G (2015) The use of health information technology to improve care and outcomes for older adults. Res Gerontol Nurs 8: 5-10. Link: http://bit.ly/2Kh8RvN
  3. Weiner JP, Fowles JB, Chan KS (2012) New paradigms for measuring clinical performance using electronic health records. Int J Qual Health Care 24: 200-5. Link: http://bit.ly/2OF2Pcw
  4. Weigl M, Müller A, Vincent C, Angerer P, Sevdalis N (2012) The association of workflow interruptions and hospital doctors' workload: a prospective observational study. BMJ Qual Saf 21: 399-407. Link: http://bit.ly/2GKAdbu
  5. Hughes RG (2008) Tools and Strategies for Quality Improvement and Patient Safety. In: Hughes RG, editor. Patient Safety and Quality: An Evidence-Based Handbook for Nurses. Rockville (MD): Agency for Healthcare Research and Quality (US). Link: http://bit.ly/2Zz9aqK
  6. Allard T, Alvino P, Shing L, Wollaber A, Yuen J (2019) A dataset to facilitate automated workflow analysis. PLoS ONE. 14: e0211486. Link: http://bit.ly/2yEmxKI
  7. Chassin MR, Galvin RW (1998) The urgent need to improve health care quality. Institute of Medicine National Roundtable on Health Care Quality. JAMA 280: 1000-1005. Link: http://bit.ly/2YpmD81
  8. Cohn KH, Hough DE (2008) The Business of Healthcare 3: 736. Link: http://bit.ly/31gXXf1
  9. Zheng K, Haftel HM, Hirschl RB, O'reilly M, Hanauer DA (2010) Quantifying the impact of health IT implementations on clinical workflow: a new methodological perspective. J Am Med Inform Assoc 17: 454-61. Link: http://bit.ly/2OIn09o
  10. Squires DA (2012) Explaining high health care spending in the United States: An international comparison of supply, utilization, prices, and quality. Issue Brief (Commonw Fund). 10: 1-14. Link: http://bit.ly/2LZqphR
  11. Stabile M, Cooper L (2013) Review article: The evolving role of information technology in perioperative patient safety. Can J Anaesth 60:119-126. Link: http://bit.ly/2T3OFjw
  12. Shortliffe EH (2005) Strategic action in health information technology: Why the obvious has taken so long. Health Aff 24: 1222-1233. Link: http://bit.ly/2LZqke3
  13. Fischer L (2007) BPM and Workflow Handbook. Lighthouse Point, Fla.: Future Strategies
  14. Lee J, Cain C, Young S, Chockley N, Burstin H (2005) The adoption gap: Health information technology in small physician practices. Health Aff 24:1364-1366. Link: http://bit.ly/2KwZUNQ
  15. Langley J, Beasley C (2007) Health information technology for improving quality of care in primary care settings. Rockville, Agency for Healthcare Research and Quality. Link: http://bit.ly/2Ylvbgi
  16. Cain C, Haque S. Organizational workflow and its impact on work quality. In: Hughes RG, ed. Patient Safety and Quality: An Evidence-Based Handbook for Nurses. Advances in Patient Safety. Rockville, MD: Agency for Healthcare Research and Quality (US); 2008:217-244.
  17. (2015) How to build an implementation toolkit from start to finish. Link: http://bit.ly/2KhYJ5O .
  18. MacFarlane A, Clerkin P, Murray E, Heaney DJ, Wakeling M, et al. (2011)The e-Health Implementation Toolkit: qualitative evaluation across four European countries. Implement Sci 6:122. Link: http://bit.ly/2OGlaWG
  19. Lykowski G, Mahoney D (2004) Computerized provider order entry improves workflow and outcomes. Nurs Manage 35: 40G-40H. Link: http://bit.ly/2YF8B1v
  20. Han YY, Carcillo JA, Venkataraman ST, Clark RS, Watson RS, et al. (2005) Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics 116:1506-1512. Link: http://bit.ly/2OC0RJR
  21. Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, et al. (2008) Evaluating the successful implementation of evidence into practice using the PARiHS framework: Theoretical and practical challenges. Implement Sci 3: 1. Link: http://bit.ly/2MCoGi0
  22. Sittig D, Singh H (2010) A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care19: i68-i74. Link: http://bit.ly/2yCPwOP
  23. Lykowski G, Mahoney D (2004) Computerized provider order entry improves workflow and outcomes. Nurs Manage 35: 40G-40H. Link: http://bit.ly/2YF8B1v
  24. Han YY, Carcillo JA, Venkataraman ST, Clark RS, Watson RS, et al. (2005) Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics 116:1506-1512. Link: http://bit.ly/2OC0RJR
  25. Singh R, Singh A, Singh DR, Singh G (2013) Improvement of workflow and processes to ease and enrich meaningful use of health information technology. Adv Med Educ Pract 4: 231-236. Link: http://bit.ly/2M1kV6n
© 2019 Kalsy M, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.