supporting hitech implementation and assessing lessons for the future: the role of program...

5
The Leading Edge Supporting HITECH implementation and assessing lessons for the future: The role of program evaluation Emily B. Jones n , Matthew J. Swain, Vaishali Patel, Michael F. Furukawa Ofce of Economic Analysis, Evaluation, and Modeling, Ofce of the National Coordinator for Health Information Technology, United States Department of Health and Human Services, 200 Independence Ave, SW, Washington, DC 20201, United States article info Article history: Received 2 August 2013 Received in revised form 19 November 2013 Accepted 20 December 2013 Keywords: Health information technology Program evaluation Implementation Performance measurement Research methods abstract In addition to supporting the adoption and use of health IT, HITECH also included funds to support independent national program evaluation activities. The main challenges of evaluating health IT programs of the breadth and scale of the HITECH programs are the importance of context in the implementation and impact of the programs, the complexity and heterogeneity of the interventions, and the unpredictable nature of the innovative practices spurred by HITECH. The lessons learned include the importance of tailoring evaluation activities to each phase of implementation, exible mixed methods, and continuous formative evaluation. Published by Elsevier Inc. 1. Introduction The Health Information Technology for Economic and Clinical Health (HITECH) Act sought to strengthen the U.S. health care system by investing in health information technology (IT) infrastructure. As a part of the American Recovery and Reinvestment Act of 2009, HITECH directed the Ofce of the National Coordinator for Health Information Technology (ONC) to create programs designed to accelerate the adoption of electronic health records (EHRs), enable electronic health information exchange (HIE), and support the use of health IT to fuel improvements in clinical care. 1 HITECHs invest- ments in health IT are foundational to the success of delivery system transformation efforts. 2 HITECH included directives to conduct pro- gram evaluation activities, recognizing the importance of evaluation in shaping program implementation as well as capturing lessons learned. 3 The design of these evaluationsand ultimately the nd- ingswill guide the design, implementation, and evaluation of future programs. Large-scale health IT initiatives are characterized by context-sensitivity, complex interventions, and delivery innovation, which can make assessing the effectiveness and impact of these programs a challenging endeavor. The national approach to evalua- tion is shaped by the challenges of evaluating programs as wide- ranging and transformative as the HITECH programs. HITECH targets nancial incentives to eligible providers and hospitals to encourage the adoption and meaningful use of health IT. 4 Programs supporting EHR adoption and meaningful use include the Regional Extension Center Program, which funds local organiza- tions to provide outreach, technical assistance, and practice coaching to primary care physicians in small independent practices, critical access hospitals, and federally-qualied health centers. 5 HITECH also designed a suite of workforce training programs to address the growing demand for a skilled workforce capable of implementing and optimizing health IT systems. 6 The State HIE Cooperative Agreement Program awarded funds to 56 states, eligible territories, and qualied state designated entities to enable the electronic exchange of health information through governance, policies, tech- nical services, business operations, and nancing mechanisms. 7 In pursuit of the triple aim goals, 8 ONC established the Beacon Com- munity Program to have local communities test ways to use health IT for improvements in quality, safety, efciency, and population health. 9 The Strategic Health IT Advanced Research Projects (SHARP) addresses priority areas where technical advancements were needed such as privacy and security of electronic health information. 10 This perspective examines the approach driving the HITECH independent program evaluation activities. The framework of the HITECH programs specically includes evaluation as a continuous activity, to not only ensure the programs promote the meaningful use of health IT, but that health IT supports delivery system transformation. 11 Six contracts totaling over $20 million were awarded; ve focusing on each of the ONC HITECH programs, and a globalassessment that examines interdependencies among Contents lists available at ScienceDirect journal homepage: www.elsevier.com/locate/hjdsi Healthcare 2213-0764/$ - see front matter Published by Elsevier Inc. http://dx.doi.org/10.1016/j.hjdsi.2013.12.015 n Corresponding author. E-mail address: [email protected] (E.B. Jones). Healthcare 2 (2014) 48

Upload: michael-f

Post on 05-Jan-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Supporting HITECH implementation and assessing lessons for the future: The role of program evaluation

The Leading Edge

Supporting HITECH implementation and assessing lessonsfor the future: The role of program evaluation

Emily B. Jones n, Matthew J. Swain, Vaishali Patel, Michael F. FurukawaOffice of Economic Analysis, Evaluation, and Modeling, Office of the National Coordinator for Health Information Technology,United States Department of Health and Human Services, 200 Independence Ave, SW, Washington, DC 20201, United States

a r t i c l e i n f o

Article history:Received 2 August 2013Received in revised form19 November 2013Accepted 20 December 2013

Keywords:Health information technologyProgram evaluationImplementationPerformance measurementResearch methods

a b s t r a c t

In addition to supporting the adoption and use of health IT, HITECH also included funds to supportindependent national program evaluation activities. The main challenges of evaluating health ITprograms of the breadth and scale of the HITECH programs are the importance of context in theimplementation and impact of the programs, the complexity and heterogeneity of the interventions, andthe unpredictable nature of the innovative practices spurred by HITECH. The lessons learned include theimportance of tailoring evaluation activities to each phase of implementation, flexible mixed methods,and continuous formative evaluation.

Published by Elsevier Inc.

1. Introduction

The Health Information Technology for Economic and ClinicalHealth (HITECH) Act sought to strengthen the U.S. health care systemby investing in health information technology (IT) infrastructure. As apart of the American Recovery and Reinvestment Act of 2009,HITECH directed the Office of the National Coordinator for HealthInformation Technology (ONC) to create programs designed toaccelerate the adoption of electronic health records (EHRs), enableelectronic health information exchange (HIE), and support the use ofhealth IT to fuel improvements in clinical care.1 HITECH’s invest-ments in health IT are foundational to the success of delivery systemtransformation efforts.2 HITECH included directives to conduct pro-gram evaluation activities, recognizing the importance of evaluationin shaping program implementation as well as capturing lessonslearned.3 The design of these evaluations—and ultimately the find-ings—will guide the design, implementation, and evaluation of futureprograms. Large-scale health IT initiatives are characterized bycontext-sensitivity, complex interventions, and delivery innovation,which can make assessing the effectiveness and impact of theseprograms a challenging endeavor. The national approach to evalua-tion is shaped by the challenges of evaluating programs as wide-ranging and transformative as the HITECH programs.

HITECH targets financial incentives to eligible providers andhospitals to encourage the adoption and meaningful use of healthIT.4 Programs supporting EHR adoption and meaningful use includethe Regional Extension Center Program, which funds local organiza-tions to provide outreach, technical assistance, and practice coachingto primary care physicians in small independent practices, criticalaccess hospitals, and federally-qualified health centers.5 HITECH alsodesigned a suite of workforce training programs to address thegrowing demand for a skilled workforce capable of implementingand optimizing health IT systems.6 The State HIE CooperativeAgreement Program awarded funds to 56 states, eligible territories,and qualified state designated entities to enable the electronicexchange of health information through governance, policies, tech-nical services, business operations, and financing mechanisms.7 Inpursuit of the triple aim goals,8 ONC established the Beacon Com-munity Program to have local communities test ways to use health ITfor improvements in quality, safety, efficiency, and populationhealth.9 The Strategic Health IT Advanced Research Projects (SHARP)addresses priority areas where technical advancements were neededsuch as privacy and security of electronic health information.10

This perspective examines the approach driving the HITECHindependent program evaluation activities. The framework of theHITECH programs specifically includes evaluation as a continuousactivity, to not only ensure the programs promote the meaningfuluse of health IT, but that health IT supports delivery systemtransformation.11 Six contracts totaling over $20 million wereawarded; five focusing on each of the ONC HITECH programs,and a ‘global’ assessment that examines interdependencies among

Contents lists available at ScienceDirect

journal homepage: www.elsevier.com/locate/hjdsi

Healthcare

2213-0764/$ - see front matter Published by Elsevier Inc.http://dx.doi.org/10.1016/j.hjdsi.2013.12.015

n Corresponding author.E-mail address: [email protected] (E.B. Jones).

Healthcare 2 (2014) 4–8

Page 2: Supporting HITECH implementation and assessing lessons for the future: The role of program evaluation

the different HITECH initiatives and assesses the collective impactof the programs. These investments in national program evalua-tions support not only summative evaluation activities, but for-mative activities that help to guide program implementation.Below, we explore the inherent challenges in assessing whetherthe unprecedented infrastructure investments in HITECH supportdelivery system transformation. In light of these challenges, wediscuss the national HITECH evaluation strategy, including theresearch questions, data sources, and methods.

2. Challenges in evaluating health IT

Evaluating a bundle of programs as transformative and variedas HITECH offers the unprecedented opportunity to explore theprocess of building and meaningfully using health IT infrastructureacross the health care system. Describing the context and imple-mentation of such a broad set of programs is critical, yet inherentlychallenging, as is assessing the effectiveness—and ultimately theimpact—of health IT initiatives like HITECH. This paper focuses onthree characteristics of large-scale health IT programs that shapethe evaluation activities: the importance of context, the complex-ity of the interventions, and the difficulties encountered whenestimating the impact and effectiveness of innovative practices.

2.1. Context sensitivity

Developing an understanding regarding contextual factors is animportant element of program evaluation because context shapesprogram implementation and acts as a mediating factor betweenactivities and outcomes.12 Health IT implementation and optimi-zation activities, in particular, are highly influenced by localcontext, human factors, and characteristics of providers, patients,care teams, and organizations.13 Both social and technical factorsare also crucial in understanding the context of these interven-tions. For example, challenges encountered in enabling dataexchange among hospitals in a populous urban state with a highlycompetitive marketplace like Massachusetts differ from theexperiences of a large rural state like South Dakota. Thus, evalua-tion efforts need to capture and use information on contextualvariation and understand how it shapes both the implementationprocess and the outcomes of interest.14 This information needs tobe captured at baseline and regularly monitored, since the contextitself might be altered by the interventions being evaluated.15 Theinfluence of context on implementation may mean that eachimplementation is unique and model fidelity might be low,possibly limiting the generalizability of evaluation findings.16

2.2. Complex interventions

Health IT initiatives are multifaceted and encompass many intri-cacies; the sheer complexity of the activities involved in adopting andmeaningfully using health IT poses challenges for implementation aswell as evaluation. Formative evaluation results play a critical role inguiding program implementation.17 Since complex interventionstypically roll out in many phases, it is necessary to tailor analyses toeach phase of program implementation.18 Performance measurementmethods and targets must adapt as the program unfolds. Character-izing complex interventions systematically aids in classification andunderstanding of these initiatives.19 The interactions among theHITECH programs add to the complexity, since an understanding ofthe interdependent parts of the system is necessary to properlyevaluate each program.20

2.3. Delivery innovation

One objective of large-scale health IT initiatives like the HITECH isapplying innovative practices to transform the health care deliverysystem. By definition, the effects of innovations are unpredictable,21

and we might not yet understand how health IT innovations fuelclinical transformation.22 Collective understanding of the mechanismsfor how health IT impacts care is still evolving. Therefore, theevaluation activities include theory-building and the use of conceptualframeworks to explore how health IT enables providers to pursue thetriple aim. In addition, innovative practices might be difficult todisseminate, which has implications for evaluation efforts.23 Learningfrom the innovators and early adopters at the beginning stages of thediffusion of innovations curve and spreading best practices is impor-tant to guide the late majority and the laggards, who may be moreresistant to adopting health IT.24

3. The national strategy for HITECH evaluation

These three main challenges shaped the national approach toevaluating HITECH. The program evaluation activities pursue theobjectives of steering HITECH implementation and guiding futureprogram design by characterizing the context and interventionactivities and, ultimately, assessing factors associated with short-term effectiveness as well as longer-term impact. As shown in theHITECH logic model in Fig. 1, the inputs include contextual factorsand HITECH programs, and the activities are the implementationefforts for each program. Effectiveness in the short term isassessed using performance measures related to each program’sgoals, and longer-term impact encompasses improvements totriple aim goals, reductions in disparities, and sustainability ofprogram activities after the grant period. Below, the researchquestions, data sources, and research methods are described, withan emphasis on how the three main types of challenges shapeHITECH evaluation data collection.

3.1. Situational awareness of contextual factors

Since the context shapes both the implementation of health ITprojects and the outcomes of interest, maintaining ongoing situa-tional awareness of contextual factors is necessary to monitorprogress and to assess effectiveness and impact. Context isdescribed and characterized using information on demographicsand the health care marketplace, particularly the baseline level ofhealth IT adoption. To understand and describe contextual factors,evaluators used quantitative information from proprietary andpublicly available datasets such as the census and Area ResourceFile. Relevant policies and regulations are also monitored, parti-cularly in qualitative work such as case studies and key informantinterviews. For example, the global evaluation team producespublicly available quarterly monitoring reports summarizing thepolicy and health IT market landscape, and also develops papersexploring topics such as the vendor marketplace and the role ofhealth IT in delivery system transformation initiatives. See Table 1for a crosswalk of challenges and the specific evaluation strategiesused to address those challenges.

3.2. Timely, flexible mixed methods

The complex nature of the HITECH programs calls for timely,flexible evaluation techniques that employ both quantitative andqualitative research methodologies. Formative ‘rapid-cycle’25 evalua-tion activities provide feedback to guide program implementation,and evaluators work closely with the program implementationteams to ensure evaluation activities are complementary to ongoing

E.B. Jones et al. / Healthcare 2 (2014) 4–8 5

Page 3: Supporting HITECH implementation and assessing lessons for the future: The role of program evaluation

program monitoring activities and responsive to evolving programexpectations. Evaluating complex interventions requires flexibilityand adaptability, since research questions and data sources need toevolve as the program implementation unfolds. By appropriatelytailoring research questions to each implementation phase, theHITECH evaluations address four main types of research questions:context, implementation activities, effectiveness, and impact (seeAppendix Table 1 for a summary of the types of research questionsaddressed in each evaluation). Several specific research questionswere used to explore each line of inquiry within each evaluation.

Multiple data sources are required to fully characterizethese complex programs, interventions and participants, includingoriginal qualitative and quantitative data collection as well assecondary data sources. Mixed methods foster an enhancedunderstanding of multifaceted interventions. Sources for second-ary data include administrative data provided by the grantees toONC, such as funding proposals, strategic and operations plans,annual reports, and information on governance structures, as wellas data from the Centers for Medicare and Medicaid Services EHRIncentive Programs detailing provider program participation.

Fig. 1. HITECH evaluation logic model.

Table 1National HITECH program evaluation strategy shaped by the challenges.

Type ofevaluationchallenge

Context-sensitivity Complex interventions Delivery innovation

Summary ofstrategy toaddresschallenge

Use a variety of data sources tomaintain situational awareness ofcontextual factors

Timely, flexible evaluation techniques employ mixedmethods

Original data sources, theory-building, and effectivedissemination

Elements ofnationalevaluationstrategy

� Situational awareness ofdifferent types of contextualfactors:

� Policy factors� Health care and health IT

market characteristics� Patient demographics� Baseline health IT

adoption� Multi-level contextual

monitoring: national,state, local area, HRR

� Multiple datasources used

� Statistical methods used tocontrol for context whenassessing effectiveness/impact

� Rapid-cycle, formative evaluation activities helpguide program implementation

� Flexibility and working closely with the programteam to ensure evaluation planning is responsive toevolving program expectations

� Segmented research questions according to phase ofimplementation

� Mixed methods and varied data sources: qualitativeand quantitative

� Systematically characterize the interventions usingtypologies

� Examine the interdependencies and interactionsbetween programs

� Original data sources� Surveys� Technical expert panels� Interviews and case studies

� Using logic models and conceptual frameworks tobuild the theoretical foundation forunderstanding the impacts of health IT

� Dissemination: to aid diffusion of innovations.The translational piece

Notes: HRR, hospital referral region; IT, information technology.

E.B. Jones et al. / Healthcare 2 (2014) 4–86

Page 4: Supporting HITECH implementation and assessing lessons for the future: The role of program evaluation

Other secondary data sources include purchased datasets such ase-prescribing data and information on hospital health IT sophis-tication from HIMSS Analytics. Original data collection activitiesinclude key informant interviews and case studies. Quantitativeresults drove qualitative data collection activities such as in-depthdiscussions with implementers and stakeholders, case studies, andfocus groups with providers. These qualitative data derived frominterviews with grantees and other stakeholders help to capturethe implementation experience. Summary information on the datasources used in each evaluation is available in Appendix Table 2.

Systematically characterizing the grantees and interventionshelps to simplify complex phenomena. Several evaluations includea formal typology or taxonomy categorizing the structures andactivities of program grantees, and cluster and factor analysis isused to reveal latent similarities and groupings among granteesbased on common approaches or characteristics. In addition toanalysis using typologies to explore the relation between differentgrantee characteristics and strategies, the relationships betweenthe various HITECH programs were ascertained using qualitativemethods. The programs possess deep interdependencies that arecritical to understand as each program is evaluated.26 The differentHITECH evaluators shared findings through the HITECH Evaluators’Collaborative, ensuring that all program evaluation activities weresuffused with an understanding of each program and how theyrelate to each other.

3.3. Data, theory, and dissemination

Collecting original data, theory-building, and effective dissemi-nation are needed to evaluate innovative interventions. A varietyof data were used as a part of the evaluation, including casestudies, technical expert panels, and original surveys. In additionto the qualitative data collection from case studies and keyinformant interviews, technical expert panels comprised ofexperts in health services research, health informatics, healthpolicy, and commercial industry provided a formal mechanismfor eliciting input from experts in the field on the best way tounderstand the HITECH programs and evaluate their effectivenessand impact. A total of six original surveys were fielded to collectcritical information, such as whether students trained in the ONC-funded workforce programs found pertinent employment. Theseefforts address a common problem in performance measurementand evaluation—the lack of availability of relevant data. Theinformation collected in the surveys is also useful for researchbeyond the scope of the national evaluations. A national survey ofclinical laboratories on HIE will provide valuable information onthe exchange of laboratory test results within a state, which willbe leveraged for program evaluation purposes. It will also providean understanding at the national-level on whether laboratorieswill be good trading partners with providers and hospitals tryingto meet the requirements of the EHR Incentive Programs. Furtherdetails about the original surveys used to evaluate the HITECHprograms is available in Appendix 3.

Another strategy responsive to the inherent difficulty withevaluating complex programs is using logic models and conceptualframeworks to build the theoretical foundation for understandingthe impacts of health IT. Since the use of information technology isrelatively new, particularly in health care, the HITECH evaluationsstrive to contribute to the evolving understanding of not onlywhether health IT can lead to positive outcomes, but how health ITchanges the structures, processes, and outcomes of care. Theintersection between social and technical factors is examined inthe evaluations, with the goal of gaining insight into the mechan-isms by which health IT changes care delivery.

Finally, when evaluating innovative practices, it is vitally impor-tant to learn from the early adopters of health IT and program

participants more generally to guide others who have yet to adoptor participate. For example, many communities applied to theBeacon Community Program; the lessons learned generated fromthis evaluation could provide valuable insights to communities thatmight be at various stages in implementing health IT strategies toimprove population health at a local level. Thus, disseminating thefindings from HITECH evaluation activities in a timely manner isintegral to the national strategy. Reports are released in both peer-reviewed and grey literature in order to release findings promptlyand broadly to different audiences. Presentations in various venuescomplement the written work, and in some cases, enable results tobe disseminated before a formal deliverable is prepared. Transla-tional activities ensuring that the evaluation findings are utilizedcan accelerate adoption and meaningful use of health IT. Thenational approach to evaluating HITECH prioritizes the dissemina-tion of emerging findings to broad audiences, not only to research-ers and policymakers.

4. Discussion

In addition to supporting the adoption and use of health IT,HITECH also included directives to support the evaluation of theseprograms and initiatives to understand how health IT supportsdelivery system transformation. This perspective highlights thethree main challenges with evaluating health IT as a framework fordescribing the national program evaluation efforts for ONCHITECH programs. The national program evaluation activitiesexamine contextual barriers and facilitators of the implementationprocess, as well as grantee characteristics, the implementationexperiences and interventions, and the changes in clinical practiceengendered by the HITECH interventions. Different types of dataand research methods are leveraged to characterize the HITECHinterventions and the relevant contextual factors, as well asprogram effectiveness and impact. The national approach toHITECH evaluation described in this paper can inform futureevaluation efforts. Investment in program evaluation is a criticalpart of wide-ranging initiatives like HITECH, both to guide pro-gram implementation and to ensure that lessons are system-atically harvested.

The main challenges of evaluating health IT programs of thebreadth and scale of the HITECH programs are the complexity andheterogeneity of the interventions; the importance of context in theimplementation and impact of the programs; and the unpredictablenature of the innovative practices spurred by HITECH. In addressingthese challenges, the independent national program evaluationactivities leverage the activities of complementary performancemeasurement and evaluation activities. The national evaluationefforts are informed by and coordinated with the monitoring andperformance measurement activities of the ONC program teams,technical assistance contractors, federal partners, and HITECHgrantee organizations. Taken as a whole, the findings of all thesemonitoring and evaluation activities can paint a rich picture ofHITECH implementation and health IT impacts.

The lessons learned for future researchers and implementers oflarge-scale health IT initiatives include the importance of tailoringevaluation activities to each phase of program implementation.Flexible mixed methods, including primary data collection andtheory-building, and formative, rapid-cycle,27 and continuous 28

analysis help guide program implementation, particularly whenincremental research questions are tailored to each implementa-tion phase. Evidence suggests that HITECH programs acceleratedthe diffusion of technological infrastructure and innovation29, andevaluation activities offer an opportunity to learn from the brightspots as well as the barriers encountered during implementation.While the national HITECH program evaluation activities are a

E.B. Jones et al. / Healthcare 2 (2014) 4–8 7

Page 5: Supporting HITECH implementation and assessing lessons for the future: The role of program evaluation

catalyst, further research should foster understanding of howhealth IT enables delivery system transformation in widely diver-gent contexts.

Appendix A. Supporting information

Supplementary data associated with this article can be found inthe online version at http://dx.doi.org/10.1016/j.hjdsi.2013.12.015.

References

1. Blumenthal D. Implementation of the Federal Health Information TechnologyInitiative. N. Engl. J. Med. 2011;365:2426–2431.

2. Buntin MB, Jain SH, Blumenthal D. Health information technology: laying theinfrastructure for national health reform. Health Aff. 2010;29(6):1214–1219.

3. Blumenthal D. Launching HITECH. N. Engl. J. Med. 2010;362:382–385.4. Marcotte L, Seidman J, Trudel K, et al. Achieving meaningful use of health

information technology: a guide for physicians to the EHR incentive program.Ann. Intern. Med. 2012;172(9):731–736.

5. Lynch K, Kendall M, Shanks K, Haque A, Jones E, Wanis M, Furukawa M,Mostashari F. The Health IT Regional Extension Center Program: evolution andlessons for health care transformation. Health Serv. Res. [online first versionreleased December 2013.].

6. NORC at the University of Chicago. Overview of ONC’s workforce developmentprogram. June 2011. Available from: ⟨http://www.healthit.gov/sites/default/files/pdf/onc-work-force-development-program-annual-report-2011.pdf⟩.

7. Williams C, Mostashari F, Mertz K, Hogin E, Atwal P. From the Office of theNational Coordinator: the strategy for advancing the exchange of healthinformation. Health Aff. 2012;31(3):527–536.

8. Berwick D, Nolan T, Whittington J. The triple aim: care, health, and cost. HealthAff. 2008;27(3):759–769.

9. McKethan A, Brammer C, Fatemi P, et al. An early status report on the BeaconCommunities’ plans for transformation via health information technology.Health Aff. 2011;30(4):782–788.

10. NORC at the University of Chicago. Vision for the strategic health IT advancedresearch projects (SHARP) program; March 2012. Available from: ⟨http://www.healthit.gov/sites/default/files/pdf/SHARP_VisionReportMarch2012.pdf⟩.

11. Blumenthal D. Launching HITECH. N. Engl. J. Med. 2010;362:382–385.12. Pawson R, Tilley N. Realistic evaluation. London, UK: Sage; 1997.13. Takian A, Petrakaki D, Cornford T, Sheizh A, Barber N. Building a house on

shifting sands: methodological considerations when evaluating the

implementation and adoption of national electronic health record systems.BMC Health Serv. Res. 2012;12:105.

14. Ancker J, Kern L, Abramson E, Kaushal R. The triangle model for evaluating theeffect of health information technology on healthcare quality and safety. J. Am.Inform. Assoc. 2012;19:61–65.

15. Johnson K, Gadd C. Playing smallball: approaches to evaluating pilot healthinformation exchange systems. J. Biomed. Inform. 2007;40:S21–S26.

16. Greenlaugh T, Russell J, Ashcroft R, Parsons W. Why national eHealth programsneed dead philoposhers: Wittgensteinian reflections on policymakers’ reluc-tance to learn from history. Milbank Q. 2011;89(4):533–563.

17. Lilford R, Foster J, Pringle M. Evaluating eHealth: How to make evaluation moremethodologically robust. PLoS Med. 2009;6(11):e1000186.

18. Johnson K, Gadd C. Playing smallball: approaches to evaluating pilot healthinformation exchange systems. J. Biomed. Inform. 2007;40:S21–S26.

19. Lamb S, et al. Reporting of complex interventions in clinical trials: develop-ment of a taxonomy to classify and describe fall-prevention interventions.Trials. 2011;12:125–133.

20. Gold, Health Affairs 2012. Catwell L, A Sheikh. Evaluating eHealth interven-tions: the need for continuous systemic evaluation. PLOS Med. 2009;6(8):e1000126.

21. Greenlaugh T, Russell J, Ashcroft R, Parsons W. Why national eHealth programsneed dead philoposhers: Wittgensteinian reflections on policymakers’ reluc-tance to learn from history. Milbank Q. 2011;89(4):533–563.

22. (a) Kern L, Kaushal R. Health information technology and health informationexchange in New York state: new initiatives in implementation andevaluation. J. Biomed. Inform. 2007;40:S17–S20;

(b) Greenlaugh T, Russell J. Why do evaluations of eHealth programs fail:an alternative set of guiding principles. PLoS Med. 2010;7(11):e1000360.

23. Berwick D. Disseminating innovations in health care. J. Am. Med. Assoc.2003;289(15):1969–1975.

24. Rogers Everett M. Diffusion of innovations. Glencoe: Free Press; 0-612-62843-4.25. Shrank W. The Center for Medicare and Medicaid Innovation’s blueprint for

rapid-cycle evaluation of new care and payment models. Health Aff. 2013;32(4):807–812.

26. Gold M, McLaughlin, Devers K, Berenson R, Bovbjerg R. Obtaining providers’‘buy-in’ and establishing effective means of information exchange will becritical to HITECH’s success. Health Aff. 2012;31(3):514–526.

27. Gold M Helms D, Gutterman S. Identifying, monitoring, and assessing promis-ing innovations: using evaluation to support-rapid-cycle change. Issue Brief:The Commonwealth Fund; June 2011.

28. Catwell L, Sheikh A. Evaluating eHealth interventions: the need for systemiccontinuous evaluation. PLoS Med. 2009;6(8): (e1000126.d l).

29. Hsaio CJ, Jha A, King J, Patel V, Furukawa MF, Mostashari F. Office-basedphysicians are responding to incentives and assistance by adopting and usingelectronic health records. Health Affairs. 2013;32(8):1470–1477.

E.B. Jones et al. / Healthcare 2 (2014) 4–88