BRIDGING RESEARCH AND PRACTICE The ISF, Organizational Readiness and100 Million Healthier Lives Abraham Wandersman University of South Carolina 27 May 2015 [email protected] OVERVIEW • Bridging Research and Practice • Interactive Systems Framework for Dissemination and Implementation • Readiness and Readiness Heuristic • Culture of Health • 100 Million Healthier Lives by 2020 • Institute for Healthcare Improvement ‘s SCALE project funded by Robert Wood Johnson Foundation RESEARCH PRACTICE Evidence based substance abuse and crime prevention programs in schools • The 2011 study found that,” while schools reported implementing a large number of prevention programs during the 2004–05 school year, only a small proportion reported implementing programs (approximately 7.8 percent) supported by research evidence “ • ~44.3% of the estimated 7.8% of research-based programs met minimal standards for fidelity of implementation…. • Therefore, 3.5% of the programs were research-based and well-implemented U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service (2011). Prevalence and Implementation Fidelity of Research-Based Prevention Programs in Public Schools: Final Report, Washington, D.C. Do Electronic Medical Records Save Money? • Experts have long argued that computerized patient records will save the health system money by helping doctors reduce the number of redundant or inappropriate tests they order. A new study published in Health Affairs, disputes that, suggesting that office-based physicians who have access to electronic records of patient care are actually more likely to order additional imaging tests and laboratory tests than doctors who rely on paper records. • The study was based on a 2008 federal survey that collected data from 28,000 patient visits to 1,100 doctors. • It found that doctors who could call up electronic images of a patient’s previous imaging tests —such as X-rays and CT scans — ordered new imaging tests in 18 percent of the visits; physicians without such access ordered imaging on only 12.9 percent of the visits. The rate was 70 percent higher for the most advanced and expensive images. • The study’s authors argue that previous research that showed savings were done at leading medical centers with sophisticated technology. Many doctors’ offices buy off-the-shelf systems, primarily for billing purposes, that may not be able to track down redundancies or lack software to help doctors decide if a test is appropriate. • We still believe that widespread adoption of electronic medical records will improve care and reduce costs. It is also clear that many office-based physicians will need help in making the transition. (emphasis added) NY TIMES EDITORIAL ON STUDY PUBLISHED IN HEALTH AFFAIRS ( March 6, 2012 ) BRIDGING RESEARCH AND PRACTICE *WHERE DO EVIDENCE BASED PRACTICES COME FROM AND WHERE DO THEY GO *RESEARCH TO PRACTICE MODELS *THE INTERACTIVE SYSTEMS FRAMEWORK FOR DISSEMINATION AND IMPLEMENTATION (ISF) Feedback Loop 1. Identity problem or disorder(s) and review information to determine its extent 2.With an emphasis on risk and protective factors, review relevant information—both from fields outside prevention and from existing preventive intervention research programs RISK AND PROTECTIVE FACTORS EPIDEMIOLOGY 3. Design, conduct, and analyze pilot studies and confirmatory and replication trials of the preventive intervention program EFFICACY TRIALS 4. Design, conduct, and analyze largescale trails of the preventive intervention program EFFECTIVENESS 5. Facilitate large-scale implementation and ongoing evaluation of the preventive intervention program in the community PRACTICE TRIALS FIGURE 1.1 The preventive intervention research cycle. Preventive intervention research is represented in boxes three and four. Note that although information from many different fields in health research, represented in the first and second boxes, is necessary to the cycle depicted here, it is the review of this information, rather than the original studies, that is considered to be part of the preventive intervention research cycle. Likewise, for the fifth box, it is the facilitation by the investigator of the shift from research project to community service program with ongoing evaluation, rather than the service program itself, that is part of the preventive intervention research cycle. Although only one feedback loop is represented here, the exchange of knowledge among researchers and between researchers and community practitioners occurs throughout the cycle. Funding Putting It Into Practice— Delivery System General Capacity Use Innovation-Specific Capacity Use Supporting the Work—Support System Macro Policy Climate General Capacity Building Innovation-Specific Capacity Building Distilling the Information— Synthesis & Translation System Synthesis Translation Existing Research and Theory Interactive Systems Framework for Dissemination and Implementation Delivery System General Capacity General Capacities Types of General Capacities (non-exhaustive) Authors Culture Drzensky et al., 2012; Glisson, 2007; Glisson & Schoenwald, 2005; Hemmelgarn et al., 2006 Climate Aarons et al., 2011; Beidas et al., 2013; Damschroder et al., 2009; Glisson, 2007; Greenhalgh et al., 2004, Hall & Hord, 2010; Lehman et al., 2002 Organizational Innovativeness Damschroder et al., 2009; Fetterman & Wandersman, 2005; Greenhalgh et al., 2004; Klein & Knight, 2005; Rafferty et al., 2013; Rogers, 2003 Resource Utilization Armstrong et al., 2006; Greenhalgh et al., 2004; Klein et al., 2001; Rogers, 2003; Simpson, 2002 Leadership Aarons & Sommerfield, 2012; Becan, Knight, & Flynn, 2012; Beidas et al., 2013; Fixsen et al., 2005; Grant, 2013; Rafferty et al., 2013; Simpson et al., 2002 Structure Damschroder et al., 2009; Flaspohler et al., 2008; Greenhalgh et al., 2004, Lehman et al., 2002; Rafferty et al., 2013; Rogers, 2003 Staff Capacity Flaspohler et al., 2008; McShane & Van Glinow, 2009; Simpson et al., 2002 GC Components are “Normal” Leadership Low Medium High Interactive Systems Framework for Dissemination and Implementation Delivery System General Capacity InnovationSpecific Capacity An Innovation •Any policy, program, or process that is new to a setting Innovation-Specific Capacities Types of Innovation-Specific Capacities; (non-exhaustive) Authors Innovation-Specific knowledge, skills, and abilities Program Champion Wandersman, Chien, & Katz, 2012; Fixsen et al., 2005; Greenhalgh et al., 2004; Simpson, 2002 Specific Implementation Supports Aarons et al., 2011; Beidas et al., 2013; Damshroder et al., 2009; Fetterman & Wandersman, 2005; Greenhalgh et al., 2004; Hall & Hord, 2010; Rogers, 2003; Schoenwald & Hoagwood, 2001; Weiner et al., 2008. Interorganizational Relationships Aarons et al., 2011; Flaspohler et al., 2004; Powell et al., 2012 Atkins et al., 2008; Damshroder et al., 2009; Greenhalgh et al., 2004; Gladwell, 2002; Grant, 2013; Rafferty et al., 2013; Rogers, 2003 Interactive Systems Framework for Dissemination and Implementation Delivery System General Capacity Motivation InnovationSpecific Capacity Motivation for Innovation Types of Motivations (non-exhaustive) Authors Relative Advantage Armenakis et al., 1993; Damschroder et al., 2009; Hall & Hord, 2010; Rafferty et al., 2013; Rogers, 2003; Weiner, 2009 Compatibility Chinman et al., 2004; Durlak & Dupre, 2008; Fetterman & Wandersman, 2005; Greenhalgh et al., 2004; Rogers, 2003; Simpson, 2002 Complexity Damschroder & Hagedorn, 2011; Fixsen et al., 2005; Greenhalgh et al., 2004; Meyers, Durlak & Wandersman, 2012; Wandersman et al., 2008. Trialability Armenakis et al., 1993; Greenhalgh et al., 2004; Rapkin et al., 2012; Rogers, 2003 Observability Beutler, 2001; Chinman et al., 2004; Damschroder et al., 2009; Ford et al., 2008; Rossi, Lipsey, & Freeman, 2004 Priority Armenakis & Harris, 2009; Greenhalgh et al., 2004; Flaspohler et al., 2008 A Heuristic Readinessi = Motivationi x General Capacity x Innovation-Specific Capacityi R = MC 2 Interactive Systems Framework for Dissemination and Implementation Delivery System General Capacity Motivation InnovationSpecific Capacity Readiness Building Strategies Support System General Capacity Motivation InnovationSpecific Capacity Readiness Overview •To describe importance of building organizational readiness to help get innovations into place •To present preliminary results on the evidence that readiness can be changed •To discuss implications for providing supportive interventions to help make organizations better at providing services Readiness: Facilitating Innovation Use • Organizational readiness is important if we want to help organizations put an innovation into place • Flaspohler et al. (2008), Scaccia et al. (2015), Weiner et al. (2008) • Assessment of organizational readiness is important: • Before an innovation is selected (adoption) • Before the innovation is put into place • During implementation • During sustainability planning Building Readiness Broad Strategies Ways to support an Innovation Tools Technical Assistance Training Quality Assurance/ Quality Improvement Evidence-Based System for Innovation Support (EBSIS) (Wandersman, Chien, & Katz, 2012) Testing out systems to help assess and develop readiness and capability: Building Readiness Through an Evidence-Based System for Implementation Support (EBSIS) Relationships Readiness Outcomes Improved: Training Initial Readiness To Achieve Desired Outcomes • • • General Capacities InnovationSpecific Capacities Motivation Technical Assistance Tools Quality Assurance Quality Improvement • General Capacity • Innovation - Specific Capacity • Motivation Initial Research Questions about Readiness • What evidence is there that the factors and subcomponents can be changed by support system interventions (e.g. tools, training, etc…)? • Do support system activities that target specific components of readiness lead to better innovation outcomes than those that do not? Research Synthesis (Scaccia, 2014) • Ways to analyze large amounts of research across multiples studies • Looked in Behavioral Health, Medical, and Grey Literature Synthesis and Translation System Synthesis Translation Article Descriptions •Content Areas: • Health care (112; 65%), • Public health (29; 18%), • Behavioral health (14; 8%) • Education (11; 6%). • Business literature (4; 2%) • US Federal Government (3; <2%). •Articles were published between 1972 and 2013, with 94% being published after the year 2000 Readiness Outcomes •Readiness can be changed by targeted support • Odds Ratio = 3.1; 95% CI [1.23,7.48] • Log odds: 1.13 (SE = 0.46; p = 0.0137). • Wald Test: X2 (2, 173) = 58.7, p < 0.001 Innovation Outcomes •Do Support System Activities that target readiness have better innovation outcomes? •Yes; Almost 7 times more likely to report positive outcomes • OR = 6.8; 95% CI [1.18,38.83] • Log odds = 1.917 (SE = 0.84; p = 0.0234) • X2 (2, 85) = 6.2, p = 0.044 Building a Culture of Health—RWJF • If most of health and well-being is determined by • Social/Economic/Environmental factors • Then we need to stop doing business as usual • Because we aim at something new • We need to rethink strategy • Grantees need to rethink operations • That means more Empowerment Evaluation Healthy Community Initiatives • Our Healthier Communities Initiatives (HCI) are built on the concept that local communities can work together to provide healthy choices and support the pursuit of healthy lifestyles. Ys across the country work in collaboration with community leaders to change policies and physical surroundings to bring healthy living within reach of all people. • CDC • Lets Move • And many more What is 100 Million Healthier Lives? An unprecedented collaboration of change agents pursuing an unprecedented result: 100 Million People Living Healthier Lives by 2020 What? An unprecedented collaboration of change agents pursuing an unprecedented result: 100 million people living healthier lives by 2020. How? Unite a Guiding Coalition across people and organizations to achieve 100 million people living healthier lives by 2020. Co-design core strategies and an asset-based implementation plan through workgroups (i.e., December meeting and ongoing) Develop infrastructure for learning, improvement, and motivation. (i.e., RWJF SCALE Initiative). Engage health care, people, and communities to improve health Create meaningful connections, removal of barriers, and ongoing support Key Stats • • • • 162 members working on the ground to improve health have committed to being part of the community 91 partners ready to play a leadership role in 100 Million Healthier Lives 7 sponsors willing to providing some level of financial support 192 action plans submitted, detailing how individuals/organizations/networks will take action in improving health Theory of change – 100 Million Healthier Lives Unprecedented collaboration Large-scale improvement innovation System innovation to remove barriers 100 Million People Living Healthier Lives by 2020 Overview of Spreading Community Accelerators through Learning and Evaluation - SCALE Generously supported by the Robert Wood Johnson Foundation: A 20-month intensive “learning and doing” program for 20 communities working on becoming pacesetters who are paired with an improvement coach and 4 mentor communities. What are the goals of SCALE? 1. Learn what it takes to create effective inter-community spread (focus of majority of the formative evaluation, as this is where there is a great deal to be learned). 2. Develop capability within communities to improve health and achieve a culture of health. 3. Develop a system for effective teamwork between communities that supports synthesis and dissemination (spread system). 4. Explore whether the SCALE approach can be used as the basis of a learning system in the future, for the RWJF Culture of Health and/or 100 Million Lives Campaign. Approach to scale: peer to peer spread Content bright spots (eg, CureViolence, 100,000 Homes) Pace-setter communities Pace-setter communities (Wave 3) - 20 Wave 2 (20) Pace-setter communities Adopter/Pacesetter communities (Wave 3) - 20 Wave 1 (20) Pace-setter communities High functioning communities (4) communities (peer mentors) Wave 2 (20) Pace-setter (Wave 3) - 20 Pace-setter communities (Wave 3) - 20 Other communities in our learning system (Wellville, Frontier, CDC, AF4Q, ReThink...) Scale-up of other communities; spread of lessons learned Additional pace-setter communities through other community improvement efforts SCALE: Learning and Doing Community Health Improvement and Leadership Academy (CHILA) Intensive support from improvement coaches and other community health leaders Peer-to-peer learning activities Development of community-specific health goals Collaborative and supportive evaluation team Peer-learning and knowledge-sharing technology platform Support and tools from the broader 100 Million Healthier Lives initiative Formative Evaluation • Rossi, Lipsey, and Freeman (2004) emphasize the use of formative evaluation for program improvement. o IHI also recommends the use of theory-based formative evaluation (Parry 2013) • We build on this and suggest that formative evaluation can help ameliorate limitations or failures in program theory (idea), implementation, and evaluation (Wandersman, 2009). • Wandersman and colleagues have developed FORECAST (Formative Evaluation and Consultation Systems Technique) (1994, 2013). Pacesetter Community FORECAST Model Draft: February 9, 2015 Phases of Project Application Process/Outreach Process Developm ent of RFA Development of Selection Process Readiness Criteria (Inclusion/Exclusion ) Identification of Potential Pacesetters Create CHIA and Community of Practice Needs and Asset Assessment Onboarding Implementation Outcome Evaluation Synthesis and Translation EBSIS to evaluate process of support RFA Announcem ent Review, Selection and Sorting of Applicants Develo p MOUs Welcoming Process Refine Scope of Work Phase 2 Readiness Assessment Supports to Pacesetters to identify community need and innovation Case Studies Coordinate Community of Practice and CHIA Implementation (Note: activity not specified in SCALE plan And other assessments during this period Short Term Outcome Evaluation Changes in Readiness Formative Evaluation Anticipated Progress on the Triple Aim Journey Triple Aim Level 3 Triple Aim Level 4 Triple Aim Level 5 Synthesize and Translate Results Proposed formative evaluation questions 1. How do we synthesize, document, disseminate, translate and support implementation lessons learned from SCALE to facilitate spread across additional communities in the future-- to make progress towards 100M healthier lives and a culture of health? 2. What are the accelerators and inhibitors of progress in pacesetter communities across the phases of the SCALE project? a. What are the criteria and associated indicators to assess performance of pacesetter communities? b. Can indicators of readiness and capability (capacity and motivation) predict progress? 3. How are these accelerators and inhibitors influenced by other aligned initiatives in a community? What can we say about the contribution of the SCALE approach to progress (e.g., CHILA, mentors, meso-system coaches)? 4. What are the facilitators and inhibitors of intercommunity spread (e.g., pace setters to pace setters, bright spots to pacesetters, mentor to pacesetters ) under the SCALE approach? How can the SCALE approach be improved to increase effectiveness and efficiency of spread? Assessing Initial Readiness • Development of an application (RFA); we looked at: 1. Sources of Motivation to participate in SCALE 2. Specific skills needed for SCALE tasks (innovation-specific capacity) 3. General organizational characteristics (General Capacity) • We expect that mentor communities will be “more ready” than pacesetters. Assessing Ongoing Readiness in SCALE • Readiness is dynamic and will change over time: • After “onboarding phase” • After Training Academy (CHILA) • Regular intervals (e.g. interactions with mentors) • We will be looking at a smaller subset of communities in a more indepth, case study approach. How does SCALE relate to the lSF? Switch strategies Delivery system: 20 Pacesetter communities • Multistakeholder coalitions that include healthcare, community and public health systems • Selected based on readiness to adopt improvement approach via application and demonstrated previous capacity in implementing multistakeholder change Support with sustainable business model development Improvement advisers Triple Aim? • Community Health Improvement Academy • Intercommunity Learning system • Meso-system coaches • Advanced community peer mentors Harvest bright spots; create toolkits Facilitated connections to content bright spots Redesign of learning system and approach as needed “Content bright spots” – eg CureViolence Prior knowledge of what works in communitie ANSWER The Interactive Systems Framework for Dissemination and Implementation What is the name of a framework for bridging research and practice that integrates research-to-practice models with community-centered/ practice centered models? ABE BONO ANSWER This heuristic, R=MC^2, helps us to think about components likely essential to implementation. WHAT IS THE HUERISTIC FORMULA FOR READINESS? ABE BONO ANSWER This unprecedented collaboration of change agents is pursuing an unprecedented result Hello! Hello! What’s IHI’s 100 Million Healthier People by 2020, yeah yeah yeah! ABE BONO ANSWER This initiative looks to Spreading Community Accelerators through Learning and Evaluation What is the name of the IHI project funded by RWJ that explores intercommunity spread? ABE BONO References • Flaspohler, P., Duffy, J., Wandersman, A., Stillman, L., Maras, M.A. (2008). Unpacking Prevention Capacity: An Intersection of Research to Practice Models and Community-centered Models. American Journal of Community Psychology, 41, 182-196. • Greenhalgh, T., Robert, G., MacFarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of Innovations in Service Organizations: Systematic Review and Recommendations. The Milbank Quarterly, 82(4), 581-629. • Kane, H., Lewis, M. A., Williams, P. A., & Kahwati, L. C. (2014). Using qualitative comparative analysis to understand and quantify translation and implementation. Translational Behavioral Medicine, 4(2), 201-208. • Miller, W., & Rollnick, S. (2013). Motivational Interviewing; 3rd ed. Guilford Press. New York. • Øvretveit, J (2003). What are the best strategies for ensuring quality in hospitals? Copenhagen, WHO Regional Office for Europe (Health Evidence Network report; http://www.euro.who.int/document/e82995.pdf, accessed 7.23.13). • Ragin, C. C. (2009). Qualitative comparative analysis using fuzzy sets (fsQCA).Configurational Comparative Methods, 87-121. • Scaccia, J.P., Cook, B.S., Lamont, A., Wandersman, A., Castellow, J., Katz, J., & Beidas, R. (in press). A practical implementation science heuristic for organizational readiness: R=MC2. Journal of Community Psychology. • Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., et al. (2008). Bridging the gap between prevention research and practice: The Interactive Systems Framework for Dissemination and Implementation. American Journal of Community Psychology, 41, 171-181. • Weiner, B., Amick, H., & Lee, S. Y. (2008). Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Medical Care Research and Review.
© Copyright 2024