Massachusetts State-wide System of Care: Learning about Case

Massachusetts State-wide System of Care:
Learning about Case Practice
in Wraparound and In-Home Therapy
using the SOCPR
Jack Simons PhD and Emily Sherwood MPA
Children's Behavioral Health Interagency Initiatives
Massachusetts Executive Office of Health and Human Services
Boston, MA
28th Annual Research & Policy Conference on Child, Adolescent, and Young Adult Behavioral Health
Tampa, FL
March 23, 2015
the winter has gotten us a little disregulated…
https://www.youtube.com/watch?v=Qv56miXMY7A
Goals

Learn how MA has used intensive case reviews to assess practice in its SOC

Understand how case reviews produce both qualitative and quantitative (“mixed method”) results

Understand costs and benefits of intensive case reviews in a SOC

Identify key decisions you must make in implementing an intensive case review process

Understand how intensive case reviews informed reallocation of critical QI resources for MassHealth
Background

Massachusetts: 6.7 million residents

Children’s Behavioral Health Initiative launched in 2008 in response to a Federal Medicaid lawsuit (Rosie D.)

Required expansion of BH services for children and youth up to age 21 with MassHealth (MA Medicaid)

Over 35,000 children and youth receive services each year through CBHI

Court monitoring continues as of 2015
New MassHealth service components (since
2008)







Primary care screening for BH conditions
Intensive Care Coordination using high‐fidelity Wraparound
Family Partners
In‐home Therapy
Therapeutic Mentoring In‐home Behavioral Services using approach of Positive Behavioral Supports
Mobile Crisis Intervention Teams
MA experience with intensive case reviews
Years
Protocol
Context
Description
For more information
2010 – 2012 Community Service Review (CSR ‐‐ in other jurisdictions called Quality Service Review, QSR)
Managed by Court Monitor with contract to HSO (Mary Brogan of Hawaii as MA site lead)
Proprietary protocol assessing practice quality
Human Systems and Outcomes, Inc., Tallahassee FL
2013 – 2014 System of Care Practice Review (SOCPR)
Managed by MA with training Assesses adherence to SOC See SOCPR USF website* for and consultation contract to principles, has research extensive documentation USF
support
and videos.
2014 –
ongoing
Massachusetts Practice Review (MPR)
MA ongoing QI process
* http://cfscommunitysolutions.cbcs.usf.edu/oll/SOCPR.html
MA adaptation of SOCPR
Contact info at end of presentation
Case reviews provide both qualitative and
quantitative (“mixed method”) results

Ratings of practice based on the items and domains of the review, on a 5‐point scale (for the MPR) ranging from poorest to best practice. Numerous specific item scores are summed to provide more general domain scores.

Quantitative data derived from ratings are useful for looking at variations across the system, and at changes in practice quality over time.

Qualitative data, including quoted statements, can lead to rich commentary on specific items, or to coherent accounts of the experience of a family working together with a group of providers and natural supports.

Qualitative data provide the human experience and insight behind the numbers.
Benefits of intensive case mixed-method
reviews








“Seeing is believing” – get direct accounts from providers and families
Inquire into causes at the level of family experience
Generate hypotheses to explain administrative data
Better information to allocate critical, precious resources for implementation support and quality improvement
Benchmark services over time, across regions and services
Generate stories that communicate effectively to stakeholders
Generate teaching / QI cases
Involve decision makers as reviewers or to participate in debriefing sessions, to galvanize efforts at improvement
Costs and risks of intensive case review

Can be an expensive way to gather information ‐‐ in time and money




Up front work: protocol selection and modification, project setup
Heavy continuing logistical demand:








For MA, max was 250k for 120 reviews, not including 50% state reviewer time
Many costs can be reduced over time although innovation creates temporary costs and can make data less comparable across time periods
Reviewer recruitment and training
Sampling and recruitment of families to participate
Reviewers are away from other work for long, intensive days
Managing the review activities on review days
Managing the data collection and debrief
Reporting
Family and provider burden (but they can also benefit)
Risk of misinterpreted findings if methodology or reporting is not sound
Implementation Decisions






Scope and scale – and cost
How will logistics be managed?
Choice and modification of protocol
Number of data points for each case
Reviewer selection and training
Who will be privy to which information?
Each issue is elaborated in what follows…
Scope and scale



What is the scope of practice to be reviewed?
How many cases?
How frequently will you review?
In MA, we review cases of Intensive Care Coordination and In‐Home therapy. We rate only the quality of the MassHealth service and not that of other SOC stakeholders. We typically review 120 cases each year. We are now at a point where reviews of the same services every year are not needed.
How will logistics be managed?

Operational planning and execution are critical in supporting many anxious stakeholders:





Providers
Families
Reviewers
State program administrators
Logistical tasks, even after one‐time tasks are complete, include:


Planning dates for reviews, establishing reviewer panel, obtaining sample frame, selecting cases, webinar for participating providers, tracking which families have consented, adding to sample pool as needed, sending consent materials, purchasing and controlling gift cards, arranging interpreters, arranging space and food for debriefs, obtaining interview schedules and sending to reviewers, chasing people who owe stuff…
And that’s even before the review week begins! Much more follows!
Choice and modification of protocol

Some modification is needed to ensure that the language used is consistent with your service system (e.g. how you label service categories)

All the protocols we have used result in tremendous gain of knowledge: good reviewers make the best use of whatever protocol you choose
No protocol is without challenges: complex phenoma are hard to put in boxes




Protocol should support interview consistency while allowing for individuality and flow
Protocol dimensions and items need to reflect your system’s values and assess the processes that most affect outcomes
Biggest challenges for us have been:  ambiguity in the constructs being rated in protocols, in many areas but especially in terms of scope of practice (in a team process, who are we rating?)

situations that call for causal inference by the reviewer, especially in terms of effort versus outcome, particularly with children with complex and severe needs and challenging stakeholder community
Logistics suggested in protocols may also need
modification

Our move from data submission on paper to web‐based system simplified and speeded the process for reviewers and for the data analyst

Both the CSR and SOCPR recommend group debriefing after every day of review. Despite benefits of immediate debrief, this required reviewers to stay in “econo‐lodgings” at Commonwealth expense, away from family, for the whole review week. This was not sustainable from a reviewer wellness perspective, or a MA budgetary perspective. We have moved to a day‐long debrief at the end of the review week.

Under the CSR we required each reviewer to write a narrative summary and formulation of each case. Reviewers spent much longer on these than expected (e.g. 4 hours per case). Under SOCPR and our own efforts to save reviewer time we no longer required comprehensive summaries. Now we miss them and are looking for a middle ground.
Excellent resource when
considering case reviews: SOCPR
website


SOCPR Protocol in English and Spanish
Review Team Member Training Manual







Rationale
Reviewer qualifications and training
Logistics
How to interview
Data analysis
SOCPR Fact Sheet
Training videos
Number of data points for each case





Data points = Record review plus # of interviews
We now allot 2 hours for record review
With children with very complex and challenging needs, reviews are often detective work where each witness may have a somewhat different account
With CSR we did as many interviews as we could identify team members. Mean = 6.7, max = 13. You get a more complete picture with >3 informants but there are diminishing returns, and for reviewers the number of interviews was exhausting.
With SOCPR you need at least 3 data points. We usually have more.
 Record review (required)
 Caregiver (required)
 Child if 10 or older
 Primary clinician (required)
 Another team member including natural support
Reviewer selection and training

Reviewer competence and preparation are critical

Most of our reviewers are clinicians, and some are family partners. A few are non‐clinician administrators.
In MA we currently pay hired reviewers $400, or $450 for a review they conduct in another language, and we do not pay for training or debrief, your mileage may vary






Bilingual reviewers preferable to using interpreters.
Mix of state employees and hired reviewers, with care to avoid conflict of interest when reviewers are providers
Reviewers must have good organizational skills and able to think on their feet
Reviewer training length has varied, currently 1 to 1.5 days depending on circumstances, plus shadowing at least one case
New reviewers must shadow experienced reviewers. For both CSR and SOCPR we benefitted tremendously from bringing in experienced reviewers from other states including NC, HI, AZ, FL and Canada.
SOCPR videos prepares reviewers for
intensive training
When you get to live training, we find that rehearsal of interviews is one of the most valuable aspects of training –
much more than lecture.
Who will be privy to which information?







Need to clearly define the purpose of the review and who will have access to information.
Providers are asked to explain the process to families and to obtain signed consent when families wish to participate.
What feedback should go to providers (N is very small)?
Will feedback go to payers (we have multiple managed care entities)?
How will you handle seriously deficient practice in real time?
How will data be shared with the wider community?
In MA we do a reviewer group debrief to develop themes and sharpen ratings, then provider feedback meetings (two‐way conversations), and finally we issue a summary report with aggregate ratings, quotes from participants, but no individually identifying info. We’d like to do more, given our investment.
Report Example…
SOCPR to MPR

Numerous changes in emphasis to get more discrimination in areas of practice where we have quality concerns (e.g. clinical assessment) rather then areas where we usually excel (e.g. community based).

Increased number of areas where we rate impact of services

Some reorganization of constructs

Replaced some questions re cultural competence with others arising from MA consensus process with CANS

Revised 7‐point rating scale to 5‐point scale with new anchors for clarity (CSR scale was 6 points)

Tried to clarify questions that reviewers in our context found confusing or redundant

Moved from paper protocols to HIPAA‐compliant web‐based survey tool
Moving from paper to electronic data collection
made reviews easier for us
Paper to electronic
How the SOCPR shaped our QI

In 2009 we knew it would be a challenge to create Wraparound statewide with high fidelity, for Intensive Care Coordination (ICC) and Family Partners

Over the next 5 years we spent $2 million in training and coaching plus additional funds for fidelity measurement and substantial expenditures by our Community Service Agencies to obtain high fidelity and a sustainable culture of Wraparound practice, with good results.

Our other flagship service, In‐home Therapy (IHT), resembled services our providers already had in place. For IHT we provided documentation on service specifications and a couple of days of centralized training.

What would you predict regarding the consistency of practice and the level of practice in ICC versus IHT?
SOCPR in 2014 showed us the impact of our
implementation support for ICC and IHT

While there was room for practice improvement in both services, there was less variation and higher average quality in ICC.

Assessment of child and family needs and strengths was an area where this was especially evident.
Response to SOCPR findings


The SOCPR results confirmed our confidence in the Wraparound implementation supports we’d used:

Defined practice model (National Wraparound Initiative)

Fidelity measurement (Eric Bruns’ WFI and TOM)

Training and coaching (Vroon Vandenberg, LLC)

Common training materials (Vroon Vandenberg, LLC)

Learning Collaboratives (Technical Assistance Collaborative)
…AND in our assessment that the MA Wraparound provider agencies were now successfully training and coaching their own staff to fidelity
Response to SOCPR findings, cont.

As a result, we began shifting implementation support and QI resources from ICC to IHT….

Created a manual of IHT Practice Guidelines in 2014

Provided statewide trainings on the Guidelines in 2015

Will sponsor two Learning Collaboratives for selected IHT providers in FY2016

Plan to develop an IHT self‐assessment tool in FY2016

In collaboration with state agency partners, will provide training in FY2016 for IHT and IHT supervisors re: working with families and youth of color and working with families with young children.

Plan to implement a demonstration trial of a modular evidence‐supported treatment approach
Response to SOCPR findings, cont.

…while still supporting Wraparound fidelity:

Continued fidelity monitoring using the Wraparound Fidelity Index, the Team Observation Measure and the Mass. Practice Review

Statewide and regional Wraparound provider meetings on topics of mutual interest

Trainings and learning collaboratives on specific topics
Going forward…

Continue to improve our case review process

Seek more participation from state agency partners, as reviewers and also as users of the methodology

Continue to refine our methodology to get more meaningful data at less expense

Look for more flexible uses of case reviews

e.g. focus on other services or on specific populations

For a recent study of care coordination in Outpatient we applied our SOCPR experience to shape a 3‐datapoint method that worked extremely well

Find more powerful ways to disseminate lessons learned to our stakeholders.

Integrate case review process more fully with other data gathering and QI processes
Thank you…




For the CSR: Karen Snyder, Court Monitor and Mary Brogan, MA lead for Human Systems and Outcomes, Inc.
For logistical and analytic support through the SOCPR and MPR: Technical Assistance Collaborative, especially Kelly English and Amy Horton
For training and support with the SOCPR: USF Dept of Child and Family Studies, especially Mario Hernandez and Debra Mowery
Our many participating family members, providers, team members and reviewers
For more information




Emily Sherwood: [email protected]; 617‐573‐1759
Jack Simons: [email protected]; 617‐573‐1791
Children’s Behavioral Health Initiative: www.mass.gov/masshealth/cbhi
SOCPR: http://cfscommunitysolutions.cbcs.usf.edu/oll/SOCPR.html