How to Manage EVS for Optimal Service Quality and Productivity

How to Manage EVS for
Optimal Service Quality
and
Productivity
Jo D’Ambrosio, EVS Director
Cottage Hospital, Santa Barbara, CA
Andrew Ganti, MBB
GE Healthcare, San Ramon, CA
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Introduction
Objectives of the Presentation
•
•
•
Define dashboard metrics for balancing quality and productivity in EVS
Select performance targets to promote continuous improvement
Find out how you compare with the best performing peers of yours
Audience
•
EVS Shift & Group Leads, Supervisors, Mgrs., Directors, Hospital CXOs
Contents & Organization
•
•
•
•
•
•
•
Definition of performance monitoring
Definition of a Dashboard and its elements
Need for Balancing the metrics and ways to do it
Some Key metrics in EVS
Examples of Service quality and productivity metrics
Approaches to facilitate continuous improvement
Benchmark to compare the performance with the best in class hospital
Quiz
•
•
•
•
Service Quality:COPIS,Expectations/Metrics, Feedback, Monitoring
Productivity:What is it? Why? How to measure, monitor
Management:What is it? Why? Balance b/w behavioral, financial, Core processes
Optimization:What is it? Why? How to measure, monitor multiple metrics
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Performance Monitoring
• To improve anything, first it should be measured
• Monitoring is to compare expected results with the actual
over time
• Applicable to individuals, groups, sections, departments,
divisions, hospitals, multi-hospitals
• Monitoring may cover one or more aspects e.g. quality,
service, cost, revenue, productivity, profitability, margins
(operating, contributing) etc.
• Monitoring involves metrics or indicators of performance
and can be quantitative or descriptive (e.g. pass/fail,
satisfactory/unsatisfactory)
• Monitoring is not just to measure but to improve, reward
and recognize staff
• Follow-up and feedback are important elements
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
1
Performance Monitoring
Expectations
(standards)
Expectations
(standards)
Svce
Svce
Supplier
Process
Customer
Prod.
Prod.
Feedback
Feedback
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Monitoring Tools
• By User Types - External or Internal
• Dashboards for External Users
• Scorecards for Internal Users
–
–
–
–
–
–
Metrics: single or multiple for balanced
Formats: graphs, tables, gages
Frequency: Daily, Weekly, Monthly ,Yearly
Processing Methods: manual, computerized
Display: Hardcopy or electronic; B&W or color
Dissemination: bulletin board, e-mail, on-line
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
2
Dashboard Components
• Metrics, units & definitions
• Target or Expected performance (can be a
range also a.k.a tolerance limits)
• Actual performance
• Legend, labels & units for axes for a graph
• Interpretation guide I.e. what is favorable
• Variance and Root Cause Analysis
• Corrective Action (what, who, when & how) if
performance is unfavorable
• Feedback mechanism
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
S P E C I F IC W O R K D U T I E S
P O S IT IO N /T IT L E :
Flo o r C a r e
JO B :
3 rd & 6 th F l o o r
E N V IR O N M E N T A L S E R V IC E S
S H IF T :
D ATE:
7 :0 0 - 1 5 :3 0
M ay 2007
E N V IR O N M E N T A L S E R V IC E S
S U P E R V IS O R O N D U T Y
D a ily D u tie s :
•
3 rd
&
6 th E a s t p a t i o n e e d s t o c l e a n d a i l y .
•
S p o t m o p & s w e e p a l l 3 th & 6 th f l o o r H a l l w a y s .
•
S c r u b & b u ff A s s ig n e d r o o m s a s a v a ila b le .
•
C h e c k e le v a to r s la n d in g a n d s c r u b & b u f f a s n e e d e d .
•
C h e ck w aitin g a re a fu rn itu re & cle a n a s n e e d e d .
W e e k ly P r o je c t s :
•
M o n d a y : S c r u b a n d b u f f 6 c e n t r a l h a llw a y s .
•
T uesday : S c r u b an d b u ff 3 c en tr a l h allw a y s.
•
W e d n e sd a y : S c r u b a n d b u f f 3 n o r t h & L . & D .
•
T h u rs d a y : S c r u b a n d b u f f p a t i e n t r o o m s a s a v a i l a b le .
•
F r id a y : S c r u b a n d b u f f 6 c e n t r a l h a l l w a y s .
•
S a t u rd a y : S c r u b a n d b u f f 3 c e n t r a l h a l l w a y s .
•
S u n d a y : S c r u b a n d b u f f p a t ie n t r o o m s a s a v a ila b le .
A d d it io n a l d u t ie s a s a s s ig n e d b y s u p e r v is o r .
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Balancing the Metrics
Types of Metrics
– Behavioral(Customer satisfaction,employee satisfaction, training
effectiveness)
– Financial (dollars related)
– Core Processes (Quality, productivity, response/turnaround
times, discharge cleaning)
Need for Balancing
–
–
–
–
Multiple aspects to any operations
Influencing factors are interdependent
Focusing only on one is undesirable
Reality poses competing priorities
Ways to Balancing
– Prioritize the metrics and select the top few
– Weighted avg. of some metrics if there are too many
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
3
Some Key Metrics in EVS
• Quality (service)
– Cleanliness (per internal/external expectation)
– Response & turnaround time (external expectation)
• Productivity
– By employee, by group, job category
• Operating Cost
– Labor, Supplies & Other
• Satisfaction
– Customer(external/ internal), Staff
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Press Ganey Monthly Results
Dec
Jan
Feb
Mar
Room Cleanliness
80.1%
83.4%
77.9%
78.4%
80.2%
87.3%
Courtsey of Person Cleaning Room
84.9%
85.2%
83.0%
85.0%
85.7%
87.9%
%
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Courtsey of Person Cleaning Room
87
.9
%
%
85
.7
85
.0
83
.0
%
%
84
.9
90%
85
.2
%
Room Cleanliness
100%
April May
80%
70%
87.
3%
80.
2%
78.
4%
30%
77.
9%
40%
80
.1%
50%
83
.4%
60%
20%
10%
0%
Dec
Jan
Feb
Mar
April
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Service Quality
COPIS
– Customer, Outputs, Processes, Inputs, Suppliers
Expectations
– Customers:
• External-Patients & families:
• Internal- Nursing, Ancillary, Support, Medical Staff
Metrics
– Cleanliness by user type (Public, Patient Care, Office
& Other )
– Response time to clean spills to prevent incidents
– Turnaround time to clean and ready the bed after
patient discharge
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
4
Service Quality (Cont’d)
Monitoring
– Data collection
– Data Processing & Dashboard Preparation
– Variance Analysis
– Corrective Action
Dissemination & Feedback
– Dashboard data display, recognition
– Variance explanation & follow-up support
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
DIRTY TO CLEAN BED TRACKING
WK 1 WK 2 WK 3 WK 4 WK 5 WK 6 WK 7 WK 8 WK 9 WK 10
DIRTY
CLEAN NEXT
STAT
Total Number of
Rooms Cleaned
64
66
65
66
64
59
64
65
65
63
43
47
41
49
38
44
41
41
46
41
36
35
40
35
33
27
37
35
32
39
789
779
749
772
809
658
741
744
821
847
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Productivity
COPIS
–
–
–
–
–
Customer (See below)
Outputs (Dashboard, Var. Explanation, Corr.Actions)
Processes (collect data, prep dashboard, distribute)
Inputs (Time, $, Sq.ft., count of other units of measure)
Suppliers (Finance, EVS)
Expectations
– Customers: Internal- CXO, Multi-Dept.Director, Mgr., Sup., Lead
Definitions:
– Traditional: Productivity = Output/Input e.g. Meals/
hour;
– Modified: Productivity = Input/Output e.g. hours/meal;
Nsg hours/pat. day
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
5
7/2/2007
Reports for BedTracking® Statistics
Santa Barbara Cottage Hospital
0
Response/Turn-Time
Date: 6/29/2007 , All Users , All Patient Units , Cleaning time 5 - 60 min. Max. Response
time 120
min. , Based on Last Requests
ID:1
Employee 1
Status
Turn
Requests Compliant/Adjusted
Avg. Response Avg. Clean
Avg.
Cleans
Dirty
13
8 (62%) / 5 (38%)
1:00
0:19
1:19
Clean next
4
4 (100%) / 0 (0%)
0:10
0:22
0:32
Stat
2
2 (100%) / 0 (0%)
0:05
0:13
0:18
Totals (all statuses)
Requested: 19
Compliant Cleans: 14 (74%)
Adjusted Clean: 5 (26%)
ID:0
Employee 2
Status
Turn
Requests Compliant/Adjusted
Avg. Response Avg. Clean
Avg.
Cleans
Dirty
7
6 (86%) / 1 (14%)
0:44
0:16
1:00
Stat
1
1 (100%) / 0 (0%)
0:16
0:23
0:39
Totals (all statuses)
Requested: 8
Compliant Cleans: 7 (88%)
Adjusted Clean: 1 (13%)
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Metrics:
Productivity (Cont’d)
Time per unit of measure or Units (of measure) per time
unit
Time = Worked or Paid; Hours or Minutes
Unit of Measure:
– Area cleaned in a day, pay period, month or year per 1000 sft;
gross or net; sft by area such as public, Pt. care by type:
Med/surg, ICU; routine vs. isolation Office,hallways, stairs;
– Units cleaned in count I.e. No. of bathrooms cleaned; patient
days, adjusted patient days
– e.g. Worked hours per pay period per 1000 net sft.
Cleaned; per Adj. Pt. Day; per Adj. Discharge
Conversion factors: Paid to worked, divide by 0.9
Hrs. per year to Mo., divide by 12, Hrs/mo. to Hrs/pay period,
multiply by 12 and divide then divide by 26
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
_______________________________ Hospital
Metrics Development Worksheet
Performance Measurement: proposed
Proposed Measurement: ____________________________________________
How will the data be collected? _______________________________________
How will the proposed measurement be calculated? _______________________
What will be included or excluded in the measure? ________________________
With what frequency will the indicator be reported (monthly, quarterly) ________
Goals/Benchmarks:
What is the goal for the measurement? _________________________
What is the source of the measurement goal? ___________________________
Performance Improvement:
How does this indicator relate to the organizational goal of Patient Safety?
Signature ______________________________
Date ________________
Approval _________________________
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
6
Productivity (Cont’d)
Monitoring
– Data collection (Type, duration, sample size)
– Data Processing & Dashboard Preparation
– Variance Analysis
– Corrective Action
Dissemination & Feedback
– Dashboard data display, recognition
– Variance explanation & follow-up support
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Management
Definition:
–
–
–
–
–
Doing more with Less;
Doing the right thing at the right time;
Processing by priorities
Balancing competing priorities
Balancing competing forces e.g. staff needs vs. dept.
needs in staff scheduling
Purpose: Balance b/w behavioral, financial & core processes
Tools: Dashboards, Standard Operating Procedures
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Optimization
Definition:
– Maximum value obtainable when there is more than
one variable e.g. Optimum speed to get the highest
gas mileage in miles per gal. is 55 mph.Any speed
above or below 55 reduces the gas mileage
– Quality vs. Productivity i.e. productivity can be
increased up to a point after which quality could suffer
– Measure and monitor more than one metric that is
likely to move in the opposite direction
Monitoring:
– Watch for the trends in two or more metrics to ensure
one is not improved at the expense of the other
– Note that improving multiple metrics is not mutually
exclusive but has an optimum level
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
7
Approaches to Continuous Improvement
Definition: Improvement is a journey and not a destination. That is
why the improvement needs to be continuous
Approaches/ Methodologies:
• CQI: Continuous Quality Improvement
• TQI: Total Quality Improvement
• SDCA: Standardize (stabilize), Do, Check, Act
• PDCA: Plan, Do, Check and Act
• Six Sigma: DMAIC (Define, Measure, Analyze, Improve, Control)
to go to 6 sigma level or almost zero(99.999 good) defects
• Lean: Toyota Production System using Value Stream Maps, Single
Piece Flow, Just in Time, Line Balancing and 5 S (Sort, Sweep,
Shine, Standardize & Sustain) as part of Kaizen bursts
• Benchmarking & Best Practices with Peer or Best in Class outside
the hospital industry comparisons
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Benchmarking & Best Practices
Definition: Improvement is a journey with multiple milestones.
Benchmarking with peers is one way to select the target or the next
milestone and the best practices help reach that in the continuous
improvement process. Each can be used independent of the other
Benchmarking in EVS:
Metrics with benchmark data:
– Service Quality: Cleanliness, Response & Turnaround times
– Productivity: worked hrs/1000 sft, Worked hrs/adj. Discharge,
Worked hrs/patient day; worked hrs/discharge
Benchmarking Data Sources:
–
–
–
–
–
Hospitals in the multi-hospital system
Informal networking
Local, state and national trade association(?)
Contract Mgmt. Co.;EVS equipment/supplies vendors
Commercial benchmarking co. (e.g. Solucient)
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Benchmarking Metrics - Examples
SKILL MIX
Custodial Worker %
Management %
Other %
Hours Worked: Hours Paid %
Non-Payroll Hrs: Total Hrs Paid %
WORKLOAD / SERVICE INTENSITY
Non-Hospital Sq Ft: Tot Net Sq Ft %
Disch / Trans Cleanings / Discharge
LABOR PRODUCTIVITY RATIOS
HOURS WORKED PER
1000 Net Sq Ft Cleaned/Pay Period
Adj Patient Day
Adj Discharge
COST RATIOS
Staff Average Rate / Hour
Regional Adj Rate / Hour
EXPENSE $ / 1000 NET SQ FT CLEANED
Labor
Regional Adj Labor
Other Direct
Cleaning Supply
Management Contract
Total
Total w. Regional Adj Labor
EXPENSE $ / ADJ PATIENT DAY
Labor
Regional Adj Labor
Other Direct
Cleaning Supply
Management Contract
Total
Total w. Regional Adj Labor
EXPENSE $ / ADJ DISCHARGE
Labor
Regional Adj Labor
Other Direct
Cleaning Supply
Management Contract
Total
Total w. Regional Adj Labor
Total Dept Exp: Total Hospital Exp %
Source: Solucient
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
8
Adjust for the Differences
Statistical Characteristics
– % Carpeted, % Infectious wasted, % Spl proj. etc.
Hours include Mgr. or not
Organization Characteristics
– Clean rooms after discharge/transfer,Setup mtg.
Rooms,Clean kitchen floors, grounds, Replace light
bulbs, move beds. Furniture etc.
Dept. Operational Characteristics
– Staff assigned to OR, OB, ED; Contract mgmt.,
Computer system, special equipment., floor storage,
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
Quiz
• Service Quality
– COPIS
– Expectations/Metrics, Feedback & Monitoring
• Productivity
– What is it? Why? How to measure it? Monitor it?
• Management
– What is it? Why? Balance between behavioral,
financial, core processes, example for each metric
• Optimization
– What is it? Why? How to measure & monitor multiple
metrics
2007 ASHES Annual Conference
September 30-October 4
St. Louis, MO
9