N

RIME
Educating Future Physicians to Track Health
Care Quality: Feasibility and Perceived Impact
of a Health Care Quality Report Card for
Medical Students
Sean M. O’Neill, PhD, Bruce L. Henschen, MD, MPH, Erin D. Unger, MD,
Paul S. Jansson, MS, Kristen Unti, Pietro Bortoletto, Kristine M. Gleason, MPH, RPh,
Donna M. Woods, PhD, and Daniel B. Evans, MD
Abstract
Purpose
Quality improvement (QI) requires
measurement, but medical schools
rarely provide opportunities for students
to measure their patient outcomes.
The authors tested the feasibility and
perceived impact of a quality metric
report card as part of an EducationCentered Medical Home longitudinal
curriculum.
their assigned patients. Scorecards were
created for each clinic team. Students
completed pre/post surveys on selfperceived QI skills.
Results
A total of 405 of their patients’ charts
were abstracted by 149 students (76%
response rate; mean 2.7 charts/student).
Median abstraction time was 21.8
(range: 13.1–37.1) minutes. Abstracted
data confirmed that the students had
successfully recruited a “high-risk”
patient panel. Initial performance on
abstracted quality measures ranged from
100% adherence on the use of betablockers in postmyocardial infarction
patients to 24% on documentation of
Method
Student teams were embedded into
faculty practices and assigned a panel
of patients to follow longitudinally.
Students performed retrospective chart
reviews and reported deidentified data
on 30 nationally endorsed QI metrics for
New models of care such as the
patient-centered medical home (PCMH)
show promise at lowering costs and
improving patient care by empowering
patients and mobilizing primary care
physicians as leaders of health teams.2,3
The core principles of the PCMH align
with many of the Carnegie Foundation
recommendations for medical education
reform,4,5 and leading physician
organizations have called on medical
schools to teach the principles of the
PCMH.6,7 However, current medical
school curricula provide minimal, if
any, exposure to emerging primary care
models such as the PCMH.8,9 Some
schools offer students meaningful
opportunities to follow patients
1
Please see the end of this article for information
about the authors.
Correspondence should be addressed to Dr. Evans,
Division of General Internal Medicine, Northwestern
Medical Faculty Foundation, 675 N. St. Clair, Suite 18200, Chicago, IL 60611; telephone: (312) 315-8032;
fax: (312) 926-2857; e-mail: [email protected].
Acad Med. 2013;88:00–00.
First published online
doi: 10.1097/ACM.0b013e3182a36bb5
Academic Medicine, Vol. 88, No. 10 / October 2013
longitudinally,10–12 but few such models
exist in medical school curricula.13
Opportunities for medical students to
track health care quality metrics and
measure outcomes for an authentic
patient panel as part of a longitudinal
clerkship have not been reported in the
literature.
In 2011, the Education-Centered
Medical Home (ECMH) launched at
Northwestern University’s Feinberg
School of Medicine (NUFSM).14 The
ECMH is a longitudinal clerkship
designed to integrate teams of medical
students into outpatient clinics that
focus on adopting the principles of the
PCMH, including continuity with a
personal physician; team-based care;
care coordination and integration;
quality and safety; and enhanced access
to care.6 Specific components of the
ECMH learning environment include
longitudinal clinical experiences; care
coordination of medically complex
patients; peer teaching; quality assessment
and improvement activities; and monthly
“grand rounds” didactic sessions. After a
dilated diabetic eye exams. After the
chart abstraction assignment, grand
rounds, and background readings,
student self-assessment of their
perceived QI skills significantly increased
for all metrics, though it remained low.
Conclusions
Creation of an actionable health
care quality report card as part of an
ambulatory longitudinal experience
is feasible, and it improves student
perception of QI skills. Future research
will aim to use statistical process
control methods to track health care
quality prospectively as our students
use their scorecards to drive clinic-level
improvement efforts.
one-year pilot, the ECMH was expanded
to 13 clinics and 202 students (29% of the
NUFSM student body) for the 2012–2013
academic year (see Table 1).
A core objective of the ECMH
curriculum was for every student to
contribute to the development of a
quality metric scorecard using National
Quality Forum–endorsed measures.15
Our goal in this study was to introduce
medical students, who had little
background knowledge related to quality
improvement (QI), to the concepts and
process of QI by having them participate
in a quality assessment practicum.
We set out to create a practical tool for
tracking ECMH program performance
centrally and for highlighting quality
deficits and areas for improvement at
the individual clinic level. We tasked
all medical students enrolled in the
ECMH to measure the quality of their
clinic’s care, with the intention of
providing firsthand experience in QI data
abstraction, using available QI metrics,
applying these metrics to an authentic
1
Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.
RIME
Northwestern University institutional
review board approved this study.
Table 1
Medical Students Participating in the
Education-Centered Medical Home
and Chart Abstraction Practicum,
Northwestern University Feinberg
School of Medicine, 2012–2013
Total
students
Students
abstracting
≥1 chart
First year
Second year
50
51
31
34
Third year
51
43
Fourth year
50
41
202
149
Class
Total
patient panel, and then using clinical
data to drive improvement efforts. Here
we report the year one results regarding
feasibility and perceived impact of our
ECMH quality report card curriculum.
Method
ECMH setting and student participants
The organizational design of the ECMH
has been previously reported,14,16 but here
we provide a brief summary. We recruited
students from all four years of medical
education at NUFSM to participate in
the ECMH curriculum. We matched
them in teams of 16 (4 students from
each class) to existing faculty practices.
Each week, 7 or 8 students from each
team would attend clinic and provide
direct patient care under the traditional
preceptor model, with third- and fourthyear students directly observing and
mentoring their first- and second-year
peers. Students also assumed the roles
of health coaches and care coordinators
in an effort to deliver PCMH-level
care.1,6,14,17
In the 2011–2012 academic year, 56
students worked across 4 pilot clinics.
On the basis of positive reactions from
students, preceptors, and patients, the
program was expanded to 202 students
across 13 clinics for the 2012–2013
academic year (see Table 1). Three of the
pilot clinics continued uninterrupted,
and we refer to these as “experienced”
clinics in this report. One hundred
ninety-seven out of the 202 available
students consented to be part of the
educational study and to allow their
aggregated assessment results and
surveys to be analyzed and reported. The
2
on the ECMH scorecard.18 Scorecards
were used by each ECMH clinic to view
their clinic’s aggregate performance and
discuss strategies for real-time clinical
improvement.
Patient selection
From their existing panels, preceptors
recruited “high-risk” patients, defined as
any patient who had complex medical
or social issues, required three to four
office visits per year, and/or had two
or more emergency room visits or
hospital admissions during the past
year. Preceptors verbally explained to
potential enrollees that they would be
seeing medical students in clinic who
would serve as their care coordinator
and who would contact them outside
clinic for follow-up. Patients who
agreed to participate had their routine
appointments steered toward ECMH
clinic days but could also schedule
appointments with their primary
physician for urgent issues at other times.
Patients were defined as “enrolled” with
the ECMH team on the date they were
identified by the preceptor as “in need of
outreach” and were contacted by phone
for enrollment, or the date of their first
visit with the student team, whichever
came first. This was done to capture
patients who may have been called by
students following enrollment but prior
to coming to clinic. We noted each
patient’s date of enrollment.
Data abstraction and analysis
Students participating in the ECMH
attended a large-group session on quality
measurement and data abstraction.
Most students reported little previous
experience with quality measurement
or improvement (5 students reported
> 50 hours, but the remaining 192
students estimated an average of 2.8
hours of prior QI training). Over two
months, all students were given a series
of background readings on health care
QI,18–22 were sent a link to a secure Webbased abstraction instrument built using
SurveyMonkey.com (SurveyMonkey, Palo
Alto, California), and were instructed to
use the tool to abstract quality metrics
from patients in their personal panel.
This abstraction tool included 157 items
designed to calculate each of the quality
indicators; skip logic built into the
instrument reduced the number of items
answered for patients who did not meet
qualifying criteria (e.g., questions for
diabetes-specific indicators were skipped
if a patient did not have diabetes).
ECMH interventions
Students were tasked to serve as “health
coaches” for between one and six patients,
depending on the student’s training level,
under the supervision of their clinic
preceptor. In addition to coordinating
their patients’ ambulatory visits, students
in the ECMH provided a range of
services to empaneled patients, including
telephone outreach, calls to specialists
to coordinate care, health behavior
coaching, and real-time identification of
quality measure deficits during outpatient
visits (e.g., immunization or diabetic foot
exam deficiencies).
Quality indicators
We incorporated a total of 27 adult
and 3 pediatric National Quality
Forum–endorsed quality indicators into
an ECMH “Quality Scorecard.”15 These
included 19 measures reflecting chronic
disease management as well as 9 adult
and 2 pediatric preventive care measures.
Metrics suggested by the Commonwealth
Fund for tracking QI in the PCMH
environment were heavily represented
Using the electronic medical record,
students entered deidentified patient
data from 2010, 2011, and 2012 using
the abstraction instrument. Patient
data were abstracted for 2010, prior
to the ECMH launch, to assess each
patient’s baseline performance with each
quality metric. For patients who had no
data from 2010, we still included their
data from 2011 and 2012 in the study.
Students practiced using the abstraction
instrument by entering “test patients” to
become familiar with the data-gathering
process; these practice sessions were
later dropped from the final dataset. We
deidentified patients by assigning them
a generic ECMH identification number.
The validity of the data was assessed by
a physician (D.E.) who performed an
implicit review of a random 5% sample
(n = 20) of the student-abstracted charts.
To develop a picture of the reliability of
our abstraction process, a team of three
trained chart reviewers independently
abstracted the same sample. We then
calculated the pairwise agreement and
kappa statistic between the trained
and student reviewers to determine
Academic Medicine, Vol. 88, No. 10 / October 2013
Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.
RIME
the eligibility and pass criteria for all
indicators.
We asked students to complete the
previously developed Quality Assessment
and Improvement Curriculum (QAIC)
Toolkit survey on self-perceived QI skills
both pre and post intervention.23,24 We
compared pre- and postintervention
surveys using the Wilcoxon signed rank
test because the QAIC survey collects
responses on an ordinal, four-point
Likert-type scale.
We calculated descriptive statistics
regarding the student abstraction process
and the patient panel. We determined
the proportion of eligible patients who
received the specified care for each quality
indicator in each year. All statistical
analyses were conducted in STATA
statistical software version 12.0 (STATA
Corporation, College Station, Texas). As
data were collected, we communicated
ongoing results to students at monthly
grand rounds large-group sessions, as
well as through regular e-mails from the
ECMH director (D.E.) to the students.
Clinic-specific quality scorecards were
generated using the final abstraction
results and given to each student team
for the purpose of finding quality deficits
and encouraging patient outreach.
Results
Seventy-six percent of students (149/197)
completed at least one patient record
abstraction (range: 1–10; see Table 1).
In total, 405 patient records were
abstracted (mean: 2.7 per student; SD:
1.8). Third-year students abstracted 3.3
records on average compared with 1.6
records for first-year students. Median
abstraction time was 21.8 minutes
(interquartile range: 13.1–37.1). Students
in the experienced ECMH clinics that
had participated in the pilot year tended
to abstract more records (3.3 versus 2.5)
but had no differences in extraction time.
Mean agreement across all abstractor
pairs (n = 20) was 86%, and the kappa
statistic was 0.59. Mean agreement across
abstractor pairs that included a first-year
student-abstractor (n = 4) was much
poorer (61%, kappa 0.12) compared with
second- through fourth-year abstractors
(92%, kappa = 0.60).
Among the abstracted records, 100
patients were enrolled by the end of 2011;
355 were enrolled by the end of 2012,
Academic Medicine, Vol. 88, No. 10 / October 2013
and 405 patients were enrolled by the end
of the study in February 2013. Patients’
characteristics are displayed in Table 2.
Patients were eligible for an average of 8.0
indicators (SD 5.0, range 0–21), meaning
that they met criteria to be screened for
those indicators, and more than 10% of
the sample population were eligible for
22 out of the 27 adult indicators (see
Table 3). Weight screening (85%) and
influenza immunization (85%) were the
most common indicators that patients
were eligible for; by contrast, less than 5%
of patients were eligible for several heartdisease-related metrics and pediatric
measures.
Overall performance on quality measures
ranged from 100% for beta-blocker use
for patients with a history of myocardial
infarction to a low of 24% for dilated
diabetic eye exams (see Table 3). We
performed exploratory analyses examining
trends in performance by year, ECMH
enrollment, duration of enrollment, and
clinic experience. From 2010 (pre-ECMH)
to 2012, the greatest performance gains
overall were observed in diabetic foot
exams (22% versus 55%), chlamydia
screening rates (32% versus 62%), medical
attention to diabetic nephropathy (65%
versus 91%) and the use of inhaled
steroids for moderate-to-severe persistent
asthma (71% versus 100%).
Table 2
Characteristics of Abstracted
Education-Centered Medical Home
Patients, Northwestern University
Feinberg School of Medicine,
2012–2013
Characteristic
No. (%)
Total patients
Gender
405 (100)
Male
139 (34)
Female
251 (62)
Not recorded
15 (4)
Race
White, non-Hispanic
146 (36)
Black or African American
151 (37)
White, Hispanic, or Latino
75 (19)
Asian or Pacific Islander
Native American
Unknown or declined
10 (2)
6 (1)
17 (4)
Chronic medical conditions
Hypertension
233 (58)
Diabetes
99 (24)
Coronary artery disease
66 (16)
Active smoker
70 (17)
Asthma (moderate-to-severe)
29 (7)
Atrial fibrillation
28 (7)
History of myocardial
infarction
19 (5)
Congestive heart failure
18 (4)
BMI (kg/m2)
Underweight (<18.5)
9 (2)
A total of 147 out of 197 students
(75%) completed both the pre and post
QAIC survey on self-assessment of QI
skills (see Table 4).25 Mean baseline
confidence in QI skills was at least
“slightly” confident in most domains,
with postintervention confidence ratings
advancing to between “slightly” and
“moderately” confident. At the end of the
ECMH-QI project, students were asked
to rate the educational value/impact of
the exercise. Sixty-six percent of students
agreed or strongly agreed with the
statement “reviewing the quality of care
for my individual patients was a valuable
exercise,” and 77% agreed or strongly
agreed with the statement “prospectively
following ECMH quality metrics going
forward will be a valuable exercise.”
Controlled (18.5–25)
66 (18)
Discussion and Conclusions
clerkship. A quick-turnaround “quality
scorecard” is capable of highlighting
specific care deficits on which to focus
improvement efforts. A significant
number of patients were eligible for most
of the indicators used, which illustrates
Our results demonstrate the feasibility of
medical student participation in firsthand
health care quality measurement using
a Web-based abstraction instrument for
patients they have seen on a longitudinal
Overweight (25–30)
Obese (>30)
93 (25)
163 (44)
30–35
79 (21)
35–50
78 (21)
>50
Not recorded
6 (2)
71 (19)
Enrollment in ECMH
2010
0 (0)
2011
100 (25)
2012
355 (88)
2013
405 (100)
Clinic type
Internal medicine
234 (58)
Family medicine
158 (39)
Pediatrics
13 (3)
3
Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.
RIME
Table 3
Education-Centered Medical Home Patient Performance on Quality Report Card
Indicators for Calendar Year 2012, Northwestern University Feinberg School of
Medicine
% of eligible
National
% of eligible
patients
Quality Fund No. of eligible
patients meeting metric
indicator no. patients (2012)
(2012)
(2012)
Title
Total patients
Chronic disease management measures
Adult weight screening and follow-up
—
405
100.0
—
421
344
84.9
59.9
Controlling high blood pressure
18
205
50.6
52.2
Chronic stable coronary artery disease: antiplatelet therapy
67
52
12.8
82.7
Chronic stable coronary artery disease: beta-blocker therapy—prior
myocardial infarction (MI)
70
17
4.2
100.0
Chronic stable coronary artery disease: lipid control
74
57
14.1
82.5
Ischemic vascular disease: complete lipid profile and LDL control < 100
75
66
16.3
50.0
Heart failure: angiotensin-converting enzyme (ACE) inhibitor or
angiotensin receptor blocker (ARB) therapy for left ventricular
systolic dysfunction
81
15
3.7
93.3
Heart failure: beta-blocker therapy for left ventricular systolic
dysfunction
83
16
4.0
93.8
Atrial fibrillation: anticoagulation therapy prescribed
624
19
4.7
78.9
Diabetes: eye exam
55
99
24.4
24.2
Diabetes: foot exam
56
99
24.4
55.6
Diabetes: hemoglobin A1c (HbA1c) testing
57
99
24.4
89.9
575
99
24.4
53.5
Diabetes: medical attention to nephropathy
62
99
24.4
90.9
Diabetes lipid management: low-density lipoprotein cholesterol
(LDL-C) < 130
64
99
24.4
62.6
Diabetes lipid management: LDL-C < 100
64
99
24.4
43.4
Use of appropriate medications for people with asthma
36
24
5.9
100.0
Monitoring chronic medications
21
194
47.9
80.4
Tobacco use assessment
28
326
80.5
93.3
Tobacco cessation intervention
28
67
16.5
79.1
Breast cancer screening
31
120
29.6
59.2
Colorectal cancer screening
34
176
43.5
52.8
Cervical cancer screening
32
172
42.5
61.0
Influenza immunization
41
344
84.9
50.9
Pneumonia vaccination status for older adults
43
78
19.3
75.6
Pneumonia vaccination status for older adults and adults with severe
medical conditions
43
187
46.2
62.0
Chlamydia screening
33
50
12.3
62.0
38
5
1.2
100.0
1,392
4
1.0
100.0
Diabetes: HbA1c control (<8.0%)
Preventive care measures
Pediatric preventive care measures
Childhood immunization status at 2 years of age
Well-child visits before 15 months
the usefulness of this particular scorecard
for this set of primary care clinics.
Several aspects of this project are
unique from an educational perspective.
First, data abstraction for quality
measurement is typically carried out
by trained abstractors who focus on
quality measurement. We were able to
4
show that a distributed data collection
mechanism, employing medical
students for only a few hours at most,
was able to rapidly produce useful
quality measurement data. Second, by
being so deeply involved in the quality
measurement process, students are
experiencing firsthand the limitations
and frustrations that go along with
translating the complex act of clinical
care into objective statistics.
Students’ self-assessments further reflect
these frustrations as they, on average,
reported only moderate comfort with
their skills after the exercise. The low
pre- and postassessment QAIC ratings
are similar to students’ and residents’
Academic Medicine, Vol. 88, No. 10 / October 2013
Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.
RIME
Table 4
Students’ (n = 147) Mean Rating of Their Self-Assessment of Quality Improvement
Learning Objectives, Northwestern Feinberg University School of Medicine,
2012–2013
Learning objectives
Preprogram
confidence rating,
mean (SD)*
Writing a clear problem statement
(goal, aim)
Applying the best professional
knowledge
Postprogram
confidence rating,
P
mean (SD)* value
2.5 (0.7)
2.9 (0.6)
<.001
2.3 (0.8)
2.7 (0.6)
<.001
Using measurement to improve your
skills
2.3 (0.7)
2.7 (0.6)
<.001
Studying the process
2.2 (0.7)
2.6 (0.7)
<.001
Making changes in a system
2.0 (0.8)
2.3 (0.7)
<.001
Identifying whether a change leads
to improvement in your skills
2.3 (0.7)
2.6 (0.6)
<.001
Using small cycles of change
2.0 (0.8)
2.6 (0.7)
<.001
Identifying best practices and
comparing these to your local
practice/skills
2.2 (0.7)
2.7 (0.6)
<.001
Implementing a structured plan to
test a change
2.1 (0.8)
2.5 (0.7)
<.001
Using the PDSA model† as a
systematic framework for trial and
learning
1.5 (0.7)
2.2 (0.7)
<.001
Identifying how data are linked to
specific processes
2.0 (0.8)
2.5 (0.7)
<.001
Building your next improvement on
prior success or failure
2.2 (0.8)
2.7 (0.6)
<.001
*Rating scale of confidence: 1 = not at all, 2 = slightly, 3 = moderately, 4 = extremely.
†
PDSA indicates the Plan, Do, Study, Act model of quality improvement, a method of organizing the process of
quality improvement efforts. See: American Academy of Family Practice. Model for Improvement Video. http://
www.aafp.org/practice-management/pcmh/overview/videos.html. Accessed August 7, 2013.
QI self-confidence ratings seen in recent
studies26,27 and may also stem from little
prior exposure to data abstraction and
from the short study length. Further
exposure to directed didactic and training
modules for QI skills, in addition to the
much broader ECMH “grand rounds”
curriculum, may have helped to increase
self-confidence ratings. Despite these
challenges, student feedback was largely
positive and may reflect an appreciation
for the importance of accurate and timely
data collection for informing QI efforts.
After seeing their clinic’s performance
outcomes, participating students shared
during grand rounds discussions that
the process is forcing them to consider
clinical care at a new level they had not
fully appreciated before. We believe that
exposure to such a “deep-dive” view of
their own clinical care during medical
school will engender a familiarity and
informed perspective on the quality
metrics by which they will be assessed
for the remainder of their careers.
Future planned analyses of the ECMH
Academic Medicine, Vol. 88, No. 10 / October 2013
intervention include tracking students’
academic and clinical performance
over time.
This study has several limitations.
First, our data collection process
was distributed widely among 149
independent abstractors, and agreement
statistics of 86% may not be optimal.
There are no widely accepted thresholds
for interrater reliability, but high
reliability is optimal when using quality
indicators to drive organizational
improvement.28 However, given the wide
variation in abstractor training level
(first- through fourth-year students) and
experience, and the ECMH’s primary
emphasis on hands-on educational
experience (versus “gold standard”
quality measurement), our results are an
encouraging first step toward building
the practical tools by which our students
will eventually assess their performance
using clinical measures. Second, the
ECMH intervention itself is occurring
across multiple clinics in multiple
settings with differing resources, practice
habits, and patient populations, and
the challenges facing each clinic vary.
Thus, the treatment effect of the ECMH
model itself on patient care is difficult to
ascertain, and some improvements may
be attributable to the Hawthorne effect.
Third, we did not prospectively identify
a group of control clinics by which to
randomize the ECMH intervention.
However, applying traditional biomedical
research methodology to complex social
and organizational interventions is
difficult, and its utility is controversial.29,30
The ECMH is an evolving, ongoing
intervention within a medical school
and multiple independent physician
groups, and research projects are under
way to evaluate its educational and
QI effectiveness through a variety of
methodological approaches.
In summary, we demonstrated the
feasibility of an authentic QI curriculum
for medical students embedded in a
longitudinal clerkship structure. By
using a rapid and convenient abstraction
instrument, students were able to
identify opportunities to improve the
quality of care they delivered to their
patients. Using the curriculum, student
perceptions of their own QI skills
improved compared with baseline. We
hope that by participating integrally in a
real-time quality measurement process,
medical students will be inspired to
generate meaningful QI interventions
within the ECMH. Furthermore, learning
QI techniques in a practical setting may
provide ECMH graduates with the tools
needed to measure and improve their
future workplaces. Further work is required
to refine our abstraction instrument and
follow these data over time to see if our
students can improve the quality of care for
their personal patient panels.
Acknowledgments: The authors wish to thank the
physicians and practices who hosted our students:
Jennifer Bierman, MD, at Northwestern Medical
Faculty Foundation, Chicago, Illinois; Rebecca
DeHoek, MD, and Alisha Thomas, MD, at PCC
Wellness, Chicago, Illinois; Allison Cromwell, MD,
Christopher Konstantelos, MD, Jessica Kennedy,
MD, and Paul Pickering, MD, at Northwestern
Memorial Physicians Group, Chicago, Illinois;
Frank Castillo, MD, and Laura Zimmerman, MD,
at Erie Family Health, Chicago, Illinois; Mark
Loafman, MD, at Norwegian American Hospital,
Chicago, Illinois; and Caroline Kalember, MD, at
Associates in Internal Medicine (AIM), Chicago,
Illinois. The authors also would like to recognize
Ricardo Rosenkranz, MD, Elizabeth Ryan, EdD,
Allison Hammer, BA, Berna Jacobson, BA, Mark
5
Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.
RIME
Loafman, MD, Sharon Unti, MD, Jennifer Bierman,
MD, Heather Heiman, MD, and John Butter, MD,
for their contributions to the Education-Centered
Medical Home (ECMH) steering committee.
Finally, the authors thank Raymond Curry, MD,
and John X. Thomas, PhD, for their guidance and
administrative support during the implementation
phase of our ECMH program.
Funding/Support: Intramural funding for this
study was provided by Northwestern University
Feinberg School of Medicine. No external
funding was obtained.
Other disclosures: None.
Ethical approval: This study was approved by
the institutional review board of Northwestern
University.
Previous presentations: The abstract of an earlier
version of this article was presented as a poster
on February 1, 2013, at the 3rd Annual Education
Day, Feinberg Academy of Medical Educators,
Northwestern University Feinberg School of
Medicine, Chicago, Illinois.
An overview of the ECMH program was
presented at the Research in Medical Education
(RIME) Conference, AAMC Annual Meeting,
November 2012, San Francisco, California. Oral
presentation: “The patient centered-medical
home as curricular model: Feasibility of building
an education-centered medical home.”
A panel discussion on the ECMH as a vehicle for
teaching quality improvement was presented at
the AAMC Integrating Quality Annual Meeting,
June 2012, Rosemont, Illinois. Title: “The
‘Education-Centered Medical Home’: A new
clerkship model where EVERY student gets a
quality report card.”
Dr. O’Neill is an MD candidate, Feinberg School of
Medicine, Northwestern University, Chicago, Illinois.
Dr. Henschen is an internal medicine resident,
Department of Medicine, Northwestern McGaw
Medical Center, Northwestern University, Chicago,
Illinois.
Dr. Unger is an internal medicine resident,
Department of Medicine, Northwestern McGaw
Medical Center, Northwestern University, Chicago,
Illinois.
Mr. Jansson is an MD candidate, Feinberg School
of Medicine, Northwestern University, Chicago,
Illinois.
Ms. Unti is an MD candidate, Feinberg School of
Medicine, Northwestern University, Chicago, Illinois.
Mr. Bortoletto is an MD candidate, Feinberg
School of Medicine, Northwestern University,
Chicago, Illinois.
Ms. Gleason is clinical quality leader, Clinical
Quality Department, Northwestern Memorial
Hospital, Chicago, Illinois.
Dr. Woods is research associate professor, Center
for Healthcare Studies–Institute for Public Health and
Medicine, Northwestern University, Chicago, Illinois.
Dr. Evans is assistant professor, Division of General
Internal Medicine, Northwestern Medical Faculty
Foundation, Northwestern University, Chicago, Illinois.
6
References
1 Patient-Centered Primary Care Collaborative.
Defining the medical home. http://www.
pcpcc.net/content/joint-principles-patientcentered-medical-home. Accessed June 6,
2013.
2 Stange KC, Nutting PA, Miller WL, et
al. Defining and measuring the patientcentered medical home. J Gen Intern Med.
2010;25:601–612.
3 Jackson GL, Powers BJ, Chatterjee R, et
al. The patient-centered medical home:
A systematic review. Ann Intern Med.
2013;158:169–178.
4 Irby DM, Cooke M, O’Brien BC. Calls
for reform of medical education by the
Carnegie Foundation for the Advancement
of Teaching: 1910 and 2010. Acad Med.
2010;85:220–227.
5 Cooke M, Irby D, O’Brien B. Educating
Physicians: A Call for Reform of Medical
School and Residency. San Francisco, Calif:
Jossey-Bass; 2010.
6 Baxley E, Dearing J, Esquivel M, et al. Joint
Principles for the Medical Education of
Physicians as Preparation for Practice in the
Patient-Centered Medical Home. http://www.
acponline.org/running_practice/delivery_
and_payment_models/pcmh/understanding/
educ-joint-principles.pdf. Accessed June 6,
2013.
7 Lieberman SA, McCallum RM, Anderson
GD. A golden opportunity: The coevolution
of medical and education homes. Acad Med.
2011;86:1342.
8 Joo P, Younge R, Jones D, Hove J, Lin S,
Burton W. Medical student awareness of the
patient-centered medical home. Fam Med.
2011;43:696–701.
9 Lausen H, Kruse JE, Barnhart AJ, Smith TJ.
The patient-centered medical home: A new
perspective for the family medicine clerkship.
Fam Med. 2011;43:718–720.
10 Teherani A, Irby DM, Loeser H. Outcomes
of different clerkship models: Longitudinal
integrated, hybrid, and block. Acad Med.
2013;88:35–43.
11 Hirsh D, Gaufberg E, Ogur B, et al.
Educational outcomes of the Harvard
Medical School–Cambridge integrated
clerkship: A way forward for medical
education. Acad Med. 2012;87:643–650.
12 Poncelet A, Bokser S, Calton B, et al.
Development of a longitudinal integrated
clerkship at an academic medical center. Med
Educ Online. 2011;16:5939. http://med-edonline.net/index.php/meo/article/view/5939.
Accessed August 7, 2013.
13 Ogrinc G, Mutha S, Irby DM. Evidence for
longitudinal ambulatory care rotations:
A review of the literature. Acad Med.
2002;77:688–693.
14 Henschen BL, Garcia PM, Jacobson B, et
al. The patient centered medical home as
curricular model: Perceived impact of the
“education-centered medical home.” J Gen
Intern Med. 2013;28:1105–1109.
15 National Quality Forum. Measures, reports
and tools. http://www.qualityforum.org/
Measures_Reports_Tools.aspx. Accessed June
6, 2013.
16 Evans D. The patient-centered medical home
as a curricular model: Medical students
need an “educational home.” Acad Med.
2011;86:e2.
17 American College of Physicians. The PatientCentered Medical Home Neighbor: The
Interface of the Patient-Centered Medical
Home With Specialty/Subspecialty Practices.
http://www.acponline.org/advocacy/
where_we_stand/policy/pcmh_neighbors.
pdf. Published August 1, 2010. Accessed June
6, 2013.
18 Rosenthal MB, Abrams MK, Bitton A, and the
Patient-Centered Medical Home Evaluators’
Collaborative. Recommended Core Measures
for Evaluating the Patient-Centered Medical
Home: Cost, Utilization, and Clinical
Quality. http://www.commonwealthfund.
org/~/media/Files/Publications/Data%20
Brief/2012/1601_Rosenthal_recommended_
core_measures_PCMH_v2.pdf. Accessed
June 6, 2013.
19 McGlynn EA, Asch SM, Adams J, et al.
The quality of health care delivered to
adults in the United States. N Engl J Med.
2003;348:2635–2645.
20 Batalden PB, Davidoff F. What is “quality
improvement” and how can it transform
healthcare? Qual Saf Health Care. 2007;16:
2–3.
21 Campbell SM, Braspenning J, Hutchinson
A, Marshall MN. Research methods used in
developing and applying quality indicators in
primary care. BMJ. 2003;326:816–819.
22 Aron DC, Headrick LA. Educating physicians
prepared to improve care and safety is no
accident: It requires a systematic approach.
Qual Saf Health Care. 2002;11:168–173.
23 Oyler J, Vinci L, Johnson J, Arora V. Quality
Assessment and Improvement Curriculum
(QAIC) Toolkit. http://medqi.bsd.uchicago.
edu/documents/QAICToolKit5-09.pdf.
Accessed June 6, 2013.
24 Oyler J, Vinci L, Arora V, Johnson J.
Teaching internal medicine residents quality
improvement techniques using the ABIM’s
practice improvement modules. J Gen Intern
Med. 2008;23:927–930.
25 American Academy of Family Practice. Model
for Improvement Video. http://www.aafp.org/
practice-management/pcmh/overview/videos.
html. Accessed August 7, 2013.
26 Shunk R, Dulay M, Julian K, et al. Using
the American Board of Internal Medicine
practice improvement modules to teach
internal medicine residents practice
improvement. J Grad Med Educ. 2010;2:
90–95.
27 Levitt DS, Hauer KE, Poncelet A, Mookherjee
S. An innovative quality improvement
curriculum for third-year medical students.
Med Educ Online. 2012;17:18391. http://
med-ed-online.net/index.php/meo/article/
view/18391/html. Accessed August 7, 2013.
28 Quality Measurement and Health Assessment
Group (QMHAG), Center for Medicare
and Medicaid Services. CMS Measures
Management Special Project. Adopted
from NQF Final Evaluation Criteria. http://
www.hsag.com/services/special/mms.aspx.
Accessed June 6, 2013.
29 Grembowski D, Anderson ML, Conrad DA, et
al. Evaluation of the group health cooperative
access initiative: Study design challenges
in estimating the impact of a large-scale
organizational transformation. Qual Manag
Health Care. 2008;17:292–303.
30 Berwick DM. The science of improvement.
JAMA. 2008;299:1182–1184.
Academic Medicine, Vol. 88, No. 10 / October 2013
Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.