Evaluation of the Sanford Inspire Program Teacher Preparation

Evaluation of the Sanford Inspire Program
Teacher Preparation Analytics LLC
March 2015
This is the eighth in a series of evaluation reports with notable findings and project accomplishments for the Sanford Inspire
Program. Independent evaluation of the Sanford program is made possible by cooperation from Sanford Inspire Program leaders
and team members, from teacher candidates, school and district administrators, and because of the leadership and faculty of the
Mary Lou Fulton Teachers College (MLF TC) at Arizona State University.
This evaluation report analyzes and reports on these components of the Sanford Inspire Program: the work to develop,
test, and disseminate on-demand modules for teacher candidates and in-service teachers; Sanford’s role in working
with college faculty and school partners to pilot surveys of K-12 students; its marketing and dissemination strategies;
and reports from K-12 partners about the impact of the Sanford Inspire Program on their work. The evaluation
evidence behind this report comes from data collected by Sanford and shared with the external evaluator for
independent analysis, in-depth review of six Sanford on-demand modules, and from interviews with Sanford team
members, ASU site coordinators, local school leaders, and the CEO of a partner organization.
On-Demand Modules for Candidates and In-Service Teachers
TPA reported last fall on the initial stages of work by the Sanford Inspire Program team to create and test on-line
professional development resources. These on-demand learning modules for teacher candidates and novice in-service
teachers are being created to meet specific needs for their audiences, based on research by Sanford. As we reported
earlier, the Sanford Inspire Program team gathered data from in-service teachers, graduates of Teachers College, and
ASU teacher preparation instructors. Results were discussed with preparation program and College leaders. Sanford
also used teacher candidate focus groups to identify teaching skills topics where the candidates wanted additional
information, and the Sanford team tapped performance assessment scores on the TAP rubric to identify areas where
2013-14 ASU teacher candidate performance indicated the need for reinforcement or refinement.
Six on-demand modules are profiled and reviewed in this evaluation report, including a fuller discussion of the
“Causes of Misbehavior” resource that was profiled briefly in October 2014. In each case the module was reviewed
from beginning to end through the ASU Professional Learning Library (PLL) by Sanford’s external evaluator. This
included viewing and interacting with all module components, completing the built-in assessment, reviewing planning
and implementation aids are included with the modules, and looking at the background research literature included
with each module. A brief description of module purposes and goals adapted from the modules themselves is presented
for each of the six:
Causes of Misbehavior: In order to respond to classroom misbehavior, teachers have to understand its “root
causes”. The module addresses some of these causes and provides resources through which teacher candidates
and classroom teachers can identify them and develop effective interventions. The module also presents material
on why teachers sometimes do not respond effectively to misbehavior.
Designing Effective Procedures: A safe and productive learning environment requires effective classroom
procedures. The module explains steps for creating successful procedures, helps users to identify what makes for
effective classroom procedures, and describes three steps for creating them.
2
Giving Clear Directions for a Task: This skill-building module aims to help users give clear directions that
minimize off-task behavior in the classroom “ and set your students up for success”. Among other things,
candidates and teachers utilizing the module will learn about the link between clear directions and fairness or
equity in the classroom, identify criteria that make for good directions, and review how to give directions
successfully.
Identifying Necessary Procedures: The main purpose of this module is to help teachers select procedures that
will enable them to manage the classroom efficiently. As the module overview notes, “ This module will
introduce you to a thought process that will help you to identify procedures to fill needs, fix problems, and solve
inefficiencies in your classroom.”
Physical Environment of Your Classroom: Its purpose is to help teachers arrange or modify their classroom
environment to meet the learning needs of students. The module explains how classroom environment affects
behavior, motivation, and achievement; it offers guidance and practice tools for improving the physical set-up of
a classroom.
Teaching Procedures: The module prompts users to think about the value of classroom procedures, with
examples of how to teach, practice, and reinforce key procedures with students. Users are given the opportunity
to create a plan for teaching a procedure to students, with templates and on-line guidance.
The modules can be accessed through Arizona State University’s Professional Learning Library (https://pll.asu.edu/p/)
or by linking to the Sanford Inspire Program’s Curriculum Resource Catalog (https://pll.asu.edu/p/node/192622). The
work discussed in this document was accessed through the January 21, 2015 edition of the Resource Catalog. Each
module has similar components—an introductory “foundation” section that defines an issue or teaching challenge; a
resource section that offers practice tools for learning about and implementing module materials; the assessment
component where users test and refine their knowledge; the application section of the module where users actually
apply their new knowledge to their own classroom setting; and the conclusion which points to further reading for those
interested in learning more about a module topic.
Some readers of this report may wonder about the need for these on-line learning resources or about the meticulous
attention to small details in the module content, implementation tools, and assessment instruments. Do teachers and
teacher candidates really need to give this much focus to procedural issues in their classrooms? Doug Lemov, in Teach
Like A Champion, provides one answer. Lemov refers to “specific, concrete, actionable techniques” as the basis for
successful teaching. He goes on to tell readers that “many of the techniques you will read about in this book may at
first seem mundane, unremarkable, even disappointing” (page 5), but “carefully built and practiced routines” can have
a powerful impact on student achievement (page 7).
As the Coalition for Psychology in Schools and Education of the American Psychological Association recently stated
in its 2015 publication, Top Twenty Principles from Psychology for PreK-12 Teaching and Learning: “Improved
social and classroom behavior, like academic skill, can be shaped and taught. In the most effective classrooms,
classroom rules and expectations represent a social curriculum that is taught and retaught throughout the academic
year” (page 29). Tapping an extensive research literature, the APA Coalition went on to note that:
Classroom rules and expectations can be taught and re-taught using the same principles as those used in
academic instruction including: clear presentation of a goal, task or behavior; opportunities for practice with
timely and specific feedback; reinforcement for desired behavior; and behavioral correction as needed (page
29).
3
Further empirical evidence for the value of these modules comes from the Measures of Effective Teaching Study
(MET). Employing videotaped observations of 3,000 teachers in six school districts, researchers found significant need
for improvement in areas tackled by the Sanford on-demand modules. For example, although 75% of observed
teachers got the highest positive score for classroom management on the Danielson Framework For Teaching (FFT),
only 5% of the same teachers received the highest score for managing classroom procedures. Similarly, a majority of
observed teachers obtained the top behavior management rating on the CLASS observation instrument, but only 15%
scored at that level for what CLASS calls “productivity” (defined as how well instructional time and routines are
handled). And MET study ratings on the UTeach observation instrument—known as UTOP—indicated that 40% of
observed teachers received the highest score for “students on task” and for classroom management, but 20% scored
equally well on lesson organization, and 15% for allocation of time during instruction.1
The Sanford Inspire Program’s curriculum team relied on evidence about candidate and teacher needs to identify
module topics, and they reached into the literature about effective teaching to shape the content of these modules. As
Lemov and the APA group indicate, the needs are real from the perspective of teachers and their students. The payoffs
for successful practice are equally real—well-organized classrooms, maximized opportunities for students to learn, and
the likely outcome of enhanced student achievement.
Developing and Testing the On-Line Modules
All six modules were developed, pilot tested, refined, and completed through a formal development process. The key
steps included deciding on pre-service and in-service teacher needs from the data analysis strategies described in the
first paragraph of this section; reviewing research literature to ensure that module content provides a research-based
learning resource; consulting with university faculty and/or subject-area experts; embedding learning assessment tools
in the module to gauge how well users are mastering its content; pilot tests with teachers, teacher candidates, and
teacher educators; refining the module from this actual use and feedback; and final production steps to ensure
professional polish as well as ease of use by those hoping to improve their practice.
All modules are pilot tested and feedback is reviewed carefully to ensure that products and processes are strong and
meet the needs of users. Sanford team members described for the evaluator the pilot testing process as well as specific
changes made as a result of feedback. The premise of pilot testing is that modules “are not perfect out of the gate” and
can be improved through this extensive review and feedback processes. Changes that resulted from feedback have
included additional research studies as sources for module content, improved navigation, additional grade- or contentarea examples in the module, branching within a module to give users more choices to fit their specific needs, closed
captioning, and editing to reduce the size of sections within modules so that users can return to a specific section more
quickly as needed. Sometimes a module goes through a second round of pilot testing after modifications based on first
round feedback.
Feedback from actual users was based on their experiences with pilot versions of each module. There were 157
participants in these pilot tests, including groups of teacher candidates (N=75), practicing teachers (N=35), ASU
teacher educators (N=18), and school administrators (N=20). Participants were asked to evaluate module content, ease
of use, relevance to their own teaching, the extent to which the module use experience was engaging, whether they
would use its content in their own classroom, and how likely they were to recommend it to others.
As a general summary of this feedback for the six modules, nearly everyone found the on-line module easy to use
(except for Identifying Necessary Procedures, where about 28% in the pilot test disagreed with the survey statement
that the module was “easy to navigate throughout”). With regard to the feedback about Identifying Necessary
Procedures, the Sanford team responded by improving the on-line navigation process, reducing the amount of typing
users have to do, and made some content optional. Nearly every user was able to complete a module in 60 minutes or
less (about 95% of all pilot users), an important finding for the Sanford marketing strategy since potential users have
many claims on their time and energy. Pilot test groups reported that module content in all six examples was relevant
to their own practice and/or to teaching. Over 80% of respondents found modules to be “engaging”, an important trait
for a self-directed online resource that most users will be interacting with in the context of limited time and perhaps
after a long day or week of teaching In the case of the Teaching Procedures module, however, only 59% of those in
pilot group found it engaging. According to a communication to the evaluator from the Sanford team, the module is
under review to address this finding. Even so, all of them would use its content in their own work, a finding echoed by
almost every participant in the six pilots. And finally, between 86-100% of those providing feedback said they would
4
recommend the module to a colleague. This suggests that users of the modules will become a big source of
dissemination to their colleagues. It can also be seen from the table below that nearly every pilot user of the module—
from four different groups—reported that the module they tested was “high quality”.
Slight variations in these findings owe more to small sample sizes of individual module pilot test groups. Even so,
however, it would be worthwhile for the Sanford team to take a look at the feedback for Identifying Necessary
Procedures and Teaching Procedures, where feedback was somewhat less positive than for the other modules.
Selected Survey Results from Pilot Tests of On-Demand Modules2
Module
Survey Questions
Percent Agree/
Strongly Agree
Causes of Misbehavior
Easy to Navigate
100%
Engaging
92%
High Quality
100%
Would Use in Teaching
100%
Recommend to Others
100%
Designing Effective Procedures
Easy to Navigate
86%
Engaging
86%
High Quality
91%
Would Use in Teaching
88%
Recommend to Others
90%
Giving Clear Directions for a
Task
Easy to Navigate
88%
Engaging
84%
High Quality
95%
Would Use in Teaching
100%
Recommend to Others
96%
Identifying Necessary
Procedures
Easy to Navigate
71%
Engaging
86%
High Quality
100%
Would Use in Teaching
94%
Recommend to Others
95%
Physical Environment of
Classroom
Easy to Navigate
92%
Engaging
81%
High Quality
92%
Would Use in Teaching
100%
Recommend to Others
93%
Teaching Procedures
Easy to Navigate
82%
Engaging
59%
High Quality
89%
Would Use in Teaching
100%
Recommend to Others
86%
Observations3
5
Apart from the comments above, it is also noteworthy what individual pilot test participants had to say about the
quality and value of the on-demand modules—for themselves and for other candidates or classroom teachers.
Reviewers of Causes of Misbehavior had these comments about the module and its value:
• “I thought it was concise, coherent and left you with tangible take away to use immediately in your classroom.”
• “My initial thoughts regarding the module on misbehavior was that it was good review from the content we have
studied in previous classes. I also thought that it would help a beginning teacher think about misbehavior and how
tools can be used to assess students.”
• Another reviewer said: "This module would push novice teachers to consider behavior from a different perspective
and hold themselves accountable for not making assumptions."
A sample quote from reviewers of Giving Clear Directions also indicates its value for novice teachers:
"Yes, I think novice teachers would find this very helpful. Giving instructions efficiently and effectively
helps eliminate many of the problems that novice teachers are most likely to struggle with, including
difficulty meeting objectives and classroom management issues. The information was also presented in a
very practical and applicable way, which I appreciated and which I'm sure new teachers will appreciate as
well. In the teaching program, we receive a lot of theoretical information with little practical application, and
this was very refreshing in its practicality."
One reviewer for Designing Effective Procedures gave positive feedback as well as this suggestion: "Good, but some
of the examples were a bit too generic. However, for a first year teacher I could see why it would be beneficial. I think
the module should have gone over being consistent with procedures because I think that's even harder to do."4 Another
person said: "This module provides valuable tips and sequential steps for the development of procedures."
Pilot use of Identifying Necessary Procedures prompted insightful thoughts from reviewers. One comment had
advice for which users would be most likely to benefit: "I think procedural issues often occur b/c teachers are unaware
of the disorganization/chaos. The checklist is helpful but I imagine the module would be better if assigned to someone
who has been identified to struggle in this area. I don't imagine it to be a self-selected module." Another
reviewer identified an interesting issue in marketing and using teaching related modules, commenting that novice
teachers would not be likely to find it helpful because "I think a lot of novice teachers feel they don't need
improvement.” As noted in the table above, most reviewers were quite positive in their assessments of this module’s
quality and relevance to their classroom—including, presumably, classroom teachers who were not all veterans. The
survey findings indicate that most reviewers have been in their current role for four or fewer years, and Sanford reports
that only a handful of reviewers had more than five years of experience. Survey answers suggest that “93% of inservice teachers who piloted these modules were in our 5 or less years target audience range.” This means that pilot
test reviewers overwhelmingly come from the same teaching experience pool as teachers for whom the modules have
been designed.
The Teaching Procedures on-demand module also received interesting comments from reviewers. One responded, "I
do think that novice teachers would find this helpful. I think that the clarify, explain, rehearse, reinforce is a perfect
model for novice teachers to follow, “ while another reviewer agreed with this but went on to say: "However, I would
include videos to supplement each part of the module. I think this would exemplify the concepts more clearly and
provide a clear illustration of practical classroom strategies for teaching procedures. The students can then better
understand the steps in the module and what it looks like in an actual classroom."
Recommendations
These on-demand modules have obvious value to teacher candidates, current practitioners, and teacher educators at
Arizona State University and elsewhere. Comments and ideas cited earlier from Doug Lemov, the APA Coalition, and
the Measures of Effective Teaching project indicate the extent to which the Sanford Inspire Program is responding to
real needs in preparation programs and K-12 classrooms. Aside from relatively minor comments about small-scale user
issues in the modules that will be conveyed separately to Sanford, there are two significant recommendations that grow
out of this external evaluation of the on-demand module initiative:
6
•
•
•
•
•
It will be important to track and share impact data from implementation of the final version of each module,
including frequency of use and by whom (e.g., candidates, teachers, program faculty associated with ASU and
externally). Sanford plans to do this, and a future evaluation report will tap into this information to explore how the
modules are being used.
Sanford should expand the research base that supports development and implementation of each module.5 It is
worth going more deeply into the research behind future modules to establish the rigor of the knowledge base
employed by the Sanford team. This step would strengthen the value of on-demand modules to university faculty,
making it more likely that courses and course experiences would be linked to or influenced by the modules.
Designing Effective Procedures is an example of a module where a fuller set of references would strengthen its
appeal to some potential users. Unless there really is no other relevant literature, this very short list is likely to get
in the way of faculty adopting and recommending the work. And it may raise questions about the extent to which
the developers really know the literature related to the topics that are being addressed through the modules.
If developers have relied on studies in addition to those cited at the conclusion of each module, consider adding
these to the list or creating a link between each module and a full set of references organized by module title.
Finally, Sanford might consider identifying a tenured or tenure-track faculty member with expertise in the subject
to be an adviser on each module development project. The idea behind this is not to turn each module development
project into a graduate thesis or a research paper but to ensure that the module is grounded in—and is seen to be
grounded in—a reasonably extensive research base. The Sanford Design Team made it clear in interviews with the
external evaluator that feedback from College faculty is sought after and highly valued, so this recommendation
should not be taken to imply that external faculty feedback is not a component of the module development process.
K-12 Student Surveys
The Sanford Inspire Program’s K-12 student survey initiative has continued and expanded over time, reaching more
schools, classrooms, and students. Through partnerships with individual schools and with the Teacher Retention
Project, the Sanford survey initiative is providing new and valued feedback across the ASU service area. Beyond these
activities, the use of student surveys is being embedded in teacher preparation programs at the Mary Lou Fulton
College of Education as a source of feedback for teacher candidates and as evidence of performance for programs.
By the end of 2014, K-12 student surveys had been administered in 76 classrooms at twelve different schools. Almost
two thousands students participated by completing surveys on paper or on-line. This external evaluator observed
survey administration in a second grade classroom where just over two dozen students answered the Sanford survey
questions with pencil and paper. Survey administration in this second grade classroom had multiple purposes, as it did
in other schools and classrooms. Surveys have generated findings of interest to school leaders and their teachers,
produced useful feedback for ASU teacher candidates and their instructors, and given College program leaders insights
about the value of student surveys as a source of useful performance data. And beyond all these benefits, lessons from
each survey experience have been used to improve the overall process of administration, analysis, and reporting.
According to Sanford, each student survey cycle has yielded insights about how to generate this feedback from
students more efficiently and with less intrusion on the time and energy of school administrators. Survey instruments
have been tweaked to reduce the number of questions and even lower the average number of words per item, resulting
in significantly fewer items on the Sanford elementary and secondary survey instruments than are found on typical
commercial products such as Tripod or MyVoice. And as TPA reported last summer, in 2015-16, the College is
planning to pilot surveys with all teacher candidates. In the 2016-17 academic year, will require surveys of their
students by all teacher candidates in the program by including the survey as an assignment in the student teaching
course.
It is also important to note that one of Sanford’s partners in the student survey initiative—the Teacher Retention
Project (TRP) utilized the survey without direct assistance from the Sanford Inspire Program. TRP coaches who work
with novice or relatively inexperienced teachers in Phoenix-area school districts administered the survey themselves.
Results were tallied by Sanford and returned to TRP.6 Results are also being shared with TRP teachers. A separate
interview with Dr. Teressa Ribbens, chief executive officer of the Teacher Retention Project, provided other details of
this experience as well as examples of the benefits from collaborating with Sanford. Sanford also partners with the
Teacher Retention Project through access to the on-demand modules discussed in the previous section of this report.
7
TRP coaches are assigned to teachers in schools across Arizona, working to help them improve their practice and
thereby build successful careers in the teaching profession. According to Dr. Ribbens, TRP coaches were trained to
administer the student survey by Valerie Simmons, Sanford’s internal evaluator, and 34 teachers supported by TRP
have used the Sanford survey with their own students (primarily completed by students in school computer labs), all
but one school served by the TRP. The resulting survey findings gave “clear and accurate results” helpful to teachers,
coaches, and schools—and the findings have “driven next quarter goals for our teachers.” Dr. Ribbens also praised the
readability of survey reports generated by Sanford; she said they stand alone as a coaching and teacher devleopment
resource but also are valuable because their content links back to the Danielson Framework For Teaching used as an
observation instrument in local schools. She also commented about the student survey marketing opportunity that
exists through her TRP coaches who can talk up the survey’s value with principals and others in the schools where
they work with individual teachers.
In the way of recommendations about the student survey experience, comments from Dr. Ribbens echoed the Sanford
experience described in a previous evaluation report about the surveys. As Val Simmons said in discussing the survey
initiative with the external evaluator, Sanford is “being careful not to overask or overreach” in working with schools
and districts. This outlook has facilitated the early successes of the student survey initiative. Arranging permissions
with schools to administer surveys has been challenging for Sanford in part because of University institutional research
board (IRB) deadlines and policies and because Sanford has to navigate school district-level policies as well. These
issues are being worked out as more experience is gained in the survey work, supporting the longer-term focus on
student surveys that will ensure smoother administration in the future. Dr. Ribbens pointed to survey results where
TRP teachers can use additional development in areas such as the general academic domain, academic engagement of
students, questioning strategies and skills, and differentiated instruction. These areas also stand out in the MET report
as weaknesses for most of the 3,000 teachers who participated in that study.
As Valerie Simmons and Ryen Borden wrote in “The Use of Student-Facing Surveys in the Classrooms of Teacher
Candidates to Advance Teacher Preparation Practice,” a paper submitted to the 2015 annual conference of the
American Association of Colleges for Teacher Education (AACTE), “We see the use of student surveys with
preservice teachers as an innovative practice with great potential value to the teacher candidates and ultimately, the
students they serve.” They also commented that Sanford’s survey administration and reporting processes were
developed so that they are “free to low cost... sustainable in its administration and distribution protocols...[and]
replicable in other institutions with little background training and/or knowledge”.7 In addition to providing context for
Sanford’s goals and experiences, the papers cited here provide solid evidence about the dissemination strategy the
Sanford Inspire Program is using to help others learn from its work.
Earlier evaluation reports commented on the growing national interest in K-12 student surveys as a source of feedback
for teachers, teacher candidates, and preparation programs. Perhaps stimulated by the highly visible Measures of
Effective Teaching project, states, schools, and preparation programs across the country have been developing student
surveys. As MET noted in “Asking Students About Teaching,” good quality student surveys can complement
classroom teaching observations and measures of student achievement in providing insights about teachers and their
impact on students. “Teaching is a complex interaction among students, teachers, and content that no one tool can
measure. The search for different-but-aligned instruments has led many to use student surveys,” according to MET
researchers. Moreover, they have enhanced value when standardized test data are not available:
“Analysis by the Measures of Effective Teaching (MET) project finds that teachers’ student survey results
are predictive of student achievement gains. Students know an effective classroom when they experience
one. That survey results predict student learning also suggests surveys may provide outcome-related results
in grades and subjects for which no standardized assessments of student learning are available”.8
A task force of education leaders and measurement experts convened by the American Psychological Association also
cited the benefits of asking students about the performance and attitudes of their teachers. In “Assessing and
Evaluating Teacher Preparation Programs,” the APA Task Force said that “States should work with teacher preparation
program providers to design systems of data collection that include information collected at the stages of selection,
progression, program completion, and postcompletion”.9 The APA group went on to make these observations about the
value and quality of student surveys as a measure of teaching performance:
8
“[S]tudent surveys of teacher effectiveness have considerable support in the empirical literature. Scores of
constructs based on observable behaviors are internally consistent and stable, are related to achievement
outcomes in both college and K–12 students, are more highly correlated with student achievement than are
teacher self-ratings and ratings by principals, and distinguish between more- and less-effective teachers
identified using other metrics”.10
And finally, in work produced by Teacher Preparation Analytics (TPA) for the Council on Accreditation of Educator
Preparation (CAEP), the authors of “Building an Evidence-Based System for Teacher Preparation” included K-12
student surveys of teaching practice as a key indicator of preparation program quality.11 While most states and
preparation programs are not yet deploying student surveys for program improvement or for accountability, the
Arizona State University experience offers valuable insights about ways to move in this direction.
Marketing Strategies to Support Phase 2 of the Sanford Inspire Program
Since the early 2014 transition from Phase 1 to Phase 2 of the Sanford Inspire Program, marketing has focused on
supporting the transition in a variety of ways. This includes marketing team work on reviewing and revising Phase 1
curriculum products to ensure that these products will still be available and attractive as pre-service and in-service
curriculum or professional development resources. Phase 2 marketing has seen the team play an important role in the
work of pilot testing on-demand modules, engaging with schools and districts to support the student survey initiative,
and disseminating to other teacher preparation programs a Recruitment Toolkit built from the Sanford Inspire
Program’s early Phase 1 recruitment activities.
While marketing involves “driving demand to the modules from potential users”, the Marketing Team participates
significantly in several design and production phases. Working with the design and production teams, marketing vets
the clarity and content of individual modules (e.g., module titles to convey content as clearly as possible) from the
perspective of prospective users. Team members look at a module’s ease of use and provide other feedback to
maximize its value to that user. They also work to set up and manage pilot reviews of the modules to generate the
kinds of feedback discussed above in this report. This ensures that marketing is fully in sync with product development
and that product development is a marketing asset.
The Sanford Inspire Program’s student survey initiative also depends on marketing expertise. Team members traveled
to schools and district with Valerie Simmons for meetings and to observe early implementation of the surveys, with the
idea that the team will have direct responsibility as the student survey initiative widens its scope in terms of schools
and districts. In late fall 2014 evaluation interviews, the team felt that the student surveys had been the most successful
Phase 2 Sanford product marketed to schools because of its benefits to individual schools and their leaders.12 One
team member explained this in terms of the pitch to school leaders: “we’d like to come in and survey your students to
help you better understand your teachers’ effectiveness through the eyes of your students.”
At the same time, marketing work aimed towards schools and districts that have used the student surveys will
emphasize the on-demand modules as professional development resources and their linkages to survey findings.
Sanford is also underscoring connections that surveys and modules have with teacher evaluation rubrics used in
surrounding districts, schools, and charters. An internal Sanford document developed in 2014 is a kind of crosswalk or
Venn diagram showing overlaps among local evaluation systems—the TAP rubrics in districts employing that system,
Danielson’s Framework For Teaching, the Marzano system, and Teaching as Leadership (TAL).
Just listing these multiple ways through which schools try to assess teaching quality points up the complexity of
developing resources to help large numbers of schools, their teachers and pupils. But intentional Sanford efforts to
align—and to make alignment connections explicit—these approaches with the student survey and the on-demand
module initiative are evidence for a carefully-designed and thoughtful set of strategies to advance Phase 2 work across
all of its many fronts.
In broad strokes, marketing activities have two big audiences. Outside Teachers College, these include the schools and
districts discussed above, as well as initiatives like the Teacher Retention Project whose leader and coaches report
great value in their work with Sanford. Communications materials are being created to share with this audience, and as
time goes on, marketing will have sole responsibility for promoting the use of surveys and managing the transition
9
through which schools and districts will take up the survey work on their own. In this sense, marketing is also about
sustainability.
Within Teachers College, marketing includes recruiting faculty reviewers of the pilot modules but it also about efforts
to inform as many internal constituencies as possible. This is no easy task in a large and complicated College where
faculty and staff work at multiple locations, where they staff very large programs, and when they are subject to
relatively frequent large scale upheavals. Sanford outreach seeks to inform and engage course instructors, tenured and
tenure-track faculty, program coordinators, and program leaders—especially about Phase 1 resources they may want to
use.
One marketing team member summed up the challenge of this work at a very large institution by noting, “different
faculty have different perceptions of what we [Sanford] do”. Some know about Phase 1, others are familiar with the
prior recruitment focus. “They have different snippets of what we do... most people have heard about us but they’re not
clear about what we do.” The most productive outreach seems to be with those involved in courses for which Sanford
has a resource—instructors, course and program coordinators, and those who lead programs where Sanford resources
can and do fit in. All this is not say that Sanford is unsuccessful in the work of informing, engaging, and involving
faculty across the College. Participation in module reviews, College support for scaling up the student surveys across
all pre-service teacher education, partnering in research and dissemination activities are concrete examples of
awareness and support.
Recommendations
Sanford Inspire Program team members made the effort to find commonalities among a number of teacher observation
and evaluation systems used by the College and by school or district partners where Sanford resources are used or
marketed. This is a clever way to ensure that school and district leaders see the student surveys and on-demand
modules not as added work or peripheral to their goals , but complementary to their efforts to improve instruction and
raise student achievement.
In that same spirit, Sanford should consider obtaining and analyzing local district teacher evaluation instruments other
than those mentioned already, looking for relevance and alignment with the student surveys and on-demand modules.
Teressa Ribbens from the Teacher Retention Project stressed the value of this alignment for her coaches and their
efforts to improve instruction and persistence in teaching. Making specific connections between evaluation rubric
components and surveys or modules can open the door to conversations with principals or district leaders. The next
step might be sharing actual evaluation results—not for individuals but maybe for grade levels or subject areas—to
explore ways in which the surveys and current or future on-demand modules can be used to solve specific problems.
Dissemination Activities
An important aspect of the Sanford Inspire Program from the beginning has been dissemination information and
resources across the ASU community, and to local schools, districts, and other partners. Sanford’s work in Phase 1
and Phase 2 has big implications for teacher preparation and teacher professional development, and so dissemination
also has aimed to reach and engage those communities as well. As this report is being completed in February 2015,
Sanford team members are preparing to make presentations at two national conferences of teacher educators—for the
Association of Teacher Educators (ATE) and the American Association for Colleges of Teacher Education (AACTE).
At ATE, papers are being presented on data-driven decision-making and on the role of K12 student surveys in
fostering reflective practice by teaching candidates. A few weeks later, AACTE conference attendees will learn about
Sanford’s work on the student survey initiative and about research findings on “the outcomes of a clinically-based
student investment project.” AACTE’s meeting agenda has at least one other session about student surveys—presented
by the teacher education accreditor body because well-done surveys will be accepted as a form of evidence about
program performance. Thanks to Sanford, Arizona State University is a national pioneer in developing survey-based
evidence about teacher candidates and practicing teachers. Another example of dissemination to education
professionals is a paper developed by Valerie Simmons accepted to the 2015 conference of the Society for Information
Technology and Teacher Education (SITE). The Sanford team is also sharing its work widely through a quarterly
newsletter and by providing the on-demand modules to teacher candidates through their course and field experience
instructors.
10
Other dissemination work took place through the Learning to Teach (LTT) conference in late fall 2014, involving
innovators, funders and policymakers from across the country. Media coverage has highlighted Sanford initiatives such
as Quest2Teach and mentor teacher training. A redesigned website for the Sanford Inspire Program is generating
significant traffic; Sanford resources are much easier to identify and link with through this polished website.
Teachers College Dean Mari Koerner was one of a small number of education deans announced as members of “Deans
for Impact” in January 2015. This is a change-focused group that has received funding support from the Schusterman
Family Foundation because members “are working together to improve student learning and other outcomes” of
teacher preparation programs.13 Teachers College at ASU has seen massive changes to policies and practices under
Dean Koerner’s leadership; her support for the Sanford Inspire Program (as well as growing awareness of Sanford’s
work) has increased the visibility of the teacher preparation reform agenda being implemented at ASU and may well
have an impact on the work of Deans for Impact.
It should also be clear that Sanford’s engagement with the Teacher Retention Project, the Paideia Academies and
Academia del Pueblo Elementary Charter School is about dissemination through close partnerships. Feedback from
these organizations about the student surveys and on-demand modules shows the Sanford Inspire Program is making a
difference to their work in very specific ways. Through this outreach and support, Sanford is also laying the
groundwork to sustain Phase 2 of the Sanford Inspire Program.
Final Thoughts
Interviews and other evidence collected and analyzed for this report indicates that the Sanford Inspire Program has
made a very rapid transition into Phase 2 of its work. At the same time, the Sanford team has been taking steps to
improve and continue its support for the Phase 1 curriculum resources that have been assessed in previous evaluation
reports. As the teams work on multiple fronts, extensive interviews and internal documents from team meets indicate
the existence of clearly articulated strategies being pursued by each of the teams interviewed for this report. Team
members have a clear understanding of their goals, well-developed and detailed strategies for getting to them, and
strong commitment to the overall effort to serve the needs of schools, teachers, teacher candidates and others touched
by the Sanford work.
As in previous interactions with them, Sanford Inspire Program team members continue to demonstrate a remarkable
openness to feedback and constructive ideas from their partners, from resource reviewers, and from this external
evaluator. With the work moving forward on new fronts at a rapid pace (as always), assessment of this complex project
is coming closer than ever to having impact data as the College continues to build its own capacity to tap data on
program outcomes.
1
Findings from the Measures of Effective Teaching study are contained in several reports available through
http://www.metproject.org/.
2
The surveys were developed and administered by Valerie Simmons, internal evaluator for the Sanford Inspire
Program, who generously shared findings and analysis with TPA for this report.
3
Notes taken while reviewing the six modules discussed here include other observations about smaller-scale issues
that cropped up—positive assessments about how a module is addressing a specific topic (e.g. in Causes of
Misbehavior, “The emphasis on specific easily understood steps that can be used to diagnose and respond to particular
events/incidents is a strength—users can draw parallels to their own classroom, they can also ground the module’s
ideas in practice, something that is too often missing in teacher prep coursework.”) as well as notes about typos or
duplicated choices in some assessment questions—are provided to the program separately from this evaluation report
(such as examples from Designing Effective Procedures).
4
Sanford’s response to this feedback: “Because we wanted to keep the modules short (<60 min) and focused, we
address this in another module… This is addressed in the Teaching Procedures module, which explains the
importance of reinforcing/re-teaching procedures as necessary over time.”
11
5
The team working on creating modules told this external evaluator “On average, each of the last half-dozen modules
that have been sent to production has had a minimum of fifteen references (consisting almost wholly of books and
scholarly articles).”
6
As with other data in this report such as survey feedback about the on-line modules, information about student survey
administration has been provided to TPA by Valerie Simmons, Sanford’s internal evaluator.
7
For more see http://sanfordinspireprogram.org/wp-content/uploads/2015/01/AACTE-Student-Survey.pdf and
http://sanfordinspireprogram.org/wp-content/uploads/2015/01/Abstract-of-Presentation-Student-Surveys-ATElive.pdf.
8
For more on the Measures of Effective Teaching Project’s findings and recommendations regarding student surveys,
see http://www.metproject.org/downloads/Asking_Students_Summary_Doc.pdf.
9
American Psychological Association, 2014. Assessing and Evaluating Teacher Preparation Programs. Washington,
DC: APA Center on Psychology in Schools and Education. Retrieved from
http://www.apa.org/ed/schools/cpse/teacher-preparation-programs.pdf.
10
APA Task Force, page 24. Citations embedded in the quoted text have been omitted but can be found in the original
reference.
11
A summary of this report can be found at https://caepnet.files.wordpress.com/2014/12/tpa_report_highlights.pdf.
12
When evaluation interviews were conducted only one on-demand module was publicly available; the strength of the
survey initiative as a marketing opportunity should not be taken to imply weak demand for or interest in the modules.
13
For more see http://www.deansforimpact.org/.