edTPA Evidence The edTPA evidence begins with the CIP matrix

edTPA Evidence
The edTPA evidence begins with the CIP matrix and timeline followed up with both fall 2013 and spring 2014 summaries and data reports.
Continuous Improvement Plan
3.A
Yearly Goals &
Objectives
Goal 1: Phase
out of 6 NCDPI
electronic
evidences.
3.G
Implementation
Year
Candidate
Requirements
Review Process
All candidates
were required
to submit 6
NCDPI
evidences.
EPP Faculty
reviewed and
approved by
program.
Candidate
Accountability
3.D
EPP
Activities/Initiatives
3.F
Human and Capital Resources
Continuous Improvement
Data
2012-2013
Objective 1:
Fall 2012
Complete review
of pipeline
candidate
evidences
High Stakes:
Awaited state-level
Candidates required process for submission
to successfully
and program review
complete evidences
to be recommended
for a NC teaching
license.
Human:
Time required to conduct
reviews by EPP faculty, KMA,
OPE
Capital:
Our EPP was asked to house
and analyze the state-level
electronic evidence program
reviews.
OPE was one of 3 official
trainers for the state-level
pilot.
None
Pilot review by NCDPI made it
clear that the capacity to
review all candidate evidence
throughout NC did not exist.
While our candidates were
successful, inconsistencies in
training and evaluation made
cross-program (Unit) analysis
difficult and impacted
reliability and validity.
Objective 2:
Complete
transition plan
for remaining
candidates
Spring 2013
All candidates
were required
to submit 6
NCDPI
evidences.
EPP Faculty
reviewed and
approved by
program.
High Stakes:
Candidates required
to successfully
complete to be
recommended for a
license.
EPP engaged in initial Human:
edTPA consortium
meetings and training. EPP developed a core edTPA
group to begin transition
KMA and OPE worked planning
with programs to
determine which
EPP core group attended
programs and
consortium training
candidates would begin
The Core edTPA team
created a transition
document to share with EPP
programs and faculty. An
edTPA Moodle site was
created to house the
transition documents in
addition to training materials
and edTPA handbooks for
edTPA implementation Financial:
in Fall 2013
The EPP redesigned SAGE to
Elementary MAT
accommodate edTPA
volunteered for presubmission (approximately
pilot
$25,000)
Goal 2: Develop
a plan for full
edTPA
implementation
2013 - 2016
Objective 1:
Provide
additional
support for
faculty
implementation
of edTPA.
Fall 2013
All other
programs
completed 6
NCDPI
evidences.
Objective 2:
Provide support
for candidate
use of edTPA.
Objective 3:
Implement and
Programs
volunteered to
participate in
the pilot edTPA
administration
(n=45).
Spring 2014
All Spring
completers
EPP faculty,
University
Supervisors,
and Doctoral
Students
reviewed
edTPA artifacts
developed by
candidates.
Low Stakes:
Candidates were
required to
complete edTPA
artifacts and
successfully student
teach.
Introduced and trained Human & Financial:
Data on the pilot was
faculty on the edTPA
collected and analyzed. (See
process related to the EPP committed resources for an edTPA pilot evidence)
transition plan.
edTPA manager. ($16,000 plus
tuition waiver)
The edTPA manager
Hired an edTPA
conducted 4 candidate
manager to support
edTPA state consortium
training sessions.
faculty and candidates. committed resources for faculty
and staff to attend local and
The edTPA manager created
Trained faculty and
national edTPA training.
a faculty electronic
staff on SCALE local
(Approximately $10,000)
warehouse (moodle) to store
evaluation rubric.
all relevant documents,
Financial:
templates, and meeting
Provided Candidate
EPP provided resources to
information. (accessible on
support for edTPA
purchase 100 digital cameras for site)
through workshops,
candidates. (Approximately
website, and online
$30,000)
resources.
The edTPA manager created
Human:
a candidate google site to
Scored each fall
store all handbooks,
completer’s edTPA
EPP provided 2 half day SCALE templates, webinars, and
portfolio with 2
local scoring rubric trainings for training information. (See
independent reviewers. EPP faculty and University
edTPA evidence)
Supervisors
EPP analyzed fall data and
EPP faculty and University
created program reports
Supervisors volunteered to
based on rubrics. (See edTPA
score edTPA portfolios
evidence)
Low Stakes:
Candidates were
Trained P-12 teachers Human:
and doctoral students
Each pilot
edTPA portfolio
was scored by
2 independent
raters on the
SCALE 3-point
local evaluation
rubrics.
Our EPP used
current P-12
faculty. (See edTPA
Evidence)
EPP provided rubric analysis
by program in SnapShot
Evaluate edTPA
artifacts across
all programs.
required to
complete
edTPA
artifacts.
(n=220)
teachers and
doctoral
students with
teaching
experience for
edTPA pilot
portfolio
review.
required to
on SCALE local
complete edTPA and evaluation rubric.
successfully student
teach.
Provided Candidate
support for edTPA
through workshops,
edTPA Google site, and
online resources.
Each pilot
portfolio was
scored on the
SCALE 3-point
local evaluation
rubrics.
Scored each Spring
completer’s edTPA
portfolio.
EPP provided 1 half day SCALE
local scoring rubric training for
P-12 teachers, doctoral
students, and University
Supervisors
Human & Financial:
EPP provided resources to pay
P-12 teachers and doctoral
students to score spring edTPA
portfolios ($25K)
Objective 4:
Transition from
local scored (3
point) to
Pearson scored
(5 point) edTPA
portfolios.
Fall 2014
All completers
will
submit edTPA
portfolios to
Pearson for
scoring.
All completer
edTPA
portfolios will
be scored by
Pearson.
Moderate Stakes:
Candidates are
required to
complete, higher
threshold for
quality;
Resubmission is
required for
candidates not
meeting program
specified criteria.
Provide Candidate
support for edTPA
through workshops,
website, and online
resources.
Financial: EPP will provide
EPP will provide results from
funding for the Pearson Scoring Pearson to programs through
(Approximately $12k)
the SnapShot
Objective 5:
Conduct review
of 2014-15 data
to determine
EPP cut score.
Spring 2015
All completers
will
submit edTPA
portfolios to
Pearson for
scoring.
All completer
edTPA
portfolios will
be scored by
Pearson.
Moderate Stakes:
Candidates are
required to
complete, higher
threshold for
quality;
Resubmission is
required for
candidates not
meeting program
specified criteria.
Provide Candidate
support for edTPA
through workshops,
website, and online
resources.
Financial:
EPP will
conduct an
analysis of
completer
scores along
with national
data to
determine an
appropriate cut
score.
EPP will provide results from
Pearson to programs through
EPP will provide funding for the the SnapShot
Pearson Scoring ($60k)
Human:
KMA and OPE will work with
programs to create cut score
policy
KMA and OPE will work with
programs to determine
candidate remediation
procedures
Objective 6:
Implement EPP
cutscore for
licensure
recommendatio
n
Fall 2015
Fall 2012
• Candidates
complete NCDPI
Evidences
• State Pilot Review
All completers
will
submit edTPA
portfolios to
Pearson for
scoring.
Spring 2013
• Candidates
complete NCDPI
Evidences
• EPP begins edTPA
review
All completer
edTPA
portfolios will
be scored by
Pearson.
Fall 2013
• Most candidates
complete NCDPI
Evidences
• edTPA pilot begins
HIgh Stakes:
Candidates must
meet EPP
determined cut
score for licensure
recommendation.
Provide Candidate
support for edTPA
through workshops,
website, and online
resources.
Spring 2014
• All candidates
complete edTPA Low stakes
• EPP uses SCALE 3
point Local Rubric
Financial: EPP will provide
EPP will provide results from
funding for the Pearson Scoring Pearson to programs through
(Approximately $12,000)
the SnapShot
Fall 2014
Spring 2015
• First candidate
• Candidates submit
submission to
to Pearson Pearson - Moderate
Moderate stakes
stakes
• EPP creates cut
• EPP provides scores
score policy and
to programs via
remediation
SnapShot
procedures
Fall 2015
• Candidates submit
to Pearson - High
Stakes
• EPP analyzes results
Fall2013PilotSummary
The Unit implemented edTPA for all graduating teacher candidates in Fall 2013.
Forty-two candidates were required to complete an edTPA portfolio in lieu of state
evidences for this pilot. Each candidates’ edTPA portfolio was assigned two raters and
scored according to the three point Local Rubric provided by SCALE. Each rater was
required to complete a day long rubric training in order to be qualified to review. Most
fall reviewers were faculty and university supervisors. An inter-rater reliability analysis
using Cohen’s Kappa was performed to determine consistency across raters. Two
reliability tests were conducted, and absolute score and a pass/fail score. The absolute
Kappa statistic determined the degree to which raters agreed on ratings for each of the 3
levels of scores. The pass/fail Kappa statistic determined the degree to which raters
agreed on passing (achieving level 2 or 3 for a standard), or failing (achieving level 1 for
a standard). Both calculations showed no greater than chance reliabilities for both the
pass/fail as well as the overall Kappa. The small sample size may be a factor in the
analysis. Therefore, percentages were derived from the total number of reviewers on
each rubric score to determine the overall decision of quality given by reviewers. The
following is a summary of these percentages for each department and each rubric.
Appendix A has a Unit summary table as well as an example of a program detail report.
Overall, Secondary Social Studies portfolios scored the highest percentage rate
for all 15 of the edTPA rubrics. Within these portfolios teacher candidates scored a 100%
on Rubrics 4, 6, 9, and 11. The lowest scoring rubric within this department were rubrics
12, and 15. The Special Education department candidates achieved an 81% pass rate for
all rubrics combined. Candidates achieved a 100% on rubric 11. Achievement was lowest
in the Special Education department for rubrics 9, and 15. Elementary Education
candidates achieved an 80% for their edTPA portfolios as a whole. Achievement for
rubric 7 was the highest at 96% with rubrics 2 and 13 having the lowest achievement of
67%. English as an Additional Language department achieved a 78% on their edTPA
portfolios as a whole. Rubrics 4, and 15 constituted the highest achievement at 100%.
English as an Additional Language portfolios showed achievement difficulties in rubrics
6 (50%), 7 (50%), 8 (67%), 9 (42%), and 10 (58%). Many of these rubrics scored video
excerpts that were unavailable, incorrectly uploaded, or unrelated to the rubric prompt for
this department. Finally, the Secondary English department portfolio (2 raters for 1
portfolio), showed an achievement of 77% across all rubrics. A 100% achievement rating
was given for rubrics 1, 2, 3, 4, 5, 6, 8, 10, and 15.
Each rubric was given an overall percentage to determine the highest achievement
areas and the lowest achievement areas as a college. According to raters, Rubric 4 was
achieved by the most portfolios (92%). Rubrics 1, and 3 showed achievement of 87%.
Raters determined that the portfolios were weakest on achieving the goals of rubrics 7
(71%), 9 (69%), 13 (66%), and 14 (71%).
Each program was provided with a detailed summary of their rubric scores.
Included in the report were percent at each level of each rubric as well as what specific
“look for” items were included or omitted from the portfolio. The Unit compiled and
included this information in the program reports to aid programs in program
improvement.
Overall, the Unit saw three areas where support is needed. First, candidates
struggled with some of the technology requirements. Several video clips were not able to
be viewed, had corrupt links, or were of poor quality. The Unit is providing addition
technology training in digital media, compressing video files, and camera use. In
addition, an assignment was added to the ED 312 (assessment course) for students to
practice recording and uploading small segments of instruction. The Unit has purchased
120 small video camera for check out in METRC. Candidates can check out equipment
free of charge. Second, area of support is use of academic language. Rubric data
revealed candidates struggle with academic language portions of the rubric. The Unit has
added more discussion and practice with academic language in the ED 204 course. This
course is required of all candidates. The spring 2014 sections of ED 204 included the
additional support so we should see an improvement in this area for the 2016 cohort. The
third area of improvement focuses around timing issues and logistics. The Unit provided
a timeline for programs to have students complete and submit edTPA documents. The
2013-2014 academic year was a low stakes year for candidates. Candidates were
required to submit edTPA portfolios but results were not used for graduation or licensure
purposes. Scoring for 2013-2014 was done using the local scoring rubric and candidates
were not provided rubric scores. Moving forward, academic year 2014-15, all edTPA
portfolios will be submitted to Pearson for full scoring. Although this academic year is
still low stakes, candidates must submit documents earlier than previous years. The Unit
is working with a subgroup of program coordinators to draft a semester timeline for
candidates and programs so all materials are submitted before the unit deadline. Program
requested more help with timeline creation and implementation. The edTPA timeline
actually is impacted by the student teaching year-long placement process so both
timelines are being done simultaneously to ensure a smooth transition.
The following table is the Program Summary by Rubric for the Fall 2013 administration.
Passing Rate for All Candidates Count of Scores at Each Level Rubric Planning Rubrics S1: Planning for Literacy Learning‐ How do the candidate's plans build students' literacy skills and an essential strategy for comprehending or composing text? Criteria Included from Score/ Criteria Not included from Level above Score Emerging Performance 81.00% 1 = 5 60% focused on practice of skills/facts/procedures/conventions. limited classroom students opportunities to develop subject specific understandings. Level 1 included: not aligned with learning outcomes. built on skills/facts/procedures and subject specific understandings (but may be unbalanced). 100% Level 1 did NOT include from Level 2: Level 2 included: Proficient Performance 80% consistent in content errors. 40% 40% teacher directed. % 20% 0 aligned with learning outcomes. built on skills/facts/procedures and subject specific understandings (but may be unbalanced). 88% aligned with learning outcomes. built on skills/facts/procedures and subject specific understandings (but may be unbalanced). 100% 100% 88% Level 2 did NOT include 2 = 8 from Level 3 sequenced in a learning progression across lessons built on skills/facts/procedures/conventions AND deep subject specific understandings across all lessons. 0 88% Advanced Performance supportive of classroom students understanding of the relationship between skills/facts/procedures/conventions and subject specific understandings. S2: Planning to Support Varied Student Learning Needs‐ How does the candidate use knowledge of his/her students to target support for students' literacy learning? 3 = 14 Level 1 included: Emerging Performance 67.00% 1 = 9 56% 56% limited or missing. did not address IEP/504 requirements. 56% 11% aligned with learning outcomes. Level 1 did NOT include from Level 2: Level 2 included: Proficient Performance 22% appropriate for the needs of the whole class. superficially aligned with learning outcomes. 38% 2 = 9 addressing IEPs/504 requirements. aligned with learning outcomes. built on skills/facts/procedures and subject specific understandings (but may be unbalanced). addressing IEPs/504 requirements. designed to scaffold learning for a variety of students. 78% 91% 73% 27% 64% Level 2 did NOT include from Level 3 S3: Using Knowledge of Students to Inform Teaching and Learning‐ How does the candidate use knowledge of his/her students to justify instructional plans? Advanced Performance 3 = 11 81.40% Level 1 included: Emerging Performance 1 = 5 Level 1 did NOT include from Level 2: Level 2 included: Proficient Performance 2 = 13 80% superficial descriptions of classroom students' prior learning 36% identifying and responsive to potential misconceptions or partial understandings. Level 2 did NOT include from Level 3 superficial descriptions of classroom students' lived experiences. pervasive negative portrayals of students' backgrounds, educational experiences or family/community characteristics. concrete and specific connections between tasks and prior learning. at least surface level of discussion of theory or research. concrete and specific connections between tasks and prior learning. at least surface level of discussion of theory or research. concrete, specific connections between tasks and prior learning. 60% 0 80% 80% 92% 62% 15% 100% grounded discussion of theory or research. Advanced Performance 3 = 11 81.40% S4: Identifying and Supporting Language Demands‐ How does the candidate identify and support language demands associated with a key literacy learning task? Emerging Performance Level 1 included: 1 = 5 vocabulary that was only demand identified. Level 1 did NOT include from Level 2: Level 2 included: a mismatch between language demands and language function. a mismatch between language demands and language supports. 20% 0% 0% a mismatch between language demands and the learning tasks. 0% supports that were not included or focused on vocabulary. 80% 60% language demands including function vocabulary AND discourse/syntax. supports that generally addressed vocabulary and one other identified demand (function, discourse or syntax). 100% language demands including function vocabulary AND discourse/syntax. 71% supports that generally addressed vocabulary and one other identified demand (function, discourse or syntax). 86% Proficient Performance 2 = 13 Level 2 did NOT include from Level 3 Advanced Performance supports that were strategically designed to address all language demands for students with varying characteristics and language needs. 100% 3 = 11 S5: Planning Assessments to Monitor and Support Student Learning‐ How are the formal and informal assessments selected or designed to monitor students' use of the essential strategy and requisite skills to comprehend or compose text? 74% Emerging Performance 1 = 7 86% A majority of assessments were not aligned with the full scope of subject specific outcomes. 57% IEP/504 requirements for adaptations/modifications were not addressed. 29% Level 1 included: A majority of assessments provided minimal evidence of subject specific understandings. Level 1 did NOT include from Level 2: Level 2 included: Proficient Performance 2 = 11 supports generally addressing vocabulary and one other identified demands. 100% language demands including function, vocabulary AND discourse/syntax. supports generally addressing vocabulary and one other identified demands. 45% 100% Advanced Performance 71% language demands including function, vocabulary AND discourse/syntax. Level 2 did NOT include from Level 3 100% supports that were strategically designed to address all language demands for students with varying characteristics and language needs. 3 = 11 Instructional Rubrics respect. 81.40% Emerging Performance Level 1 included: S6: Learning Environment‐ How does the candidate demonstrate a positive literacy learning environment that supports students' engagement in learning? 100% disrespectful interactions. 0% disruptive behaviors. 0% 1 = 5 controlling or directive environment. 0% minimal support for learning goals. 20% the majority of assessments providing evidence of subject specific understandings. 80% IEP/504 requirements for adaptations/modifications were addressed. 80% Level 1 did NOT include from Level 2: Level 2 included: Proficient Performance 2 = 11 100% 36% assessments that provided evidence of the full range of subject specific understandings. 36% assessments that were used in each lesson. 9% assessments that were differentiated so that classroom students showed understandings in various ways. 82% Level 2 did NOT include from Level 3 the majority of assessments providing evidence of subject specific understandings. IEP/504 requirements for adaptations/modifications were addressed. Advanced Performance 3 = 13 96.20% Emerging Performance Level 1 included: S7: Engaging Students in Learning‐ How does the candidate actively engage students in integrating strategies and skills to comprehend or compose text? loose connections between tasks and the central focus. 1 = 1 0% 100% links to prior learning or lived experiences were limited. 0% classroom students were confused by links to content. 0% rapport. 100% Level 1 did NOT include from Level 2: mutual respect. 0% low risk. 0% Level 2 included: rapport. 92% tasks that focused on low‐level content. Proficient Performance mutual respect. 92% low risk. 100% challenge. 69% perspective. 69% 2 = 13 Level 2 did NOT include from Level 3 Advanced Performance 3 = 15 surface level questions. 100% candidate talk. 25% S8: Deepening Student Learning‐ How does the candidate elicit student responses to promote thinking and develop literacy skills and the essential strategy to comprehend and/or compose text? 85% Emerging Performance Level 1 included: consistent or egregious content inaccuracies. 25% 1 = 4 Level 1 did NOT include from Level 2: questions prompting some high‐er order thinking related to subject specific understandings. 100% Level 2 included: Proficient Performance 2 = 10 100% questions built on classroom student thinking about subject specific understandings. 60% interactions among students. 50% Level 2 did NOT include from Level 3 questions prompting some high‐er order thinking related to subject specific understandings. Advanced Performance 3 = 15 no attempt to teach classroom students how to use the key strategy to support comprehension or composition. 67% S9: Subject Specific‐Subject‐
Specific Pedagogy‐Elementary Literacy‐ How does the candidate support students to apply the essential literacy strategy? 85.10% Emerging Performance a clear mismatch between or among strategies, skills, and students' readiness to learn. 33% significant content inaccuracies within the material that would lead to student misunderstandings. 0% modeling of the key strategy or skills without opportunities for classroom students to practice or apply them. 0% Level 1 included: 1 = 4 Level 1 did NOT include from Level 2: 100% Level 2 included: Proficient Performance 2 = 10 Level 2 did NOT include from Level 3 modeling the key strategy with limited opportunities for practice. Advanced Performance modeling the key strategy with limited opportunities for practice. 100% explicitly teaching classroom students on how to apply the strategy and provide opportunities for guided practice. 43% explicitly teaching classroom students when to apply the strategy in meaningful contexts. 100% 3 = 15 S10: Analyzing Teaching Effectiveness‐ How does the candidate use evidence to evlauate and change teaching practice to meet students' varied learning needs? 85.10% Emerging Performance proposed changes that addressed their own behavior without reference to student learning. 67% proposed changes suggested "more practice" or time to work on similar or identical tasks without revision. 33% proposed changes addressed problems with classroom student behavior and how to "fix" it. 0% Level 1 included: 1 = 4 proposed changes addressed gaps in the whole of class learning/understanding. 75% proposed changes re‐reengaged students in new revised or additional task. 75% proposed changes included surface level discussion of research or theory. 75% proposed changes addressed gaps in the whole of class learning/understanding. 67% proposed changes re‐engaged students in new revised or additional task. 60% Level 1 did NOT include from Level 2: Level 2 included: Proficient Performance 2 = 15 60% proposed changes that were concrete, specific and elaborated. 40% proposed changes addressed gaps in student learning for different students in different ways. 60% proposed changes were grounded in principles from theory or research. 80% Level 2 did NOT include from Level 3 proposed changes included surface level discussion of research or theory. Advanced Performance 3 = 10 8.25 Assessment 81.40% Emerging Performance listing correct OR incorrect answers (but not both). 20% claims that were unsupported by work samples. 60% no alignment between assessment and objectives. 40% 1 = 5 listing both correct AND incorrect answers. 80% listing some areas where the whole class excelled or struggled. 80% Level 1 did NOT include from Level 2: 75% Level 2 included: Proficient Performance listing both correct AND incorrect answers. 2 = 7 86% listing some areas where the whole class excelled or struggled. 57% Level 2 did NOT include from Level 3 Level 1 included: S11: Analysis of Student Learning‐ How does the candidate analyze evidence of student learning? descriptions of classroom students' understandings and struggles citing evidence. 57% learning trends that were related to individual or group understandings/misunderstandings. 57% Advanced Performance 3 = 15 S12: Providing Feedback to Guide Further Learning‐ What type of feedback does the candidate provide to focus students? 81.40% Emerging Performance listing correct OR incorrect answers. 60% claims that were unsupported by work samples. 20% no alignment between assessment and objectives. 40% Level 1 included: 1 = 5 lists correct AND incorrect answers. 80% lists some areas where the whole class excelled or struggled. 80% Level 1 did NOT include from Level 2: Level 2 included: Proficient Performance 2 = 11 Advanced Performance 3 = 11 82% lists some areas where the whole class excelled or struggled. 82% Level 2 did NOT include from Level 3 lists correct AND incorrect answers. Balanced specific feedback on strengths AND weaknesses. 64% Guides student self evaluation of strengths and weaknesses. 82% generic discussion for the use of feedback. 78% no discussion for use of feedback. 33% 66.60% Emerging Performance Level 1 included: no feedback given on samples. 33% 1 = 9 S13: Student Use of Feedback‐ How does the candidate provide opportunities for focus students to use the feedback to fuide their further learning? Level 1 did NOT include from Level 2: explicit discussion for how classroom students use feedback to improve work. 100% Level 2 included: Proficient Performance 2 = 9 100% Level 2 did NOT include from Level 3 explicit discussion for how classroom students use feedback to improve work. Advanced Performance discussion of support for student use of feedback. 22% feedback that lead to deeper understandings of current or future work. 78% 3 = 9 S14: Analyzing Students' Language Use and Literacy Learning‐ How does the candidate analyze students' use of language to develop content understanding? 70.30% Emerging Performance Level 1 included: 1 = 8 listing and explaining students' use of vocabulary and related function. 88% listing and explaining students' use of discourse or syntax. 100% Level 1 did NOT include from Level 2: Level 2 included: Proficient Performance 2 = 5 Level 2 did NOT include from Level 3 Advanced Performance 3 = 14 listing and explaining students' use of vocabulary and related function. 80% listing and explaining students' use of discourse or syntax. 20% 75% listing language use that was not connected to identified vocabulary or other demands. 38% listing only vocabulary use. listing and explaining vocabulary, function, and syntax or discourse used by the whole class OR students with varied needs. 80% language use that clearly supported content understandings. 80% next steps that did not make sense. next steps that were not aligned to learning objectives. 0% 100% S15: Using Assessment to Inform Instruction‐ How does the candidate use the analysis of what students know and are able to do to plan next steps in instruction? 88.80% Emerging Performance Level 1 included: next steps that presented vague information. 1 = 3 listing and explaining students' use of vocabulary and related function. 100% listing and explaining students' use of discourse or syntax. 100% Proficient Performance Level 2 included: 2 = 14 Level 2 did NOT include from Level 3 Advanced Performance next steps that generally attended to the whole class needs in relation to content. 100% discussions of research/theory that were at least surface level. 50% strategic support for individuals AND groups related to subject specific knowledge. 57% next steps that were grounded in research/theory. 71% Level 1 did NOT include from Level 2: 33% 3 = 10 Spring 2014 edTPA Implementation Summary
The Unit implemented edTPA for all graduating teacher candidates in Spring
2014. Two hundred twenty (220) candidates were required to complete an edTPA
portfolio to be eligible for program completion. Each candidates’ edTPA portfolio was
assigned to a rater and scored according to the three point Local Rubric provided by
SCALE. Each rater was required to complete a day long rubric training in order to be
qualified to review. Most spring reviewers were cooperating teachers or university
supervisors.
Rubric score analysis is currently ongoing. Preliminary results indicate that
spring completers are outperforming fall completers on all rubrics. It is important to note
that the spring semester saw a large increase in the number of portfolios submitted.
Upon completion of the data analysis each program will be provided with a
detailed summary of their candidate rubric scores. Included in the report are percent at
each level of each rubric as well as what specific “look for” items were included or
omitted from the portfolio. The Unit compiles and include this information in the
program reports to aid programs in program improvement. Attached are the program rubric
summaries and a sample of a program report for spring 2014 administration.
Spring 2014 Program Summary of Pass Rates by Rubric
Ag Ed
S1: Passing
Level 1
Level 2
Level 3
93%
S2: Passing
Level 1
Level 2
Level 3
93%
Bus Ed
ESL
MAT: Elem Ed
Math
Mid LA
100%
0%
54%
46%
60%
40%
0%
60%
87%
13%
60%
27%
90%
11%
28%
62%
87%
13%
31%
56%
100%
0%
100%
0%
87%
13%
60%
27%
78%
22%
31%
47%
82%
19%
38%
44%
92%
7%
43%
50%
S3: Passing
Level 1
Level 2
Level 3
85%
14%
71%
14%
80%
20%
80%
0%
80%
20%
47%
33%
89%
11%
42%
47%
56%
44%
25%
31%
92%
S4: Passing
Level 1
Level 2
Level 3
92%
80%
20%
40%
40%
94%
7%
71%
21%
7%
67%
27%
89%
11%
46%
43%
69%
31%
44%
25%
77%
23%
62%
15%
S5: Passing
Level 1
Level 2
Level 3
79%
21%
36%
43%
80%
20%
80%
0%
80%
20%
40%
40%
85%
16%
38%
47%
81%
19%
56%
25%
85%
15%
31%
54%
S6: Passing
Level 1
Level 2
Level 3
100%
0%
64%
36%
80%
20%
60%
20%
54%
47%
47%
7%
84%
16%
40%
44%
101%
0%
63%
38%
84%
15%
46%
38%
S7: Passing
Level 1
Level 2
Level 3
86%
14%
29%
57%
80%
20%
60%
20%
53%
47%
53%
0%
96%
4%
47%
49%
6%
69%
25%
8%
54%
38%
S8: Passing
Level 1
Level 2
Level 3
93%
7%
43%
50%
60%
40%
60%
0%
67%
33%
60%
7%
85%
16%
36%
49%
69%
31%
25%
44%
77%
23%
54%
23%
S9: Passing
Level 1
Level 2
Level 3
86%
14%
36%
50%
100%
0%
100%
0%
46%
53%
33%
13%
93%
81%
19%
31%
50%
85%
15%
23%
62%
7%
29%
64%
7%
29%
64%
94%
8%
69%
23%
8%
69%
23%
92%
Ag Ed
Bus Ed
S10: Passing
Level 1
Level 2
Level 3
79%
21%
50%
29%
40%
60%
20%
20%
S11: Passing
Level 1
Level 2
Level 3
72%
29%
36%
36%
S12: Passing
Level 1
Level 2
Level 3
ELSMAT: Elem Ed
Math
Mid LA
60%
40%
53%
7%
89%
11%
51%
38%
81%
19%
50%
31%
77%
23%
46%
31%
100%
0%
100%
0%
93%
88%
12%
23%
65%
94%
85%
15%
62%
23%
86%
14%
43%
43%
100%
74%
86%
62%
S13: Passing
Level 1
Level 2
Level 3
79%
21%
50%
29%
100%
0%
80%
20%
7%
67%
27%
87%
13%
60%
27%
14%
49%
37%
54%
30%
31%
23%
6%
56%
38%
88%
13%
69%
19%
S14: Passing
Level 1
Level 2
Level 3
72%
29%
43%
29%
100%
0%
100%
0%
74%
27%
47%
27%
82%
19%
33%
49%
57%
44%
19%
38%
69%
31%
54%
15%
S15: Passing
Level 1
Level 2
Level 3
85%
14%
71%
14%
80%
20%
60%
20%
94%
93%
88%
13%
69%
19%
92%
Pass Rate:
Total
Students
Scored
80%
20%
7%
53%
40%
7%
67%
27%
7%
56%
37%
85.333333 82.666667 75.333333
14
5
15
6%
31%
63%
85%
15%
62%
23%
77%
23%
54%
23%
8%
46%
46%
85.4 79.333333
47
16
84.6
13
Mid Sci
91%
Mid SS
Sec Eng
Sec Sci
Sec SS
94%
90%
10%
35%
55%
90%
10%
42%
48%
0%
9%
39%
52%
90%
10%
60%
30%
90%
10%
53%
37%
87%
13%
57%
30%
75%
25%
55%
20%
82%
17%
65%
17%
82%
17%
17%
65%
Spec Ed
Tech and Eng
5%
68%
26%
80%
20%
50%
30%
85%
14%
21%
64%
0%
83%
17%
50%
33%
80%
20%
40%
40%
86%
14%
36%
50%
70%
30%
43%
27%
0%
94%
90%
`0
70%
20%
78%
21%
64%
14%
65%
35%
50%
15%
83%
17%
53%
30%
0%
6%
56%
39%
80%
20%
40%
40%
72%
29%
43%
29%
90%
10%
45%
45%
80%
20%
47%
33%
0%
89%
11%
56%
33%
70%
30%
30%
40%
86%
14%
36%
50%
95%
90%
10%
63%
27%
0%
94%
6%
50%
44%
90%
10%
50%
40%
93%
5%
60%
35%
87%
13%
48%
39%
85%
15%
60%
25%
87%
13%
57%
30%
0%
88%
11%
44%
44%
80%
20%
50%
30%
93%
78%
22%
39%
39%
85%
15%
60%
25%
84%
17%
57%
27%
0%
89%
11%
56%
33%
90%
10%
70%
20%
86%
14%
57%
29%
87%
13%
48%
39%
80%
20%
55%
25%
80%
20%
47%
33%
0%
89%
11%
72%
17%
70%
30%
20%
50%
92%
9%
26%
65%
91%
105%
4%
79%
26%
6%
61%
33%
95%
7%
43%
50%
7%
79%
14%
7%
21%
71%
Mid Sci
Mid SS
Sec Eng
Sec Sci
69%
30%
43%
26%
85%
15%
50%
35%
70%
30%
50%
20%
0%
91%
95%
Sec SS
95%
Spec EdTech and Eng
6%
67%
28%
80%
20%
50%
30%
64%
36%
50%
14%
5%
55%
40%
86%
13%
43%
43%
0%
9%
52%
39%
100%
0%
56%
44%
100%
0%
80%
20%
86%
14%
43%
43%
65%
35%
43%
22%
90%
10%
60%
30%
84%
17%
57%
27%
0%
89%
11%
56%
33%
90%
10%
60%
30%
72%
29%
43%
29%
69%
30%
39%
30%
55%
45%
40%
15%
54%
47%
37%
17%
0%
83%
17%
61%
22%
90%
10%
60%
30%
72%
29%
43%
29%
91%
70%
30%
50%
20%
0%
9%
52%
39%
65%
30%
50%
15%
89%
11%
39%
50%
80%
20%
60%
20%
64%
36%
14%
50%
65%
35%
48%
17%
85%
15%
55%
30%
83%
17%
60%
23%
0%
89%
11%
50%
39%
70%
30%
40%
30%
78%
21%
57%
21%
82.666667
23
82 80.066667
20
31
0 90.666667 82.666667 80.466667
19
10
14
UG Elem
World Lang
88%
13%
53%
35%
100%
0%
33%
67%
76%
24%
60%
16%
100%
0%
33%
67%
74%
25%
56%
18%
100%
0%
33%
67%
85%
15%
67%
18%
100%
0%
50%
50%
78%
22%
58%
20%
100%
0%
67%
33%
91%
9%
67%
24%
97%
67%
33%
50%
17%
4%
82%
15%
66%
33%
33%
33%
77%
24%
64%
13%
67%
33%
17%
50%
89%
11%
58%
31%
83%
17%
50%
33%
UG Elem
World Lang
76%
24%
56%
20%
100%
0%
33%
67%
89%
11%
53%
36%
83%
17%
50%
33%
80%
20%
64%
16%
100%
0%
83%
17%
71%
29%
53%
18%
83%
17%
50%
33%
78%
22%
56%
22%
N/A
85%
15%
65%
20%
N/A
82.266667 88.384615
55
6
Rubric
Passing Rate Count of for All Scores at Candidates Each Level
Criteria Included from Score/ Criteria Not included from Level above Score
%
Planning Rubrics
S1: Planning for Literacy Learning‐ How do the candidate's plans build students' literacy skills and an essential strategy for comprehending or composing text?
60%
teacher directed. focused on practice of skills/facts/procedures/conventions. Emerging Performance
90.00%
n = 5 limited classroom students opportunities to develop subject specific understandings. Level 1 included:
40%
80%
40%
consistent in content errors. not aligned with learning outcomes. 20%
built on skills/facts/procedures and subject specific Level 1 did NOT include from Level aligned with learning outcomes.
2:
Level 2 included:
0
built on skills/facts/procedures and subject specific 100%
aligned with learning outcomes.
built on skills/facts/procedures and subject specific understandings (but may be unbalanced).
100%
91%
Proficient Performance
sequenced in a learning progression across lessons
Level 2 did n = 13 NOT include built on skills/facts/procedures/conventions AND from Level 3 deep subject specific understandings across all lessons.
supportive of classroom students understanding of the relationship between skills/facts/procedures/conventions and subject specific understandings. 0
91%
36%
Advanced Performance
n = 30
S2: Planning to Support Varied Student Learning Needs‐ How does the candidate use knowledge of his/her students to target support for students' literacy learning?
superficially aligned with learning outcomes.
Level 1 included:
limited or missing.
did not address IEP/504 requirements.
50%
60%
50%
Emerging Performance
78.00%
n = 10 aligned with learning outcomes.
10%
Level 1 did NOT include from Level 2:
Level 2 included:
Proficient Performance
n = 15 76%
appropriate for the needs of the whole class.
addressing IEPs/504 requirements.
aligned with learning outcomes.
built on skills/facts/procedures and subject specific understandings (but may be unbalanced).
n = 21
91%
73%
addressing IEPs/504 requirements.
27%
designed to scaffold learning for a variety of students.
65%
Level 2 did NOT include identifying and responsive to potential from Level 3 misconceptions or partial understandings.
Advanced Performance
24%
35%
S3: Using Knowledge of Students to Inform Teaching and Learning‐ How does the candidate use knowledge of his/her students to justify instructional plans?
89.00%
80%
Level 1 included:
Emerging Performance
superficial descriptions of classroom students' prior learning
superficial descriptions of classroom students' lived experiences.
pervasive negative portrayals of students' backgrounds, educational experiences or family/community characteristics.
60%
0
n = 5 Level 1 did NOT include concrete and specific connections between tasks from Level and prior learning.
2:
at least surface level of discussion of theory or research.
80%
80%
Level 2 included:
Proficient Performance
concrete and specific connections between tasks and prior learning.
at least surface level of discussion of theory or research.
95%
70%
n = 20 Level 2 did NOT include concrete, specific connections between tasks and from Level 3 prior learning.
grounded discussion of theory or research.
10%
100%
Advanced Performance
n = 21
S4: Identifying and Supporting Language Demands‐ How does the candidate identify and support language demands associated with a key literacy learning task?
89.00%
Emerging Performance
Level 1 included:
n = 5 vocabulary that was only demand identified.
a mismatch between language demands and language function.
a mismatch between language demands and language supports.
20%
0%
0%
a mismatch between language demands and the learning tasks.
0%
supports that were not included or focused on vocabulary.
80%
Level 1 did NOT include from Level language demands including function vocabulary 2:
AND discourse/syntax.
60%
supports that generally addressed vocabulary and one other identified demand (function, discourse or syntax).
100%
AND discourse/syntax.
77%
Level 2 included:
supports that generally addressed vocabulary and one other identified demand (function, discourse or syntax).
Proficient Performance
91%
n = 21 Level 2 did supports that were strategically designed to address NOT include all language demands for students with varying from Level 3
characteristics and language needs.
100%
Advanced Performance
n = 20
S5: Planning Assessments to Monitor and Support Student Learning‐ How are the formal and informal assessments selected or designed to monitor students' use of the essential strategy and requisite skills to comprehend or compose text?
85%
A majority of assessments provided minimal evidence of subject specific understandings.
86%
A majority of assessments were not aligned with the full scope of subject specific outcomes.
57%
IEP/504 requirements for adaptations/modifications were not addressed.
29%
Emerging Performance
Level 1 included:
n = 7 Level 1 did NOT include from Level language demands including function, vocabulary 2:
AND discourse/syntax.
71%
supports generally addressing vocabulary and one other identified demands.
100%
language demands including function, vocabulary AND discourse/syntax.
67%
supports generally addressing vocabulary and one other identified demands.
100%
Level 2 included:
Proficient Performance
n = 18 Level 2 did NOT include supports that were strategically designed to address from Level 3 all language demands for students with varying characteristics and language needs.
100%
Advanced Performance
n = 21
Instructional Rubrics
respect.
86%
S6: Learning Environment‐ How does the candidate demonstrate a positive literacy learning environment that supports students' engagement in learning?
84.00%
Emerging Performance
Level 1 included:
n = 7 disrespectful interactions.
14%
disruptive behaviors.
controlling or directive environment.
minimal support for learning goals.
29%
14%
Level 1 did NOT include from Level the majority of assessments providing evidence of 2:
subject specific understandings.
29%
71%
IEP/504 requirements for adaptations/modifications were addressed.
86%
the majority of assessments providing evidence of subject specific understandings.
100%
IEP/504 requirements for adaptations/modifications were addressed.
26%
assessments that provided evidence of the full range of subject specific understandings.
26%
assessments that were used in each lesson.
5%
Level 2 included:
Proficient Performance
n = 19 Level 2 did NOT include from Level 3
assessments that were differentiated so that classroom students showed understandings in various ways.
84%
loose connections between tasks and the central focus.
tasks that focused on low‐level content.
0%
100%
Advanced Performance
n = 20
S7: Engaging Students in Learning‐ How does the candidate actively engage students in integrating strategies and skills to comprehend or compose text?
96.00%
Emerging Performance
Level 1 included:
links to prior learning or lived experiences were limited.
classroom students were confused by links to content.
n = 2 Level 1 did NOT include from Level 2:
Level 2 included:
Proficient Performance
n = 22 50%
rapport.
0%
100%
mutual respect.
low risk.
50%
0%
rapport.
95%
mutual respect.
low risk.
95%
100%
challenge.
68%
perspective.
59%
surface level questions.
candidate talk.
100%
57%
consistent or egregious content inaccuracies.
14%
Level 2 did NOT include from Level 3
Advanced Performance
n = 22
S8: Deepening Student Learning‐ How does the candidate elicit student responses to promote thinking and develop literacy skills and the essential strategy to comprehend and/or compose text?
85%
Emerging Performance
Level 1 included:
n = 7 Level 1 did NOT include from Level questions prompting some high‐er order thinking 2:
related to subject specific understandings.
Level 2 included:
100%
Proficient Performance
questions prompting some high‐er order thinking related to subject specific understandings.
100%
questions built on classroom student thinking about subject specific understandings.
56%
interactions among students.
56%
no attempt to teach classroom students how to use the key strategy to support comprehension or composition.
67%
a clear mismatch between or among strategies, skills, and students' readiness to learn.
33%
significant content inaccuracies within the material that would lead to student misunderstandings.
0%
modeling of the key strategy or skills without opportunities for classroom students to practice or apply them.
0%
n = 16 Level 2 did NOT include from Level 3
Advanced Performance
n = 23
S9: Subject Specific‐Subject‐
Specific Pedagogy‐Elementary Literacy‐ How does the candidate support students to apply the essential literacy strategy?
93.00%
Emerging Performance
n = 3 Level 1 included:
Level 1 did NOT include from Level modeling the key strategy with limited 2:
opportunities for practice.
100%
Level 2 included:
Proficient Performance
modeling the key strategy with limited opportunities for practice.
100%
explicitly teaching classroom students on how to apply the strategy and provide opportunities for guided practice.
50%
n = 13 Level 2 did NOT include from Level 3 explicitly teaching classroom students when to apply the strategy in meaningful contexts.
79%
Advanced Performance
n = 30
proposed changes that addressed their own behavior without reference to student learning.
proposed changes suggested "more practice" or time to work on similar or identical tasks without revision.
43%
57%
S10: Analyzing Teaching Effectiveness‐ How does the candidate use evidence to evlauate and change teaching practice to meet students' varied learning needs?
89.00%
Emerging Performance
Level 1 included:
proposed changes addressed problems with classroom student behavior and how to "fix" it.
14%
proposed changes addressed gaps in the whole of class learning/understanding.
80%
n = 5 Level 1 did NOT include from Level proposed changes re‐reengaged students in new 2:
revised or additional task.
proposed changes included surface level discussion of research or theory.
Level 2 included:
Proficient Performance
proposed changes addressed gaps in the whole of class learning/understanding.
60%
80%
75%
proposed changes re‐engaged students in new revised or additional task.
proposed changes included surface level discussion of research or theory.
71%
Level 2 did NOT include from Level 3 proposed changes that were concrete, specific and elaborated.
30%
67%
n = 24 proposed changes addressed gaps in student learning for different students in different ways.
54%
proposed changes were grounded in principles from theory or research.
75%
listing correct OR incorrect answers (but not both).
20%
claims that were unsupported by work samples.
60%
no alignment between assessment and objectives.
40%
listing both correct AND incorrect answers.
80%
Advanced Performance
n = 17
8.25
Assessment
S11: Analysis of Student Learning‐ How does the candidate analyze evidence of student learning?
88.00%
Emerging Performance
Level 1 included:
n = 5 Level 1 did NOT include from Level listing some areas where the whole class excelled or 2:
struggled.
80%
Level 2 included:
Proficient Performance
n = 10 listing both correct AND incorrect answers.
listing some areas where the whole class excelled or struggled.
60%
90%
Level 2 did NOT include from Level 3 descriptions of classroom students' understandings and struggles citing evidence.
learning trends that were related to individual or group understandings/misunderstandings.
50%
60%
Advanced Performance
n = 28
S12: Providing Feedback to Guide Further Learning‐ What type of feedback does the candidate provide to focus students?
86.00%
Emerging Performance
listing correct OR incorrect answers.
67%
claims that were unsupported by work samples.
17%
no alignment between assessment and objectives.
33%
lists correct AND incorrect answers.
83%
Level 1 included:
n = 6 Level 1 did NOT include from Level lists some areas where the whole class excelled or 2:
struggled.
67%
Level 2 included:
lists correct AND incorrect answers.
90%
Proficient Performance
lists some areas where the whole class excelled or struggled.
81%
n = 21 Level 2 did NOT include from Level 3 Balanced specific feedback on strengths AND weaknesses.
Guides student self evaluation of strengths and weaknesses.
38%
90%
Advanced Performance
n = 16
S13: Student Use of Feedback‐ How does the candidate provide opportunities for focus students to use the feedback to fuide their further learning?
54.00%
Emerging Performance
generic discussion for the use of feedback.
no discussion for use of feedback.
77%
31%
no feedback given on samples.
23%
Level 1 included:
n = 10 Level 1 did NOT include from Level explicit discussion for how classroom students use 2:
feedback to improve work.
100%
Level 2 included:
Proficient Performance
explicit discussion for how classroom students use feedback to improve work.
100%
n = 4 Level 2 did NOT include from Level 3
discussion of support for student use of feedback.
feedback that lead to deeper understandings of current or future work.
73%
listing only vocabulary use.
38%
listing language use that was not connected to identified vocabulary or other demands.
75%
27%
Advanced Performance
n = 3
S14: Analyzing Students' Language Use and Literacy Learning‐ How does the candidate analyze students' use of language to develop content understanding?
82.00%
Emerging Performance
Level 1 included:
n = 8 listing and explaining students' use of vocabulary and related function.
Level 1 did NOT include from Level listing and explaining students' use of discourse or 2:
syntax.
Level 2 included:
Proficient Performance
88%
100%
listing and explaining students' use of vocabulary and related function.
93%
listing and explaining students' use of discourse or syntax.
64%
n = 14 Level 2 did NOT include listing and explaining vocabulary, function, and from Level 3 syntax or discourse used by the whole class OR students with varied needs.
language use that clearly supported content understandings.
64%
57%
Advanced Performance
n = 21
next steps that did not make sense.
next steps that were not aligned to learning objectives.
0%
100%
S15: Using Assessment to Inform Instruction‐ How does the candidate use the analysis of what students know and are able to do to plan next steps in instruction?
93.00%
Emerging Performance
Level 1 included:
next steps that presented vague information.
33%
listing and explaining students' use of vocabulary and related function.
100%
n = 3 Level 1 did NOT include from Level listing and explaining students' use of discourse or 2:
syntax.
Proficient Performance
Level 2 included:
next steps that generally attended to the whole class needs in relation to content.
discussions of research/theory that were at least surface level.
100%
100%
58%
n = 24 Level 2 did NOT include from Level 3 strategic support for individuals AND groups related to subject specific knowledge.
next steps that were grounded in research/theory.
Advanced Performance
n = 16
50%
75%