Project Document Cover Sheet Project Acronym Project Title

Document title: Project Document Cover Sheet
Last updated: April 2007
Project Document Cover Sheet
Project Information
Project Acronym
Making Assessment Count
Project Title
Making Assessment Count
Start Date
1 November 2008
Lead Institution
University of Westminster
Project Director
Professor Gunter Saunders, [email protected]
Project Manager
contact details
31 October 2010
End Date
& Ms Yanna Nedelcheva
[email protected]
02079115803
Partner Institutions
None
Project Web URL
https://sites.google.com/a/staff.westminster.ac.uk/mac/
Programme
(and number)
Name Curriculum Delivery
Programme Manager
Lisa Gray
Document Name
Document Title
Interim Report
Reporting Period
for interim progress reports only
Author(s) & project Professor Gunter Saunders, Project Director,
role
Date
February 2010
URL
if document is posted on project web site
Access
† Project and JISC internal
Document History
Version
Date
Comments
Filename
Mac_interim_March10_V4
†X General dissemination
Transforming Curriculum Delivery through Technology Programme
Interim Reporting Template
Project Name: Making Assessment Count, University of Westminster
Report compiled by: Professor Gunter Saunders
With contributions from: Ms Yanna Nedelcheva, Dr. Mark Kerrigan, Dr. Mark Clements
Reporting period: September 09 – February 10
Section One: Summary
Project Overview
Making Assessment Count (MAC), which completes in autumn 2010, was piloted during
2008-2009 in the School of Biosciences in conjunction with the university’s Learning
Teaching & Development Centre. A model for feedback at the core of the project engages
students in deeper reflection on the comments they receive on coursework assignments.
There are emerging plans to adopt the process or aspects of the process in other academic
areas at Westminster, currently including Cognitive Science, Integrated Health and
Languages as the project moves into its third and final stage.
Key Developments in current reporting period
1. Creation of a scaleable, corporate level process for MAC. The new system integrates
with student records (SRS), facilitating the generation of reports for students and
ensures that students receive their reports within minutes of completing the online
questionnaire part of the process.
2. The linkage with SRS makes it possible to use the online questionnaire part of MAC
with any student and any item of coursework with a record on SRS.
3. The new MAC system has been provided for use by the 2009/10 undergraduate
cohort in Biosciences (~400 students) and all personal tutors (35 staff). To date over
311 students have used the system collectively generating over 1000 MAC reports.
Deliverables/Outputs
1. Functioning and scaleable Making Assessment Count process linked to student
records
2. Revised Student Guide and support videos to using e-Reflect
3. E-Reflect Handbook for Staff
4. Conference papers and presentations (see links in section 3 below)
5. Updated project website
6. Object explaining the MAC process created for the Design Studio
7. Student-developed website on MAC
Key Achievements
1. Successful (as determined from initial evaluation) roll-out of MAC across the
undergraduate provision in Biosciences (now part of the School of Life Sciences)
2. Significant participation from staff in the School during ‘roll out’ of MAC
3. Integration of ‘know-how’ from the TWOLER JISC project into the development of
MAC with significant support from central computing services
4. Presentations given at numerous UK e-learning events and staff development events
helping to raise the profile of the project.
Transforming Curriculum Delivery through Technology Programme
Section Two: Activities and Progress
The main operational objectives for the reporting period were to:
1. Complete the re-development MAC (version 2.0) in light of feedback from the pilot
and with the input of central computer services and the TWOLER project team (a
JISC project in the Institutional Innovation programme investigating the development
and use of Google apps by students).
2. ‘Roll-out’ MAC across the undergraduate provision in the Biosciences
3. Conduct early, initial evaluation of MAC version 2.0
4. Continue dissemination activities both internally and externally
The project team worked with central computing services over the summer to re-develop
MAC creating MAC 2.0. This new version sought to take into account the key suggestions
for improvement made by students and staff after the pilot as well developing a more
automated process that could easily be scaled across the institution. Thus the new version
of MAC is fully integrated with student records and technological approaches derived from
TWOLER enable the generation of and return of reports to students within minutes of
questionnaire completion.
The integration with student records makes it very easy for any coursework in any subject
area across the university to make use of MAC (on the assumption that the same
questionnaire used for the Biosciences provision is used). Should a subject area require or
desire a different question set then some work would be required in the SQL reporting part of
the process. As indicated above the development of MAC 2.0 has made use of know-how
and ideas generated through the JISC funded TWOLER project. A diagram of the
questionnaire processing part of MAC 2.0 is shown in figure 1.
In September 2009 the project team provided training to staff on the operation of MAC 2.0
and from early November the system went ‘live’ to all new undergraduate student in
Biosciences who were inducted into the process through the level 4 core module Human
Physiology and Anatomy. Prior to ‘going live’ the process was promoted to new students on
a weekly basis in order to prepare them for the return of their first piece of marked
coursework.
The first student coursework was returned late November and well over 100 students
completed a questionnaire around this coursework within the first few days. Between the end
of November and end of the calendar year a further 3 courseworks were returned to
students and by early January over 311 students had between them completed over 1000
questionnaires leading to the generation of an equivalent number of reports emailed to
students individually. Initially there were unexpected problems with the automated emailing
of reports to students and for the first item of coursework manual intervention was needed to
trigger the emailed report to be sent. However, this was subsequently ‘fixed’ for the second
coursework return and currently students are, as was planned, receiving their reports within
minutes of submitting a completed questionnaire. One item of coursework, on the Human
Physiology and Anatomy module, included an assessed reflection on the MAC report
derived from the coursework (the coursework was a laboratory report worth 25% of the
module marks with 5% of that attributable to the quality of the student reflection in their
learning journal). A surprisingly high number of students (120 out of 364) did not for some
reason complete an entry in their learning journal for that particular coursework. Of the 311
Transforming Curriculum Delivery through Technology Programme
students who have completed questionnaires a total of 232 went on to complete reflective
learning journal entries based around the MAC report derived from the questionnaire. As
part of the evaluation to be done the reason for such a high number not completing an ereflection potentially worth some marks will be investigated. As part of this deeper evaluation
we will determine the range and spread of e-reflective cycles completed by individual
students, in order that we can try to find out why some students have completed more ereflect reports than others.
Staff participation has and continues to be a key issue (see preliminary evaluation conducted
below). Some staff members 24 have so far made comments on their tutee’s learning
journals leaving 11 that have not. At this stage of the process we have no way to easily tell
how many staff may have given face-to-face feedback to their tutees on their learning journal
entries at tutorial meetings. However, 2 staff who have not commented on student blogs
have told the project team that they have preferred to read a tutee’s blog and then give
feedback face-to-face at a tutorial meeting. Whilst, not an expected use of tool, this process
does still meet the main objectives of the MAC philosophy.
Early in January we conducted a preliminary evaluation of the operation of MAC 2.0 by
questionnaire to students and themes emerging from the responses obtained are presented
in section 4 below. Staff feedback has been, at this stage, only via informal one to one or
small group meetings. In fact feedback from some staff came immediately after students
started receiving their first MAC 2.0 reports and led to amendments being made to some of
the automated comments generated on reports. The project team has continued to present
on the project at a range of external and internal events and these are detailed in section 6,
Communication and Dissemination.
Document title: Project Document Cover Sheet
Last updated: April 2007
Figure1
Document title: Project Document Cover Sheet
Last updated: April 2007
Section Three: Outputs and Deliverables
Expected Outputs from the Project
For the previous interim report we produced a table of expected types of output and this has
been updated (see attached) with an indication of where outputs have already been
achieved and to include any new expected outputs.
Expected technical outputs/system integrations
MAC version 1.0 used existing technology (blogs within Blackboard, Google forms). The
diagnostic questionnaire that students complete after each coursework in MAC version 1.0
was processed using Excel macros. In MAC version 2.0 the questionnaire is linked to an
SQL database and student data from the student records system is integrated with that
database using SQL queries, enabling more efficient questionnaire processing and report
generation. A diagram detailing the way in which data from the questionnaire is matched to a
student record leading to generation and email of a report to the student is shown in figure 1.
Outputs produced in this reporting period
• Functioning MAC version 2.0 process with enhanced questionnaire processing,
integrated with student records and scaleable across academic schools
• Student and staff guides to using MAC version 2.0
• Design studio object detailing the MAC version 2 process
• 4 conference papers/presentations (at ALT-C 2009, the ePortfolio 4YF Conference
September 2009, Organised by the ECs Leonardo da Vinci Programme
(http://scas.acad.bg/epconference/), the Ulster E-learning Conference and the JISC
Experts Group)
An unexpected output at this stage is the ‘idea’ to make use of the data collected via student
questionnaires to generate reports for module and course leaders, which would amount
almost to ‘real time’ feedback on the assessments provided for students. These reports
would essentially provide an opportunity for course teams to act on feedback from students
about assessments within the academic year in which the student feedback was received.
The project team are due to submit (by April 1) a second stage bid for funding to the
Biosciences subject centre to take this aspect further.
Section Four: Evaluation
The project team has largely kept up with the detailed evaluation plan but the bulk of this
reporting period has been focussed on developing version 2.0 of the MAC system and on
implementing the system across the undergraduate degree in Biosciences. However, we
have collected baseline data from the 09/10 undergraduate Biosciences intake via
questionnaire on their previous experiences (pre-university) of assessment and feedback.
Over 103 questionnaires were completed and preliminary analysis suggests that general
student perception of assessment and feedback is similar between pre-university and
university. Interestingly, students recognised that less assessment guidance is provided at
University and they do not have the opportunity to submit drafts of work prior to the
assessment deadline. In addition we were able to get 40 students to respond to an online
questionnaire in early January to provide some initial feel for students’ perceptions of the
MAC 2.0 process. Out of the 40 students 4 gave an overall negative view of MAC with
specific comments including:
–
–
Time consuming
Complete waste of time
Transforming Curriculum Delivery through Technology Programme
–
–
Report revealed nothing new
Didn’t understand the report
The remainder all gave positive responses to the questions posed on the evaluation
questionnaire and as a group clearly could see the value of MAC. Themes that emerged
from this group included a clear understanding of the potential link between MAC and
improvement. Specific comments that appeared commonly included high praise for the value
of certain components of the report such as the graphical representation of performance
over time and the advice on how many hours should have been spent on completing the
coursework. However, students from both groups highlighted the importance of tutors
responding in a timely fashion to their reflective learning journal entries.
Our next major evaluation task as detailed in the evaluation plan (April – May 2010) will be to
answer the question: How did the wider implementation of MAC in Biosciences go?
What other Schools have taken up the idea? What has been achieved in Biosciences
and in other Schools?
The project team have drawn up the following outline implementation plan for the next phase
of the evaluation:
Task
Methodology
By when
Lead and Notes
Baseline data on
new students
•
Questionnaire
to
determine previous
experience
of
feedback
MK
Questionnaire
done – analysis to
be completed and
written up
Collect student
data on use of
MAC in current
period
•
Preliminary
questionnaire
already completed
online
Additional
questionnaire to be
done – use the
same questionnaire
as we used for the
pilot? Will this be
done in class?
Student
focus
groups
Questionnaire?
Follow
up
with
individual interviews
Now, and then
again
before
Easter break? Is
that feasible and
does
it
make
sense to do it
twice? How many
courseworks
do
they still have to
take
through
eReflect?
MC–
Mark
has
circulated questions
used last time. Can
we feed back to him
any
suggested
changes/additions
Easter
TEAM
•
•
Collect
initial
overall
staff
views of MAC
•
•
Find out how
staff
who
engaged
managed
their
involvement (e.g.
how did they
manage
commenting on
blogs, how did
they
use
eReflect
in
tutorial
•
Meeting
with Easter
individual/groups of
staff
GS/YN
Transforming Curriculum Delivery through Technology Programme
meetings?
Analysis
of
learning journals
•
•
Read
summarise
Categorise
thematic basis
and Easter
GS
on
Section Five: Outcomes and Lessons Learned
A significant outcome to date has been the establishment of a real difference in perceptions
between students and staff regarding the use of feedback. This was established in the pilot
phase where students said they both read and valued their tutors’ comments whilst teaching
staff felt there was little evidence that students acted on the advice given, representing a
poor return on the time and effort expended. We have still to follow up and explore this
difference with the current new cohort of students (and potentially with last year’s pilot group)
but it is interesting that a potentially significant driver behind the development of MAC (staff
perception that students do not read their feedback) may not be ‘real’ for a significant
proportion of students. However, although more students than we thought do seem to read
their feedback evidence the project has gathered suggest that they find it difficult to
contextualise and use the feedback strategically (this came out of the student focus group).
This outcome has already made staff think about whether the feedback they give is
appropriate (particularly when a proforma is used). An unanticipated outcome therefore is
that staff have started to change the way they give feedback and now try to include a few
suggested action points to help students identify how to improve their future performance.
It seems reasonably clear from the feedback we have obtained so far that the MAC reports
can also help some students to use their feedback in a more structured way. In addition
anecdotally (at present) the project team have gained the impression that some students see
the MAC reports as something of significance and importance that they can keep and show
to others (not necessarily just their tutors). On the other hand it is clear that some students
(we currently believe a minority) see the reports as trivial and irrelevant. Although we are at
a relatively early stage of evaluation it is tempting even now to suggest that these two
extremes may derive from students who have been graded at opposite ends of the spectrum
of the mark range with the poorly graded students more often not finding the process as
beneficial as we would have hoped. It could well emerge that MAC is of little benefit to
students who are performing very badly unless the process is able to lead them to
productive face-to-face meetings with tutor and/or assessor. The project team will endeavour
to collect evidence to support (or otherwise) this emerging assumption through the
evaluation to be done.
Section Six: Communication and Dissemination Activities
The project team have in the current reporting period presented at 3 external conferences,
(at ALT-C 2009, ePortfolio 4YF Conference September 2009, Organised by the ECs
Leonardo da Vinci Programme (http://scas.acad.bg/epconference/) and the Ulster E-learning
Conference). This builds on the 3 conferences in the last reporting period. We feel that the
presentations we are doing at conferences (and at events organised by the JISC and HEA
e.g. we presented at the JISC Experts in this reporting period) are proving most effective at
disseminating the existence, goals and initial progress with the Making Assessment Count
project. For example later in February we are presenting at a staff development conference
at Middlesex University and this has arisen directly from one of the external presentations
we have previously given (a Pro-Vice Chancellor at Middlesex actually attended a
presentation and reported back to their e-learning director). Middlesex are planning a
Transforming Curriculum Delivery through Technology Programme
strategic development around e-feedback and are interested in ensuring that the principles
of Making Assessment Count are fully considered in their planning. In the same way the
delivery workshops at the University of Ulster arose via the profile raised through giving
presentations at other conferences.
We have continued to maintain the project website and a blog but have no statistics to show
the extent to which these are an effective means of ‘real’ dissemination. Our feeling is that
personal contact at conferences and events is proving to be the best method of
dissemination with the website (and blog) useful resources to which to refer interested
parties prior to any subsequent follow up meetings.
In a similar manner the project team have continued to vigorously promote the Making
Assessment Count project internally at Westminster with presentations (for example the
project team recently delivered the first in a new series of learning & teaching seminars at
Westminster streamed live both internally and externally and also presented at the October
meeting of the institutions’ Learning Teaching & Student Support Committee). In addition
shorter updates to groups of staff at meetings of School Learning Teaching & Quality
Committees have been made.
Section Seven: Issues, Challenges and Opportunities
Our aim has always been to consistently have all students and staff participating in the
project. Although to date student participation and staff participation has been good the
project team has some concern that students who have a negative view of MAC (which we
believe from preliminary evaluation to be a minority) could have a disproportionate influence
on the sustainability of the approach within the Biosciences provision. The effect could be
particularly significant if indeed it is the case that the poorly performing students are in the
main the ones that have difficulty seeing the benefit of MAC. Clearly a major aim of all
Schools is to help the poorer performing students (at the same time as enhancing the
development of students who do well from the outset.
It is too early to say how we address this, not new risk, but potentially unexpected effect of a
subset of students. The more complete evaluation of the roll-out which is planned over the
next few months, if favourable, will be an important tool to convince staff of the desirability of
sustaining the system for the majority of students. It may also be sensible in the future to
treat students with low marks slightly differently through the MAC process.
There have been no ‘big’ changes to the original project plan. However a significant change
in the implementation of the plan has seen the nature of the technology driving the
questionnaire and report part of MAC change dramatically. At the outset of the project it had
not been envisaged that the process would become so integrated with corporate systems so
quickly. However, through the link with the TWOLER JISC project it has been possible to put
in place a quite substantial part of the technical infrastructure required for any School in the
university to, for example, make use of the information on the student record system to
facilitate the generation of MAC reports.
Transforming Curriculum Delivery through Technology Programme
A further opportunity that has arisen has already been highlighted in section 3. The
possibility of making use of the data collected via student questionnaires to generate reports
for module and course leaders is an exciting one which we believe could change the way in
which student feedback on the conduct of a course from their perspective is monitored. A
proposal has currently juts passed the first filter for HEA funding.
Section Eight: Collaboration and Support
We have seen the programme manager at programme meetings (both specific to the
Delivery programme and also at a joint meeting of the Delivery and Design projects) and
other related JISC meetings (e.g. the Experts group). These face-to-face encounters have
provided ample opportunity to discuss any issues associated with the progress on and
management of the project. We have also had extensive feedback and support from the
programme manager and support team on our first interim report. We would hope that
through contributing at these meetings we would have, through exchange of ideas and
synthesis of project outputs, been able in some small way to input into discussions for
subsequent programme ideas.
We have previously asked for help with our evaluation work and had planned a meeting with
the external experts provided to us specifically to discuss ways of analysing qualitative
information. However, we have simply not had the time to take up the offer. We will in the
next few months seek to address this as we move in to the final evaluation phase and will
especially be seeking some help and advice around analysis of student learning journals.
We are in the process of completing a case study around the project for a JISC publication
on e-assessment/feedback. We are working with Ros Smith and the case study will include a
video scheduled for filming on the 10th March 2010.
We like our cluster group a lot. As stated in our first interim report the cluster group, and the
critical friend, provide plenty of scope for external ‘sounding out’ in relation to our progress
and next steps. With the input of the programme manager/support team and critical
friend/cluster group we have not been short of positive encouragement, ideas and
information about what is going on of relevance elsewhere.
JISC also offer a wide range of other opportunities for us to find out more about what is
happening (e.g. the elluminate series of seminars & face-to-face seminars). However, finding
the time for all of the opportunities made available to us is difficult and at times the number
of emails we receive inviting us to attend one event or another seems ‘overpowering’. We
endeavour to attend as many as possible and ask for resources from meetings we can not
attend
Transforming Curriculum Delivery through Technology Programme
Section Nine: Financial Statement
Total Grant
£200,000
Duration of
project
Reporting
Period
1 September 2009 – 28 February 2010
Budget
Headings
Total budget
allocated
Expenditure this
reporting period
Total
expenditure to
date
Travel &
Subsistence
15000
1648
4338
Equipment
2500
129
3691
Dissemination
activities
17000
2476
4313
Evaluation
activities
5000
0
0
Other (please
specify)
10000
0
0
2 years
Further information
Staff
Checklist:
Before you return this report:
F Ensure that your project webpage on the JISC site is up to date and contains the
correct information. Attach details of any required amendments to this report. Project
webpages can be found from: www.jisc.ac.uk/curriculumdelivery
F If there have been any changes to the original project plan and/or work packages,
ensure that amended copies of the relevant sections of your project plan are
attached to this report.
F Identify and name any areas within this report that you’d like removed before the
report is made public (*see below)
*Please note the interim reports will be made available on the JISC website and on the
Circle site with the budgetary information removed. We recognise that projects may
occasionally address very sensitive issues. We would like you to present as full a picture in
this report as you can as the lessons you learn are valuable to us. We assure you that any
issues you identify as confidential are removed before the report is made public. Where such
issues do represent valuable lessons for the community we will involve you in further
discussion as to how they could be passed on without identifying institutions or individuals.