NVivo in Education Research: Two Examples from New Mexico April 16, 2015 (Webinar for QSR) Summary Sheet Prepared by Scott D. Hughes, PHD UNM Center for Education Policy Research Introduction The Co-Teaching Collaborative Schools Initiative A model for teaching training based on a mentorship relation between an experienced teacher and a student teacher Useful Sources for Information on CTCS: o A New Student Teaching Model for Pairing Interns with Clinical Teachers http://www.edutopia.org/blog/co‐teaching‐internship‐model‐teresa‐heck o Co-Teaching Collaborative Schools (CTCS) https://coe.unm.edu/departments‐programs/teelp/co‐teaching‐collaborative‐ schools/index.html The CNM Faculty Evaluation Survey Administered to both full- and part-time faculty at the Central New Mexico Community College, known as CNM, during the spring of 2015 Website for CNM: o Central New Mexico Community College https://www.cnm.edu/ The CEPR was contracted to conduct evaluations of both. The Co‐Teaching Collaborative Schools Initiative (CTCS) What it is: Initially developed at St. Cloud State College in Minnesota. Depends heavily on a mentorship arrangement whereby a student teacher (teacher candidate) works with: A cooperating teacher, who provides guidance in: UNM CEPR APRIL 16, 2015 1 skill development, co‐teaching strategies, and enculturation into the school community. An embedded faculty member who provides: a presence at the school (for the UNM COE) help to mitigate any issues or conflicts that arise in relation to the program operations, and additional support for the teacher candidates. During the third semester, the teacher candidate is: present in the collaborative classroom 5 days a week, provided a cumulative total of 900 clinical hours of pre‐service experience. Data Sources: For this project, there were a number of data sources including: student level quantitative data collected from the district; weekly monitoring data submitted by each of the teacher candidates; and guided discussion group transcripts. How collected: Teacher candidates are required to: o complete a series of weekly monitoring surveys that tally data on teaching strategies used in the CTCS approach. (Completed via SurveyMonkey) o meet regularly with their mentor in what are termed paired team interviews. Cooperating teachers and administrators meet in separate guided group discussions, termed focus groups by school (N=3): o Total of 9 Cooperating Teacher & 3 Administration Groups o Facilitated by a member of the COE coordinating team; o Responded to a series of questions o administrator groups were asked 7 questions; and cooperating teachers were asked 9 questions. Audio recorded and later transcribed. UNM CEPR APRIL 16, 2015 2 Analysis: CEPR formed a small team of two senior researchers and two graduate students. One senior member assigned to analyze the quantitative data: o weekly teacher candidate surveys; o student assessment data collected via the district and state; and o used Stata and Excel. Another senior member conducted the qualitative analysis: o Transcripts from the guided discussions: 9 Cooperating Teacher Discussion Group Transcripts 3 Collaborative School Administrator Group Transcripts o Used NVivo. Challenges: The biggest challenge was the steep initial learning curve when first learning NVivo. o Previous experience with QSR’s NUD*IST software (late 1990s), but skill set had faded out. o Addressed the training need by: Utilizing Web‐based training videos Reading “Qualitative Research Analysis with NVivo,” by Patricia Bazeley and Kristi Jackson. o Time commitment of roughly a week (on and off) of viewing the videos, reading the book, and getting my hands dirty with the software to develop user competence. A second challenge worth discussing surrounds the transcripts: o The discussion groups were held in school buildings, processes sometimes interrupted by schedule bells that at times drown out the comments. A final challenge also relates to the transcripts and the discussion group process: o At times some participants would talk across each other that resulted, at times, in choppy transcript copy (i.e. verification of which part of the transcript belonged to which participant). UNM CEPR APRIL 16, 2015 Highly recommend this book as a general reference and user guide to the software. 3 o Issue addressed by: Referring back to audiotape; “Over highlighting” text to be dropped into a node; Stitching the separate pieces back into a cohesive flow for use in the report narrative. CNM Faculty Evaluation Survey What it is: CNM is the: o acronym for the Community College of New Mexico located in Albuquerque, NM; o provides several different training and certificate programs as well as awarding an associate’s degree in various disciplines; and is o the largest IHE in New Mexico. The Survey: Conducted in spring, 2014 by school’s faculty senate to collect input on the faculty evaluation system. Comprised of: o 6 questions for full‐time, and o 5 questions for part‐time faculty. Structure of Questions: o First question in each survey requested rank ordering of items, and o Remaining questions requested written responses. Administered via Survey Monkey. Faculty (spring 2014 counts): 320 full‐time, and 730 part‐time faculty. Solicited via email messages. Response rate: 96 full‐time faculty or ~30%, and 188 for part‐time, roughly 25.8%. CNM Faculty Senate contracted with CEPR in late August 2014 to evaluate the results. NVivo used as the evaluation software. UNM CEPR APRIL 16, 2015 4 Data Sources: CNM Faculty Senate administered separate surveys to each of the full‐time and part‐time faculty groups. Provided in Excel file format: Data from first question provided in numeric form; and Data from remaining questions was in typewritten form. Number of responses varied by question Challenges: The biggest challenge faced was one of time in terms of billable hours. Unfamiliarity with the application of NVivo in the text environment being confronted Not able to use autocoding function of the software Underestimated time commitment for project by 50% A second challenge was tied to the variance in responses made by the faculty to each questions. On one hand, high enough degree of consistency in responses allowed for coding across a handful of nodes, On the other, a sufficient number faculty responded in ways that resulted in the creation of one or two response nodes. Solution followed: o Three‐item threshold established for a response node to be included in graphs the client requested be created per question. o Responses that did not meet this threshold were recoded to a general or miscellaneous response node. Recoding process is described in either: online training videos, or Bazeley and Jackson’s “Qualitative Data Analysis Using Nvivo” mentioned earlier. Closing Comments As illustrated in this presentation, NVivo: provides an effective software tool for application in the field of education research. possess a fairly steep initial learning curve. The software has the capacity for providing an effective means for: coding and organizing large volumes of data quickly and efficiently, facilitating report preparation. UNM CEPR APRIL 16, 2015 5 Future Applications of NVivo at CEPR: A second round of analyses on the CTCS project later this spring, and Another evaluation project with a group of early childhood educators that includes a set of interviews. I want to close by emphasizing the successful outcomes I realized in these projects with the production of reports that have helped inform important policy discussions on how best to educate student teachers in New Mexico and how to have an effective faculty evaluation system at one of the key institutions for higher education in the state. I have provided my email on the last page if you are interested in contacting me. If any of you are interested in getting a summary sheet with the major points I made in this presentation, I have included the website for CEPR and the title of the document that is available under the “CEPR Publications & Presentations” tab. UNM CEPR APRIL 16, 2015 6
© Copyright 2024