exploration - Centre for the Study of Higher Education

Exploring the rela/onship between learning strategies and learning outcomes in inquiry-­‐based online resources Associate Professor Barney Dalgarno Faculty of Education Charles Sturt University, Wagga Wagga Acknowledging the contribution of: Associate Professor Gregor Kennedy The University of Melbourne Associate Professor Sue Bennett University of Wollongong Overview —  Theoretical support for inquiry-­‐based and discovery learning —  Empirical and theoretical question marks —  This study – comparing free exploration and scaffolded observation for discovery learning —  Initial results showing very little difference —  Reanalysis taking into account exploration strategy showing advantage for systematic exploration —  Implications for inquiry learning research —  Implications for the use of learning analytics Inquiry-­‐based/discovery learning —  Learning designs where students: —  explore learning resources, physical objects and spaces, virtual environments or computer-­‐based simulations, —  with minimal explicit instruction, and —  with minimal guidance during exploration. —  Inquiry-­‐based learning —  Is a key element of constructivist theories of learning and —  underpins Problem Based Learning (PBL) Seminal cri/cisms of ‘pure’ discovery learning —  Mayer, R. (2004). Should there be a three-­‐strikes rule against pure discovery learning? American Psychologist, 59(1), 14-­‐19. —  Kirschner, P.A., Sweller, J. & Clarke, R.E. (2006). Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-­‐Based, Experiential, and Inquiry-­‐
Based Teaching. Educational Psychologist, 41(2), 75-­‐86. —  Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-­‐based instruction enhance learning? Journal of educational psychology, 103(1), 1. Guidance in discovery learning —  To address the limitations of pure unguided discovery learning, guidance and scaffolding strategies including the following have been proposed (see for example de Jong and van Joolingen, 1998): —  Instructional information in advance of exploration —  Hints on exploration strategy during exploration —  Structuring the task and/or learning resource to reduce complexity —  Instructional feedback during exploration —  Peer guidance (eg. through cooperative/collaborative learning designs) Computer-­‐based simula/ons —  De Jong & van Joolingen (1998) distinguish between conceptual and operational simulations —  Conceptual computer-­‐based simulations: —  allow the learner to develop an understanding of concepts/constructs within a topic domain and their inter-­‐relationship, —  allow the learner to observe the effect of various input variables on various output variables, —  have particular efficacy in scientific topic domains, —  are often used within a predict-­‐observe-­‐explain learning design (White and Gunstone, 1992). This study —  Explored the value of individual manipulation of input parameters and prediction of effect on output while using conceptual computer-­‐based simulations —  Learning outcomes after exploration and manipulation were compared with outcomes from observation of output from preset simulation parameters Experimental design Condition
Observation
Exploration
Content Area
Blood Alcohol
Concentration
N=73
N=85
Global Warming
N=85
N=73
Experimental design —  Two content areas (global warming, blood alcohol concentration, considered separately in the analysis) —  Computer-­‐based simulations developed to support two alternative learning conditions (exploration, observation) in each content area —  Each participant (n=158) completed: a pre test on knowledge within each content area, —  the exploration condition in one content area and the observation condition in the other content area, assigned at random —  A post test on knowledge within each content area — 
Resource designs —  Exploration —  A series of instructional screens providing background to the content area and explanation of terminology but not including explanation of key concepts —  A series of screens allow for setting of simulation parameters and mental prediction of output, observation of results, and mental explanation. —  Observation —  The same series of instructional screens —  A series of simulation output screens each showing the effect of one set of input parameters Blood Alcohol Concentra/on -­‐ Explora/on Blood Alcohol Concentra/on -­‐ Observa/on Global Warming -­‐ Explora/on Global Warming -­‐ Observa/on Data collec/on —  Identical pre-­‐test and post-­‐test on conceptual understanding within these content areas —  Global warming – 7 items —  Blood alcohol concentration – 9 items —  Questionnaires on cognitive load and engagement —  Student actions within the simulations were also logged to allow later analysis of their exploration strategies Example test item – Blood Alcohol —  A person with greater body weight: q Will have a higher Blood Alcohol Concentration (BAC) than a lighter person. q Will have a lower BAC than a lighter person . q Will have the same BAC as a lighter person. q Will have their BAC increase at a greater rate than a lighter person Example test item – Global Warming —  Which of the following environmental factors have a direct or an indirect effect on the Global Average Surface Temperature (GAST)? q The amount of Carbon Dioxide (CO2) absorbed by plants q The thickness of the ozone layer q The percentage of CO2 in the atmosphere q The Greenhouse insulation effect Results – Pre test versus Post test Content Area Global Warming Blood Alcohol Condi,on Pre-­‐Test M (SD) Observa/on (n=85) 1.82 (1.51) Post Test M (SD) 1.42 (1.29) t Explora/on (n=73) 1.68 (1.42) 1.72 (1.85) 0.20 (p=0.841) Observa/on (n=73) 3.55 (1.25) 3.42 (1.31) 0.60 (p=0.552) Explora/on (n=85) 3.93 (1.40) 2.33 (p=0.022*) 3.60 (1.24) —  Little or no improvement on post test 2.26 (p=0.027*) Results – Observa/on v Explora/on Content Area p Global Warming Mean Post-­‐Test Mean Post-­‐Test F Observa,on Explora,on 1.42 1.72 F(1,155)=2.40 Blood Alcohol 3.42 0.02* 3.93 F(1,155)=5.52 0.124 —  No effect of learning condition for Global Warming (F (1,155) = 2.40; p = .124) —  Main effect of learning condition for Blood Alcohol (F (1, 155) = 5.52; p = .02) Explora/on strategy characterisa/on —  Ultimately, we arrived at the following characterisation of participants: —  Systematic Exploration Participants: —  4 or more cycles with only one variable changed from previous cycle or —  4 or more cycles with only one variable changed from the provided example (‘Bill’s values’ or ‘2006 values’) —  Non Systematic Exploration Participants: —  All other exploration participants Results by Strategy Pre test versus Post test Content Area Condi,on Pre-­‐Test M (SD) Observa/on (n=85) 1.82 (1.51) Post Test M t (SD) Global Warming 1.42 (1.29) t(84)=2.26 (p=0.027) Non-­‐systema/c 1.44 (1.18) 1.33 (1.52) t(47)=0.44 Explora/on (n=48) (p=0.662) Systema/c 2.16 (1.72) 2.48 (2.20) t(24)=0.83 Explora/on (n=25) (p =0.415 ) Blood Alcohol (n=73) 3.55 (1.25) 3.42 (1.31) t(72)=0.60 —  Little or nObserva/on o improvement on post test (p=0.552) Non-­‐systema/c 3.47 (0.17) 3.51 (1.30) t(50)=0.24 Explora/on (n=51) (p=0.814) Systema/c 3.79 (1.25) 4.56 (1.33) t(33)=3.25 Explora/on (n=34) (p=0.003) Results by strategy Content Domain Post-­‐Test Tutorial M (SD) Post-­‐Test Systema,c Discovery M (SD) 2.48 (2.20) F p Global Warming 1.42 (1.29) Post-­‐Test Non Systema,c Discovery M (SD) 1.33 (1.52) 4.17 .017 Blood Alcohol 3.51 (1.30) 4.56 (1.33) 8.69 <.001 3.42 (1.31) —  Significant main effect of learning condition for both content domains —  In each case Systematic Discovery > Non Systematic = Tutorial Implica/ons for inquiry learning —  Idea that pure discovery learning can be inefficient or ineffective is consistent with Mayer (2004), Kirschner et al. (2006) and Alferi et al. (2011). —  For some students, however, an active discovery-­‐based design is ideal, which to some extent counters criticism of such designs. —  For others, the benefits of active exploration are countered by confusion caused by unsystematic exploration. —  For these students support or scaffolding is needed to ensure that they explore the environment in a systematic way and focus on the key concepts within it. Posi/oning this work in the learning analy*cs landscape —  Learning analytics is concerned with the collection, analysis and reporting of data about learning in a range of contexts. It informs and provides input for action to support and enhance learning experiences, and the success of learners. (Simon Buckingham Shum, The Open University, ascilite 2011 keynote presentation). —  Some purposes of learning analytics: —  To inform future institutional student support strategies —  To identify students at risk —  To inform future subject/unit design —  To dynamically support students according to their learning needs Characterising student learning strategies —  The log file data in this study provided us with a number of variables that could be used to characterise learners’ strategies —  For example: —  Time spent on the task as a whole —  Time spent on specific screens representing aspects of the task (eg. planning, manipulating, reviewing output) —  The number of iterations through the simulation —  The number of variables changed during each iteration —  The values chosen Characterising student learning strategies —  There are a number of alternative approaches that have been used by others: —  Thompson and Reimann (2010), drawing on Levy and Wilensky (2005), used rules based on the values chosen by learners, the time spent and the number of iterations, and characterised learner strategies as ‘straight to the point’, ‘homing’ or ‘oscillating’, in manipulating an agent-­‐based model . —  Kennedy and Judd (2004) used Cluster Analysis to identify clusters of students with interaction patterns illustrating distinct learning strategies in the context of exploration of a digital learning resource —  Kennedy et al. (2012) developed Hidden Marcov Models characterising expert and novice performance in a surgical simulator and dynamically provided feedback to learners depending on which model their actions best matched Characterising explora/on strategies —  An alternative approach we considered (but didn’t use here) was to use cluster analysis, drawing on the following variables: —  time spent on the background material preceding the simulation, —  total time spent on the simulation —  number of cycles in which exactly one variable was changed from the previous cycle —  number of cycles in which exactly one variable was changed from the provided base values —  number of cycles where at least one variable was changed from the previous cycle —  the sum of the number of variables changed per cycle across all cycles. Characterising explora/on strategies —  Cluster analysis for the Blood Alcohol condition led to a three-­‐cluster solution, discriminated by: —  by time spent on the simulation, and —  the degree to which the student manipulated single variables in the simulation. —  Cluster analysis for the Global Warming condition led to a four-­‐cluster solution, using the same variables as above. —  The additional cluster in the Global Warming condition contained students with interaction patterns that were indicative of a complete lack of engagement with the program. Take home messages for learning analy/cs —  Learning designers and academic staff need more sophisticated understandings of the relationship between learning activities and outcomes —  Techniques such as Cluster Aalysis and use of Hidden Marcov Models have promise in characterising learning strategies —  We need tools that make it easier to —  Develop empirically informed characterisations of successful and unsuccessful strategies in specific discipline/learning design contexts —  Automatically provide tailored support based on this characterisation Follow up research —  Currently analysing the engagement and cognitive load questionnaire data to determine whether there is a relationship between these factors and learning strategies and outcomes for each learning condition. —  Also exploring the role of prior knowledge is a factor contributing to systematic exploration as predicted by Kirschner et al. (2006) —  We have also undertaken a study using the same experimental design but with participants in an MRI scanner to determine whether brain activation differences are evident between the two conditions (see Dalgarno, Kennedy & Bennett, 2010) Acknowledgements —  Support for this project has been provided through a Charles Sturt University Small Grant and a University of Wollongong Centre for Research in Interactive Learning Environments Seed Grant. —  Dr Terry Judd of the University of Melbourne contributed to the development of the simulation interaction logging modules. —  Dr Michael Lew of the University of Melbourne provided the blood alcohol concentration simulation model. Ques/ons