How Breakthroughs in Neuroscience and Epigenetics Are

WHEATON COLLEGE GRADUATE SCHOOL
Lessons From The Little Engine That Could: How
Breakthroughs in Neuroscience and Epigenetics Are
Confirming The Need For Teachers To Have And Maintain
High Expectations Of Students
A Thesis Submitted To The Faculty Of The Graduate School In
Partial Fulfillment Of The Requirements For The Degree Of
Masters Of Arts
Education Department
by
Kirsty Rouse Thompson
Wheaton, Illinois
May 2010
i
Lessons From The Little Engine That Could: How
Breakthroughs in Neuroscience and Epigenetics Are
Confirming The Need For Teachers To Have And Maintain
High Expectations Of Students
by
Kirsty Rouse Thompson
Approved:
_________________________
_______________
Dr. Barwegen, First Reader
Date
_________________________
________________
Dr. Loomis, Second Reader
Date
ii
Disclaimer
The views expressed in this thesis are those of the student and do not
necessarily express the views of Wheaton College Graduate School.
iii
WHEATON COLLEGE
Wheaton, Illinois
Date_______________ 20 ___
Lessons From The Little Engine That Could: How Breakthroughs
in Neuroscience and Epigenetics Are Confirming The Need For
Teachers To Have And Maintain High Expectations Of Students
Wheaton College
Department of Education
Masters of Arts Degree
Permission is herewith granted to Wheaton College to make copies of the above title, at
its discretion, upon the request of individuals or institutions at their expense.
__________________________
Signature of Author
Extensive quotation of further reproduction of this material by persons or agencies
other than Wheaton College may not be made without the expressed permission of the
writer.
iv
Dedication
To the LORD who has created us so fearfully and wonderfully, and whose love and grace
have sustained me.
To my husband Neil and to my family for their continual love, support, and prayers.
To three Wheaton College professors who had high expectations of me and challenged
me to strive for excellence: Dr Barwegen, Dr. Loomis, and Dr. Davis.
v
Abstract
In the midst of striking achievement gaps that exist within the nation’s schools,
many teachers, in spite of the plethora of educational literature on the subject, still
underestimate the role of teacher expectations and student self-efficacy beliefs with
regard to student achievement. This research calls attention to the growing fields of
neuroplasticity and epigentics and highlights the ways in which both disciplines can be
used to combat the often erroneous idea of genetic determinism as it relates to beliefs
about the nature of intelligence. The assertion is that scientific breakthroughs ought to
influence both educational policy and practice and should therefore be an essential part
of teacher preparation programs as well as continuing professional development for inservice teachers. In addition, the research explores the possible implications that our
evolving knowledge of neuroscience and the mind/brain debate may have on Christian
theology, and how this might impact Christian educators’ understanding of the role of
education.
Key words: achievement gaps, teacher expectations, theories of intelligence,
neuroplasticity, epigenetics, brain based education.
vi
Table of Contents
I. INTRODUCTION
Definition of Important Terms
II. REVIEW OF LITERATURE
A. Education
i) Student Beliefs
ii) Teacher Expectations
B. Scientific Breakthroughs
i) Epigenetics
ii) Neuroplasticity
iii) Brain-based Education
C. The Mind/Brain Debate and Questions of Theology
III. DISCUSSION
A. Implications of Research
i) Student Beliefs and Teacher Expectations: To be a Flower Girl or a
Duchess, that is the Question
ii) Epigenetics: Genes are “so twentieth century”
iii) Neuroplasticity: Mind Altering Stuff
iv) Transforming Minds: Does a Christian Understanding of the
Mind/Brain Debate Have Any Impact on One’s View of the
Role of Education?
B. RECOMMENDATIONS
IV. CONCLUSION AND FUTURE STUDY
REFERENCES
vii
I. Introduction
When I first decided that I wanted to teach, a significant part of my motivation was to see
each of my students succeed; to witness, and be a part of, each of them reaching their true
potential—academically, socially, even spiritually. Regrettably, when one looks at the statistics
emerging from the nation’s educational system, striking achievement gaps exist, and it is plainly
apparent that not every child is reaching his or her full potential (Gandara, 2009; Planty, Kena, &
Hannes, 2009; Vanneman, Hamilton, Anderson, & Rahman 2009). Consequently, one cannot
ignore the pressing need for educational reform and intervention to help eliminate the gaps that
exist among students within the nation’s schools. Indeed, achievement gaps should concern
educators at all levels today, especially in light of the threats these gaps pose to education
quality and social equity.
The federal government, along with state and local governments, has recognized this need
within the public school system and has made steps to provide funding accordingly. President
Obama has allocated $100 billion dollars to education in his economic stimulus plan (Moses,
2009). Within this amount, $200 million has been allocated for the Teacher Incentive Fund;
$650 million for more innovative programs; and $39 billion for professional development, school
staff, after‐school programs, and early‐childhood education (Moses, 2009).
It is encouraging to see that a substantial amount has been allocated to professional
development. Richards (2009) asserts that one of the primary reasons for the achievement gap
is lack of teacher professional development, which he describes as the debilitating crack in the
foundation of America’s public education system. Furthermore, Gallagher (2009) and Zeichner
1
(1995) assert that professional development is one of the primary tools used to strengthen
teacher craft and consequently address the achievement gap.
The Department of Education at Wheaton College, Wheaton, Illinois, where I am studying to
obtain my Masters of Arts in Teaching (MAT), has a conceptual framework that envisions the
teacher as “an agent of change.” The way I see it, one aspect of being such an agent of change
is taking seriously the moral responsibility to keep as up-to-date as possible with research, not
only in one’s own field, but in all fields which have the potential to inform teaching practice.
Part of this keeping up-to-date ought to come from a teacher’s own initiative, but the other part
invariably needs to come from outside resources, such as those provided to teachers through
professional development opportunities. Certainly, students, via their teachers, should have
access to the latest research on learning in order to enrich and empower them with both
learning tools and life skills. As Eric Jensen reminds us, students’ brains “will be altered by the
experiences they have in school. As educators, we must—ethically, morally, and
opportunistically—pay attention to how we ask students to spend time with us” (2005, p. 1-2).
Not surprisingly, one such area that has started to receive attention in educational
professional development is the brain itself (Battro, Fischer & Lena, 2008; Diamond & Hopson,
1998; Hart, 1985; Jensen, 2005; Posner & Rothbart, 2007; Ramey & Ramey, 1996; Sousa, 2001;
Sylwester, 1995; Westwater & Wolfe, 2000; Wolfe & Brandt; 1998, Zull, 2002). Indeed, the
brain/mind is one of the most exciting and important areas currently under investigation by
researchers in neuroscience, psychology, philosophy, biology, and numerous other disciplines,
and many important breakthroughs have been, and are currently being, made in the field of
neuroscience. For example, it was only a few short decades ago that scientists considered the
brain to be fixed or hardwired, incapable of change in any real or significant way (Kandel, 2006).
2
Furthermore, not that long ago, to suggest that one’s thoughts could have an impact on the
structure of one’s brain would have incurred the risk of ridicule, or worse still, being branded a
scientific naïf (Damasio, 1994). This dogma (that the brain is incapable of change), combined
with the dogma of total genetic determinism (Herrnstein & Murray, 1994), has had profound
consequences, and the lingering effects are still being witnessed today in the ways that some
teachers regard their students and their ability to learn and change.
Thus, the significance of this study is to highlight the real and far-reaching paradigm shifts
that have taken place in the fields of neuroscience and cellular biology and to advocate a brainbased understanding of the importance of teachers having and communicating high
expectations to their students. While I acknowledge that there already exists an extensive body
of research that describes how teachers’ expectations can influence student performance
(Bamburg, 1994; Benard, 1995; Edmonds, 1986; Howard, 1990; Lumsden, 1997; Pardini, 2007;
Rosenthal and Jacobsen, 1968; Zeichner, 1995), as well as a vast body of research that explores
the ways in which self-efficacy and self-concept correlate with academic achievement (Bandura,
1993; Hackett & Betz, 1989; Hillman, 1984; Lent, Brown, & Larkin, 1984; Musher-Eizenman,
Nesselroade & Schmitz, 2002; Schunk, 1991; Zimmerman, Bandura & Martinez-Pons, 1992;), I do
not believe that the connection between the brain’s ability to transform itself and the
implications that this has for improving self-efficacy and self-concept (through teachers
maintaining and communicating high expectations of their students) has been formally explored
or established—even within the newly developed field of brain-based education. This is a
noteworthy oversight, and it is in this regard that my research will offer original contributions to
the literature.
3
Such research is important because, in spite of recent discoveries and developments with
regard to epigenetics and neuroplasticity, there still exists, in both students and teachers, a
trend towards academic fatalism—a belief that we as people do not really have freewill but are
simply the victims of our genes (Herrnstein & Murray, 1994). Consequently, many of the beliefs
that drive the lives of students are false and self-limiting, and in too many instances, teachers
are doing little to counteract these cognitive distortions (in many instances because these
teachers have not been trained in effective ways of doing this). Undoubtedly, this negative way
of thinking about students’ capacities to change and learn contributes to some of the glaring
disparities that exist in the achievement levels of students. Certainly, there is no question that
genetics play a role in who students are and how they behave and learn, but breakthroughs in
our understanding of epigenetics and neuroplasticity suggest that each student has a far greater
capacity and potential for change than that which was previously acknowledged (Wolfe &
Brandt, 1998; Ramey & Ramey, 1996, 2002).
Indeed, science is advancing at a rapid pace and the onus is on teachers to keep abreast of
new developments. In fact, Dr. Bruce Lipton (a cellular biologist who taught cell biology at the
University of Wisconsin School of Medicine, Madison, and later performed groundbreaking
studies at Stanford, California, University’s School of Medicine) asserts that, “we are living in
exciting times, for science is in the process of shattering old myths and rewriting a fundamental
belief of human civilization” (2005, p. 17). Consequently, the purpose of this research study is to
identify some of the “old myths” that Lipton refers to (particularly old rules and outdated ways
of thinking about the brain and the mind), and to establish how new facts and knowledge in the
fields of cell biology and neuroscience can contribute to pedagogy. Specifically, the objectives of
this research are threefold: (i) to investigate the current breakthroughs in cell biology and
4
neuroscience with regard to our understanding of epigenetics and neuroplasticity; (ii) to show
how an understanding of these two phenomena can be used to inform and validate the need for
teachers to have, and communicate, high expectations of their students; and (iii) to explore the
possible implications that our evolving knowledge of neuroscience may have on our Christian
theology—particularly as it relates to our call to be transformed by the renewing of our minds
(Romans 12:2) and how this may impact a Christian teacher’s understanding of the role of
education.
Definition of Important Terms
Plasticity—the ability to permanently change or deform.
Neuroplasticity—refers specifically to the brain’s lifelong ability to reorganize neural networks
as a result of new experiences or learning. In other words, neuroplasticity means the rewiring of
the brain. This can result in not only one region of the brain colonizing another, but also in the
wholesale remodeling of neural networks.
Epigenetics—literally translates as being “in addition to or above the genome.” Using an analogy,
if one thinks of the human genome as being like the hardware of a computer, then the
epigenome is like the software which tells the computer when to work, how to work, and how
much. The term has evolved to include any process that alters gene activity without changing
the DNA sequence.
Self-efficacy—a person’s belief about his or her ability and capacity to accomplish a task or
successfully deal with the challenges of life.
5
Brain-based education—the incorporation of strategies and teaching methodologies based on
principles which have been acquired from an understanding of neuroscience and other brainrelated sciences.
Dualism—the view that the mind and the body are composed of two different things: the body
is physical and the mind is nonphysical. This theory of mind was established as long ago as the
fourth century (Plato defended such a view), but it is most commonly associated with
seventeenth-century philosopher René Descartes.
Physicalism—the philosophical position that holds that everything which exists is nothing more
than its physical properties; in other words that there are no kinds of things other than physical
things. In contemporary philosophy, physicalism is most frequently associated with philosophy
of mind, and in particular the mind/body problem, in which physicalism holds that the mind is a
physical thing in every sense of the word; that is, all that has once been ascribed to “mind” is
more correctly ascribed to “brain.” Physicalism is sometimes referred to as “materialism” and
can also be considered a monistic viewpoint because it holds that only matter, or the physical
body, exist.
Non-reductive physicalism—the idea that while mental states are physical they are not
reducible to physical properties.
6
II. REVIEW OF LITERATURE
A. Education
i) Student Beliefs
One of the most prominent voices calling for attention to be paid to the concept of self,
and particularly self-efficacy beliefs of students, has been that of Albert Bandura, professor of
psychology at Stanford University, Palo Alto, California. In 1977, with the publication of “SelfEfficacy: Toward a Unifying Theory of Behavioral Change,” Bandura argued that individuals
create and develop self-perceptions of capability that become instrumental to the goals they
pursue and to the control they are able to exercise over their environments. He argued that
beliefs about one’s ability to successfully perform a given task or behavior determine whether
coping behavior will be initiated, how much effort will be expended, and how long effort will be
sustained in the face of obstacles and aversive experiences. In Bandura’s theory, individuals are
seen as proactive and self-regulating rather than as reactive and controlled by their
environment and other biological factors.
Bandura’s (1997) book, Self-efficacy: The exercise of control, summarizes the importance
of self-efficacy in the following way: People make causal contributions to their own psychosocial
functioning through mechanisms of personal agency, among which none is more crucial and
widespread than beliefs of personal efficacy. This is because unless people actually believe that
they can produce desired effects by their actions, they have very little incentive or motivation to
act. Consequently, efficacy belief is a fundamental basis for action (p. 2). What Bandura (1977,
1986, 1997) ultimately asserts is that it is not enough for people to possess the requisite
7
knowledge and skills to perform a task; in addition, one also has to have the certitude that s/he
can successfully execute the required behavior (even under difficult circumstances).
The self-efficacy contribution to skill utilization identified by Bandura is illustrated in a
study by Collins (1982). In the study, Collins selected children at three levels of mathematical
ability—low, medium, and high. Within each of these ability levels, she found children who
were secure in their perceived mathematical self-efficacy and others who had self-doubts. Each
child was given the same set of difficult problems to solve. At each ability level, children who
believed strongly in their capabilities were quicker to discard faulty strategies, and they
performed better overall. In addition, they chose to rework more of the problems they had
failed on and they did so more accurately than those children of equal ability who were plagued
by self-doubts. Thus, positive attitudes towards mathematics were better predicted by
perceived self-efficacy than by actual ability.
The tenets of self-efficacy have since been tested in varied disciplines and settings and
have received support from a growing body of research from diverse fields (Maddux & Stanley,
1986; Multon, Brown & Lent, 1991). The Multon et al. (1991) research found 36 studies
conducted between 1977 and 1988 on the relationship between self-efficacy and academic
performance or persistence that met their inclusion for a meta-analysis (i.e., those containing a
measure of self-efficacy and academic performance and providing sufficient information to
calculate effect-size estimates). They computed that efficacy beliefs were related to
performance and accounted for approximately 14% of the variance in academic performance.
They noted however that effect sizes depended on specific characteristics of the studies: The
strongest effects were obtained by researchers who compared specific efficacy judgments with
8
basic cognitive skills measures of performance, developed highly concordant selfefficacy/performance indexes, and administered them at the same time.
This is not surprising given that an important aspect of self-efficacy is its domain
specificity (Bandura, 1997). In other words, personal efficacy is not a general disposition devoid
of context, but rather a self-assessment that is specific to the activity domain. As a result,
high/low self-efficacy in one realm does not necessarily automatically result in high/low efficacy
in another. For example, a student may have high self-efficacy for analyzing a Shakespearean
sonnet, but low self-efficacy for completing algebraic equations in mathematics.
As a result, many educational researchers have mismeasured self-efficacy due, in large
part, to their misunderstanding of the construct (Bandura, 1997; Pajares, 1996). As Pajares
(1996) points out, “Because judgments of self-efficacy are task and domain specific, globally or
inappropriately defined self-efficacy assessments weaken effects” (p. 547). Consequently, a
researcher attempting to predict academic outcome is far more likely to find a strong
relationship between self-efficacy and academic outcome if the study assesses specific aspects
of the task. As Bandura (1997) says, “this requires clear definition of the activity domain of
interest and a good conceptual analysis of its different facets, the types of capabilities it calls
upon, and the range of situations in which these capabilities might be applied” (p. 42). Thus,
what Bandura terms the “omnibus measures” of general contextless studies have relatively
weak predictive power.
Hence, there have been studies that report a weak, or even lack of, relationship
between self-efficacy and performance (Benson, 1989; Wilhite, 1990; Smith, Arnkoff, & Wright,
1990). In each of these studies, the efficacy beliefs being tested were more globally assessed
9
and did not correspond with the criterial tasks with which they were being compared. As a
result, their predictive value was diminished or even nullified (Pajares, 1996, p. 557). Therefore,
in general, when used correctly, it would seem that there is ample evidence to suggest that selfefficacy is a powerful predictor of performance. Indeed, results from a recent meta-analysis of
more than 100 empirical studies conducted over the last 20 years found that out of nine
commonly researched psychosocial constructs, academic self-efficacy was the strongest single
predictor of college students’ academic achievement and performance (Robbins, Lauver, Le,
Davis, Langley, & Carlstrom, 2004).
One additional student belief system that affects cognitive functioning is concerned with
how students understand ability (Dweck, 1999; Dweck & Leggett, 1988; Nicholls, 1984). In the
1980s, Carol Dweck, professor of psychology at Columbia University, began investigating the
self-theories behind student behavior, particularly with regard to how students respond in the
face of challenges and failure. Her extensive research in the field led her to the conclusion that
students’ implicit beliefs about the nature of intelligence have a significant impact on the way
they approach challenging intellectual tasks. Some children tend to think of ability as an
acquirable skill that can be improved by gaining new knowledge and competency. Dweck (1999)
termed this view a malleable or incremental theory of intelligence. As a result, these children
seek out challenges that provide them with opportunities to expand their knowledge and
competency. Furthermore, they regard errors as a natural part of the acquisition process and,
for this reason, they are not easily perturbed by difficult situations along the way. By
comparison, other children view ability as an inherent capacity, or what Dweck (1999) referred
to as a fixed or entity theory of intelligence. For these children, performance is indicative of
their inherent intellectual capabilities. Thus, sub-optimal performances communicate high
10
evaluative threats that they lack basic intelligence. Therefore, they prefer to opt for tasks that
minimize the possibility for error and reveal their proficiency, rather than looking for
opportunities to expand their knowledge and skill set. Such children also regard the exertion of
high effort on tasks negatively because they regard effort as revealing that one is not inherently
smart. Dweck and others have conducted numerous studies to explore the role/ relationship of
theories of intelligence on student achievement (Aronson, Fried, & Good, 2002; Blackwell,
Trzesniewski, & Dweck, 2007; Dweck & Leggett, 1988; Mangels, Butterfield, Lamb, Good, &
Dweck, 2006; Robins & Pals, 2002; Stipek & Gralinski, 1996; Ying-yi, Dweck, Lin, & Wan, 1999) all
of which indicate that a malleable view of intelligence improves academic performance.
The research referred to above conducted by Aronson, Fried, and Good (2002) was
particularly concerned with reducing the effect of stereotype threat on African-American
students (i.e., the underachievement of African-Americans students) by shaping theories of
intelligence (in other words by encouraging students to see intelligence as a malleable rather
than a fixed capacity). Three groups of African-American and Caucasian (both male and female)
undergraduates participated in the study (N= 79 Stanford undergraduates). One group
participated in an intervention (a pen-pal program) that employed various attitude-changing
techniques designed to teach and help the students internalize the theory that intelligence is
expandable (malleable pen-pal condition). The attitudes and achievement outcomes for this
group were compared to those of two control groups: one that took part in the same
intervention with a different intelligence orientation (control pen-pal condition) and a third
group that did not take part in the intervention (non pen-pal condition). During these lab
sessions, participants took part in the attitude-changing intervention. A few days after the
intervention, the participants completed a measure of their beliefs about their intelligence as a
11
check on the manipulation, but the remaining measures were given several months after the
start of the intervention. These later measures included participants’ beliefs about the nature of
intelligence, ratings of their enjoyment of academics, their grades, and items designed to assess
their perceptions of stereotype threat.
Participants in the malleable and control pen-pal condition came into the laboratory on
three occasions (each for the duration of one hour), purportedly to participate in the pen-pal
program. Those in the malleable pen-pal group were instructed to write letters to (fictional) “at
risk” middle school students, impressing upon them the need to work hard in spite of their
difficulties, and more importantly, to realize that new research was showing how intelligence is
not a finite endowment but rather an expandable capacity that “grows like a muscle” (p. 117).
In order to reinforce the scientific validity of their message, participants were shown a video clip
that discussed how the brain, and hence intelligence, is capable of growing and making new
connections throughout life. Those students in the control pen-pal group were also told to write
encouraging letters to a younger student, with the only difference being that they were told to
stress how intelligence is not a single entity, but rather composed of “many different talents,
and, as a result, every person has both intellectual strengths and weakness” (p. 118). To bolster
this view, these students were shown a video clip that discussed how psychologists were
beginning to understand intelligence not as a single unit but as composed of many different
abilities.
At the end of the intervention, the following results were obtained. Participants who
had written letters advocating the malleability of intelligence indicated greater enjoyment of the
educational process than if they had been in the control; they reported valuing academics more
than those in the control; and most importantly they obtained higher grades than those in the
12
control. The African-American students tended to obtain higher grades in the malleable penpal condition, both compared to no pen-pal participants, t(72) = 2.19, p ∠ .05, and those in the
pen-pal control t(72) = 2.24, p ∠.05. This strongly suggests that it was the malleability of
intelligence message and not some other intervention that was responsible for the gains in the
academic achievement. For White participants, the condition difference was only marginally
significant, but in the same direction as predicted; higher grades were obtained by participants
in the malleable pen-pal conditions than in either the pen-pal control condition t(72) = 1.76, p
∠ .09, or in the non pen-pal control condition, t(72) = 1.82, p ∠.08. All this was achieved after
just three sessions of advocating the malleability of intelligence (which begs the question of
what could be achieved if this message were constantly being reinforced by all teachers across
grades and subject matters).
In the research by Blackwell, Trzesniewski, and Dweck (2007), two studies were
conducted to explore the role of implicit theories of intelligence in adolescents’ mathematics
achievement. In the first study, which involved 373 seventh graders, the belief that intelligence
is malleable (incremental theory) predicted an upward trajectory in grades over the two years of
junior high school, while a belief that intelligence is fixed (entity theory) predicted a flat
trajectory. In the second study, an intervention teaching incremental theory to seventh graders
(N=48) promoted positive change in classroom motivation, compared with a control group
(N=43). Simultaneously, students in the control group displayed a continuing downward
trajectory in grades, while this decline was reversed for students in the experimental group.
13
ii) Teacher Expectations
One of the most seminal studies to address the relationship between teacher
expectations and student achievement was Pygmalion in the classroom: Teacher expectations
and pupils’ intellectual development (Rosenthal & Jacobson, 1968). Following that study, an
extensive body of research has been developed that describes how teachers’ expectations can
influence student performance (Bamburg, 1994; Benard, 1995; Edmonds, 1986; Howard, 1990;
Lumsden, 1997; Pardini, 2007; Raudenbush, 1984; Zeichner, 1995). While it would not only be
misleading but inaccurate to state that teacher expectations definitively determine a student’s
success, the research discussed below does suggest that teacher expectations play a significant
role in influencing how well and how much students learn.
In the first educational Pygmalion study, Rosenthal, a German-born U.S. psychologist,
and Jacobsen, a U.S. schoolteacher (1968) conducted an experiment at (what the authors
termed) Oak School public elementary school, in order to investigate what role teacher
expectations play in student achievement. The real name of the school was not identified, but
the school was an elementary school in the South San Francisco United School District. In the
experiment, Rosenthal and Jacobsen told teachers that certain children in their classes could be
expected to be “growth spurters,” during the next eight months of school based on the
students’ results on the Harvard Test of Inflected Acquisition. (The Harvard test was in fact nonexistent and the children designated as “spurters” were simply chosen at random.) Oak Park
had 18 classrooms in the school, three at each of the six grade levels. Within each grade level,
the three classrooms were composed of children with above-average ability, average ability, and
below-average ability, respectively. Within each of the 18 classrooms approximately 20% of the
children were chosen at random to form the experimental group. Each teacher was given the
14
names of the children from his or her class who were in the experimental condition, the socalled “spurters.” Therefore, the only difference between the experimental group children and
the control group children was in the mind of the teacher.
By giving the teachers this information, the researchers had as their hypothesis that the
elementary school teachers would possibly unconsciously behave in ways that facilitated and
encouraged the students’ success. Thus, what Rosenthal and Jacobson hoped to determine by
the experiment was the degree, if any, to which changes in teacher expectation produced
changes in student achievement. The results were what they had hypothesized: At the end of
the school year, eight months later, all the children were re-tested with the same test of
intelligence. Considering the school as a whole, the children from whom the teachers had been
led to expect greater intellectual gain did in fact show a greater intellectual gain than did the
children of the control group (Rosenthal & Jacobson, 1966, 1968). Consequently, Rosenthal and
Jacobson’s study has become a classic in the sociology of education in terms of our
understanding of how the attitudes that teachers have about their students’ abilities can
unwittingly influence student achievement.
However, what matters most to researchers is not only that claims to knowledge be
based on credible empirical evidence, but, and perhaps more importantly, that phenomena
claimed as knowledge be replicable. Rosenthal recognized this and went on to conduct several
other studies (Rosenthal, 1984, 1991, 1998). These studies found that what one person expects
of another tends to elicit that behavior from the other person. When discussing the Pygmalion
results, Rosenthal (in Aronson, 2002) noted that the average magnitude of the effect of teacher
expectations could be expressed as a correlation of approximately r = .30 (a substantial
15
magnitude) between what has been expected from research participants and what has been
obtained from research participants (Aronson, 2002, p. 32).
An additional study which investigated teacher expectations and student learning has
been highlighted by Bamburg (1994). As evidence for the impact of teacher expectations and
student achievement, Bamburg recounts a survey conducted by the Center for Effective Schools
(CES) at the University of Washington, Seattle, Washington. During the spring of 1992, the CES
surveyed the staff of 87 elementary and secondary schools in four urban school districts
(Chicago, Detroit, Indianapolis, and Milwaukee) as part of the data collection activities of the
Academy for Urban School Leaders. The surveys, based on CES research, were designed to
assess staff perceptions of their school on nine school variables—one of them being high
expectations for student achievement. The survey results on the high expectations for student
achievement variable indicated that a large percentage of the 2, 378 teachers surveyed did not
have high expectations for the academic achievement of students in their schools. (Examples of
survey questions include: Teachers in my school generally believe most students are able to
master the basic read/math skills; I expect that most students in my school will perform above
the national average in academic achievement; Nearly all of my students will be at or above
grade level by the end of this year. Teachers were asked to indicate whether they strongly
agreed, agreed, were undecided, disagreed, or strongly disagreed with the statements.)
Bamburg asserts that the results raised an important question: do the expectations for
student achievement expressed by teachers in these urban schools differ from the expectations
generally found in schools. To answer this, he compared the mean score on the high
expectations for student achievement variable for all of the schools that the CES had surveyed
during the past seven years (N of schools = 800) with the mean score for the schools surveyed in
16
this project (N of schools = 87). On a five-point scale (1= low, 5= high), the mean score for all
schools was 3.61, while the mean score for the 87 schools in this project was 3.01. Percentile
norms established by the Center show that the average mean score for the 87 schools in this
study would place them at the seventh percentile in comparison with all schools. This result
suggests that teachers in urban schools—regardless of grade level—have lower expectations for
their students.
Both the Pygmalion study as well as the survey conducted by the CES, beg the question
as to what ways teachers’ beliefs about students may translate into differential behavior
towards their students, resulting in those students for whom the teachers have low
expectations performing at a lower level than those students for whom the teachers have high
expectations. A study by Bamburg (1994) shows that either consciously or unconsciously,
teachers often behave differently towards students based on beliefs and assumptions that they
have about them. For example, Bamburg’s research showed that teachers tend to engage in
affirming behaviors such as smiling, leaning toward, and making eye-contact with students more
frequently when they believe they are dealing with high ability students than when they think
they are interacting with low ability students. Furthermore, as observed by Cotton (1989),
students who are perceived by teachers to be low in ability may also be given fewer
opportunities to learn new material: they are often asked less challenging and stimulating
questions; they are provided with scantier and less informative feedback; and they are praised
less frequently for success, called upon less frequently, and given less time to respond than
students who are perceived to be of higher ability.
Ultimately, this research on teacher expectations, or a version of self-fulfilling prophecy,
can have either a positive or a negative impact. In other words, when teachers expect students
17
to do well, they tend to do well; when teachers expect students to do poorly, they tend to do
poorly. For obvious ethical reasons, Rosenthal and Jacobson’s Oak School experiment and
others studies like it, only focused on favorable or positive expectations and their impact on
intellectual competence, but it is reasonable to infer that unfavorable expectations (such as
those identified in the CES survey and the studies by Bamburg and Cotton) could also lead to a
corresponding decrease in student performance.
Pygmalion in the classroom was followed up by many other school-based studies that
examined these assertions from slightly different perspectives. Prominent among the works on
this subject is: “Student social class and teacher expectations: the self-fulfilling prophecy in
ghetto education” (Rist, 2000). Over the course of a two-year observational study, conducted
for one and one-half hours twice weekly, one class of children in a ghetto school were observed
during their kindergarten, first, and second grade years. The study shows how the kindergarten
teacher placed the students into reading groups that mirrored the social class composition.
These groups then lasted throughout the course of the first several years of elementary school.
Furthermore, the way in which the teacher behaved and interacted with the students was
shown to have an important influence on the children’s achievement. Thus, this study highlights
not only the impact of teacher expectations, but also the ways in which teacher expectations
and consequently behavior influence student attainment, which ultimately contributes to
achievement gaps and broader concerns of class and social justice.
The idea of teacher expectations as self-fulfilling prophecy can work in the opposite
direction as well (i.e., expectations that students have of teachers). Feldman and Prohaska
(1979) conducted an experiment to study the effect of student expectations on teachers. In the
experiment, one group of students were told that their teacher was “quite effective,” while the
18
other group was told that their teacher was “incompetent.” The effect of these positive and
negative expectations were then measured in three different ways: the student attitudes
toward the teacher, scores on tests, and non-verbal behavior of the students towards the
teacher. The teachers were not informed of the competency level that had been ascribed to
them. The results of the experiment showed clear differences in all three measures based on
positive or negative expectation. Students with a negative expectation rated the lesson as
being more difficult, less interesting, and less effective. Students with a positive expectation
scored 65.8% on the test, whereas those with a negative expectation scored 52.2% on the test.
In terms of non-verbal behavior, subjects leaned forward more to those who they thought were
good teachers versus those who they thought were poor teachers. In addition, there was some
evidence that students with a positive expectation had better eye contact with the teacher.
Overall, it appeared that the expectations had an impact on the learning outcomes. These
findings were replicated in a follow-up study by Feldman and Theiss (1982) which showed that
both teachers and students can concurrently hold expectations about each other, and that these
expectations affect the entire teaching situation.
Having ascertained the importance of student belief systems and teacher expectations, I
now turn my attention to exploring some of the scientific breakthroughs (and consequent
paradigm shifts) taking place in the fields of cellular biology and neuroscience, in order to gain a
better understanding of how these discoveries might contribute to pedagogy.
19
B. Scientific Breakthroughs
i) Epigenetics
Until recently, the idea that your environment might change your heredity without
changing a gene sequence was scientific heresy. Over 50 years have passed since Watson and
Crick first published the three-dimensional structure of the DNA double helix (Watson & Crick,
1953), and yet the field of epigenetics is now showing that we have become too invested in the
theory that genes control everything. Aravinda Chakravarti, an expert in computational biology
and a renowned geneticist, and Peter Little (2003) argue that “unfortunately the double helix
has not, so far, revealed as much as one would have hoped. The primary reason is an inability to
determine how nurture fits into the DNA paradigm” (p. 412). These authors argue that “a
number of presently fashionable ideas about genetics are simplistic” and that “the influence of
nature and nurture cannot be neatly divided… human beings are each the product of a unique
genome and a unique set of experiences. Both need to be understood to intervene effectively”
(p. 413, emphasis added). The authors continue to assert that “scientifically, it is necessary to
understand a deceptively simple equation: genes + environment = outcome” (p. 414).
Chakravarti and Little recognize the need for balance to be restored in the nature/nurture
debate: “If the past 50 years has seen the revolution of DNA, then the revolution cannot be
completed without an appreciation of both genetic and environmental individuality; only then
will individuals understand the meaning of their inheritance” (p. 414).
Azim Surani (2001), permanent member of Europe’s premier consortium on epigenetic
research, The Epigenome Network of Excellence, has helped to establish the notion that DNA
blueprints passed down through genes are not set in stone at the time of birth (as was once
20
believed). Rather, environmental influences, including factors such as nutrition, stress, and
emotions, can alter those genes, and they can do so without changing their basic blueprint.
Epigenticists like Surani, along with that of Reik, Dean, and Walter (2001) have discovered that
these modifications can be passed on to future generations in much the same way as DNA
blueprints are passed via the double helix. Surani writes: “alterations to epigenetic
modifications allow a switch in patterns of gene expression, which are central to genomic
plasticity and transdifferentiation. While emphasizing the intrinsic nature of the switch [genes],
environmental factors also play a fundamental role” (2001, p.127).
Perhaps one of the most famous examples of epigenetic breakthroughs came from
Randy Jirtle, a professor of radiation oncology at Duke University, and his postdoctoral student
Robert Waterland (Waterland & Jirtle, 2003). In this groundbreaking genetic experiment, the
two researchers started out with a pair of fat, yellow mice, known to scientists as agouti mice
because they carry the agouti gene. This gene, in addition to making these mice obese, also
makes them prone to premature death as a result of cancer and diabetes. When agouti mice
breed, their offspring are typically identical to their parents—that is, they are yellow, fat, and
susceptible to life-shortening diseases. However, in the Jirtle and Waterland experiment, the
majority of offspring were slender and mousy brown, and in addition, they did not display their
parents’ susceptibility to cancer and diabetes. Thus, the effects of the agouti gene had virtually
been erased. In order to achieve these results, the researchers did something surprisingly
simple—they changed the mother’s diet. What was most remarkable about this experiment
was that Jirtle and Waterland achieved this transformation without altering a single letter of the
mouse’s DNA. In other words, the mothers passed along the agouti gene to their offspring, but
21
thanks to a new methyl-rich pregnancy diet, they had added to the gene a chemical “switch”
that had dimmed the gene’s deleterious effects.
Substances, however, like the dietary changes referred to above, are not the only
sources of epigenetic changes. Dr. Moshe Szyf, editor-in-chief of the newly launched scientific
journal, Epigenetics, the first publication to be devoted to the field, has done research linking
epigenetic changes to social interaction. This research showed that the licking, grooming, and
nursing methods that mother rats use with their pups can affect the long-term behavior of their
offspring, and those results can be tied to changes in DNA methylation and histone acetylation
at a glucocorticoid receptor gene promoter in the pup’s hippocampus. This finding was
published in Nature Neuroscience (Weaver, Cervoni, Champagne, D’Alessio, Sharma, Seckl et al.,
2004).
To begin with, most epigenetic modification, by whatever mechanism, was believed to
be erased with each new generation, during gametogenesis and after fertilization. However,
new research is beginning to challenge this belief, suggestsing that epigenetic changes may
endure in at least four subsequent generations of organisms. Michael Skinner, a professor of
molecular biosciences and director of the Center for Reproductive Biology at Washington State
University, Pullman, Washington, and his team described in an issue of Science how they briefly
exposed pregnant rats to individual relatively high levels of the insecticide methoxychlor and the
fungicide vinclozolin, and documented effects such as decreased sperm production and
increased male infertility in the male pups (Anway, 2005). Digging for more information, they
found altered DNA methylation of two genes. As they continued the experiment, they
discovered the adverse effects (decreased sperm production and increased infertility) lasted in
about 90% of the males in all four subsequent generations that they followed, with no
22
additional pesticide exposures. The findings are not known to have been reproduced, but
Skinner and his colleagues are conducting follow-up studies, assessing many other genes and
looking at other effects such as breast and skin tumors, kidney degeneration, and blood defects.
Still other studies have found that epigenetic effects occur not just in the womb, but
over the full course of a human life span. Manel Esteller, director of the Cancer Epigenetics
Laboratory at the Spanish National Cancer Center in Madrid, and his colleagues evaluated 40
pairs of identical twins, ranging in age from three to 74, and found a striking trend: Younger twin
pairs and those who shared similar lifestyles and spent more years together had very similar
DNA methylation and histone acetylation patterns, but older twins, especially those who had
different lifestyles and had spent fewer years of their lives together, had significantly different
patterns. Thus, although twins are epigenetically indistinguishable during the early years of life,
older monozygous twins exhibited remarkable differences in their overall content and genomic
distribution of five-methylcytosine DNA and histone acetylation, affecting their gene-expression
portrait in many different tissues, such as lymphocytes, epithelial mouth cells, intra-abdominal
fat, and selected muscles. These findings indicate how an appreciation of epigenetics has been
missing from our understanding of how different phenotypes can be originated from the same
genotype (Fraga, Ballestar, Paz, Ropero, Setien, Ballestar et al., 2005).
Similar studies, conducted by Petronis and his colleagues, set out to investigate why
twins, who carry the exact same DNA, can end up in the situation in which one twin develops a
hereditary disease and the other one does not (Petronis, Paterson, & Kennedy, 1999; Petronis,
Gottesman, Kennedy, & Paterson, 2003). What their studies revealed was that in about half of
the cases of schizophrenia found in twins, only one twin developed the condition, even though
schizophrenia is widely considered to be genetic in origin. By studying sets of twins in which one
23
twin had a psychiatric disorder and the other didn’t, Petronis and his colleagues found that the
psychiatric patients had more in common with each other, epigenetically, than they did with
their own twins. This is because, as Petronis explains, any two random people share 99.7% of
their DNA, but at the epigenetic level, people are very, very different.
Lastly, another author with a keen interest in the field of epigenetics is cell biologist and
philosopher, Dr. Bruce Lipton. Lipton was involved in some pioneering studies on the cell
membrane performed at Stanford University’s School of Medicine (1977a, 1977b, 1991, 1992).
His experiments examined in great detail the processes by which cells receive information. By
conducting the abovementioned series of experiments, Lipton discovered that the cell
membrane, the outer layer of a cell, is the organic equivalent of a computer chip, and the cell’s
equivalent of a brain (not the nucleus as was previously thought). Because almost all of the
cell’s genes are stored in its largest organelle, the nucleus, conventional science maintained that
the nucleus represented the “command center” of the cell—a notion based upon the
assumption that genes control (determine) the expression of the cell (Vinson, Pernell, Chin, &
Marx, 2000). However, if the brain is removed from any organism, the necessary consequence
of such an action is immediate death of the organism. But Lipton’s research showed that if the
nucleus is removed from a cell, the cell does not necessarily die. In fact, his experiments
showed that some enucleated cells can survive for two or more months without possessing any
genes. In the absence of a nucleus, cells maintained their metabolism, digested food, excreted
waste, breathed, and moved through their environment recognizing and appropriately
responding to other cells, predators, or toxins. Lipton therefore argued that the fact that cells
maintain a successful and integrated life in the absence of genes, reveals that genes cannot be
the “brain” of the cell. In other words, the primary reason why genes cannot “control” biology is
24
that they are not self emergent (Nijhout, 1990). This means that genes cannot self-actualize;
they are chemically unable to turn themselves on or off. Therefore gene expression is under the
regulatory control of environmental signals through epigenetic mechanisms. Dr. Lipton was one
of the first scientists to posit such extra-cellular control, but this has been supported by the
research of others (Nijhout, 1990; Symer & Bender, 2001).
In his lay book, The Biology of Belief (2005), Dr. Lipton addressed some of the practical
applications of epigenetics to everyday life. Drawing on his studies, mentioned above, he
suggested that genes and DNA do not control our biology; instead, he argued that genes are
turned on or off by signals outside the cell (including messages emanating from our positive and
negative thoughts). Thus he asserts that our beliefs (true or false, positive or negative) affect
genetic activity and actually alter our genetic code. Lipton also traces the influence that Charles
Darwin’s The Origin of Species has had on biology’s adoption of the primacy of DNA, highlighting
how the primacy of DNA within biology has contributed significantly to the notion of genetic
determinism (as evidenced by the work of Herrnstein & Murray, 1994).
Lipton’s work, however, is not without its critics. There are those who have argued that
Lipton overextends the research and that Lipton has made the transition from academic
scientist to proselytizer of not only a new biology but also a new-age religion. Indeed, Lipton’s
assertions run counter to what is considered by many as the “established” scientific view,
namely that life is controlled by genes (Dawkins, 1989, 2006; Gazzaniga, 2005). In his own
defense, Lipton has said, “There is no doubt that human beings have a great capacity for sticking
to false beliefs with great passion and tenacity and hyper-rational scientists are not immune”
(2005, p. 16). He is also quick to point out that “I need to make it very clear that I do not
believe that simply thinking positive thoughts always leads to physical cures” (p. 127).
25
ii) Neuroplasticity
Scholars have long argued about what intelligence is and how it should be measured
(Gardner, 1983; Gould, 1981; Herrnstein & Murray, 1994; Lewotin, Rose, & Kamin, 1984; Neisser,
Boodoo, Bouchard, Boykin, Brody, Ceci et al., 1996). But one question that has produced strong
opinions at both extremes concerns the malleability of intelligence, in other words, whether
intelligence (and by association, the brain) is expandable or fixed/hardwired at birth. For more
than a century, medical science firmly believed that the human brain could not grow new brain
cells (Kandel, 2006). However, after decades of experiments and hundreds of studies,
neuroplasticity is now established as a fact of mainstream science.
Although the notion of a fixed or hardwired brain was the widely held view for most of
the previous century, there were some scientists who were clearly ahead of their time with
regard to their view of the brain. One such example was Spanish neuroscientist and Nobel
Laureate Santiago Ramón y Cajal, who spent his later life looking in vain for proof of brain
plasticity, but nonetheless proposed in 1894 that “the organ of thought is, within certain limits,
malleable, and perfectible by well-directed mental exercise” (in Doidge, 2007, p. 201).
As early as the 1930s, Canadian neurosurgeon, Dr. Wilder Penfield began experimenting
with the development of brain maps in order to try and ascertain where in the brain various
parts of the body were represented as well as the activities that those parts of the brain
processed. Penfield's medical exploration began with the causes and treatment of epilepsy,
which was, at the time, considered incurable. Alongside his colleague, Herbert Jasper, he
invented a procedure in which he treated patients with severe epilepsy by destroying nerve cells
in the brain where the seizures originated. Prior to conducting the operation, Penfield
26
stimulated the brain with electrical probes while the patients remained conscious on the
operating table (Penfield only used local anesthesia), and thus he was able to observe their
responses. By doing this, he could more accurately target the areas of the brain responsible for
the seizures and therefore reduce the side effects of the surgery. This method also allowed him
to directly observe the living brain and consequently to amass a large body of data which he
used to construct maps of the sensory and motor cortices of the brain and to reveal their
connections to the various organs and limbs of the body. These maps are still used today,
practically unaltered. During the course of his career, Penfield operated on approximately 400
patients, and eventually summarized his findings in a 1950 book called The Cerebral Cortex of
Man. The book, which was written with his colleague Theodore Rasmussen, is a detailed
account of Penfield’s electrical explorations of the human brain. While Penfield was a
groundbreaking researcher and highly original surgeon, his work led many scientists to believe
that the human brain consisted of specialized areas designed for specific functions (i.e., that one
area of the brain could not do the work of another area of the brain). Owing to that fact,
scientists thought that the brain was incapable of change, and they concluded that the brain’s
mental maps were permanent and unvarying from one individual to another. Nobel Laureate
Eric Kandel attests to this position: “when I was a medical student in the 1950s, we were taught
that the map of the brain… was fixed and immutable throughout life” (2006, p. 216).
Michael Merzenich, however, discovered that these brain maps were neither immutable
nor common to all humans. Merzenich took the sensory cortex maps developed by his
predecessors such as Wilder Penfield, Clinton Woolsey (Woolsey, Marshal, & Bard, 1937;
Woolsey & Travis, 1956; Woolsey, Erickson, & Gilson, 1979), Vernon Mountcastle (1978), Wade
Marshall, and Philip Bard, and refined them using dense micro-electrode mapping techniques.
27
The experiment which perhaps made neuroplasticity most famous among neuroscientists, and
as Doidge (2007) asserts, “did more to win over skeptics than any plasticity experiment before
or since” (p. 60), was Merzenich’s experiment with the mapping of a monkey’s hand map in the
brain (Merzenich, Nelson, Stryker, Cynader, Schoppmann, & Zook, 1984). During the
experiment, Merzenich and his team mapped adult owl monkeys’ hand maps in their brains
using microelectrode mapping techniques. Following the initial mapping process, they surgically
amputated digit 3, or both digits 2 and 3, and then digital nerves were tied to prevent their
regeneration within the amputation stump. Two to eight months later, they re-mapped the
monkeys’ brains and found that the brain maps for the amputated digit(s) had disappeared and
that the maps for the adjacent fingers had grown into the area that had originally mapped for
the middle finger. Successive maps were derived in several monkeys to determine the nature of
changes in map organization in the same individuals over time. In all monkeys studied, the
representations of adjacent digits and palmar surfaces expanded topographically to occupy
most or all of the cortical territories formerly representing the amputated digit(s). Ultimately,
this experiment confirmed that brain maps are dynamic, responding and adapting to changes in
the environment.
Around the same time that Merzenich was conducting his research during the 1960s,
David Hubel and Torsten Wiesel were involved in Nobel Prize-winning research to micromap the
visual cortex in order to learn how vision is processed. They conducted their work with kittens,
and it was during this experiment that they discovered that there was a “critical period”
between the third and eighth week of life in which the newborn kitten’s brain had to receive
visual stimulation in order for vision to develop as normal. In this experiment, the researchers
sewed one eyelid of the kitten shut during the critical period so that the eye was unable to
28
receive visual stimulation. Later, when they opened this shut eye, they discovered that the
visual areas in the brain map that ordinarily processed input from the shut eye had failed to
develop and that this left the kitten blind in that eye for life. Although this experiment
demonstrated that the brains of the kittens during the critical period were plastic (their
structure literally shaped by experience), it also established the, now regarded as erroneous,
idea that the brain is hardwired by the end of infancy to perform functions in fixed locations
(Doidge, 2007, pp. 51-52).
While Eric Kandel may have been taught at medical school that the brain was fixed for
life, his own research allowed him to discover a very different reality. Kandel used giant marine
snails, called Aplysia, to see whether he could gain any insight into how human nervous system
tissue functions. He elected to use the Aplysia snail in his research because of the snail’s
unusually large neurons. 1 By inserting microelectrodes into the snail’s neurons, Kandel was able
to demonstrate that as the snail learned to avoid shocks, its nervous system changed, enhancing
the synaptic connections between its sensory and motor neurons. Thus, Kandel was the first to
show that, as we learn, our individual neurons alter their structure and strengthen synaptic
connections between them. In addition, he was the first to demonstrate that when we make
long-term memories, the neurons in our brains actually change their anatomical shape and
increase the number of synaptic connections that they have to other surrounding neurons
(Kandel, 2003). As Doidge (2007) points out, when a single neuron develops a long-term
memory for sensitization, it might go from having 1,300 to 2,700 synaptic connections, “a
staggering amount of neuroplastic change” (p. 220).
1
Doidge (2007) points out that the cells of the Aplysia snail are a millimeter wide and visible to
the naked eye (p. 218).
29
Schwartz and Begley’s (2000) neuroplasticty work focuses primarily on the treatment of
OCD (obsessive compulsive disorder) patients. Until the mid 1960s, the psychiatric and
psychological professions deemed OCD treatment-intractable—meaning that nothing could be
done to release patients from its grip. In the late 1960s and early 1970s psychiatrists started
turning to drugs to offer patients some relief. 2 Schwartz and Begley entered the field in the mid
1980’s when cognitive therapy was beginning to be used to effectively treat patients suffering
from depression. In 1987, they started a group therapy session for OCD patients, meeting every
Thursday afternoon, in conjunction with an ongoing study of the disease’s brain abnormalities.
In an analysis of the PET scans of 24 patients, they were able to pinpoint several structures in
the brain that appeared to be consistently involved in OCD. 3 The net result of their research
was the discovery that because OCD patients have a hyperactive orbital frontal cortex, their
error-detection circuitry appeared to be inappropriately stimulated causing OCD sufferers to
constantly be bombarded with signals that something is amiss. Using a four-step program
(Relabel, Reattribute, Refocus, Revalue) their patients were able to effect significant and lasting
changes in their own neural pathways, 4 offering them relief from their debilitating OCD
symptoms.
The authors make the case that their research with OCD patients provides strong
evidence to suggest that willful, mindful effort can alter brain function—thus proving their thesis
2
Studies since the 1980s have shown that about 60% of OCD patients respond somewhat to socalled selective serotonin reuptake inhibitors (SSRIs) such a Prozac, Paxil, Zoloft, Luvox and Celexa, and
for those 60% there’s about a 30-40% reduction in symptoms (Schwartz & Begley, 2002, p. 58).
3
Compared to the brains of “normal” controls, the brains of the OCD patients showed
hypermetabolic activity in the orbital frontal cortex, as well as a trend toward hyperactivity in the caudate
nucleus. By 1990, five different studies by three different teams had all shown elevated metabolism in the
orbital frontal cortex in patients with OCD (Schwartz & Begley, 2002, p. 62-63).
4
Using the four steps, patients were able to switch gears, activating circuits in the dorsal prefrontal
cortex connecting to adaptive basal ganglia circuits, rather than the OCD circuits connecting the orbital
frontal cortex to the anterior cingulate and caudate (Schwartz & Begley, 2002, p. 93).
30
that self-directed cerebral changes (neuroplasticity) are a genuine reality. Furthermore, by
presenting a diverse selection of animal and human experiments, including an extended
examination of the now infamous Silver Spring Monkeys deafferentation experiments (Taub,
1980), Schwartz and Begley show how these successes are not limited to OCD patients and that
similar neuroplasticity is possible in stroke victims—often with the result that patients gain a
return of function previously considered inconceivable. Their work also shows how similar
procedures (which rely upon focused attention) produced equally successful therapies for
patients with depression, Turrets syndrome, and autism.
Schwartz and Begley are convinced of the genuine power of mental force and its integral
role in a quantum-based mind-brain theory. Their ideas of mental force are understood within a
notion of the ancient Buddhist concepts of mindfulness, and the book provides the reader with
an in-depth understanding of what causes a person to produce certain thoughts and how one is
able to regulate unwanted thoughts and consequently change brain morphology. The authors’
Buddhist bent clearly invites a spiritual contextualization of the issues, which does not
necessarily preclude a Christian interpretation.
Other examples of neuroplasticity experiments abound including Bach-y-Rita’s work
with stroke victims and sensory substitution devices (Bach-y-Rita, Collins, Saunders, White, &
Scadden, 1969), one of which has enabled people who had been blind from birth to see;
Rosenzweig and colleagues examination of rats in stimulating and non-stimulating environments
which showed that activity could produce changes in the structure of the brain (Rosenzweig,
Krech, Bennett, & Diamond, 1962); Taub’s (1977, 1980) work with the Silver Spring monkeys and
stroke victims which revealed how constraint therapy can combat learned non-use; PascualLeone’s work with Braille students which demonstrated that the brain maps for people’s Braille
31
reading fingers were larger than the maps for their other index fingers and also those for the
index fingers of non-Braille readers (Pascual-Leone, Hamilton, Tormos, Keenan, & Catala, 1999);
and Ramachandran’s work with amputee victims suffering from phantom limb syndrome
(Ramachandran, Rogers-Ramachandran, & Stewart, 1992).
iii) Brain-Based Education
As already noted, neuroscientific research is gradually being translated into classroom
practice, despite the criticism from some educators (Bruer, 1999; Jorgenson, 2003; Strauss, 2001;
Willingham, 2006) that neuroscience has little to offer classroom practice. 5 However, educators
who strongly believe in brain-based education and who therefore argue that breakthroughs in
neuroscience ought to influence both educational policy and practice (and should therefore be
an essential part of the preparation of future teachers as well as continuing professional
development for in-service teachers) include Davis, 1997; Diamond and Hopson, 1998; Hart,
1985; Jensen, 2005; Posner and Rothbart, 2007; Ramey and Ramey, 1996, 2002; Westwater and
Wolfe, 2000; Wolfe, 2001; Wolfe and Brandt, 1998; and Zull, 2002.
Craig Ramey, a university of Alabama psychologist, has conducted several research
studies involving thousands of children at dozens of research centers, which have showed that
intervention programs for impoverished children can prevent children from having low IQs and
mental retardation. Two such projects are, the Abecedarian Project (Ramey & Campbell, 1984)
and Project CARE (Ramey, Bryant, Sparling & Wasik, 1985). The Abecedarian Project (N = 111
children) and Project CARE (N=63 children) were single-site randomized controlled trials that
enrolled children at birth who were biologically healthy, but who came from very poor and
5
Critiques from such authors center primarily on the fact that findings of brain research have been
adopted uncritically by educators, and that we currently do not know enough about brain development and
neural function to link that understanding, in any meaningful way, to educational practice.
32
under-educated families. For example, the mean maternal education in both projects was
approximately 10 years of schooling, and approximately three-quarters of the mothers were
unmarried. Due to ethical concerns, the control group families (sometimes referred to in
Ramey’s studies as the follow-up group) were not a totally untreated group. 6 The Early
Intervention groups received the same services as the control group, but, in addition, they
received an Early Childhood Education Program known as Partners for Learning (Sparling, Lewis
& Ramey, 1995). This program was administered within the context of a specially developed
Child Development Center. The Center admitted children after six weeks of age and maintained
low child/teacher ratios (e.g., 3:1 for children < 1 year; 4:1 for children between 1 and 3 years)
as well as an ongoing in-service curriculum training and technical assistance program for
teachers. Partners for Learning is an educational curriculum that is concerned with 31 child
development areas that are consistently oriented toward adult-child transactions involving wellformed and conversational language about topics of everyday interest to young children
(McGinness & Ramey 1981; Ramey, McGinness, Cross, Collier, & Barrie-Blackley, 1981). The
cognitive results from both these Early Education programs are highlighted in the graphs below,
which show Bayley Mental Development Index scores (MDIs) and Stanford—Binet (S-B).
6
The children in the control group received pediatric follow-up services on a schedule
recommended by the American Academy of Pediatrics. In addition, the children were provided unlimited
iron-fortified formula. Families of control group children also received social work services and home
visits. Their provision, therefore, makes the demonstration of group differences less likely due to their
presumed positive influence on the control group children and families.
33
Taken from Ramey and Ramey, 2002
As indicated by the graphs, by 36 months, the mean IQ scores in the Abecedarian
Project are 101 and 84 for the Early Education and control/follow-up group respectively, and in
Project CARE the comparable 36-month scores are 105 and 93. Thus, the results provide a
strong causal inference concerning the preventive power of the early educational curriculum.
The Abecedarian and CARE Projects continued with the Early Childhood Educational
Program until the children entered public kindergarten. The cognitive differences between the
treated and control groups continued to persist at statistically significant levels (Burchinal,
Campbell, Bryant, Wasik, & Ramey, 1997). Follow-up assessments of the Abecedarian children
have now been completed at eight, 12, 15 and 21 years of age. In standardized measures of
intelligence, the differences between the treatment and comparison groups narrowed, but still
continued to be statistically and educationally meaningful at approximately five IQ points. The
results from reading and math achievement assessments showed that, at all ages from eight to
21, the treatment group had significantly higher academic achievement scores in both reading
34
and mathematics as seen in the table below which details the results for reading performance
for Abecedarian children.
Taken from Ramey and Ramey, 2002
Furthermore, in the follow-up study at age 21 (Campbell, Ramey, Pungello, Sparling, &
Miller-Johnson, 2002) it was found that not only did the children in the preschool treatment
group earn significantly higher scores on intellectual and academic measures as young adults,
but they were also more likely to have attained more years of total education, were three times
more likely to attend a four-year college (12% vs. 36%), and showed a reduction in teenaged
births compared to preschool controls. (An earlier follow-up at age 15 had demonstrated that
they were less likely to be retained in grade or placed in special education.) In summary, risk for
poor cognitive development was offset to a considerable degree by high-quality preschool
education that focused on cognitive development, conversational skills and social competence.
35
One of the most direct applications of neuro-scientific research to the classroom can be
found in the work of noted research scientists Drs. Michael Merzenich and Bill Jenkins at the
University of California, San Francisco, and Drs. Paula Tallal and Steven Miller at Rutgers
University, New Brunswick, New Jersey (Tallal, Miller, Jenkins, & Merzenich, 1997; Merzenich,
Saunders, Jenkins, Peterson, & Tallal, 1999; Miller, Linn, Tallal, Merzenich, & Jenkins, 1999).
Through their research collaboration, they discovered that difficulty in learning to read, in some
cases, stems from a language processing delay in the student’s brain. Armed with this
information, they developed a plasticity-based computer program designed to correct this delay.
With the help of computers, the complex speech sounds in sentences (such as phonemes) could
be slowed down and digitally enhanced so that they could be easily differentiated. The
scientists discovered that by using this acoustically modified speech technology in an intensive,
adaptive product, students could build a wide range of critical language and reading skills such
as phonological awareness, phonemic awareness, fluency, vocabulary, comprehension,
decoding, syntax, grammar, and other skills that had previously been inaccessible to them.
The Fast ForWord program is one of the first brain studies with specific applications to
the classroom, and has already benefited countless students with any form of auditory
processing disorder as well as those with dyslexia, attention problems, autism, and English
Second Language (ESL) students. In the decade since the initial foundational research, Fast
ForWord has been validated by numerous research studies ranging from university-based fRMIbrain imaging studies through scientifically designed studies in schools and clinical settings to
teacher and parent observational reports (Bao, Chang, Davis, Gobeske, & Merzenich, 2003;
Burns, 2003; Dronkers, Husted, Deutsch, Tayler, Saunders, & Merzenich, 1999; Friel-Patti,
36
DesBarres, & Thibodeau, 2001; Gaab, Gabrieli, Deutsch, Tallal, & Temple, 2007; Gillam, 1999;
Loeb, Stoke, & Fey, 2001).
C. The Mind/Brain Debate and Questions of Theology
The third objective of this study is to explore the possible implications that our evolving
knowledge of neuroscience may have on our Christian theology—particularly as it relates to our
call to be transformed by the renewing of our minds (Romans 12:2). In order to do so a review
of the current literature on the mind/brain debate is necessary.
In The Mind and the Brain (2002), Jeffrey Schwartz and Sharon Begley investigate the
perennial mind/brain dichotomy debate. The authors explain the various positions in the
debate and provide deft descriptions of functionalism, epiphenomenalism, emergent
materialism, agnostic physicalism, process philosophy, and dualistic interactionism (2002, pp.
38-46). Functionalism denies that the mind is anything more than the brain state; in other
words, “mind” is nothing more than a consequence or by-product of the brain’s physical activity.
Advocates of this position include Paul and Patricia Churchland (1998), and Daniel Dennett
(1991). Epiphenomenalism accepts that mind is a real phenomenon, but insists that it cannot
have any effect on the physical world. From this perspective, consciousness is an
epiphenomenon of neuronal processes. The position maintains that the causal arrow between
brain and mind is unidirectional (from brain to mind/ material to mental). Emergent
materialism asserts that mind arises from brain in a way that cannot be fully predicted from or
reduced to brain processes. That is, the attributes of mind cannot exclusively be explained by
the physical activity of the brain. In addition, the mind may have the power to effect both
mental and physical change. This position is associated with Nobel prize-winning neuroscientist,
37
Roger Sperry (1992). Agnostic physicalism holds that mind derives exclusively from the matter
of the brain. The “agnostic” precursor reflects the fact that adherents to this school of thought
acknowledge that this may not be the whole picture; thus, they do not deny the existence of
nonmaterial forces. Process philosophy holds that mind and brain are manifestations of a single
reality, one that is in constant flux. Dualistic interactionism holds that consciousness and other
aspects of mind can occur independently of brain. In this view, mental states have the power to
shape brain or cerebral states, and the mind cannot in any sense be reduced to the brain.
Very broadly speaking, these positions can be summed up as those who hold a dualist
(separate parts) view of human nature and those who hold a monist (single entity) view of
human nature. In the traditional dualist position the material and immaterial aspects of human
personhood are believed to be separate. In this view, the physical body (including the brain) is
completely distinct from the non-physical mind/soul. In a monistic perspective, humans are
understood to exist as a single physical entity, and it is this complex entity that gives rise to our
thoughts, beliefs, desires, values, and so forth (aspects which have often been attributed to the
soul in a more dualistic understanding). As highlighted by the definitions in Schwartz and Begley,
monism has also been associated with the other terms such as materialism, physicalism, and
reductionism—all three terms highlighting the notion that human personhood is solely
comprised of that which is material or physical.
In the past, a dualistic view has been widely accepted among Christians and nonChristians alike (Augustine, Thomas Aquinas, John Calvin, René Descartes). For Christians, the
strongest Scriptural reference to support such a position is found in Philippians 1: 21-24: “For to
me, to live is Christ and to die is gain. If I am to go on living in the body, this will mean fruitful
labour for me. Yet what shall I choose? I do not know! I am torn between the two: I desire to
38
depart and be with Christ, which is better by far; but it is necessary for you that I remain in the
body.” The monistic view, with its emphasis on the physical has received criticism from
Christians. This is most likely primarily due to the fact that monism has come to be associated
with extremely reductive materialism and determinism as evidenced by prominent nonChristian authors such as Crick (1994), Damasio (1994, 1999, 2003), Dawkins (1989, 2006) and
Pinker (1997, 2002), each of whom would have us believe that we are nothing more than the
result of physical and chemical activity in the brain.
In spite of this, there are increasing numbers of Christians who argue that a nonreductive version of monism is not only conceivable, but is also the most biblical/accurate
understanding of personhood (Allen, 1985; Brown, Murphy & Malony, 1998; Brown & Jeeves,
1999; Green, 2002; Green & Palmer, 2005). In Whatever Happened to the Soul? Scientific and
Theological Portraits of Human Nature (1998), noted biblical scholars Brown, Murphy, and
Malony look into the possibility of trying to create a position that respects both Christian
theology and scientific advances while at the same time avoids the reductive and deterministic
aspects of monism. Their offered solution is what they term non-reductive physicalism. In their
understanding, the position accepts the biological/physical nature of humans, but the nonreductive qualifier attests to the fact that conscious decisions are real “hard” phenomena that
have the capacity to exert a top-down causal influence on the brain. Key to this position is the
notion that human beings are souls (as opposed to human beings possessing additional nonmaterial souls).
Not surprisingly, nonreductive physicalism has been criticized by those who worry that
reductionism is inevitable and simply inserting a qualifier does not rule out the possibility
(Jaegwon, 1994). Another significant worry with the position has been raised by Mackay (1974),
39
who cautions against what he terms “nothing buttery”—a phrase which denotes ontological
reductionism as expressed by the likes of Crick and Dawkins mentioned above, namely that we
are nothing but a bunch of neuronal responses.
Cooper (1989) is one Christian scholar who argues strongly for a (holistic) dualist
understanding of personhood and undertakes a detailed exegetical approach to the issue. Using
a wide selection of biblical texts (as well as intra- and extra-biblical data) 7, Cooper largely
defends his dualistic position on the notion of the doctrine of intermediate state—a doctrine
which he argues presupposes dualism, and which he argues is affirmed by the majority of
historic Christianity. Cooper adopts a viewpoint very similar to that of Evans (1981), in which he
considers our souls not as separate substances but one substance while we are in the body and
then separable upon death, only to be reunited at the resurrection.
7
Cooper is critical of commentators and theologians who fail to take all the relevant intra- and
extra- Biblical data into consideration when making pronouncements about Biblical texts: “The Gospels
cannot be read apart from the eschatology of first century Judaism” (1989, p. xxv).
40
III. DISCUSSION
A. Implications of Research
i) Student Beliefs and Teacher Expectations: To be a Flower Girl or a Duchess, That is the
Question 1
The literature has clearly established that teacher expectations can have an impact on
student performance and achievement. Indeed, it is a sad indictment on society that it is not
uncommon for negative expectations to be based on arbitrary or biased grounds. In such cases,
teachers can hold expectations of students based purely on appearances and other factors that
have nothing to do with actual intellectual ability. Examples of this include racial and gender
stereotyping as well as academic expectations based on socioeconomic status.
This kind of teacher behavior (whether conscious or not) urgently needs to be addressed
because it has the result of reproducing in the classroom the social cycle of advantages and
disadvantages and ultimately widening the gap between the haves and the have-nots.
Understood in this light, the motivation behind this paper is not solely to improve the academic
performance of students as an end in and of itself, no matter how admirable a goal that might
appear to be; rather my goal is to improve academic performance of students as a means of
social justice–recognizing the inherent worth of each individual and the right that they have to
human and academic flourishing. As Robert Sternberg (1996) asserts, the traditional model of
1
The title of this refers to George Bernard Shaw’s play Pygmalion which was later turned into the
well-known musical My Fair Lady. In Shaw’s play, Professor Henry Higgins claims he can take a
Cockney flower girl, Eliza Doolittle, and turn her into a duchess. But, as Eliza herself points out to
Higgins’ friend Pickering, it isn’t what she learns or does that determines whether she will become a
duchess or not, but rather how she is perceived and subsequently treated.
41
intelligence may be a cause of, rather than a potential answer to, educational problems in
particular, and societal problems in general.
Teacher expectations are closely linked to the ideas of epigenetics and neuroplasticity
discussed in the previous section. It goes without saying that it is not only important for
students to know about the benefits associated with epigenetics and neuroplasticity, but
teachers need to know (and more importantly truly believe) them as well. Unfortunately, in the
U.S. many teachers subscribe to what Bamburg (1994) dubs a philosophy of “educational
predestination” (para. 13). Such teachers regard intelligence as something that is genetically
determined and cannot be changed over time. As Linda Lumsden (1997) explains, “poor
performance in school is often attributed to low ability, and ability is viewed as being immune to
alteration, much like eye or skin color” (para. 14). Teachers who view ability as an unalterable
quality can clearly not have high expectations for students who appear to be performing poorly.
Thus, Lumsden points out the need for teachers who “view intelligence as dynamic and fluid
rather than static and unchanging” (1997, para. 26).
Readers may be surprised to discover that Alfred Binet, inventor of the IQ test (meant to
summarize children’s unchangeable intelligence) was actually a great advocate of the view that
intelligence could change over time. In fact, the IQ test was designed to test and identify
students who were struggling academically in the Parisian public school system in order that
new educational programs could be designed to help get them back on track. In his Modern
Ideas About Children, published in 1909, he had the following to say:
A few modern philosophies…assert that an individual’s intelligence is a fixed quantity, a
quantity which cannot be increased. We must protest and react against this brutal
pessimism… With practice, training, and above all method, we manage to increase our
42
attention, our memory, our judgment and literally to become more intelligent than we
were before. (Quoted in Dweck, 2006, p. 5)
As I have already suggested, this way of thinking is especially important for teachers
who have poorly performing students, because if teachers do not instill in these students the
confidence that they can improve academically, then these students often come to the
erroneous and devastating belief that it is of no consequence how much effort they put into
their studies, because ultimately they believe that this effort will not result in the dividends of
improved performance. This is because what research has suggested is that, over time,
students readily internalize the beliefs of their teachers, at least when it comes to their
academic abilities. In the case of low teacher expectations, when this internalization occurs, the
student’s self-concept and motivation to achieve may gradually decline to such a degree that
eventually the student’s ability to achieve to his or her potential is significantly compromised.
As Bandura’s (1997) research highlighted, an important notion to clarify with selfefficacy theory is that self-efficacy is a belief about one’s perceived capability, and as such, does
not necessarily correlate with one’s actual ability in a specific domain. Consequently, teachers
should not only have high expectations, or try to raise self-efficacy, for those students whom
they already perceive as capable. As obvious as it may sound, it is important for teachers to
have high expectations for all students. In fact, Bandura (1986) has argued that the most useful
efficacy judgments are those that slightly exceed one’s actual capabilities, because these
overestimations of ability can actually increase effort and persistence during challenging
academic situations. Thus, it is not mere wishful thinking or false hope to have expectations of
students that are higher than the students’ present abilities (provided, of course, that the
43
teacher is committed to providing the students with the necessary academic support to reach
the desired goal).
One of the possible critiques about self-efficacy theory is that of causality and the
direction of causality: In other words, the proverbial chicken and egg argument. Thus, the
question is often asked whether feeling good about one’s academic abilities is primarily
responsible for increased achievement or whether successful performance is largely responsible
for stronger feelings of self-efficacy. Pajares (1996) argues that “because of the reciprocal
nature of human motivation and behavior, it is unlikely that such a question can be resolved” (p.
566). However, Pajares (1996) also makes the argument that, “how individuals interpret the
results of their performance informs and alters their environments and self-beliefs, which in
turn inform and alter their subsequent performances” (p. 544). What Pajares describes here is
essentially a vicious circle, or what Bandura (1978, 1986) identifies as reciprocal determinism,
the view that a) personal factors in the form of cognition, affect, and biological events, b)
behavior, and c) environmental influences create interactions that result in triadic reciprocity.
While I understand what the two authors are saying, I think that beliefs always come before
behavior, and that only thereafter does the reciprocal cycle continue. In this way of looking at
things, it is useful to see behavior as a symptom of our beliefs. Stated slightly differently,
behavior is a natural consequence of beliefs and internal dialogue. Proof of this can be found in
the fact that the reason why most people are unsuccessful at personal change is because they
attempt to change behavior rather than the belief systems that are the root cause of all
behavior. Dr. James Chestnut (2005) addresses this common problem when he talks about
behavior modification versus belief system modification. He contends that belief system
modification is always far more likely to yield successful results, especially in the long term.
44
Behavior modification can only get someone so far, because when the proverbial rubber hits the
road, we will always revert to behavior that is in line with what we truly believe.
I contend that many teachers, in spite of the plethora of educational literature on the
subject, still underestimate, and do not fully believe, the crucial role of expectations. This is a
frustrating reality because the expectation of success is a very powerful force. Skilled physicians
have known for centuries that positive expectations play a crucial role in healing many illnesses.
Until about 150 years ago, medical therapeutics mainly involved the doctor-patient relationship
and the placebo effect. (Actually, most treatments by physicians in times past would have been
more harmful than beneficial to patients, if it weren’t for their faith in the healing power of the
treatments and their own recuperative powers.) The benefits of the placebo effects are
determined by the expectations and hopes shared by the patient and the doctor in much the
same way that high expectations by teachers have an impact on students. What is worth
clarifying here is that placebo responses are clearly not simply a case of patients fooling or
tricking themselves out of pain—placebo administration can produce real physiological changes.
The same is true for raising student expectations or self-efficacy beliefs: real physiological
changes in the brain are possible.
To summarize, it can be concluded that what most of the research relating to student
beliefs reveals is that many students have difficulty performing in school, not because they are
incapable of performing successfully, but rather because they are incapable of believing that
they can perform successfully. This belief is usually a combination of the notion that they are
not smart enough in general (fixed intellectual ability) or skilled enough in the particular subject
at hand. To a large extent, teachers need to be held accountable for this situation—for failing to
have high expectations of all their students and for not actively dispelling the belief of genetic
45
determinism and fixed intellectual capacity (both in their own minds and the minds of their
students).
ii) Epigenetics: Genes are “so twentieth century!” 2
As pointed out in the review of literature, the field of epigenetics is showing how we
have become too invested in the theory that genes control everything. Indeed, we regularly
read in scholarly journals or hear on the news that yet another mental or physical trait has been
linked to a particular gene. Everything from various forms of psychosis, to alcoholism, to
intelligence, to sexual preference (and the list is constantly growing) have been suggested to be
genetically determined. However, the pathway from gene to human trait is far more complex
and less predictable than we have been lead, particularly by the popular media, to believe.
Dorothy Nelkin, an American sociologist, has focused much of her research on raising awareness
about the ramifications of unchecked scientific advances and the unwariness of the public
towards scientific authority. She writes:
In a diverse array of popular sources, the gene has become a supergene, an almost
supernatural entity that has the power to define identity, determine human affairs,
dictate human relationships, and explain social problems. In this construct, human
beings in all their complexity are seen as products of molecular text…the secular
equivalent of a soul—the immortal site of the true self and determiner of fate. (Nelkin &
Lindee, 1995, p. 193)
I have certainly witnessed this phenomenon taking place throughout my educational
experience. When I was learning science in high school during the mid 1990s, we were taught
the position that physical structures (e.g., skin tone, height, hair color, eye color, etc.) were
determined by genes; however in the decade following my graduation from high school, there
2
This humorously scornful phrase is borrowed from the children of Dawson Church (Church,
2007, p. 57).
46
has been a slow but steady shift with regard to what genes are held accountable for, to the
point that we have recently become so invested in the “primacy of DNA” that we have fully
bought into the notion that genes are responsible for nearly everything—emotional and
behavioral responses included. But what has caused us to get to this place where genes have
become so omnipotent?
What is important to point out here is that behind science is philosophy; in fact, it would
almost not be unfair to say that in many instances science is largely a proof of philosophy.
Therefore, in order to understand the nature of the science we find ourselves living with today,
it is incumbent upon us to go back and understand the philosophical roots that have led to
today’s world of modern science, particularly as it relates to our understanding of genetic
theory. 3
Less than a hundred years after Isaac Newton established the formality of the science of
physics in The Philosophiæ Naturalis Principia Mathematica (1687), 4 biology was created as its
own individual formal science. However, biologists used physics to inform their own discipline,
and within this framework they set off on the premise that all living things are essentially
physical machines. Biologists thus reasoned that if you take things apart (e.g., a body) you will
be able to understand how they work—and biologists wanted to know what mechanism
controlled the human body. During this time, René Descartes in his 1649 work, Les passions de
l’âme (Passions of the Soul), was advocating the concept that we are comprised of a separate
mind and body. Physicists however considered the mind as an epiphenomenon, something that
derived from the operation of a physical machine. But the mind itself was energy, and as energy
3
For this brief history of biology, I am largely indebted to notes that I took during a Bruce Lipton
conference in September 2007, in London, England entitled “The Biology of Belief.”
4
Latin for “Mathematical Principles of Natural Philosophy.”
47
in a physical world, it could not impact matter. (This was a given in the world of atomists: only
matter can affect matter.) Since the mind is not matter, they reasoned that it could not affect
the body, and consequently, for the time being, biologists let go of the idea of the influence of
the mind on human biology. Instead, they decided that they would look into matter.
In 1859, Darwin published his On the Origin of Species. In it he argued that traits in an
organism were related to traits of the parents, and the theory of evolution was birthed. At this
point in history, scientists were very involved in breeding plants and animals, and their
conclusion in the face of Darwin’s discovery was that whatever was controlling the traits of a
person, must have been provided to them via the germ cell, since that was the physical thing
that the parents provided that gave rise to new offspring. Consequently, they reasoned that
there must be something in the germ cell (the gametes such as egg cells and sperm cells) that
controls the traits of individuals. This belief system set scientists off on a mad hunt to try and
dissect life down to its molecular nuts and bolts.
In 1892, August Weismann, a German biologist, studying isolated cells, started to see a
structure that appeared just before the cell divided (chromosomes), and noted that each
daughter cell received a complete set. It was then suggested by Weismann (and others) that
chromosomes represented the physical units of heredity. When scientists isolated
chromosomes they found that they were made up of both protein and DNA, and they posed the
question of which one was involved in the carrying of traits. In an attempt to answer this
question, they started to isolate protein and DNA. Then in 1944, Avery, MacLeod, and McCarty
discovered that they could change the character of one species by employing the DNA of
another species (and that the same thing didn’t happen when they tried this with protein). Their
discoveries were published in the Journal of Experimental Medicine. Prior to this work, no
48
biological assay was available to link genetic information with DNA. Although the authors are
commonly not as well-known to the general public, the work of Avery, MacLeod, and McCarty
preceded by nearly a decade the Nobel Prize-winning work of James Watson and Francis Crick.
Consequently, Nobelist Sir Peter Medawar called the Avery team’s isolation of genes in pure
chemical form “the most interesting and portentous biological experiment of the twentieth
century” (Lederberg, 1994).
Nearly a decade later, the information discovered in 1944 by the Avery team was
elaborated upon and revealed to the public by Watson and Crick, who professed to having
uncovered the secret of life in the discovery of the DNA double helix (Watson & Crick, 1953).
When asking the question, what controls DNA, Watson and Crick concluded that DNA controls
itself because if you split the double helix, one strand would make a complementary copy of
itself. Thus, the DNA double helix appeared to be the “be all and end all” for information
control in biology. Based upon this information, Francis Crick came up with what is now
recognized as the Central Dogma of biology, namely that information flow in biological systems
is a one-way street. Essentially, then, according to the Central Dogma, we humans are simply
protein machines and our behaviors are controlled by our proteins, and the character of our
proteins comes from our DNA.
During the 1960s and 1970s, research began in earnest into how genes control life. This
ultimately led to the concept of genetic determinism, the lingering effects of which are still
strongly present today. Under this way of thinking, at the moment of conception, our fate is
preprogrammed by the selection of genes in our genome. By the 1980s the Central Dogma had
been so widely accepted and painstakingly spelled out in science textbooks that many biologists,
rather arrogantly, assumed that they knew how life worked! Therefore, they figured that the
49
final and ultimate project that they needed was the human genome project (which was largely
driven by the possibilities of genetic engineering).
In 1987 the project was founded and in 2001 the results came in. What scientists
discovered was that there was a great mismatch. The number of genes found was significantly
less than the number of proteins, and if DNA is required for proteins, then there is a major
problem with the theory. Thus, one of the key problems with the Central Dogma is that the
number of genes in the human chromosome is insufficient to carry all the information required
to create and run a human body. In the 2001 issue of the prestigious European journal Nature,
an issue which focused on the human genome project, Nobel Laureate David Baltimore, a Ph.D.
geneticist, had the following to say in the face of the human genome results:
But unless the human genome contains a lot of genes that are opaque to our computers,
it is clear that we do not gain our undoubted complexity over worms and plants by using
more genes. Understanding what does give us our complexity—our enormous
behavioral repertoire, ability to produce conscious action, remarkable physical
coordination, precisely tuned alterations in response to external variations of the
environment, learning, memory… need I go on? – remains a challenge for the future. (p.
816, emphasis added)
Plainly put, we cannot be controlled by our genes because there simply aren’t enough of
them, and this is where the new direction of biology comes in. Thus, to refer back to the quote I
used for the title of this section, “the old view that our genes contain indelible instructions
governing the functioning of our bodies is… ‘So twentieth century!’” (Church, 2007, p. 57)
However, if genes are not responsible, then we are left to conclude that there must be some
other mechanisms of control. Given the Central Dogma, this conclusion is nothing short of a
radical shift in biology. As Dr. Bruce Lipton explains, “genes cannot turn themselves on or off.
50
In more scientific terms, genes are not ‘self-emergent.’ Something in the environment has to
trigger gene activity” (2005, p. 26, emphasis added). The next logical question to ask then is
what does control our biology, what is it in the environment that triggers gene activity? One of
the possible answers, which is of most significance for the purposes of this study, would be the
mind. 5
In Bruce Lipton’s (2007) understanding of epigenetic influences, one of the major
switches that controls biology is perception, not genes. He points out that there is a “switch” in
the human body that largely controls perceptions, and that this switch is the hypothalamus,
because it is the portion of the brain that remembers the value of perceptions. If the
hypothalamus receives a signal, it communicates with the pituitary gland (which is often called
the “master gland” because it acts like a government, determining our next move). If the
pituitary gland receives a negative perception, it activates the fight/flight mode and the adrenal
system is activated, releasing the relevant hormones. The first thing that happens is that growth
is shut down and the immune system is shut down. Cerebral blood vessels are squeezed which
forces the blood to the hind brain so we use reflex (and not logic/reason) thinking. In other
words, when we are under stress, we are less intelligent!
But where do perceptions come from in the first place? 6 They can come from three
places: First, they can, in fact, come from genetics. These types of perceptions would be called
instincts and in the nature/nurture debate, they would fall under the nature category. Second,
5
Certainly other environmental triggers can have an impact on gene activity. Other examples
include things such as trauma (physical damage to the nervous system) or chemical triggers (e.g., toxins or
changes in diet). Perhaps one of the most famous examples of epigenetic breakthroughs is the previously
mentioned Waterland and Jirtle (2003) study on Agouti mice. As one recalls, during this experiment, the
researchers were able to alter the expression of the Agouti gene simply by implementing a new methyl-rich
pregnancy diet to the mother mice. In other words, the environmental “switch” which dimmed the gene’s
deleterious effects was the change in diet.
6
I will also deal further with perceptions and the brain in the next section of this paper.
51
we can have perceptions from our subconscious mind, which would largely be termed learned
habits. These perceptions would fall under the nurture category. Third, we have perceptions
from our conscious mind and these are created by volitional (creative) programming. These
would not fit neatly into either the nature or the nurture category, but would instead fall under
the category of consciousness. Within the epigenetic framework, this third category would
constitute the most powerful of the three groups because ultimately conscious or volitional
creative programming of the mind can re-write (or trump) both nature and nurture. As Lipton
(2007) boldly asserts, how you choose to see, or perceive, your life is how your biology will
respond. This might be considered as a very audacious claim to make, and certainly a claim that
would not be without its challenges from certain scientific quarters; however, as I learn more
and more about the way the human body works, it is a claim that I have not only come to accept,
but a claim that I would strongly advocate. Consequently, when one recalls the educational
research on expectations and self-efficacy, one can see that such claims may, in fact, not be that
far-fetched as one might initially imagine.
This assertion by Lipton is largely reinforced by the research of Dawson Church (2007).
Church’s work is primarily focused on epigenetic medicine and healing, but he contributes to
this paper in his understanding of perception and intentionality. As Church (2007) points out,
one possible mechanism that would explain the ability of our brains to override our senses, and
indeed our genes, is the surprising recent discovery that the bundles of our nerve cells running
from our brain to our senses outnumber the ones running in the other direction by a factor of
ten to one. Thus, for every neural bundle running from our senses to our brain, there are
roughly ten neural bundles running from our brain to our sensory organs. In other words,
“there’s a lot more bandwidth for signals going from the brain than there is for signals going to
52
it” (Church, 2007, p. 100). Consequently, as University of Oregon, Portland, neuroscientist
Michael Posner, says, “the idea that perceptions can be manipulated by expectations is
fundamental to the study of cognition” (Church, 2007, p. 100).
Where does all this leave us in terms of application to the classroom? What it shows us
is that we have become far too reliant on the notion that genes control our destiny. For many
students, they have bought into the lie sold to them that intelligence is a fixed trait and that
there is nothing they can do to change that. What the field of epigenetics is showing is that
“genes are not fate,” and that we can in fact be active participants in our biology. Therefore,
the perceptions that students hold, particularly the self-perceptions, become very important.
There are some critics of this position who point to certain genetic diseases that have
been attributable to genetic influence and argue that the field of epigenetics is getting ahead of
itself in terms of its applications. Lipton (2005) agrees that there is no doubt that some diseases
(such as Huntington’s chorea, beta thalassemia, and cystic fibrosis) can be blamed entirely on
one faulty gene. However, he is equally quick to highlight the fact that “single gene disorders
affect less than 2% of the population” (p. 51). Thus, he argues that the strong convictions that
people have about genetic determinism are largely to blame on the popular media: “The
confusion occurs when the media repeatedly distort the meaning of two words: correlation and
causation. It’s one thing to be linked to a disease; it’s quite another to cause a disease, which
implies a direct, controlling action” (p. 51).
As I have already highlighted, the result of such confused thinking is that we have
developed a mentality that we are victims of heredity, wholeheartedly believing that we are
stuck with the genes that control our life. This is the problem that I alluded to above with
53
regard to students. When this type of thinking starts to dominate a person’s life, it can very
quickly lead to feelings of powerlessness. For example, if I believe that x gene (for the sake of
our argument, let’s take lack of intelligence) runs in my family, I can start to live a life of fear
that I will never be smart or that I will never be able to make a successful living for myself.
Furthermore, if I believe that I can’t do anything about the situation, then it is very easy to
develop irresponsible or apathetic attitudes and lifestyle choices. Thus my behavior and my
lifestyle choices are influenced by faulty misperceptions which have far-reaching consequences.
This is clearly an attitude that teachers need to be able to address responsibly and scientifically
with their students.
The good news about advances in epigenetics is that we are learning that such changes
are possible. What is equally encouraging is the fact that the epigenome can change in response
to the environment throughout an individual’s lifetime. It was originally thought that the
pattern of an individual’s epigenome was firmly established during early fetal development;
however, while that early period is still seen as a particularly crucial time, scientists are
discovering that change in the epigenome can occur at any age (Watters, 2006). So while genes
are certainly an ontological necessity for human traits and behavior, they are by no means
sufficient: “Interactions with the environment are crucial at every stage of development. While
genes provide a range of possibilities for specific behaviors, the environment moderates and
modulates genetic control, and thus softens what otherwise might be complete genetic
determinism” (Anderson, quoted in Brown, Murphy, & Malony, 1998, p. 69, emphasis added).
An interesting, although not often publicized, fact is that even Charles Darwin conceded,
near the end of his life, that his evolutionary theory had not given due attention to the role of
the environment. In an 1876 letter to Moritz Wagner, Darwin wrote:
54
In my opinion, the greatest error which I have committed has been not allowing
sufficient weight to the direct attention of the environment, i.e., food, climate, etc.,
independently of natural selection… When I wrote Origin, and for some years
afterwards, I could find little good evidence of the direct action of the environment;
now there is a large body of evidence. (In Lipton, 2005, p. 50)
Clearly then, even great minds can be mistaken. The expression of our genes is not random; it is
based on what stimuli our genes receive. Furthermore, the stimuli the genes receive are not
random; they are based on the environment (which includes our thoughts). As James Chestnut
(2005) argues:
Who we are… and the quality of our lives are determined by the synapses and
neurological pathways we consistently build and fire and the genes we consistently
stimulate and express. As oversimplified as it seems, if you want to be depressed,
consistently think depressing thoughts. If you want to be happy, consistently think
happy thoughts. In other words, how we consistently talk to ourselves, the content of
our internal dialogue, determines which pathways we fire, which genes we stimulate
and which biochemistry and physiology we express. (p. 84)
With the ample research that now supports the idea that the environment plays an equally
important, if not greater, role than genes in human biology, is it not time that students start to
learn about the implications and benefits that such research can provide not only in terms of
their learning capabilities but also in terms of their overall self-concept? After all, as Sternberg
(1996) points out “perhaps more important from an educational point of view is to realize that
the heritability of intelligence is a question entirely distinct from that of the modifiability of
intelligence. Intelligence could be partially or even highly heritable and, at the same time,
partially or highly modifiable” (p. 14). Sternberg goes on to give two excellent examples that
highlight this point: Height has a heritability coefficient in excess of .9, but heights have gone up
in recent generations in the U.S. and elsewhere. Phenylketonuria has a heritability of 1 (i.e., it is
55
completely heritable), but its symptoms (such as mental retardation) can be alleviated by a
wholly environmental intervention (withholding of phenylalanine from the diet from the time of
birth). Thus, our ability to modify intelligence is not determined in the least by the heritability of
intelligence (p. 14-15).
iii) Neuroplasticity: Mind Altering Stuff
As previously stated, for over a century the prevailing idea in medical science was that
the human brain could not grow new brain cells, or neurons. Thus, until fairly recently, scientific
literature led us to believe that we were doomed by the notion of a hardwired brain, and that
consequently we should resign ourselves to the fact that the proverbial thinking about old dogs
and new tricks had scientific validity. However, through seminal research, such as the studies
referred to in the review of literature, literally countless numbers of studies have shown that
humans do indeed grow new neurons, and neurogenesis takes place not only during childhood
and early adolescence but throughout an individual’s lifetime. What’s more is that these new
neurons become fully functional and integrated. Consequently, neuroplasticity (also referred to
as brain re-mapping or cortical plasticity) is now established as a fact of mainstream science.
Interestingly, the fields of neuroplasticity and epigenetics have a great deal of overlap.
Doidge (2007) explains that when we learn, we alter which genes in our neurons are expressed
or turned on. Essentially our genes have two functions. The first function is the “template
function” which allows our genes to replicate or make copies of themselves in order that they
can be passed on from one generation to the next. This template function, much like a genetic
Xerox machine, is beyond our control. The second function is the “transcription function.”
Every cell in our body contains each one of our genes, but obviously not all of these genes get
expressed. When a gene is expressed, it manufactures a new protein which changes the
56
structure, as well as the function, of the cell. The reason that this is called the transcription
function is because when the gene is expressed, information about how to make these proteins
(a recipe if you will) is transcribed or read from the individual gene. The key distinction here is
that we do have control over this process, because the transcription function is influenced by
what we think and do. Thus, when we learn, our minds also affect which genes in our neurons
are transcribed: With our thoughts we can shape our genes (epigenetics) which in turn shapes
our brain’s microscopic anatomy (neuroplasticity).
In keeping with the neuroplastic understanding that our thoughts can alter the structure
of our brains, I want to turn the work of Dr. Joe Dispenza and his 2007 book, Evolve Your Brain. 7
In this book, Dispenza highlights what he perceives as a major shift going on in science: “In the
old paradigm, your mind or consciousness (what is considered the essential ‘you’) is regarded as
an epiphenomenon of your brain. In the new paradigm, your consciousness is the ground of
7
Dr. Dispenza’s research is motivated by his own powerful story of personal transformation. In
1986, when Dispenza was 23 years old, he was competing in a triathlon in Palm Springs when he was
knocked off his bicycle by an SUV traveling at 55 mph. The orthopedic surgeon diagnosed him with
multiple compression fractures of the thoracic spine with the T-8 vertebra more than 60% collapsed. The
normal procedure in such cases of spinal injury is complete thoracic laminectomy with Harrington rod
surgery. The rods that Dispenza required would need to be eight to twelve inches, from the base of his
neck to the base of his spine. He was told that there was a greater than 50% chance that he would never
walk again if he opted to not have the surgery. According to the specialist, the deformity created a
structural imbalance that would cause the spinal bone fragments to move into the spinal cord area and cause
instant paralysis: Dispenza would be paralyzed from the waist down. The surgeon told Dispenza that he
had never heard of a patient in the United States opting against the surgery. Much to the disapproval of the
surgeons, Dispenza elected to forego the surgery. As a chiropractor, he believed in the innate intelligence
of his body to heal itself. He understood that this innate intelligence runs through the central nervous
system from the midbrain and other subcortical regions of the brain to the body. After checking out of
hospital, for six weeks Dispenza spent three hours a day (morning, noon, and night) mentally visualizing
the reconstruction of his spine. In addition Dispenza ate a raw food diet, and started conservatively
stressing his spine on an incline board, starting with two degrees above horizontal and gradually building
up to sixty degrees. He did this because he realized that in order for the proper amounts of calcium to be
laid down in his broken bones, he would need to apply some gravitational stress on the damaged segment.
After this he gradually started swimming. Nine and a half weeks later, Dispenza, got up and walked back
into his busy life. Having made a complete recovery without the need for invasive surgery, Dispenza
decided that he would spend a major portion of his life studying the phenomenon of mind over matter and
how consciousness creates reality (condensed from the first chapter of Dispenza’s 2007 book, Evolve Your
Brain: The Science of Changing Your Mind).
57
being and your brain is the epiphenomenon” (2007, p. xv). Consequently, Dispenza argues that
we need to use our conscious minds to effect neuroplastic change in our brains. In order to do
this we have to understand what happens in our brains and bodies when we have certain
thoughts.
Every time we have a thought, the brain makes a chemical and as a result, we start to
feel the way we think. In other words, if we have a negative thought, in literally a matter of
seconds, we begin to feel that way. Then we begin to think the way we feel, which makes more
chemicals and so on and so forth. This becomes a vicious circle which can be very hard to break,
because ultimately this ongoing process of thinking and feeling and feeling and thinking enables
us to “memorize” ourselves as a personality: We define our identity based on a feeling.
Unfortunately, something very important takes place as a result. This ongoing process of
thinking and feeling and feeling and thinking conditions the body over time to become the mind.
(In this way, a habit is actually an example of when the body becomes the mind because with
habits the body almost seems to act independently of the mind.)
Let’s look at another example: A person may have had a negative experience in the past.
For the sake of this research, let’s say that a student did poorly on an important Math exam and
as a result concluded that she was useless at Math and that she must therefore not be very
intelligent. This student had this experience, she remembers this experience, and every time
she re-visits the experience in her mind, she makes the same chemicals in her brain as if the
experience were happening all over again. 8 Essentially that is what memories are: A memory is
8
This example essentially highlights similar principles to those described in Pavlov’s (1927)
classical conditioning model. The original and most famous example of classical conditioning involved the
salivary conditioning of Pavlov’s dogs. During his research on the physiology of digestion in dogs, Pavlov
observed that, instead of only salivating in the presence of meat powder (an innate response to food that
Pavlov referred to as an “unconditioned response”), the dogs began to salivate in the presence of the lab
58
when we create the same chemistry of that experience. So this student may go through the
process of thinking and feeling and feeling and thinking, but then, after a period of time, she just
feels, and she reaches the mistaken conclusion that this feeling is who she is and how she thinks.
As I have previously explained, what happens when we engage in such cycles is that we
condition the body to be the mind (when the body is meant to be the servant of the mind). In
other words, the servant becomes the master. This is a problematic and dangerous substitution
because what it means is that the mind essentially goes to sleep and the body starts running the
show, even though the mind thinks that it is running the show.
The question raised by this may be that if we believe that our thoughts can change our
brains and can shape reality (through our perceptions) then why do we keep creating the same
negative circumstances over and over again? Surely, the student in the above example does not
wish to feel like such an academic failure? The answer may, in part, seem obvious: If we are not
thinking anything differently, our brains will never change. To change then, is not only to think
greater than the environment, but it is to think greater than we feel. If we use feelings as a
barometer for change, we will always talk ourselves out of greatness. 9 Unfortunately, many of
us, and many of our students, have become addicted to our emotions, and when this is the case,
technician who normally fed them. As a result of this observation, Pavlov predicted that, if a particular
stimulus in the dog’s surroundings were present when the dog was given meat powder, the result would be
that, over time, this stimulus would become associated with food and cause salivation on its own. In his
initial experiment, Pavlov used a metronome to call the dogs to their food and, after a few repetitions, the
dogs started to salivate in response to the metronome. Thus, a neutral stimulus (metronome) became a
conditioned stimulus (CS) as a result of consistent pairing with the unconditioned stimulus (US—meat
powder in this example). Pavlov referred to this learned relationship as a conditional reflex (now called
conditioned response).
9
At this point, it is interesting, and I believe by no means coincidental, to note that great people in
the world who are charismatic leaders have always believed in a destiny that they could not perceive with
their senses—they couldn’t see it, taste it, touch it, smell it, or hear it—but it was alive for them in their
minds. Whether it was Mahatma Gandhi, Queen Elizabeth, Martin Luther King or Nelson Mandela—they
all essentially said the same thing: I have this vision, I have this dream, I have this concept in my mind, and
I am going to live as if it’s already happening. From a Christian perspective this is why we are called to
live lives of faith (Hebrews 11:1), trusting that what we don’t see is reality (the substance of things hoped
for, the evidence of things not seen) and living our lives as if our prayers have already been answered.
59
there is always a gap between the way things appear and the way things really are. In other
words, we perceive reality equal to the way our brains are wired. Thus, many students believe
that they are not academically capable, not based on reality, but rather because they have
reminded themselves of their failures so many times that they now believe the cognitive, or
perceptual, distortion.
What needs to be clarified here is that there are two elements of perception. The first
is known as “bottom-up” perception and this type of perception is driven by the physical
characteristics of a stimulus which enter our cognitive system through one of our five senses—
sight, sound, smell, touch, or taste. Once information from one of our senses has made its way
into the brain through various electrical and chemical processes, it then has to be interpreted.
This is where “top-down” perception comes into play. During top-down processing, the
electrical signals are interpreted based on an individual’s knowledge, expectations, or thoughts
in order to make sense of the world (Ratey, 2001). Thus, it can be seen that perception is not
simply determined by incoming sensory stimulus, but it is better understood to be the brains
“best” interpretation of incoming data. If we recall the recent discovery referred to by Church
(2007) that we have ten times the number of neural bundles running from our brain to our
sensory organs than we do those running in the opposite direction, it is clear to see that that
“top-down” perception has the greatest sway when it comes to perception. Thus, while the
brain will always try to give an accurate interpretation of reality, it is, in a very real sense, bound
by an individual’s previously constructed beliefs and attitudes. Consequently, it is easy for the
brain to have misperceptions. As Ratey (2001) explains:
an act of perception is a lot more than capturing an incoming stimulus. It requires a
form of expectation, of knowing what is about to confront us and preparing for it… We
60
automatically and unconsciously fit our sensations into categories that we have learned,
often distorting them in the process” (p. 55, emphasis added).
The solution to faulty perceptions and the way out of the problem of the body becoming
the master of the mind is to recondition the body with a new mind. In other words, we have to
show the body who’s the boss! This is where Donald Hebb’s neuroscientific law comes into play
(Hebb, 1949). Hebb’s law can best be summarized with the popular neuroscientific slogan,
“nerve cells that fire together wire together.” If someone thinks the same thoughts and
performs the same actions over and over again, then they will force the brain to fire in very
predictable ways. Over time, the brain will hardwire into “a box” (as in the concept of “think
outside the box.” The “box” in your brain is the most commonly fired, most neurologically wired
set of circuits that we endorse every day by thinking and acting in the same way). Consequently,
to think outside the box is to force the brain to work in new patterns and sequences. This is not
always easy because neuroplasticity can work both in our favor and against us. As Doidge (2007)
points out: “The plastic paradox is that the same neuroplastic properties that allow us to change
our brains and produce more flexible behaviors can also allow us to produce more rigid ones” (p.
242). Thus, in most situations where certain thought patterns have become fixed in our brains,
“unlearning” has to take place before we can re-learn and re-wire. If someone begins to re-wire
their brain and they haven’t addressed the automatic thoughts, the automatic feelings, and the
automatic behaviors that they live by every day, it will only take one stray thought from the
environment and they will begin to do the same things again and yet expect a different result.
In order to successfully make use of Hebb’s law to help fix the problem of misperception,
we have to stop the association between certain nerve cells firing together. If a student thinks
of mathematics and automatically views herself as an academic failure, she needs to consciously
61
make the effort to form new associations in her brain. While this will undoubtedly take time
and focused attention, the good news is that the converse of Hebb’s law also holds true: nerve
cells that no longer fire together no longer wire together. Consequently, if this same student,
every time she thinks of her mathematic ability starts reminding herself that math ability is
malleable over time and that she has the potential to be a better math student, eventually her
previous negative association will atrophy and cease to exist. As Dispenza reminds us, “How and
where we place our attention, what we place our attention on, and for how long we place it,
ultimately defines us on a neurological level” (2007, p. 3). Neurological change is indeed
possible.
As the above example suggests, one of the best ingredients for this process is new
knowledge. Old thought patterns need to be replaced with new ones. Bruce Lipton sums this
notion up nicely. He says, “As we all know, knowledge is power and consequently, knowledge of
self provides self-empowerment” (2005, p. 17). Giving our students the gift of accurate selfknowledge is one of the most empowering things that we as teachers can do for them.
Returning back to the history of biology in the previous section, we have now
established that a so-called “immaterial” mind can indeed leave material traces on the brain. 10
As Doidge points out, “each thought alters the physical state of your brain synapses at a
microscopic level.” (2007, p. 213). Findings such as those of Kandel (2003, 2006), Dispenza
(2007), Ratey (2001) and countless others, overturn the centuries of confusion that have grown
out of the work of French philosopher René Descartes, who argued that the mind and brain are
made out of different substances and consequently are governed by different laws (Damasio,
10
The connection between the brain and the mind will be further explored in a following section.
However, it is worth noting at this point, that some scientists and philosophers argue that the reason why an
immaterial mind can leave material traces on the brain is because the two are in fact one and the same
thing, they are not separate entities at all.
62
1994). Indeed, Descartes’ notion of the “ghost in the machine” 11 has perhaps slowed the
acceptance of brain plasticity more than any other thinker. What is interesting is that even
within his own theory, Descartes could never really credibly explain how the immaterial mind
could influence the material brain. His view seemed to open up an unbridgeable gap between
mind and brain which is finally beginning to be traversed in the fields of philosophy (including
theology), cell biology, and neuroscience.
iv) Transforming Minds—Does A Christian Understanding of the
Mind/Brain Debate Have Any Impact on One’s View of the Role of Education?
Therefore, I urge you, brothers, in view of God’s mercy, to offer your bodies as living sacrifices,
holy and pleasing to God—this is your spiritual act of worship. Do not conform any longer to the
pattern of this world, but be transformed by the renewing of your mind. Then you will be able to
test and approve what God’s will is—His good, pleasing and perfect will (Romans 12: 1-2, New
International Version, emphasis added).
As referenced earlier, some scientists and philosophers are reaching the conclusion that
brain and mind, or brain and soul, are not two separate entities as was previously thought, but
rather, that they are one and the same thing. Although neuroscience originally began as a
biological discipline, in more recent years it has expanded to include many other disciplines, two
of which include the disciplines of (neuro)philosophy and (neuro)theology. One possible reason
for this is explained in the book, The Educated Brain: “If the neurosciences seem so crucial, it is
ultimately because—beyond what is relevant for each area of application or cooperation—they
show the extent to which the brain is a fundamental organ for the constitution of human
personhood” (Battro et al., 2008, p. 20, emphasis added). In other words, how we understand
the brain/mind, has implications for how we view ourselves as human beings. Furthermore, if
our role as educators involves influencing and changing the neurological structures of our
11
Descartes’ concept of the soul (or mind) as an independent constituent of humans, controlling
the body which it animates, was called the “ghost in the machine” by Gilbert Ryle (1949) in The Concept of
the Mind.
63
students’ brains, what repercussions might this have on students’ sense of personhood? At an
even deeper level, if mind and soul are regarded as interchangeable, and mind and brain are
one and the same thing, are we in fact having an influence on our students’ very souls as we
engage them in education? 12 These are profound and important questions to consider.
As I have tried to form my own position within the various debates, I have found that I
can closely relate to the experience of Nancey Murphy (2006). She reports:
When I first became interested in the topic of human nature I [falsely] believed that a
close look at the Bible and the development of Christian theology could settle the issue
of what Christians ought to believe about human nature. Surely I could grab a book
from the library that traced the history of the issue. So far I have failed to find one.
Since I am not competent to do primary research in either church history or biblical
studies, I turned to secondary sources in order to try and put together my own account
(p. 6).
I must confess to finding myself very frustrated that I, like Murphy, am not equipped to
engage in primary research on church history or biblical studies. Consequently, I have felt very
“at that mercy” of biblical scholars’ knowledge and interpretation of texts, and I have found that
I have changed my mind several times as a result of reading various pieces of scholarship. As a
result, my prayer as I have sought to wrestle with these issues has been to be guided by the still
small voice of the Holy Spirit and to ask him to direct me to a view of human persons that can
inform the role of education in a way that is glorifying to our Maker. In doing so, I have reached
the conclusion that the philosophy that I can most endorse and defend is a more
12
Note that I am by no means suggesting that teachers might have any influence on the ultimate
destiny of a student’s soul. Clearly the saving redemptive work of Christ can only be achieved in an
individual by the grace of the Holy Spirit working in that person’s life. What I am suggesting is the
possibility that we may have an influence on the personal characteristics or “soulish” elements of an
individual—characteristics that may well last into eternity.
64
(w)holistic/single entity understanding of human persons, and thus I have chosen the position of
non-reductive physicalism. 13
Non-reductive physicalism admits the biological nature of humans (physicalism) but
rejects the “reductive” component that asserts that we are “nothing but our bodies” (hence the
qualifier). Thus, the “non-reductive” precursor points to the fact that human behavior cannot
be exhaustively explained by neurobiology (lower levels), because higher level explanations
supervene on lower levels. In this understanding, the human soul is a physical feature, not a
separate immaterial essence. 14 Humans are bodies, they do not have bodies; humans are souls,
they do not have souls. 15 Murphy argues that part of the dualistic confusion has arisen from
the fact that “mind” and “soul” are nouns, and as such we assume that there has to be an object
to which these terms correspond (quoted in Brown et al., 1998, p. 18). However, that need not
be the case: they (mind and soul) may be one and the same/interchangeable noun. All that the
non-reductive physicalist position asserts is that if soul is not in fact a separate entity, then
higher human capacities once attributed to soul simply require a different explanation: “In part,
they are explainable as brain functions, but their full explanation requires attention to human
social relationships, to cultural factors, and, most importantly, to our relationship with God”
(Murphy, 2006, pp. 69-70).
13
My journey to arriving at this position was strongly informed by the views espoused by Brown,
1998; Green, 2002; Mackay, 1979; Murphy, 2006; Oppewal, 1985; and Wong, 2002.
14
What is important to note is that the Christians who hold the non-reductive physicalist position
do deny the Biblical distinction between the physical and the spiritual world in terms of the material world
in which we live versus the nonmaterial world of God.
15
In Genesis 2:7 we read “And the Lord God formed man of the dust of the ground and breathed
into his nostrils the breath of life and man became a living soul.” It is widely agreed now that the Hebrew
word translated “soul”—nephesh—did not mean what later Christians meant by “soul.” In most cases it is
simply a way of referring to the whole, living person.
65
This is an important distinction to make because a common critique of the nonreductive physicalist account is that the non-reductive aspect precursor simply is not possible.
Put another way, reductionism is inevitable, and before you know it, we have slid down the
slippery slope to what Mackay (1974) terms “nothing buttery.” 16 This sort of ontological
reductionism is evident in the extreme thinking of Nobel Laureate, Francis Crick, who in The
Astonishing Hypothesis, 17 asserts that both the concept of a Divine Creator and a separate soul
are mistaken and antiquated ways of looking at life, and that humans are essentially “nothing
but a pack of neurons.” But, as Malcolm Jeeves (and others) 18 points out, the ever-tightening
link between mind and brain espoused by the neuroscientific community need not necessitate a
descent into ontological reductionism (in Brown et al., 1998, p. 87). This case in point is
illustrated by the response of another Nobel Laureate, Sir John Eccles (1989), who from his
earliest writings right up to his most recent publications, has advocated a strong dualist position.
Furthermore, somewhere in the middle of Crick’s materialism and Eccles’s dualism, is yet a third
Nobel Laureate, Roger Sperry, who opposes Crick’s “nothing buttery” position while at the same
time making it clear that he does not adhere to Eccles’s dualism. Sperry, thus, ascribes to the
non-reductive physicalist account.
A slightly different position has been suggested by philosopher C. S. Evans (1981). He
argues for what he terms “minimal dualism.” While Evans still elects to use the term dualism, he
16
This “nothing buttery” is a characteristic of ontological reductionism which is characterized by
the notion that “by reducing any phenomenon to its components you not only explain it, but explain it
away” (Mackay, 1974, p. 43).
17
In The Astonishing Hypothesis (1994), Crick asserted that “You, your joys and your sorrows,
your memories and your ambitions, your sense of personal identity and free will, are in fact no more than
the behavior of a vast assembly of nerve cells and their associated molecules” (quoted in Jeeves, 2004,
180).
18
Donald Mackay contends that “nothing could be more fraudulent then the pretence that science
requires or justifies a materialist ontology in which ultimate reality goes to what can be weighed and
measured, and human consciousness is reduced to a mere epiphenomenon” (in Brown et al., 1998, p. 90).
Francis Crick therefore sets us a false conflict between what he takes to be a religious view of persons and
a neuroscientific view of persons.
66
makes a noteworthy distinction between the notion of “separate” and “separable” and as such,
I have incorporated this distinction into my own understanding of non-reductive physicalism. I
do not see his position as incompatible with my own. Evans points out:
Separable is a possibility word: hence the distinction involves the difference between
what is the case and what could be the case…Two things which are separable do not
necessarily exist independently. However, they are still “different” in the sense that
they could exist separately in some sense of the word “could”… Minimal dualism is the
claim that human souls (understood as selves or persons rather than as parts of persons
or selves) and human bodies are related in the following way: the soul is functionally
separate from the body, but as a substance, it is only separable. (Evans, 1981, p. 314)
When it comes to trying to establish a Scriptural basis for human anthropology, various
techniques are employed by different biblical scholars. Those with a linguistic urge have chosen
to dive into the etymological depth and attempted to find out exactly what certain words (soul,
body, spirit, etc.) in their original language mean. In the process, some of them have fallen into
the temptation of preloading their established theology and asking themselves what these
words must mean in order to make their theology as they understand it consistent and
defensible. Joel Green has warned against such pitfalls, arguing that conceptualizations of
human nature “are not necessarily tied to particular vocabulary (etymologically speaking); and
biblical writers did not develop highly specialized or denotative vocabulary for describing human
existence” (quoted in Wong, 2002, p. 83). Thus, when one comes to looking at Scripture for the
answers to the question of human personhood, biblical scholars are, not surprisingly, somewhat
67
divided. In my own research, I came across multiple different interpretations of texts (from the
Old and New Testaments as well as inter-testamental passages). 19
Certainly, there tends to be more agreement among biblical scholars concerning the Old
Testament/Hebraic view (which is largely seen as holistic and knows little of body-soul dualism)
than among New Testament scholars. In this regard, Murphy contends that “most of the
dualism that has appeared to be biblical teaching has been a result of poor translations. The
original Aramaic and Hebrew terms were first translated into Greek, and later taken to mean
what Greek philosophers would mean by them” (p. 37). Furthermore, Joel Green asserts that
the New Testament is actually not as dualistic as the traditions of theology and biblical
interpretation have led us to believe. In fact he asserts that the predominant view of the human
person in the NT, like in the OT, is one of ontological monism: “The central concern of these
New Testament writers is the concept of soteriological wholism; in their portraits of human
nature they place a premium on the person’s relatedness to God and others” (quoted in Brown
et al., 1998, p. 28). Thus, I would tend to be in agreement with Mackay (1979) who insists that
dualistic metaphors in the New Testament (such as those referring to us being clothed by our
bodies and our bodies being a tent)—are just that—metaphors. Mackay says, “I confess that to
me, the two-way relationship between brain activity and conscious experience seems really too
close to justify taking metaphors in terms of ‘clothing’ or ‘tents’ as proving that the soul is an
invisible ‘substance’ inhabiting the body” (1979, p. 33).
Ultimately then, I could only conclude with Murphy (2006) that the Bible has no clear
teaching on the subject. She therefore argues that this reason has made it possible for
19
I am not going to attempt to elaborate on all these various positions, but for those who are
interested in this topic, my references should provide a good starting point for exploring the various
positions of biblical scholars.
68
Christians in different eras to recognize a variety of views in the texts, and, perhaps more
importantly, to have read a variety of views into the texts. Thus, Murphy asserts that this gives
contemporary Christians freedom to choose among several options, insofar as the Bible is
normative for Christians. A similar point is made by Malcolm Jeeves (2004) who agrees with
Murphy that Scripture simply is not clear on the matter:
The Bible’s witness to the nature of human life is at once naïve and profound. It is not
naïve in the sense of gullibility or primitiveness, but because it has not worked out, in
what we may regard as a philosophically satisfying way, the nature of physical existence
in life, death, and afterlife. It is profound in its presentation of the human person
fundamentally in relational terms, and in its assessment of the human being as
genuinely human and alive only within the family of humans brought into being by
Yahweh and in relation to the God who gives life-giving breath. (p. 241)
One of the primary issues that I had with Cooper (1989) was that while he certainly
appears to provide ample Scriptural references to support his case, New Testament scholar Joel
Green highlights the problematic method of using eschatology to determine semantics and
anthropology. Green asserts that using evidence of this nature (i.e., of eschatology) can always
only be analogical and speculative, since discussion of the afterlife is carried out by those who
have no firsthand knowledge on which to draw. Green therefore argues that the better method
is to say “given the biblical evidence regarding the nature of the human person this side of the
eschaton, what can we say about human nature on the other? [Rather than to ask as Cooper
does] given our theories about eschatology, what must we say about the human person in the
present” (in Brown et al., 1998, pp. 153-154).
69
Furthermore, even though Cooper does support his position with ample Scriptural
exegesis, he does also assert on several occasions that his position makes logical sense. 20 I find
this an inadequate justification because it indicates to me a very small opinion of God’s
omnipotence—the implication in Cooper’s understanding that the intermediate state must be
correct because it is the only position that makes logical sense (to our finite, human brains).
Donald Mackay (in Brown et al., 1998) has suggested that our eternal existence need not be the
continuation of some aspect of our current embodiment, but rather would be a new creation,
outside of the time and space we presently occupy. Thus the nature of our eternal existence
would be up to God to determine 21 within this new creation, and would not be dependent on a
currently existing separate nonphysical essence that survives death. In explaining this more fully,
Mackay makes use of a wonderful analogy that I feel warrants quoting at length:
If the concept of creation is to be thought of by any analogy with creation as we
ourselves understand it—as, for example, the creation of a space-time in a novel—then
a new creation is not just the running on and on of events later in the original novel: it is
a different novel. A new creation is a space-time in its own right. Even a human author
can both meaningfully and authoritatively say that the new novel has some of the same
characters in it as the old. The identity of the individuals in the new novel is for the
novelist to determine. So if there is any analogy at all with the concept of a new
creation by our divine Creator, what is set before us is the possibility that in a new
creation the Author brings into being, precisely and identically, some of those whom He
came to know in and through His participation in the old creation. (in Brown et al., 1998,
p. 100, emphasis added)
Thus, as Wong (2002) asserts “we can all agree that this takes a miracle of God’s omniscience
and power… During life, this ‘identity-information-soul’ is an integral part of the whole person.
20
Cooper argues, “Unless this topic wholly transcends logic—in which case we can say nothing
about it at all—it is not possible to affirm and deny that persons can exist apart from bodies” (1989, p. 4).
21
After all, He is God, and as such can act in whatever way he chooses.
70
It and the physical body are one. Neither one without the other would have a meaning or raison
d’être by itself” (p. 93).
Aside from Cooper’s disagreement with a non-reductive physicalist position, there is a
concern that Christian educators who hold a monistic view of persons would be more inclined to
place greater emphasis on the physical structure of the brain at the expense of a spiritual
emphasis (because the soul is not regarded as a separate entity worthy of due development). I
disagree with this concern, and rather posit that a problem of far greater concern is the false
dichotomy between the sacred and the secular which is commonly associated with a dualist
position. 22 And while proponents of body-soul dualism often point to the fact that dualism has
played an important, and often positive, role in the history of Christian ethics and Western
ethics in general, 23 it is worth pointing out that a counter-argument is just as readily composed.
Indeed various forms of body-soul dualism have been implicated not only in the sacred/secular
divide referred to above, but also as justification for slavery, degradation of the pleasure
principle (especially with regard to sexual relations in marriage), a lack of concern for the
environment, and the inferiority of women, among other things. Consequently, Post argues that
“the shift from a dualist to a nonreductive physicalist account of the person entails no significant
loss of resources for Christian ethics and in fact this ‘agapic inclusivist tradition’ may well be
enhanced by a physicalist anthropology” (in Brown et al., 1998, pp. 28-29).
22
Even Cooper (1989) recognizes the fact that the body-soul distinction is partly to blame for the
tragic loss of an active and intellectually robust Christianity: “Since the body-soul distinction has inclined
Christians to surrender vast areas of life and knowledge to the forces of secularism, Christian educators, for
example, fault it for the lack of a genuinely Biblical approach to classroom management, teaching methods,
and curricular content even in the nation’s Christian schools” (p. 28). This view leads to students being
treated as pure intellects and nothing else.
23
In this capacity, the argument is made that dualism has served as the foundation for arguments
for the value and sanctity of human life (with regard to the issues of abortion, euthanasia, and protection of
those persons with diminished cognitive capacities).
71
It would appear to me that some of the opposition to the nonreductive physicalist
position is motivated by a fear that we are compromising our faith by allowing science to dictate
what we believe. David Myers (1978) points out that oftentimes, Christian’s positions are
motivated out of a fear of science:
Most Christians, if they stopped to think about it, would agree that God acts within and
through nature because, after all, nature is his creation. Still, when it comes right down
to understanding and talking about human nature, many Christians evidence very much
of a God-of-the-gaps religion. Since, by this view, that which is left for God to explain
grows smaller and smaller as natural explanation expands, there emerges a great need
to ‘protect’ God by attacking science… These are not biblical responses, because in
isolating God from the natural processes, they deny the incarnational current that runs
through the Bible. (pp. 16-17)
In the end, what is important to bear in mind is that no matter what further discoveries are
made in neuroscience, brain imaging is not likely going to conclusively provide evidence for or
against the existence and action of God. Equally though, Murphy (2006) concedes that science
will never be able to prove that dualism is wrong:
It is important to note that no such accumulation of [neuroscientific] data can ever
amount to a proof that there is no immaterial mind or soul in addition to the body. But
if we recognize that the soul was originally introduced into Western thought not from
Hebraic Scripture but as an explanation for capacities that appeared not to be
explainable in biological terms, then we can certainly say that for scientific purposes the
hypothesis has been shown to be unnecessary. (p. 69, emphasis added)
God is the Author of both science and Scripture and both should be seen as part of His divine
revelation. Therefore, it stands to reason that principles in God’s Word ought to find
correspondence with genuine scientific truth (and its correct interpretation). This is not to say
that one should be used to “prove” the other, rather to suggest that, given that they have the
72
same Author, one might expect that there should be some sort of congruency. Thus, when we
are informing our opinions about these issues, it is important not to be looking for “proof” to
validate what we already believe. We should be extremely wary of subjecting either theology to
science or science to theology. Sometimes discomfort might result from wrestling with these
difficult issues, but we would do well to remember the wise words from C. S. Lewis:
In religion, as in war and everything else, comfort is the one thing you cannot get by
looking for it. If you look for truth, you may find comfort in the end: if you look for
comfort you will not get either comfort or truth—only soft soap and wishful thinking to
begin with and, in the end, despair. (1960, p. 25)
Ultimately as Christian educators, the main pitfalls to avoid in our understanding of human
nature are the truly reductive elements of what Mackay (1974) refers to as machinemindedness. This view “is deterministic 24 and depersonalizing, with implications which are
demoralizing” (1974, p. 13). Non-reductive physicalism does no such things. Furthermore,
neither epigenetics nor neuroplasticity need imply a deterministic viewpoint. On the contrary,
by endowing us with intentional, free volition, God has created us in such a way that, with the
assistance of the Holy Spirit, we have the capacity to be involved in the process of renewing of
our minds. Thoughts are indeed powerful, and according to the writer of Proverbs, they have
creative ability (Proverbs 23:7). If our thoughts are going to affect what and who we become,
24
Mackay (1974) clarifies between two types of determinism. The first kind is innocuous and
attests to the fact that in scientific methodology all physical events have physical causes. He points out that
we may even discover that this doctrine is not in fact true, but at least it does not deny human freedom.
The second kind of determinism, however, is the “philosophical belief that the future is inevitable, that we
have no real choices, that our sense of freedom is an illusion, and our responsibility a figment of the
moralist’s imagination” (p. 13). This is the kind of thinking that is most damaging. He does, however,
point out that progression from a more Newtonian model to a quantum physics model, as well as using
Heisenberg’s principle of uncertainty, as a means of refuting the moral deterministic position, is
inadequate: “Anyone who tries to make the indeterminacy of the models of physics a basis for restoring the
idea of freedom, and combating machine-mindedness, must admit that at the level of man-sized objects this
unpredictability normally makes no practical difference” (p. 15).
73
then it should be a priority that we think right thoughts (and as educators teach our students to
think right thoughts). Furthermore, we have not been created as depersonalized automatons,
but rather God has placed us on earth within a human community which, through relationship
with Himself and others, recognizes the dignity, uniqueness and worth of each human person.
Lastly, because we have been endowed with free will, we can be held accountable for our
actions: There are real distinctions between what is right and wrong.
In conclusion, if we accept a non-reductive physicalist account of human personhood, I
would argue that the stakes for what we do as (Christian) educators are significantly raised. As
we shape the brains and minds of our students, we bring about changes in individuals that will
be carried over with them in their resurrected bodies into eternity. The same applies for our
own minds. As we transform, and we are neurologically changed, we are able to offer our
transforming selves as a sacrifice to the Lord, a sacrifice which is holy and pleasing to Him.
To conclude, I would like to quote, at length, from Mackay (1979) who highlights the
immense responsibility that we are faced with:
As long as we do these things unthinkingly and unscientifically our consciences are
curiously at rest. It is only when our attention is drawn to what we are doing from a
scientific [and I would add spiritual] standpoint that we [might] begin to feel uneasy…
The danger then is that we slide in one of two opposite directions. On the one hand
there are many people today who react with horror. ‘Who am I,’ they say, ‘to impose
my views on my children? I must do my best to avoid conditioning them, and let them
make up their own minds in everything’… The opposite reaction, perhaps less common,
is to rush for all the latest textbooks on behavioral manipulation. (p. 75)
Clearly, neither response is justified, but what this quote does address is that once we become
aware of what we as teachers are potentially doing, we may feel a little uneasy because of the
ramifications. As teachers we recognize our role to provide our students with information that
74
will aid in their intellectual growth and development, but we must never abuse the immense
responsibility and authority that we have. Mackay (1979) goes on to point out a surprising fact:
In our society the power of others to make physical changes in our brains by direct
physical assault is reasonably limited both by law and by the thickness of our skulls; but
at the ‘informational’ or ‘programming’ level it is astonishingly unrestricted. We have
only to think how jealously a computer operator would guard his programs from
unwarranted interference to realize just how surprising this is… Information has
effects—for good or ill—which are just as material as those of ale or arsenic or the virus
of polio. (p. 87)
Clearly then, to pass on some advice given to our beloved arachnoid friend, Spiderman, “with
great power comes great responsibility.” We would be apt to use our (God-given) knowledge
and power as wisely as we know how, calling always on the strength and wisdom of the Lord to
guide us in all that we do.
B. Recommendations
Having examined all the research on student beliefs, as well as epigenetics, and
neuroplasticity, it is important to address the practical implications. While theory is
undoubtedly important, it is of little use if it does not get translated into useful application.
Ernest Rossi (2002) asks “are these [findings] to remain abstract facts safely sequestered in
academic textbooks, or can we take these facts into the mainstream of human affairs?” (p. xvii).
Furthermore, Pajares (1996) highlights the fact that while research has made invaluable
contributions to teachers’ understanding of student motivation, the connection from theory to
practice has been slow:
Classroom teachers and policymakers may well be impressed by the force of research
findings arguing that self-efficacy beliefs are important determinants of performance
and mediators of other self-beliefs, but they are more apt to be interested in useful
75
educational implications, sensible intervention strategies, and practical ways to alter
self-efficacy beliefs when they are inaccurate and debilitating to children. (p. 568,
emphasis added)
It goes without saying that teachers cannot successfully pass on to their students beliefs
that they have not internalized for themselves. Clearly teachers need to change their own belief
structures before they can change the belief structures of their students. It would seem obvious
then, that if teachers are to implement this information into their classrooms, then they need to
be trained in how to do this. Consequently, the importance of pre-service and in-service
training cannot be overemphasized. First, teachers need to be made aware of possible
conscious or unconscious views that they hold concerning the issues of genetic determinism and
brain plasticity as well as any biases they may have toward students of different gender, race, or
socioeconomic status.
One reasonably straightforward way of assessing these views is to survey teachers using
a Likert scale questionnaire. (Examples of possible questions might include: “According to my
experience, students’ achievement mostly remains constant in the course of the year,” or “As a
teacher I have no influence on students’ intellectual ability”.) Although it is possible that
teachers can misrepresent their true feelings on such issues, such a survey would at least
provide a starting point for discussion. After talking through their answers, teachers should
then be provided with scientific evidence from studies (such as those which I have referred to)
that highlight the changes that are possible in the human genome and brain. Furthermore,
teachers also need to be made aware of the extremely detrimental effects of holding differential
expectations for students. Epigenetic and neuroplastic principles hold true for all students.
76
Nationally known educators, Theodore and Nancy Sizer (1999) highlight the fact that
every child has to be treated equally:
What young people should not experience is sustained hypocrisy. The school which
claims that “everyone can be what he can be” but which demonstrably discriminates or
silently tolerates discrimination imposed by higher authorities sends a devastating
message: Do as I say, but not as I do. No message is more corrosive, especially for
teenagers. (p. 117)
Once teachers have become aware of the fact that their own perceptions can have an impact on
how they teach and treat students, it should be apparent to them that student perceptions will
therefore play a significant role in student behavior and performance. Consequently, teachers
would be well served by paying as much attention to students’ perceptions of competence as to
actual competence, for it is the perceptions that may more accurately predict students’
motivation and academic performance. If we know from neuroscience that we can change our
brains just by thinking, we need to ask ourselves, and our students, what we spend most of our
time mentally rehearsing and thinking about. Thus, teachers need to have a means of assessing
student perceptions, and one way of doing this is the same method that the teachers
themselves would have (hopefully) experienced: a Likert scale survey. 25 This could be
administered at the beginning of a school semester.
Following the administration of the surveys, the teacher should then have a period of
discussion with the students in which the implications and results are addressed. Teachers need
25
If teachers do administer surveys of students at the beginning of the semester, one important
point to highlight is that self-efficacy theory is domain specific and that generalized definitions tend to
weaken the effects. This is why Bandura (1986) has cautioned against over-generalizing. This makes
sense because students may be very confident of their abilities in some aspects and not at all in others. That
said, Multon et al. (1991), as well as Pajares and Miller (1995) have found that significant relationships are
obtained even within generalized domain-specific self-perceptions. Here it is important to realize that the
teacher is not primarily conducting these surveys for research purposes; rather s/he is conducting them to
find out ways in which s/he can better address student needs and therefore be a more effective teacher.
77
to explain the role that perception plays to students and students should be provided with
meta-cognitive strategies for addressing their thought patterns. We have learned that students
can easily be locked into certain abilities because of the areas that are myelinated within each
student’s brain and because they have become “addicted” to the chemicals associated with
various frequent thought patterns (Dispenza, 2007). As Doidge (2007) points out:
Freud was right when he said that the absence of plasticity seemed related to force of
habit. Neuroses are prone to being entrenched by force of habit because they involve
repeating patterns of which we are not conscious, making them almost impossible to
interrupt and redirect without special techniques. (p 243)
First then, students need to be made aware of their unconscious beliefs, and then they need to
be given tools to combat them. Some of these strategies have already been highlighted in
previous sections and they include the disciplines of “unlearning,” focused attention, and being
aware of associations, that is, the Hebb principle of “nerve cells that fire together, wire
together” (Hebb, 1949).
Thus, it also becomes important to teach students about the knowledge we are gaining
from research in cell biology (epigenetics) and neuroscience (neuroplasticity). Examples from
each of the two sections in this paper that deal with these topics could be taken and presented
to students. Certainly, students need to be informed about their brain structure and their
natural abilities. The problem is, since most students lack the knowledge of how their brain
functions, they have little control over their negative emotions and behaviors. Teachers need to
develop in students the concept of “ownership” of their brain. The road to success is to
empower the student with a new freedom of choice through increased information and
awareness. Once students have been provided with basic information about their brains, even
78
something as simple as having signs/posters placed around the classroom can reinforce, on an
ongoing basis, the message that students are responsible for the development of their own
brains (e.g., a poster that poses the question “Have you developed new dendrites today?”).
Research by Carol Dweck and others is already starting to explore the benefits of
teaching students about the workings of their brains (Blackwell, Trzesniewski, & Dweck, 2007).
Dweck and her colleagues looked at several hundred students going into seventh grade, and
assessed which students believed their intelligence was unchangeable (entity/fixed theory) and
which students believed their intelligence could change overtime (incremental/growth theory).
They then monitored their math grades over the next two years. What they observed was that
those with a growth mentality had steadily increasing grades, whereas those students with a
fixed view of intelligence actually showed a decrease in their grades. This finding prompted
Dweck to do a follow-up study. In the second study, Dweck and her colleague Lisa Blackwell
took 100 seventh graders, all who were doing poorly in math, and randomly assigned them to
two workshops on study skills. One workshop group was given lessons on how to study well,
and the other group was taught about the expanding nature of intelligence and the brain:
“Basically, the students were given a mini-neuroscience course on how the brain works”
(Trudeau, 2007). By the end of the semester, the group of students who had been taught that
the brain is capable of growing smarter had significantly better math grades than the other
group. This second study provides a great deal of encouragement and possibility for further
research.
However, it is worth pointing out that self-efficacy theory postulates that people acquire
information to evaluate efficacy beliefs from three primary sources: i) enactive mastery
experiences, ii) observation of others (vicarious experiences); and iii) forms of persuasion, both
79
verbal and otherwise (Bandura, 1997, p. 79). The experiment by Dweck and Blackwell would
loosely fall into the third category (verbal persuasion). Of these three information sources,
research has shown that the first kind (mastery experiences) are the most influential source of
efficacy information because they provide the most direct and authentic evidence (Bandura,
1977, 1997). Bandura cautions that “verbal persuasion alone may be limited in its power to
create enduring increases in perceived efficacy” (Bandura, 1997, p. 101). This is because
students cannot be fooled by empty praise and condescending encouragement. In fact, overly
optimistic persuasive comments tend to be ineffective, particularly if the student being
persuaded ultimately fails. The same holds true if students are taught that their intelligence is
capable of growth, but their academic experience following their receiving of that information
contradicts what they have been taught. The result in such situations can be devastating,
because it serves to discredit the persuader (in this instance the teacher), and undermine the
recipient’s efficacy beliefs (Bandura, 1977, 1997). The obvious outcome in such situations is a
loss of trust between student and teacher—something which may not be able to be easily
restored.
Thus, if teachers choose to engage in verbal persuasion methods with their students,
whether it be direct instruction about the brain’s capacity for change or the direct
communication of high expectations for their students, it is imperative that alongside the verbal
persuasion they provide authentic mastery experiences for their students. One possible way of
ensuring this is to present topics in a growth framework and to give students process feedback.
Students should also be given the opportunity to draft and re-work assignments before final
grading. This reinforces the idea that intelligence is something that grows and that we develop
over time as opposed to an inherent fixed capability.
80
There are at least three possible critiques that I can think of with regard to the
suggestions that I have made. The first critique, and likely to be the most common, centers on
the fact that teachers are already overloaded with a broad curriculum that needs to be covered
in a limited available time, and consequently where are they going to find the time to add
teaching about the brain’s capabilities to their students? Indeed, in the wake of reform acts
such as No Child Left Behind, and perceived plummeting academic standards, parents and
educators have demanded a “back-to-basics” approach to curriculum and teaching practice. In
this back-to-basics national mood, students’ emotions and perceptions are not seen as being at
the top of the priority list. 26 Instead the argument is made that teachers need to focus on
raising achievement test results. The irony of course is that research is beginning to show that
teaching students about their brains and helping them to accurately assess their perceptions
and emotions will have a positive effect on test scores. This is not to suggest that finding the
time in among a packed curriculum will be easy. The Sizers (1999) acknowledge the inherent
balancing act: “Where is the balance between teaching curriculum and addressing student
needs?” (p. 2)
Unfortunately, given the sweeping nature of the high school curricula, few teachers
think that they are able to allow the time necessary to teach about such concerns. Again, the
26
As previously mentioned, I am indebted to the work of my college professor, Dr. Steve Loomis
for informing my thinking on the matter of human capital theory and market standardization. In a journal
article published with Jacob Rodriguez entitled “A new view of institutions, human capital, and market
standardization” (Education, Knowledge and Economy, 2007), Rodriguez and Loomis address what they
see as an information distortion creeping into markets at an economic, political, and cultural level. Of
particular interest to my research is their theory on the division of information. The authors have
recognized an ironic trend: in a bid to address social issues of inequality (through the provision of greater
access to schools), the educational institution has expanded, but has had as its consequence, not the
decrease of social inequality as was hoped for, but rather the increase in social cost. This is due to the fact
there has been a substitution in terms of information dissemination: Low cost “universal” information
(perceived as necessary for the achievement of raising test scores) at the expense of high cost “particular”
information (necessary for human flourishing). Therefore, the kind of change that I am calling for in
classrooms is deemed particularly difficult because it involves more of the “particular” kind of information
that is being squeezed out of curriculums as educational institutions expand.
81
tragic irony is apparent: Teachers tend to be too focused on the short-term rather than the longterm goals. Thus, they worry about covering curriculum instead of teaching students how to
most effectively use their minds and teach them skills for life. As long as the end result of high
school is measured in “coverage,” there will always be little time left over for “additional”
student concerns.
Surely one of the core goals of schools is to prepare students to use their minds well?
My argument here is that while it is totally understandable that educators are primarily
concerned with teaching students skills and content area knowledge, results from more than
thirty years of self-concept theory cannot be ignored: “Simply possessing skills does not ensure
that students will be motivated to apply them” (Schunk, 1991, p. 227). In fact, as I have already
mentioned, much of the research suggests that students’ perceptions are a more accurate
measure of predicting student motivation and academic competence. Furthermore, as Bandura
(1986) has rightly argued:
Educational practices should be gauged not only by the skills and knowledge they impart
for present use but also by what they do to children’s beliefs about their capabilities,
which affects how they approach the future. Students who develop a strong sense of
self-efficacy are well equipped to educate themselves when they have to rely on their
own initiative. (p. 417)
In my opinion, Bandura’s definition provides a far more accurate picture of the goal of education,
and as such, teachers cannot afford to shirk their responsibilities to teach students this lifeempowering information. In a similar vein, the editors of a new book on neuroeducation
entitled The Educated Brain (2008) make the following bold statement:
Today the necessary task in educational and didactic systems is not to indoctrinate the
child and the pre-adolescent by transmission of knowledge through traditional
82
textbooks but to make them aware of their own capabilities and help them to use these
capabilities to progress from the passive condition of ‘loading’ information into memory
to the active condition of learning by direct experience. (Battro, Fisher, & Lena, pp. xxiixxiii, emphasis added)
A second possible critique of my suggestions is that teachers might feel incompetent to
teach about the brain if they feel that the knowledge is outside of their field of expertise. This
simply is not a valid argument because teachers whose subject matter falls outside of the
scientific disciplines are not suddenly being expected to become experts on the brain. All that is
required is a basic understanding which any competent teacher should be capable of mastering.
This is another reason why I am such a strong advocate of continuing professional development
because it is during these times of training that teachers could be equipped with the knowledge
and skills that they need to implement my suggested strategies. Furthermore, in time, it is my
hope that more materials like the Fast ForWord and Brainology program will be developed
that teachers can use to aid them in disseminating this information to students.
The third possible critique of my suggestions may arise from those who are skeptical of
“brain-based” education in general (Bruer, 1999; Jorgenson, 2003; Strauss, 2001; Willingham,
2006, among others). The majority of these critiques center on the premature adoption of
neuroscientific studies into classroom practice: “You can’t go from neuroscience to the
classroom because we don’t know enough about neuroscience” (Strauss, 2001, para. 1). While I
agree that some teachers have prematurely adopted principles that have subsequently been
shown to be largely erroneous (e.g., right brain/left brain theories and theories concerning the
critical or sensitive period), the kind of application that I am arguing for is, I believe, sufficiently
well established in the literature—for example, the neuroplastic nature of our brain and the role
that our thoughts play in this, and epigenetic, processes. Consequently, while I would caution
83
teachers to be diligent in choosing and evaluating which research to present to their students as
proof of the phenomena of neuroplasticity and epigenetics, I do not feel that we have
insufficient grounds on which to proceed. Furthermore, I am actually in agreement with some
of the critics of brain education who are calling for the need to develop more of a two-way
relationship between neuroscientists and educators. As Battro et al. (2008) point out,
“neuroscientists must move into the classroom, and teachers must bring their questions to the
laboratory” (p. 23). This reciprocal relationship can only prove beneficial.
My final point is to encourage teachers to regard these suggestions as a means of
addressing issues of social justice and moral responsibility. In a telling study, Dweck and her
colleagues found that about equal numbers of persons (i.e., 45%) held one of the two theories
(entity vs. malleable). The remaining 10% could not be definitely classified (Dweck, Hong, & Chiu,
1993). What this study highlights to me is that close on half of the students in classrooms have
false ideas about intelligence (an entity theory) and consequently false ideas about themselves.
This fact in and of itself should be a wake-up call and strong motivator for teachers to
implement the above-suggested strategies. As I have already said, I am not suggesting that this
will be an easy process. To really understand students’ perceptions on a deeper level, it is
ultimately going to take more than conducting a single survey at the beginning of semester.
Furthermore, over and above teaching students about the capabilities of the brain and
communicating uniformly high expectations to each student, teachers need to get to know their
students on more than an academic or purely superficial level. This takes time and dedication
and goes above and beyond what some teachers regard as their “official” teaching
responsibilities. However, as the Sizers (1999) rightly point out, the good teacher is able to take
each student on his or her own merits, “to convey, not a generic hope, not a one-size-fits-all
84
confidence, but the specific version which can only come from the student’s own facts, and from
knowing each child well” (p. 114, emphasis added). Ultimately, this comes full circle to the
reasons I mentioned in my introduction for wanting to teach—namely to see each of my
students succeed; to witness, and be a part of, each of them reaching their true potential—
academically, socially, and spiritually.
85
IV.
CONCLUSION AND FUTURE STUDY
One of the great frustrations for me during the course of this research was the fact that I
did not have access to a classroom of my own where I could test the theories that I was putting
forward. Since I was not conducting primary research, it was difficult to conclusively make the
claim that teaching students an incremental theory of intelligence, while at the same time
maintaining and communicating high expectations for all students, would in fact result in
measurable/observable changes in the brain. Consequently, I could only make this claim on the
basis of other neuroplastic experiments.
Furthermore, although Carol Dweck’s research into teaching students about their brains
comes closest to what I am advocating, I still think that further action research in this field is
needed. It would be especially interesting to incorporate brain imaging technology techniques
into the research methodology to investigate what specific neuroplastic changes can be
measured over a period of time. Furthermore, more longitudinal research is required to see
how long the effects of teaching students about their brains and the nature of intelligence
would endure. I would speculate that for such an intervention to be truly effective it would
need to be constantly reinforced to students at every grade level and across the subjects. Of
course with any research that deals with the human subject, it is virtually impossible to isolate
all the variables. Thus, it is always slightly problematic to conclude a direct outcome correlation
with any particular intervention. This is because there is always the possibility that factors
beyond the consideration of the researcher are also having an influence on the subjects being
studied. This, however, is virtually impossible to control.
86
That being said, I strongly believe that, above all else, teachers need to have as their
foundation the belief that all students are capable of learning and that each child needs to be
given the tools to reach his/her full potential. Asa Hilliard (1991) ended an article with the
following statement, which I think makes an apt conclusion and, as such, is worth quoting in its
entirety:
The risk for our children in school is not a risk associated with their intelligence. Our
failures have nothing to do with poverty, nothing to do with race, nothing to do with
language, nothing to do with style, nothing to do with the need to discover new
pedagogy, nothing to do with the development of unique and differentiated special
pedagogues, nothing to do with the children’s families. All of these are red herrings.
The study of them may ultimately lead to some greater insight into the instructional
process; but at present they serve to distract attention from the fundamental problem
facing us today. We have one and only one problem: Do we truly will to see each and
every child in this nation develop to the peak of his or her capacities? (p.36)
87
REFERENCES
Allen, D. (1985). Philosophy for understanding theology. Atlanta, GA: John Knox Press.
Anway, M. (2005). Epigenetic Transgenerational Actions of Endocrine Disruptors and Male
Fertility. Science, 308(5727), 1466-1469.
http://search.ebscohost.com.ezproxy.wheaton.edu
Aronson, J. M. (Ed.). (2002). Improving Academic Achievement: Impact of Psychological Factors
on Education. New York: Academic Press.
Aronson, J. M., Fried, C. B., & Good, C. (2002). Reducing the effects of stereotype threat on
African American college students by shaping theories of intelligence. Journal of
Experimental Social Psychology, 38, 113-125.
Avery, O. T., McLeod, C. M., & McCarty, M. (1944). Studies on the chemical nature of the
substance inducing transformation of pneumococcal types: Induction of transformation
by a deoxyribonucleic acid fraction isolated from pneumococcus type III. Journal of
Experimental Medicine, 79, 137-158.
Bach-y-Rita, P., Collins, C. C., Saunders, F. A., White, B., & Scadden, L. (1969). Vision substitution
by tactile image projection. Nature, 221(5184), 963-64.
Baltimore, D. (2001). Our genome unveiled. Nature, 409, 814-816.
Bamburg, J. D. (1994). Raising Expectations to Improve Student Learning. NCREL Monograph.
Retrieved August 2008, from http://www.ncrel.org/sdrs/areas/issues/educatrs/leadrshp/le0bam.htm
Bandura, A. (1977). Self efficacy: Toward a unifying theory of behavioral change. Psychological
Review, 84, 191-215.
Bandura, A., (1978). The self system in reciprocal determinism. American Psychologist, 33, 344358.
Bandura, A. (1986). Social foundations of thought and action: a social cognitive theory.
Englewood Cliffs, NJ: Prentice Hall.
Bandura, A. (1993). Perceived self-efficacy in cognitive development and functioning.
Educational Psychologist, 28(2), 117-148.
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W.H. Freeman and Company.
88
Bao, S., Chang, E. F., Davis, J. D., Gobeske, K. T., & Merzenich, M. M. (2003). Progressive
degradation and subsequent refinement of acoustic representations in the adult
auditory cortex. Journal of Neuroscience, 23(34), 10765-75.
Battro, A. M., Fischer, K. W., & Lena, P. J. (Eds.). (2008). The educated brain: Essays in
neuroeducation. Cambridge, UK: Cambridge University Press.
Benard, B. (1995). Fostering resiliency in urban schools. In B. Williams (Ed.). Closing the
achievement gap: A vision to guide change in beliefs and practice. Oak Brook, IL:
Research for Better Schools and North Central Regional Educational Laboratory.
Benson, J. (1989). Structural components of statistical test anxiety in adults: An exploratory
study. Journal of Experimental Education, 57, 247-261.
Blackwell, L. S., Trzesniewski, K. H., & Dweck, C. S. (2007). Implicit theories of intelligence predict
achievement across an adolescent transition: A longitudinal study and an intervention.
Child Development, 78(1), 246-263.
Brown, W. S., & Jeeves, M. A. (1999). Portraits on human nature : Reconciling neuroscience and
Christian anthropology. A report from a seminar at the combined meeting of the
American Scientific Affiliation and Christians in science, Churchill College, Cambridge
University, August 1998. Retrieved November 24, 2008, from
http://www.asa3org/aSA/topics/PsychologyNeuroscience/S&C10-99BrownJeeves.html
Brown, W. S., Murphy, N., & Malony, H. N. (Eds.). (1998). Whatever happened to the soul?
Scientific and theological portraits of human nature. Minneapolis: Fortress Press.
Bruer, J. T. Put brain science on the back burner. Retrieved November 13, 2008, from
http://mysite.verizon.net/pulsar/Library_Ref/Biology/Brain%20Development/Put%20Br
ain%20Back%20Burner/brain%2520back%2520burner.htm
Bruer, J. T. (1999). In search of…brain-based education. Phi Delta Kappan. Retrieved November
13, 2008, from http://www.pdkintl.org/kappan/kbru9905.htm
Burchinal, M.R., Campbell, F.A., Bryant, D.M., Wasik, B.H., & Ramey, C.T. (1997). Early
intervention and mediating processes in cognitive performance of children of lowincome African American families. Child Development, 68, 935-954.
Burns, M. (2003, March - April). Fast ForWord Products Open a Child's Window to Language.
Autism Asperger's Digest.
Chakravarti, A., & Little, P. (2003). Nature, nurture and human disease. Nature, 421, 412-414.
89
Campbell, F.A., Ramey, C.T., Pungello, E., Sparling, J., & Miller-Johnson, S. (2002). Early
childhood education: Outcomes as a function of different treatments. Applied
Developmental Science, 6, 42-57.
Chestnut, J. L. (2005). The innate state of mind and emotional hygiene. Victoria, BC: Global Self
Health Corp.
Church, D. (2007). The genie in your genes: Epigenetic medicine and the new biology of intention.
Santa Rosa, CA: Elite Books.
Churchland, P. M., & Churchland, P. S. (1998). On the contrary: Critical essays, 1987-1997.
Cambridge, MA: MIT Press.
Collins, J. L. (1982, March). Self-efficacy and ability in achievement behavior. Paper presented at
the annual meeting of the American Educational Research Association, New York.
Cooper, J. W. (1989). Body, soul, and life everlasting: Biblical anthropology and the monismdualism debate. Grand Rapids, MI: Wm. B. Eerdmans Publishing Co.
Corcoran, K. J. (2006). Rethinking human nature: A Christian materialist alternative to the soul.
Grand Rapids, MI: Baker Academic.
Cotton, K. (1989). Expectations and student outcomes. Portland, Oregon: Northwest Regional
Educational Laboratory.
Crick, F. (1994). The astonishing hypothesis: The scientific search for the soul. New York: Maxwell
Macmillan International.
Darwin, C. (1959). On the origin of species by means of natural selection microform. London: J.
Murray.
Damasio, A. (1994). Descartes’ error: Emotion, reason, and the human brain. New York: Avon
Books.
Damasio, A. (1999). The feeling of what happens: Body and emotion in the making of
consciousness. New York: Harcourt Brace.
Damasio, D. (2003). Looking for Spinoza: Joy, sorrow, and the feeling of brain. Orlando, FL:
Harcourt.
Davis, J. (1997). Mapping the mind: The secrets of the human brain and how it works. Secaucus,
NJ: Birch Lane Press.
90
Dawkins, R. (1989). The selfish gene. Oxford, New York: Oxford University Press.
Dawkins, R. (2006). The God delusion. Boston, MA: Houghton Mifflin.
Dennett, D. C. (1991). Consciousness explained. Boston: Little, Brown.
Descartes, R. (1649). The passions of the soul. Translated by Stephen H. Voss. (1989).
Indianapolis, IN: Hackett Publishing Company.
Diamond, M., & Hopson, J. (1998). Magic trees of the mind: How to nurture your child’s
intelligence, creativity, and healthy emotions from birth through adolescence. New York:
Penguin Putnam.
Dispenza, J. (2007). Evolve your brain: The science of changing your mind. Deerfield Beach, FL:
Health Communications, Inc.
Doidge, N. (2007). The brain that changes itself: Stories of personal triumph from the frontiers of
brain science. New York, NY: Penguin Group.
Dronkers, N. F, Husted, D.A., Deutsch, G., Tayler, M.K., Saunders, G., & Merzenich, M.M. (1999).
Lesion site as a predictor of improvement after “Fast ForWord” treatment in adult
aphasic patients. Brain and Language, 69, 450-452.
Dweck, C. S., Hong, Y., & Chiu, C. (1993). Implicit theories: Individual differences in the likelihood
and meaning of dispositional inference. Personality and Social Psychology Bulletin, 19,
644-656.
Dweck, C. S. (1999). Self-theories: Their role in motivation, personality and development.
Philadelphia: The Psychology Press.
Dweck, C. S., & Leggett, E. L. (1988). A social cognitive approach to motivation and personality.
Psychological Review, 95(2), 256-273.
Dweck, C. S. (2006). Mindset: The new psychology of success. New York: Ballantine Books.
Eccles, J. (1989). Evolution of the brain: Creation of the self. London, UK: Routledge.
Edmonds, R. (1986). Characteristics of effective schools. In Neisser, U. The school achievement
of minority children: New Perspectives, p. 93-104. Hillsdale, NJ: Lawrence Erlbaum.
Evans, C. S. (1981). Separable souls: A defense of ‘minimal dualism.’ Southern Journal of
Philosophy, 19, 313-31.
91
Feldman, R., & Prohaska, T. (1979). The Student as Pygmalion: Effect of Student Expectation on
the Teacher. Journal of Educational Psychology,
http://search.ebscohost.com.ezproxy.wheaton.edu
Feldman, R., & Theiss, A. (1982). The teacher and student as Pygmalions: Joint effects of teacher
and student expectations. Journal of Educational Psychology. 74(2), 217-223.
http:search.ebscohost.com.ezproxy.wheaton.edu
Fraga, M. F., Ballestar, E., Paz, M. F., Ropero, S., Setien, F., Ballestar, M. L., Heine-Suner, D.,
Cigudosa, J.C., Urioste, M., Benitez, J., Boix-Chornet, M., Sanchez-Aguilera, A., Ling, C.,
Carlsson, E., Poulsen, P., Vaag, A., Stephan, Z., Spector, T., Wu, Y., Plass, C., & Esteller, M.
(2005). Epigenetic differences arise during the lifetime of monozygotic twins.
Proceedings of the National Academy of Sciences, 102(30).
Friel-Patti, S., DesBarres, K., & Thibodeau, L. (2001). Case studies of children using Fast ForWord.
American Journal of Speech-Language Pathology, 10(3), 203-215.
Gaab, N., Gabrieli, J. D. E., Deutsch, G. K., Tallal, P., & Temple, E. (2007). Neural correlates of
rapid auditory processing are disrupted in children with developmental dyslexia and
ameliorated with training: An fMRI study. Restorative Neurology and Neuroscience, 25,
295-310.
Gandara, P. (2009). On Hispanic Education—Progress and Stagnation: 25 Years of Hispanic
Achievement. Diverse Issues in Higher Education, 26(9), 37-38.
http://search.ebscohost.com.ezproxy.wheaton.edu
Gardner, H. (1983). Frames of mind: The theory of multiples intelligences. New York: Basic Books.
Gazzaniga, M. (2005). The ethical brain. New York: Dana Press.
Gillam, R. B. (1999). Treatment for temporal processing deficits: Computer-assisted language
intervention using Fast ForWord(R): Theoretical and empirical considerations for clinical
decision-making. Language, Speech, and Hearing Services in Schools, 30(4), 363-370.
Gould, S. J. (1981). The mismeasure of man. New York: W.W. Norton & Co.
Green, J. B. (1998). Eschatology and the nature of humans: A reconsideration of pertinent
biblical evidence. Science and Christian Belief, 14(1), 33-50.
Green, J. B., & Palmer, S. L. (Eds.). (2005). In search of the soul: Four views of the mind-body
problem. Downers Grove, IL: InterVarsity Press.
92
Hackett, G. & Betz, N.E. (1989). An exploration of the mathematics self-efficacy/ mathematics
performance correspondence. Journal for Research in Mathematics Education, 20, 261273.
Hart, L. (1985). Human brain, human learning. New York: Longman.
Hebb, D. O. (1949). The organization of behavior: A neuropsychological theory. New York: John
Wiley & Sons.
Herrnstein, R. J., & Murray, C. (1994). The bell curve: Intelligence and class structure in American
life. New York: Free Press.
Hilliard, A, (1991). Do we have the will to educate all children? Educational Leadership, 49(1),
31-36.
Hillman, S. (1984). Contributions to achievement: the role of expectations and self-efficacy
students, teachers, and principals. Abstract of paper presented at the annual meeting of
the American Educational Research Association, New Orleans, Louisiana.
Howard, J. 1990. Getting smart: The social construction of intelligence. Lexington, MA: The
Efficacy Institute.
Jaegwon, K. (1994). Supervenience. In Guttenplan (Ed.), A Companion to Philosophy of Mind (pp.
575-584). Oxford: Blackwell.
Jeeves, M. (Ed.). (2004). From cells to souls—and beyond: Changing portraits of human nature.
Grand Rapids, MI: William B. Eerdmans Publishing Company.
Jensen, E. (2005). Teaching with the brain in mind (2nd ed.). Alexandria, VA: Association for
Supervision and Curriculum Development.
Jorgenson, O. (2003). Brain scam? Why educators should be careful about embracing ‘brain
research.’ Kappa Delta Pi. Retrieved November 13, 2008, from
http://findarticles.com/p/articles/mi_qa4013/is_200307/ai_n9272564?tag=content;col
1
Kandel, E. R. (2003). The molecular biology of memory storage: A dialogue between genes and
synapses. In H. Jornvall, ed., Nobel Lectures, Physiology of Medicine, 1996-2000.
Singapore: World Scientific Publishing Co., 402.
Kandel, E. R. (2006). In search of memory: The emergence of a new science of the mind. New
York, NY: W.W. Norton & Company.
93
Lederberg, J. (1994). Honoring Avery, MacLeod, and McCarty: The team that transformed
genetics. The Scientist, 8(4), 11.
LeDoux, J. (2002). Synaptic self: How our brains become who we are. New York, NY: Penguin
Books.
Lent, R. W., Brown, S. D., & Larkin, K. C. (1984). Relation of self-efficacy expectations to
academic achievement and persistence. Journal of Counseling Psychology, 31(3), 356362.
Lewis, C. S. (1960). Mere Christianity. New York: Macmillan.
Lewontin, R. C., Rose, S., & Kamin, L. J. (1984). Not in our genes. New York: Pantheon.
Lipton, B. H. (1977a). A fine structural analysis of normal and modulated cells in myogenic
culture. Developmental Biology, 60, 26-47.
Lipton, B. H. (1977b). Collagen synthesis by normal and bromodeoxyuridine-treated cells in
myogenic culture. Developmental Biology, 61, 153-165.
Lipton, B. H., Bensch, K. G. et al. (1991). Microrvessal endothelial cell transdifferentiation:
Phenotypic characterization. Differentiation, 46, 117-133.
Lipton, B. H., Bensch, K. G. et al. (1992). Histamine-modulated transdifferentiation of dermal
microvascular endothelial cells. Experimental Cell Research, 199, 279-291.
Lipton, B. (2005). The biology of belief: Unleashing the power of consciousness, matter and
miracles. Cambridge, UK: Cambrian Printers.
Lipton, B. H. (2007, September). The biology of belief. Lecture presented in London, England.
Loeb, D. F., Stoke, C., & Fey, M. E. (2001). Language changes associated with Fast ForWordlanguage: Evidence from case studies. American Journal of Speech-Language Pathology,
10(3), 216-230.
Lumsden, L. (1997). Expectations for students. ERIC Digest. Retrieved July 2008, from
http://cepm.uoregon.edu/publications/digests/digest116.html
Mackay, D. M. (1974). The clock work image: A Christian perspective on science. Downers Grove,
IL: InterVarsity Press.
Mackay, D. M. (1979). Human science and human dignity. Downers Grove, IL: InterVarsity Press.
94
Maddux, J. E., & Stanley, M. A. (Eds.). (1986). Self-efficacy theory in contemporary psychology
[Special issue]. Journal of Social and Clinical Psychology. 4(3).
Mangels, J. A., Butterfield, B., Lamb, J., Good, C., & Dweck, C. S. (2006). Why do beliefs about
intelligence influence learning success? A social cognitive neuroscience model. Social,
Cognitive, and Affective Neuroscience, 1, 75-86.
McGinness, G., & Ramey, C. T. (1981). Developing sociolinguistic competence in children.
Canadian Journal of Early Childhood Education, 1, 22-43.
Merzenich, M. M., Nelson, R. J., Stryker, M. P., Cynader, M. S., Schoppmann, A., & Zook, J. M.
(1984). Somatosensory cortical map changes following digit amputation in adult
monkeys. Journal of Comparative Neurology, 224(4), 591-605.
Merzenich, M. M., Saunders, G., Jenkins, W.M., Peterson, B.E., Tallal, P. (1999) Pervasive
Developmental Disorders: Listening Training and Language Abilities, In: S.H. Broman &
J.M. Fletcher (Eds.) The Changing Nervous System: Neurobehavioral Consequences of
Early Brain Disorders, Oxford: Oxford University Press, pp. 365-385.
Miller, S. L., Linn, N., Tallal, P., Merzenich, M.M., and Jenkins, W.M., (1999) Acoustically modified
speech and language training: A relationship between auditory word discrimination
training and measures of language outcome, Speech and Language Therapy, 197, 159182.
Moses, A. (2009, February 13). $100 billion to education in Obama’s stimulus package. The
George Lucas Educational Foundation. Retrieved July 3, 2009, from
http://www.edutopia.org/economic‐ Alan stimulus‐education‐school‐budget
Mountcastle, V. (1978). An Organizing Principle for Cerebral Function: The Unit Model and the
Distributed System. In G. M. Edelman & V. B. Mountcastle (Eds.), The Mindful Brain,
Cambridge, MA: MIT Press.
Multon, K. D., Brown, S. D., & Lent, R. W. (1991). Relation of self-efficacy beliefs to academic
outcomes: A meta-analytic investigation. Journal of Counseling Psychology, 28, 30-38.
Murphy, N. (2006). Bodies and souls, or spirited bodies? New York: Cambridge University Press.
Musher-Eizenman, D. R., Nesselroade, J. R., & Schmitz, B. (2002). Perceived control and
academic performance: A comparison of high- and low-performing children on withinperson change patterns. International Journal of Behavioral Development, 26(6), 540547.
95
Myers, D. G. (1978). The human puzzle: Psychological research and Christian belief. San
Francisco: Harper and Row.
Neisser, U., Boodoo, G., Bouchard, T. J., Jr., Boykin, A. W., Brody, N., Ceci, S. J., Halpern, D. F.,
Loehlin, J. C., Perloff, R., Sternberg, R. J., & Urbina, S. (1996). Intelligence: Knowns and
unknowns. American Psychologist, 51(2), 435-442.
Nelkin, D., & Lindee, M. S. (1995). The DNA mystique: The gene as a cultural icon. New York:
Freeman.
Nicholls, J. G. (1984). Achievement motivation: Conceptions of ability, subjective experience,
task choice, and performance. Psychological Review, 91, 328-346.
Nijhout, H. F. (1990). Metaphors and the roles of genes in development. BioEssays, 12(9), 441446.
Pajares, F., & Miller, M. D. (1995). Mathematics self-efficacy and mathematics outcomes: The
need for specificity of assessment. Journal of Counseling Psychology, 42, 190-198.
Pajares, F. (1996). Self-efficacy beliefs in academic settings. Review of Educational Research, 66,
543-578.
Pardini, P. (2007). Higher expectations challenge teachers and students to succeed. JSD, 28(4),
10-13.
Pascual-Leone, A., Hamilton, R., Tormos, J. M., Keenan, J. P., & Catala, M. D. (1999).
Neuroplasticity in the adjustment to blindness. In J. Grafman & Y. Christen, eds.,
Neuronal plasticity: Building a bridge from the laboratory to the clinic. New York:
Springer_Verlag, 94-108.
Pavlov, I. P. (1927). Conditioned Reflexes: An Investigation of the Physiological Activity of the
Cerebral Cortex (translated by G. V. Anrep). London: Oxford University Press.
Penfield, W. & Rasmussen, T. (1950). The Cerebral Cortex of Man. New York: Macmillan.
Petrill, S. A., & Justice, L. M. (2007). Bridging the gap between genomics and education. Mind,
Brain and Education, 1(4), 153-161.
Petronis, A., Paterson, A. D., & Kennedy, J. L. (1999). Schizophrenia: An Epigenetic Puzzle?
Schizophrenia Bulletin, 25(4), 639-655.
96
Petronis, A., Gottesman, I. I., Kennedy, J. L., & Paterson, A. D. (2003). Monozygotic twins exhibit
numerous epigenetic differences: Clues to twin discordance? Schizophrenia Bulletin,
29(1), 169-178.
Pinker, S. (1997). How the mind works. New York: Norton.
Pinker, S. (2002). The blank state: The modern denial of human nature. New York: Viking.
Pintrich, P. R., & De Groot, E. V. (1990). Motivational and self-regulated learning component of
classroom academic performance. Journal of Educational Psychology, 82, 33-40.
Planty, M., Kena, G., Hannes, G., & National Center for Education Statistics. (2009). The
Condition of Education 2009 in Brief, NCES 2009-082. National Center for Education
Statistics, http://search.ebscohost.com.ezproxy.wheaton.edu
Posner, M., & Rothbart, M. (2007). Educating the human brain. Washington, DC: American
Psychological Association.
Ramachandran, V. S., Rogers-Ramachandran, D., & Stewart, M. (1992). Perceptual correlates of
massive cortical reorganization. Science, 258(5085), 1159-60.
Ramey, C. T., McGinness, G., Cross, L., Collier, A., & Barrie-Blackley, S. (1981). The Abecedarian
approach to social competence: Cognitive and linguistic intervention for disadvantaged
preschoolers. In K. Borman (Ed.), The social life of children in a changing society (pp.
145-174). Hillsdale, NJ: Erlbaum Associates.
Ramey, C. T., & Campbell, F. A. (1984). Preventative education for high-risk children: Cognitive
consequences of the Caroline Abecedarian Project. American Journal of Mental
Deficiency, 88, 515-523.
Ramey, C. T., Bryant, D. M., Sparling, J. J., & Wasik, B. H. (1985). Project CARE: A comparison of
two early intervention strategies to prevent retarded development. Topics in Early
Childhood Special Education, 5(2), 12-25.
Ramey, C. T., & Ramey, S. L. (1996). At risk does not mean doomed. National Health/Education
Consortium Occasional Paper #4. Paper presented at the meeting of the American
Association of Science, February 1996.
Ramey, C. T., & Ramey, S. L. (2002, April 30). Early childhood education: From efficacy research
to improved practice. Retrieved September 25, 2009 from
http://www.ed.gov/teachers/how/early/cognitivedevsummit02/page_pg2.html
97
Raudenbush, S. W. (1984). Magnitude of teacher expectancy effects on pupil IQ as a function of
credibility induction: A synthesis of findings from 18 experiments. Journal of Educational
Psychology, 76, 85-97.
Ratey, J. J. (2001). A user’s guide to the brain: Perception, attention, and the four theatres of the
brain. New York: Vintage Books.
Reik, W., Dean, W., & Walter, J. (2001). Epigenetic reprogramming in mammalian development.
Science, 293, 1089-1093.
Richards, M. (2009). Closing the Gap between Professional Development and Student
Achievement. Online Submission, http://search.ebscohost.com.ezproxy.wheaton.edu
Rist, R. (2000). Student social class and teacher expectations: The self-fulfilling prophecy in
ghetto education. Harvard Educational Review, 70(3), 266-301.
http://search.ebscohost.com.ezproxy.wheaton.edu
Robins, R. W., & Pals, J. L. (2002). Implicit self-theories in the academic domain: Implications for
goal orientation, attributions, affect, and self-esteem change. Self and Identity, 1, 313336.
Robbins, S.B., Lauver, K., Le, H., Davis, D., Langley, R., & Carlstrom, K. (2004). Do psychosocial
and study skill factors predict college outcomes? A meta-analysis. Psychological Bulletin,
130, 261-288.
Rodriguez, J. P., & Loomis, S. R. (2007). A new view of institutions, human capital, and market
standardization. Education, Knowledge, and Economy, 1(1), 93-105.
Rosenthal, R., & Jacobson, L. (1966). Teachers’ expectancies: Determinants of pupils’ IQ gains.
Psychological Reports, 19, 115-118.
Rosenthal, R., and Jacobson, L. (1968). Pygmalion in the Classroom: Teacher expectation and
pupils’ intellectual development. New York: Rinehart and Winston.
Rosenthal, R. (1984). Meta-analytic procedures for social research. Newbury Park, CA: Sage.
Rosenthal, R. (1991). Teacher expectancy effects: A brief update 25 years after the Pygmalion
experiment. Journal of Research in Education, 1, 3-12.
Rosenthal, R. (1998). Interpersonal expectancy effects: A forty year perspective. Psychology
Teacher Network, 8, 2-4.
98
Rosenzweig, M. R., Krech, D., Bennett, E. L., & Diamond, M. C. (1962). Effects of environmental
complexity and training on brain chemistry and anatomy: A replication and extension.
Journal of Comparative and Physiological Psychology, 55, 429-37.
Rossi, E. (2002). The psychobiology of gene expression. New York: Norton.
Ryle, G. (1949). The concept of mind. Chicago: University of Chicago Press.
Schunk, D. H. (1991). Self-efficacy and academic motivation. Educational Psychologist, 26, 207231.
Schwartz, J., & Begley, S. (2002). The mind and the brain: Neuroplasticity and the power of
mental force. New York: Harper Perennial.
Sizer, T. R., & Sizer, N. F. (1999). The students are watching: Schools and the moral contract.
Boston: Beacon Press.
Smith, R. J., Arnkoff, D. B., & Wright, T. L. (1990). Test anxiety and academic competence: A
comparison of alternative models. Journal of Counseling Psychology, 37, 313-321.
Sousa, D. A. (2001). How the brain learns: A classroom teacher’s guide. Thousand Oaks, CA:
Corwin Press.
Sperry, R. W. (1992). Turnabout on consciousness: A mentalist view. Journal of Mind and
Behavior, 13, 259-280.
Sparling, J.J., Lewis, I. & Ramey, C.T. (1995). Partners for learning: Birth to 36 months. Lewisville,
NC: Kaplan Press.
Sternberg, R. J. (1996). Myths, countermyths, and truths about intelligence. Educational
Researcher, 25(2), 11-16.
Stipek, D., & Gralinski, J. H. (1996). Children’s beliefs about intelligence and school performance.
Journal of Educational Psychology, 88, 397-407.
Strauss, V. (2001). Brain research oversold, experts say. Washington Post. Retrieved November
13, 2008, from http://teachers.net/gazette/APR01/strauss.html
Surani, A. M. (2001). Reprogramming of genome function through epigenetic inheritance.
Nature, 414, 122–128.
99
Sylwester, R. (1995). Celebration of neurons: An educator’s guide to the human brain. Alexandria,
VA: Association for Supervision and Curriculum Development.
Symer, D. E. & Bender, J. (2001). Hip-hopping out of control. Nature, 411, 146-149.
Tallal, P., Miller, S., Jenkins, B., & Merzenich, M., (1997) The Role of Temporal Processing in
Developmental Language-Based Learning Disorders: Research and Clinical Implications,
In: Foundations of Reading Acquisition, Benita Blachman, Ed. Newark: Lawrence Erlbaum
Assoc., Inc. Publishers.
Taub, E. (1977). Movement in non-human primates deprived of somatosensory feedback. In J. F.
Keogh, ed., Exercise and sport sciences reviews (pp. 335-374). Santa Barbara, CA: Journal
Publishing Affiliates.
Taub, E. (1980). Somatosensory deafferentation research with monkeys: Implications for
rehabilitation medicine. In L. P. Ince, ed., Behavioral psychology in rehabilitation
medicine: Clinical applications (pp. 371-401). Baltimore: Williams & Williams.
Trudeau, M. (2007). Students’ view of intelligence can help grades. NRP website. Retrieved
November 17, 2008 from
http://www.nrp.org/templates/story/story.php?storyId=7406521
Vanneman, A., Hamilton, L., Anderson, J., Rahman, T., & National Center for Education Statistics,
(2009). Achievement Gaps: How Black and White Students in Public Schools Perform in
Mathematics and Reading on the National Assessment of Educational Progress.
Statistical Analysis Report. NCES 2009-455. National Center for Education Statistics,
http://search.ebscohost.com.ezproxy.wheaton.edu
Vinson, V., Pernell, B., Chin, G., & Marx, J. (2000). Macromolecular ballet. Science, 288, 1369.
Waterland, R. A. and R. L. Jirtle (2003). Transposable elements: Targets for early nutritional
effects on epigenetic gene regulation. Molecular and Cell Biology, 23(15), 5293-5300.
Watson, J.D., & Crick, F. H. C. (1953). A Structure for Deoxyribose Nucleic Acid. Nature, 171, 737738.
Watters, E. (2006). DNA is not destiny: The new science of epigenetics rewrites the rules of
disease, heredity, and identity. Discover: Science, Technology and The Future. Retrieved
September 25, 2009 from,
http://discovermagazine.com/2006/nov/cover/article_view?b_start:int=0&-C=
100
Weaver, I., Cervoni, N., Champagne, F., D’Alessio, A., Sharma, S., Seckl, J., et al. (2004).
Epigenetic programming by maternal behavior. Nature Neuroscience, 7(8), 847-854.
http://search.ebscohost.com.ezproxy.wheaton.edu
Westwater, A., & Wolfe, P. (2000). The Brain-Compatible Curriculum. Educational Leadership,
58(3), 49. http://search.ebscohost.com.ezproxy.wheaton.edu
Wilhite, S. C. (1990). Self-efficacy, locus of control, self-assessment of memory ability, and study
activities as predictors of college course achievement. Journal of Educational Psychology,
82, 696-700.
Willingham, D. T. (2006). “Brain-based” learning: More fiction than fact. American Educator.
Retrieved November 13, 2008, from http://www.aft.org/pubsreports/american_educator/issues/fall2006/cogsci.htm
Wolfe, P. (2001). Brain matters: Translating the research to classroom practice. Alexandria, VA:
Association for Supervision and Curriculum Development.
Wolfe, P., & Brandt, R. (1998). What do we know from brain research? Educational Leadership,
56(3), 8. http://search.ebscohost.com.ezproxy.wheaton.edu
Wong, J. B. (2002). Christian wholism: Theological and ethical implications in the postmodern
world. Lanham, MD: University Press of America.
Woolsey, C. N. Marshal, W. H., & Bard, P. (1937). Cortical representation of tactile sensibility as
indicated by cortical potentials. Science, 85, 388-90.
Woolsey, C. N., & Travis, A. M. (1956). Motor performance of monkeys after bilateral partial and
total cerebral decortications. American Journal of Physical Medicine and Rehabilitation,
35, 273-310.
Woolsey, C. N., Erickson, T. C., & Gilson, W. E. (1979). Localization in somatic sensory and motor
areas of human cerebral cortex as determined by direct recording of evoked potentials
and electrical stimulation. Journal of Neurosurgery, 51, 476-506.
Ying-yi, H., Dweck, C., Lin, D., & Wan, W. (1999). Implicit theories, Attributions, and Coping: A
Meaning System Approach. Journal of Personality and Social Psychology, 77(3), 588-599.
http://search.ebscohost.com.ezproxy.wheaton.edu
Zeichner, K. M. (1995). Educating teachers to close the achievement gap: Issues of pedagogy,
knowledge, and teacher preparation. In B. Williams (Ed.). (1996). Closing the
101
achievement gap: A vision to guide change in beliefs and practice. Alexandria, VA:
Association for Supervision and Curriculum Development.
Zimmerman, B. J., Bandura, A., & Martinez-Pons, M. (1992). Self-motivation for academic
attainment: The role of self-efficacy beliefs and personal goal setting. American
Educational Research, 29(3), 663-676.
Zull, J. E. (2002). The art of changing the brain: Enriching the practice of teaching by exploring
the biology of learning. Sterling, VA: Stylus Publishing.
102