Chapter 11 - Bakersfield College

Chapter 11
HYPOTHESIS TESTING USING THE
ONE-WAY ANALYSIS OF VARIANCE
Moving Forward
Your goals in this chapter are to learn:
• The terminology of analysis of variance
• When and how to compute Fobt
• Why Fobt should equal 1 if H0 is true, and why
it is greater than 1 if H0 is false
• When and how to compute Tukey’s HSD
• How eta squared describes effect size
Analysis of Variance
• The analysis of variance is the parametric
procedure for determining whether significant
differences occur in an experiment with two
or more sample means
• In an experiment involving only two
conditions of the independent variable, you
may use either a t-test or the ANOVA
An Overview of ANOVA
One-Way ANOVA
• Analysis of variance is abbreviated as ANOVA
• An independent variable is also called a factor
• Each condition of the independent variable is
called a level or treatment
• Differences produced by the independent
variable are a treatment effect
Between-Subjects
• A one-way ANOVA is performed when one
independent variable is tested in the
experiment
• When an independent variable is studied
using independent samples in all conditions, it
is called a between-subjects factor
• A between-subjects factor involves using the
formulas for a between-subjects ANOVA
Within-Subjects Factor
• When a factor is studied using related
(dependent) samples in all levels, it is called a
within-subjects factor
• This involves a set of formulas called a withinsubjects ANOVA
Diagram of a Study Having Three
Levels of One Factor
Assumptions of the ANOVA
1. All conditions contain independent samples
2. The dependent scores are normally
distributed, interval or ratio scores
3. The variances of the populations are
homogeneous
Experiment-Wise Error
• The probability of making a Type I error
somewhere among the comparisons in an
experiment is called the experiment-wise
error rate
• When we use a t-test to compare only two
means in an experiment, the experiment-wise
error rate equals a
Comparing Means
• When there are more than two means in an
experiment, the multiple t-tests result in an
experiment-wise error rate much larger than
the a we have selected
• Using the ANOVA allows us to make all our
decisions and keep the experiment-wise error
rate equal to a
Statistical Hypotheses
H 0 : 1   2     k
H a : not all  s are equal
The F-Test
• The statistic for the ANOVA is F
• When Fobt is significant, it indicates only that
somewhere among the means at least two of
them differ significantly
• It does NOT indicate which specific means
differ significantly
• When the F-test is significant, we perform
post hoc comparisons
Post Hoc Comparisons
• Post hoc comparisons are like t-tests
• We compare all possible pairs of level means
from a factor, one pair at a time to determine
which means differ significantly from each
other
Components of the ANOVA
Mean Squares
• The mean square within groups describes the
variability in scores within the conditions of an
experiment. It is symbolized by MSwn.
• The mean square between groups describes
the differences between the means of the
conditions in a factor. It is symbolized by MSbn.
The F-Ratio
• The F-ratio equals the mean square between
groups divided by the mean square within
groups
Fobt
MS bn

MS wn
• When H0 is true, Fobt should equal 1
• When H0 is false, Fobt should be greater than 1
Performing the ANOVA
Sum of Squares
• The computations for the ANOVA require the
use of several sums of squared deviations
• The sum of squares is the sum of the squared
deviations of a set of scores around the mean
of those scores
• It is symbolized by SS
Summary Table of a
One-way ANOVA
Computing Fobt
1. Compute the sums and means
• X
• X 2
• X
for each level. Add the X from all levels to
get X tot . Add together the X 2 from all
levels to get X tot2 . Add the ns together to
get N.
Computing Fobt
2. Compute the total sum of squares (SStot)
SS tot  X
2
tot
 (X tot ) 2 

 
 N 
Computing Fobt
3. Compute the sum of squares between groups
(SSbn)
 (X in column ) 2   (X tot ) 2 

  
SS bn  
 n in column   N 
Computing Fobt
4. Compute the sum of squares within groups
(SSwn)
SS wn  SStot  SSbn
Computing Fobt
Compute the degrees of freedom
• The degrees of freedom between groups
equals k – 1 where k is the number of levels
in the factor
• The degrees of freedom within groups
equals N – k
• The degrees of freedom total equals N – 1
Computing Fobt
Compute the mean squares
•
•
MS bn
SS bn

df bn
MS wn
SS wn

df wn
Computing Fobt
Compute Fobt
Fobt
MS bn

MS wn
Sampling Distribution of F
When H0 Is True
Degrees of Freedom
The critical value of F (Fcrit) depends on
• The degrees of freedom (both the
dfbn = k – 1 and the dfwn = N – k)
• The a selected
• The F-test is always a one-tailed test
Tukey’s HSD Test
When the ns in all levels of the factor are equal,
use the Tukey HSD multiple comparisons test
 MS wn
HSD  (qk )
n





where qk is found using Table 5 in Appendix B
Tukey’s HSD Test
• Determine the difference between each pair
of means
• Compare each difference between the means
to the HSD
• If the absolute difference between two means
is greater than the HSD, then these means
differ significantly
Effect Size and Eta2
Proportion of Variance
Accounted For
Eta squared ( ) indicates the proportion of
variance in the dependent variable scores that is
accounted for by changing the levels of a factor
2
SS bn
 
SS tot
2
Example
Using the following data set, conduct a one-way
ANOVA. Use a = 0.05.
Group 1
Group 2
Group 3
14
14
10
13
11
15
13
10
12
11
14
13
14
15
11
10
14
15
Example
SS tot
2


(

X
)
2
tot

 X tot  
 N 
2292
 2969 
18
 55.611
Example
 (X in column ) 2   (X tot ) 2 

  
SS bn  
 n in column   N 
2
2
2
2
80
67
82
229




 22.111
6
6
6
18
Example
SS wn  SS tot  SS bn
 55.611  22.111
 33.50
Example
• dfbn = k – 1 = 3 – 1 = 2
• dfwn = N – k = 18 – 3 = 15
• dftot = N – 1 = 18 – 1 = 17
Example
MS bn
SS bn 22.111


 11.055
df bn
2
MS wn
SS wn 33.50


 2.233
df wn
15
Fobt
MS bn 11.055


 4.951
MS wn
2.233
Example
• Fcrit for 2 and 15 degrees of freedom and a =
0.05 is 3.68
• Since Fobt = 4.951, the ANOVA is significant
• A post hoc test must now be performed
Example
 MS wn
HSD  (qk )
n


 2.233 
  3.675
  2.242



6



X 1  X 2  13.333  11.167  2.166
X 3  X 2  13.667  11.167  2.500
X 3  X 1  13.667  13.333  0.334
Example
Because 2.50 > 2.242 (HSD), the mean of sample
3 is significantly different from the mean of
sample 2.