SAMPLE SYLLABUS WITH COMPUTATIONAL FORMULAS

SAMPLE SYLLABUS WITH COMPUTATIONAL FORMULAS
1. The Language of Statistics
A. What Is Statistics?
B. Why Study Statistics?
1. Personal Reasons
2. Professional Reasons
3. Our Goals
C. Statistics as a Second Language
D. What You Need to Use This Book Successfully
2. Definitions and Scaling
A. Statistics: Some Basic Vocabulary
1. Variable
2. Population and Parameters
3. Sample, Statistic, and Sampling
B. Scales of Measurement
1. The Nominal Scale
2. The Ordinal Scale
3. The Interval Scale
4. The Ratio Scale
C. Two Basic Uses of Statistics
1. Descriptive Statistics
2. Inferential Statistics
3. The Frequency Distribution
A. Defining the Frequency Distribution
1. Continuous Variables and Discrete Variables
2. Real Limits and Apparent Limits
B. Percentage or Relative Frequency and Cumulative Frequency Distributions
f
%age (percentage) = (100)
N
Percentage frequency from frequency
Cum f
Cum %age =
(100)
N
Cumulative percentage from cumulative frequency
4. Graphing Data
A. Rules for Graphing
B. The Frequency Polygon
1. Comparison of Distributions Using the Percentage or Relative Frequency Polygon
2. Shapes of Frequency Polygons
3. The Cumulative Frequency (or Cumulative Percentage) Polygon
C. The Histogram
D. The Bar Graph
E. The Stem-and-Leaf Plot
1. Comparing Groups With the Stem-and-Leaf Plot
F. The Line Graph
5. Measures of Central Tendency
A. The Mode, Mo
B. The Median, Md
1. Locating the Median by the Counting Method: Even Number of Scores
2. Locating the Median by the Counting Method: Odd Number of Scores
13
C. The Mean, or Arithmetic Average, X or µ
ΣX
ΣfX
or
X =
N
N
Mean
frequency distribution
1. Rounding Conventions
2. The Mean as a Balancing Point
Σ( X − X ) = 0
Mean for a
Σf ( X − X ) = 0
or
D. Comparing Measures of Central Tendency
E. Positions of Measures of Central Tendency on a Frequency Polygon
6. Measures of Dispersion and Standard Scores
A. The Range, R = HS – LS, highest score minus lowest score
B. The Average Deviation
Σ X−X
Σf X − X
or
AD =
AD =
N
N
Average deviation
Average deviation
for a frequency
distribution
C. The Variance and the Standard Deviation
1. The Variance, σ 2, s2
s2 =
(ΣX ) 2
N
N −1
ΣX 2 −
(ΣfX )2
N
N −1
ΣfX 2 −
or
Computational formulas for sample variance
2.
The Standard Deviation, σ, s
s2 =
(ΣX ) 2
N
or
N −1
ΣX 2 −
(ΣfX ) 2
N
N −1
ΣfX 2 −
Computational formulas for sample standard deviation
3.
4.
Visualizing the Standard Deviation on a Frequency Polygon
Approximating the Standard Deviation
Sapprox =
5.
R
4
The Sum of Squares
SS = ΣX 2 −
(ΣX ) 2
N
or
ΣfX 2 −
(ΣfX ) 2
N
D. Standard Scores (z Scores)
z = zs + X
or
z=
X −µ
σ
Formulas to find a z score from a raw score
X = zs + X
or
X = zσ + µ
Formulas to find a raw score from a z score.
14
7. Probability
A. Thinking About Probability
1. Probability and the Individual
2. Theoretical Probability
3. Real-World Probability
4. Subjective Probability
B. Rules of Probability
1. The Addition Rule
p(A or B) = p(A) + p(B)
Formula to find probability of either one event or another, unrelated events
2.
The Multiplication Rule
p(A, B) = p(A) × p(B)
Formula to find probability of a sequence of independent event
sp(A, B) = p(A) × p(B|A)
Formula to find probability of a sequence of nonindependent events
3. More on Conditional Probability
4. Bayesian Statistics
C. The Binomial Probability Distribution
8. The Normal Distribution
A. Curves and Probability
B. Characteristics of the Normal Curve
C. Review of z Scores
X −X
s
X = zs + X
z=
or
or
z=
X −µ
σ
X = zσ + µ
D. Using the Normal Curve Table
E. Finding Areas Under the Curve
1. Finding the Percentile Rank of a Score
2. Finding the Percentage of the Normal Curve Above a Score
3. Finding Percentage Frequency
4. Finding an Area Between Two Scores
5. Probability and Areas Under the Curve
F. Finding Scores Cutting Off Areas
1. Finding the Score That Has a Particular Percentile Rank
2. Finding Deviant Scores
3. Probability and Deviant Scores
9. Confidence Intervals and Hypothesis Testing
A. The Sampling Distribution of Means
1. Derivation
2. Properties
B. Estimation and Degrees of Freedom
1. Estimated Standard Error of the Mean
sX =
s
N
15
2.
t Score Based on Sampling Distribution of Means
tX =
X −µ
sX
C. Confidence Intervals
95%CI = ±t.05 s X + X
99%CI = ±t.01s X + X
D. Hypothesis Testing: One-Sample t Test
t=
1.
2.
3.
4.
5.
6.
X −µ
sX
Seven-Step Procedure for Testing the Null Hypothesis
Directional and Nondirectional Tests
Type I and Type II Errors
The Power of a Statistical Test
Meta-Analysis
Should Hypothesis Testing be Abandoned?
10. Significance of the Difference Between Two Sample Means
A. The Sampling Distribution of the Differences Between Sample Means
1. Derivation
2. Properties
3. Standard Error of the Difference Between Means
 ( N1 − 1) s12 + ( N 2 − 1) s22   1
1 
s X1 − X 2 = 
+


N1 + N 2 − 2

  N1 N 2 
Computational formula for the estimated standard error
of the differences, independent samples
4.
Two-Sample t Test—Independent Samples
tX
1− X2
=
X1 − X 2
s X1 − X 2
B. Computing t: Independent Samples
tX1−X 2 =
1.
2.
X1 − X 2
 ( N1 − 1) s12 + ( N 2 − 1) s22   1
1 
+



N1 + N 2 − 2

  N1 N 2 
One-Tailed Versus Two-Tailed Tests
Assumptions of the Two-Sample t Test
16
C. Computing t: Dependent Samples
1. Obtaining Dependent Samples
2. The Direct Difference Method
X
t= D
sX
t=
or
D
XD N
sD
Computational formula for the t test for dependent samples
sX =
sD
D
N
Formula for the estimated standard error of the differences
11. One-Way Analysis of Variance With Post Hoc Comparisons
A. Between-Subjects ANOVA
1. Visualization of ANOVA Concepts
B. Measuring Variability: The Sums of Squares
C. Computing the Sums of Squares
1. Total Sum of Squares
(Σ X ) 2
N
SStot = ΣX 2 −
Computational formula
2.
Sum of Squares Within Groups
SS W
(

ΣX g
= Σ ΣX g2 −
g 
Ng

)
2




Computational formula
3.
Sum of Squares Between Groups
 ΣX
g
SSb = Σ 
g 
Ng

(
4.
)
2

2
 − ( ΣX )

N

Computational formula
The ANOVA Summary Table
ANOVA Summary Table
Source
Between groups
SS
df
K–1
Within groups
N–K
Total
N–1
MS
SS b
df b
SS w
df w
F
F=
MS b
MS w
17
D. Post Hoc Comparisons
1. The Fisher LSD
 1
1 
+
LSDα = tα MSw 

N
N
2 
 1
2.
The Tukey HSD
HSDα = qα
MS w
Ng
E. Repeated Measures ANOVA
1. Subjects Sum of Squares
SSsubj
 ( ΣX )2  ( ΣX )2
m
−
= Σ
s 
K 
N


Computational formula
2.
Error Sum of Squares
SSerror = SS tot − SS b − SSsubj
Computational formula
12.Two-Way Analysis of Variance
A. Main Effects and Effects of Interaction
B. Advantages of the Two-Factor Design
C. Logic of the Two-Way ANOVA
D. Interpretation of Results
13.Correlation and Regression
A. Linear Correlation
1. Classes of Correlation
2. Correlation and Causation
B. The Pearson Product-Moment Correlation Coefficient
1. Computing the Pearson r
r=
N ΣXY − ΣX ΣY
[ N ΣX 2 − (ΣX )2 ][ N ΣY 2 − (ΣY )2
Computational formula for the Pearson r
2.
3.
4.
The Effect of Range on Correlation
Testing r for Significance
The Linear Regression Equation

 rs 
 rs  
Yˆ =  Y  X + Y −  Y  X 

 sX 
 s X  
5.
Regression equation for predicting Y from X
The Coefficient of Determination, r2
18
C. The Spearman Rank Order Correlation Coefficient
6Σ d 2
rS = 1 −
(
)
N N 2 −1
1. Other Correlation Coefficients
D. The General Linear Model
14. Chi Square
A. Nonparametric Tests
B. Chi-Square Goodness-of-Fit Test
χ2 = Σ
(O − E ) 2
E
Equation for chi square. Used for both the goodness-of-fit test
and the chi-square test of independence (significance)
C. The Chi-Square Test of Independence
1. Finding Expected Frequencies
E=
RT × CT
N
Equation to find expected frequencies
D. Degrees of Freedom
1. Goodness-of-Fit Test, df = K – 1
2. Test of Independence, df = (R – 1)(C – 1)
E. Restrictions on Chi Square
15. Alternatives to t and F
A. The Mann–Whitney U Test
U = N1 N 2 +
N ( N 1 + 1)
2
− R1
U ′ = N1 N 2 −U
Equations for U and U´z = U’
z=
N1 N 2
2
(N 1 )(N 2 )(N 1 + N 2 + 1)
12
U−
Equation to convert large-sample U or U´ to a z score
19
B. The Wilcoxon Matched-Pairs Signed-Ranks Test
1. T = Sum of the Ranks With the Less Frequent Sign
2. Formula to Convert Large-Sample T to z
z=
T−
N ( N + 1)
4
N ( N + 1)( 2 N + 1)
24
C. The Kruskal–Wallis One-Way ANOVA
H=
R2
12
Σ i − 3( N + 1)
N ( N + 1) Ni
20