Up A Brief Quiz Inferential Statistics II Determining if Variances are Equal ANOVA with SPSS ANOVA with Excel

Chapter 14

Inferential Statistics II:

Beyond Two Means

Outline

Concepts

I.  Selecting an Appropriate Statistical Test

 

II.  Comparisons of More Than Two Means: Analysis of Variance

analysis of variance: a test of statistical significance that compares the means of two or more groups.

pooled variance (abbreviated sp2):  the mean of the variances of subgroups involved in comparisons using analysis of variance (weighted for sample sizes in the case of unequal sample sizes).

     A. Oneway Analysis of Variance

oneway analysis of variance: a statistical tool that permits comparison of several means for one independent variable

     B. What to Do after Finding  Statistical Significance

 

          1. Multiple Comparison Tests

multiple comparison tests: tests completed to identify locations of differences among means identified as significant with analysis of variance
--Tukey's HSD (abbreviation for John Tukey's Honestly Significant
   Difference test) used to make all possible comparisons when means
   are taken two at a time (the most powerful multiple comparison test
   for making pairwise comparisons)
--Scheffe's critical S: used to make complex comparisons of means

          2.   Determining Effect Sizes

Eta (h) (also known as the "correlation ratio"): directly interpreted as a correlation and used to compute effect sizes following analysis of variance or F. Eta may also be used to identify nonlinear as well as linear effects.

Eta squared2): a coefficient of determination computed from eta

          3.   Looking for Nonlinear Relationships

trend analysis: a method to isolate the nature of linear and nonlinear trends in effects identified as statistically significant by analysis of variance
mean square: a synonym for the variance as computed in analysis of variance (shorthand for "the mean of the squared differences of scores from their mean")

    --Interval Estimation Methods:  use of a range of values that

      capture population parameters; permits identification of
      differences among groups by looking for means that are outside
       the confidence interval around another mean

 

    C.  Factorial Analysis of Variance

variable factor: (also called factor) a variable that has been divided into levels or groups
factorial analysis of variance: analysis of variance applied to multiple independent variables that have been divided into levels or groups

main effects: dependent variable effects from independent variables separately
interaction effects: dependent variable effects from independent variables taken together
and involving variation arising from special combinations of levels of independent variables

          1.  Computing Factorial ANOVA
         

grand mean: in factorial analysis of variance, the average of the mean conditions from predictor variables

within groups variance: in analysis of variance, the pooled variance (sp2) residual.

error variance: another name for “within groups variance” or “residual variance.”

  1. Examining Effect Patterns

    --Main effects can be interpreted as if a simple one-way analysis of variance had been completed.

    --If an interaction effect is shown to be a crossed interaction (disordinal), then the researcher should not interpret main effects for variables involved, since such information would be misleading.

    --Yet, if the interaction effect is
uncrossed (ordinal), main effects
may be interpreted without fear of misleading the reader.

  --A Guide to Advanced Statistical   Methods

 

          Multiple regression correlation     

multiple regression correlation (a.k.a. multiple correlation): a correlation of multiple predictors with a single output variable
--beta weights: measures of the contribution made by each predictorZ
  to the overall correlation
--multicollinearity: the requirement that predictors be uncorrelated
multiple discriminant analysis: a method to predict membership in particular groups from a knowledge of a number of predictor variables (measured on interval or ratio levels)
log-linear analysis: extension of chi- square testing for analysis of more than two variables measured on the nominal level

         Multivariate analyses

multivariate analyses: analyses that deal with more than one dependent variable at a time
canonical correlation: an extension of multiple regression, correlating two sets of variables
--redundancy index: tells whether sets of variables should be
   interpreted differentially for additional canonical component roots
MANOVA (Multivariate Analysis of Variance): extension of analysis of variance for multiple dependent variables
--test of sphericity: a measure of variation indicating the
   interrelationship of dependent variables
multivariate multiple correlation: extension of multiple regression for many interrelated dependent measures
multivariate analysis of covariance: extension of MANOVA to adapt analysis of covariance for multiple interrelated dependent variables
Hotelling's T2: t test for intercorrelated dependent variables

         Modeling Methods

path models: use of correlational tools to interpret relationships to identify causal models with exogenous (input variable) sources, endogenous (mediating) variables, and dependent (output or criterion) variables
LISREL (Linear Structural Relations): a computer program to isolate relationships by examining covariances among variables

III.  Nonparametric Testing

 

     A.  The Nature of Nonparametric Tests

nonparametric tests: statistical methods that do not make assumptions about population distributions or population parameters (sometimes called "distribution-free" statistics)

            --The Randomization Assumption:
               one assumption made for nonparametric tests:
               randomization

 

      B.  Tests for Nominal Level Dependent Variables

chi-square test: designed to deal with "count" data; tests that permit examining observed frequencies of events with expected frequencies

            1.  The One Sample Chi-Square Test

one sample chi-square test (a.k.a. the (goodness of fit) "goodness of fit" test): a test of statistical significance for nominal level variables for which frequency data are obtained

chi-square (c2) distribution: a probability distribution of squared differences of scores.

equal probability hypothesis: a method of  determining expected frequencies for the one-sample chi-square test assuming an equal proportion of  events in each category

            2.  The Chi-Square Test of Independence

Chi-square test of independence: and adaptation of chi-square to the analysis of contingency tables

 

--factor analysis: a statistical method that
  helps the researcher discover and
  identify the unities or dimensions, called
  factors, behind many measures

            3.  Determining Effect Sizes

contingency coefficient: a method to compute effect sizes from the observed chi-square value