Social Research Glossary
Citation reference: Harvey, L., 2012-20, Social Research Glossary, Quality Research International, http://www.qualityresearchinternational.com/socialresearch/
This is a dynamic glossary and the author would welcome any e-mail suggestions for additions or amendments. Page updated 19 December, 2019 , © Lee Harvey 2012–2020.
|A fast-paced novel of conjecture and surprises|
Analysis of variance
Analysis of variance (ANOVA) tests whether there are significant differences between several means or correlation coefficients.
Analysis of variance (ANOVA) is a flexible statistical technique that has many uses. Two of the most important are (a) to test whether there are significant differences between the means of several different groups of observations, and (b) to test the significance of simple and multiple correlation coefficients.
ANOVA is a statistical test that is particularly useful where there are several means or correlations to test at the same time. Testing each separately by means of a z-test or t-test would increase the probability of falsely rejecting the null hypothesis (probabilities of error sum across tests, and with twenty tests and a 5% significance level it is virtually certain that a spuriously significant result will be reported). See significance tests.
Analysis of variance does not suffer from this problem of increased probability of a false rejection.
The first step in an analysis of variance is to obtain sums of squares. The total sum of squares (TSS) is the sum of the squares of the deviations of all the measurements from the mean of all the measurements.
The TSS may be split into components, the most basic of which are an explained sum of squares (ESS) and a residual sum of squares (RSS). ESS is the proportion of TSS that may be explained by a small number of factor/variables (such as sex, educational level etc.) in which a researcher may be interested. RSS is the proportion of TSS that cannot be so explained.
Thus TSS = ESS + RSS.
ESS can often, if desired, be split up into further parts corresponding to each single explanatory variable and/or the interaction(s) between the explanatory variables. To each sum of squares there is a corresponding mean squares obtained by dividing the sum of squares by the appropriate degrees of freedom. There is thus an explained mean square (EMS) and a residual mean square (RMS) and, if desired, mean squares corresponding to single variables or interactions.
The point of computing mean squares is that they are variance estimates and, as variance estimates, they may be compared using the F-statistic. The RMS is taken as the standard of comparison and if any of the other mean squares are shown to be sufficiently greater than the RMS then the variables corresponding to these mean squares explain a significant proportion of the variance. When this is so, such conclusions may be drawn as that there are differences between group means, that a multiple correlation is significant, and so on.
Colorado State University (1993–2013) defines analysis of variance as:
ANOVA (Analysis of Variance): A method of statistical analysis broadly applicable to a number of research designs, used to determine differences among the means of two or more groups on a variable. The independent variables are usually nominal, and the dependent variable is usual an interval.
ANCOVA (Analysis of Co-Variance): Same method as ANOVA, but analyzes differences between dependent variables.
Colorado State University, 1993–2013, Glossary of Key Terms available at http://writing.colostate.edu/guides/guide.cfm?guideid=90 , accessed 21 November 2019.
accessed 21 November 2019.
copyright Lee Harvey 2012–2020
copyright Lee Harvey 2012–2020