I could use some help with this statistics question Thank yo
Solution
16)
Two ANOVA requirements are
1. Approximately Normal Data
2. Equal Variances
Assumption #1: Your dependent variable should be measured at the interval or ratio level (i.e., they are continuous). Examples of variables that meet this criterion include revision time (measured in hours), intelligence (measured using IQ score), exam performance (measured from 0 to 100), weight (measured in kg), and so forth
Assumption #3: You should have independence of observations, which means that there is no relationship between the observations in each group or between the groups themselves. For example, there must be different participants in each group with no participant being in more than one group. This is more of a study design issue than something you can test for, but it is an important assumption of the one-way ANOVA. If your study fails this assumption, you will need to use another statistical test instead of the one-way ANOVA (e.g., a repeated measures design).
Assumption #4: There should be no significant outliers. Outliers are simply single data points within your data that do not follow the usual pattern (e.g., in a study of 100 students\' IQ scores, where the mean score was 108 with only a small variation between students, one student had a score of 156, which is very unusual, and may even put her in the top 1% of IQ scores globally). The problem with outliers is that they can have a negative effect on the one-way ANOVA, reducing the validity of your results. Fortunately, when using SPSS Statistics to run a one-way ANOVA on your data, you can easily detect possible outliers.
18)
There are four principal assumptions which justify the use of linear regression models for purposes of inference or prediction:
(i) linearity and additivity of the relationship between dependent and independent variables:
(a) The expected value of dependent variable is a straight-line function of each independent variable, holding the others fixed.
(b) The slope of that line does not depend on the values of the other variables.
(c) The effects of different independent variables on the expected value of the dependent variable are additive.
(ii) statistical independence of the errors (in particular, no correlation between consecutive errors in the case of time series data)
(iii) homoscedasticity (constant variance) of the errors
(a) versus time (in the case of time series data)
(b) versus the predictions
(c) versus any independent variable
(iv) normality of the error distribution.
