Suppose you are given a sample of n observations and the fol
Suppose you are given a sample of n observations and the following structure is hypothesized to generate the observed regressions:
Yi = 1 + 1 Xi + 1 Wi + 1 Zi + Ui for i = 1,..., n1
Yi = 2 + 2 Xi + 2 Wi + 2 Zi + Ui for i = n1+1,..., n1+n2
Yi = 3 + 3 Xi + 3 Wi + 3 Zi + Ui for i = n1+n2+1,...n
You wish to test the null hypothesis: Ho: 1 = 2 = 3 and 1 = 2 = 3. Develop the appropriate test to test Ho. Include in your development
(i) the assumptions you need;
(ii) the regression(s) you would run;
(iii) the statistic(s) you would compute;
(iv) the distributions of the computed statistic(s);
(v) the conditions under you would reject Ho.
Solution
(i) The assumptions you need:
1) Your dependent variable should be measured at the ordinal level.
2) One or more independent variables that are continuous, ordinal or categorical. However, ordinal independent variables must be treated as being either continuous or categorical.
3) There is no multicollinearity. Multicollinearity occurs when you have two or more independent variables that are highly correlated with each other. This leads to problems with understanding which variable contributes to the explanation of the dependent variable and technical issues in calculating an ordinal regression. Determining whether there is multicollinearity is an important step in ordinal regression. Unfortunately, testing for this assumption can require creating dummy variables for your categorical variables
4) You have proportional odds, which is a fundamental assumption of this type of ordinal regression model; that is, the type of ordinal regression that we are using in this guide.
(ii) the regression(s) you would run;
We would run multiple regression here.
(iii) the statistic(s) you would compute;
Here we use t-test for testing the slopes.
Also we use F-test for testing variances.
(iv) the distributions of the computed statistic(s);
t = b / SEb
where b is the slope.
SEb is the standard error of estimate.
F = MSR / MSE
where MSR is mean square for regression.
MSE is mean square for error.
(v) the conditions under you would reject Ho.
For taking decision we use critical value and P-value.
If calculated t > critical value
or P-value < alpha
where alpha is level of significance.
Reject H0 at alpha% level of significance.
If calculated F > critical value
or P-value < alpha
Reject H0 at alpha% level of significance.

