Zero Order Correlations Multicollinearity

a. "Since the zero-order correlations are very high, there must be serious multicollinearity." Comment.

b. Would you drop variables Xi2 and Xi3 from the model?

c. If you drop them, what will happen to the value of the coefficient of Xi?

378 PART TWO: RELAXING THE ASSUMPTIONS OF THE CLASSICAL MODEL

10.11. Stepwise regression. In deciding on the "best" set of explanatory variables for a regression model, researchers often follow the method of stepwise regression. In this method one proceeds either by introducing the X variables one at a time (stepwise forward regression) or by including all the possible X variables in one multiple regression and rejecting them one at a time (stepwise backward regression). The decision to add or drop a variable is usually made on the basis of the contribution of that variable to the ESS, as judged by the F test. Knowing what you do now about multicollinearity, would you recommend either procedure? Why or why not?*

10.12. State with reason whether the following statements are true, false, or uncertain:

a. Despite perfect multicollinearity, OLS estimators are BLUE.

b. In cases of high multicollinearity, it is not possible to assess the individual significance of one or more partial regression coefficients.

c. If an auxiliary regression shows that a particular R2 is high, there is definite evidence of high collinearity.

d. High pair-wise correlations do not suggest that there is high multi-collinearity.

e. Multicollinearity is harmless if the objective of the analysis is prediction only.

f. Ceteris paribus, the higher the VIF is, the larger the variances of OLS estimators.

g. The tolerance (TOL) is a better measure of multicollinearity than the VIF.

h. You will not obtain a high R2 value in a multiple regression if all the partial slope coefficients are individually statistically insignificant on the basis of the usual t test.

i. In the regression of Y on X2 and X3, suppose there is little variability in the values of X3. This would increase var (fa). In the extreme, if all X3 are identical, var (fa) is infinite.

10.13. a. Show that if r^ = 0 for i = 2, 3, . . . , k then

b. What is the importance of this finding for the regression of variable Xi(= Y) on X2, X3, ..., Xk?

10.14. Suppose all the zero-order correlation coefficients of X1(= Y), X2, .. ., Xk are equal to r.

b. What are the values of the first-order correlation coefficients?

"10.15. In matrix notation it can be shown (see Appendix C) that

ß= (X'X)-1 X'y a. What happens to ß when there is perfect collinearity among the X's?

b. How would you know if perfect collinearity exists?

See if your reasoning agrees with that of Arthur S. Goldberg and D. B. Jochems, "Note on Stepwise Least-Squares," Journal of the American Statistical Association, vol. 56, March 1961, pp. 105-110. "Optional.

CHAPTER TEN: MULTICOLLINEARITY 379

*10.16. Using matrix notation, it can be shown var-cov (0) = a2(X'X)-1

What happens to this var-cov matrix:

a. When there is perfect multicollinearity?

b. When collinearity is high but not perfect? *10.17. Consider the following correlation matrix:

Was this article helpful?

+1 0
Rules Of The Rich And Wealthy

Rules Of The Rich And Wealthy

Learning About The Rules Of The Rich And Wealthy Can Have Amazing Benefits For Your Life And Success. Discover the hidden rules and beat the rich at their own game. The general population has a love / hate kinship with riches. They resent those who have it, but spend their total lives attempting to get it for themselves. The reason an immense majority of individuals never accumulate a substantial savings is because they don't comprehend the nature of money or how it works.

Get My Free Ebook


Post a comment