## Ex

Ex2,

Note: We are using the normal distribution because a2 is assumed for convenience to be known. Hence the use of 1.96, the 95% confidence factor for the normal distribution.

The standard errors corresponding to the various r23 values are obtained from Table 10.1.

354 PART TWO: RELAXING THE ASSUMPTIONS OF THE CLASSICAL MODEL

"Insignificant" t Ratios

Recall that to test the null hypothesis that, say, fa = 0, we use the t ratio, that is, 02/se (02), and compare the estimated t value with the critical t value from the t table. But as we have seen, in cases of high collinearity the estimated standard errors increase dramatically, thereby making the t values smaller. Therefore, in such cases, one will increasingly accept the null hypothesis that the relevant true population value is zero.13

A High R2 but Few Significant t Ratios

Consider the k-variable linear regression model:

In cases of high collinearity, it is possible to find, as we have just noted, that one or more of the partial slope coefficients are individually statistically insignificant on the basis of the t test. Yet the R2 in such situations may be so high, say, in excess of 0.9, that on the basis of the F test one can convincingly reject the hypothesis that fa = fa = ••• = fa = 0. Indeed, this is one of the signals of multicollinearity—insignificant t values but a high overall R2 (and a significant F value)!

We shall demonstrate this signal in the next section, but this outcome should not be surprising in view of our discussion on individual vs. joint testing in Chapter 8. As you may recall, the real problem here is the covari-ances between the estimators, which, as formula (7.4.17) indicates, are related to the correlations between the regressors.

Sensitivity of OLS Estimators and Their Standard Errors to Small Changes in Data

As long as multicollinearity is not perfect, estimation of the regression coefficients is possible but the estimates and their standard errors become very sensitive to even the slightest change in the data.

To see this, consider Table 10.3. Based on these data, we obtain the following multiple regression:

(0.7737) (0.1848) (0.0851) t = (1.5431) (2.4151) (0.0358) (10.5.6)

R2 = 0.8101 r23 = 0.5523 cov (fa, 03) = -0.00868 df = 2

13In terms of the confidence intervals, 2 = 0 value will lie increasingly in the acceptance region as the degree of collinearity increases.

CHAPTER TEN: MULTICOLLINEARITY 355