## Info

Regression (10.5.6) shows that none of the regression coefficients is individually significant at the conventional 1 or 5 percent levels of significance, although 02 is significant at the 10 percent level on the basis of a one-tail t test.

Now consider Table 10.4. The only difference between Tables 10.3 and 10.4 is that the third and fourth values of X3 are interchanged. Using the data of Table 10.4, we now obtain

(0.7480) (0.2721) (0.1252) t = (1.6187) (1.4752) (0.2158) (10.5.7)

R2 = 0.8143 r23 = 0.8285 cov (02, 03) = -0.0282 df = 2

As a result of a slight change in the data, we see that 02, which was statistically significant before at the 10 percent level of significance, is no longer significant even at that level. Also note that in (10.5.6) cov(02, 03) = -0.00868 whereas in (10.5.7) it is -0.0282, a more than threefold increase. All these changes may be attributable to increased multicollinearity: In (10.5.6) r23 = 0.5523, whereas in (10.5.7) it is 0.8285. Similarly, the standard errors of 02 and 03 increase between the two regressions, a usual symptom of collinearity.

We noted earlier that in the presence of high collinearity one cannot estimate the individual regression coefficients precisely but that linear combinations of these coefficients may be estimated more precisely. This fact can be substantiated from the regressions (10.5.6) and (10.5.7). In the first regression the sum of the two partial slope coefficients is 0.4493 and in the second it is 0.4284, practically the same. Not only that, their standard errors are practically the same, 0.1550 vs. 0.1823.14 Note, however, the coefficient of X3 has changed dramatically, from 0.003 to 0.027.

14These standard errors are obtained from the formula se (ft + ft) = 7var(ft) + var(ft) + 2 cov(ft, ft)

Note that increasing collinearity increases the variances of ft and ft, but these variances may be offset if there is high negative covariance between the two, as our results clearly point out.

356 PART TWO: RELAXING THE ASSUMPTIONS OF THE CLASSICAL MODEL

### Consequences of Micronumerosity

In a parody of the consequences of multicollinearity, and in a tongue-in-cheek manner, Goldberger cites exactly similar consequences of micronumerosity, that is, analysis based on small sample size.15 The reader is advised to read Goldberger's analysis to see why he regards micronumerosity as being as important as multicollinearity.

10.6 AN ILLUSTRATIVE EXAMPLE: CONSUMPTION EXPENDITURE IN RELATION TO INCOME AND WEALTH

To illustrate the various points made thus far, let us reconsider the consumption-income example of Chapter 3. In Table 10.5 we reproduce the data of Table 3.2 and add to it data on wealth of the consumer. If we assume that consumption expenditure is linearly related to income and wealth, then, from Table 10.5 we obtain the following regression:

(6.7525) (0.8229) (0.0807) t = (3.6690) (1.1442) (-0.5261) (10.6.1) R2 = 0.9635 R2 = 0.9531 df = 7

Regression (10.6.1) shows that income and wealth together explain about 96 percent of the variation in consumption expenditure, and yet neither of the slope coefficients is individually statistically significant. Moreover, not only is the wealth variable statistically insignificant but also it has the wrong 