Info

24J. B. Ramsey, "Tests for Specification Errors in Classical Linear Least Squares Regression Analysis," Journal of the Royal Statistical Society, series B, vol. 31, 1969, pp. 350-371.

522 PART TWO: RELAXING THE ASSUMPTIONS OF THE CLASSICAL MODEL

(why? see Chapter 3), the residuals in this figure show a pattern in which their mean changes systematically with Yi • This would suggest that if we introduce Yi in some form as regressor(s) in (13.4.6), it should increase R2. And if the increase in R2 is statistically significant (on the basis of the F test discussed in Chapter 8), it would suggest that the linear cost function (13.4.6) was mis-specified. This is essentially the idea behind RESET. The steps involved in RESET are as follows:

1. From the chosen model, e.g., (13.4.6), obtain the estimated Yj-, that is, Yi •

2. Rerun (13.4.6) introducing Yi in some form as an additional regres-sor(s). From Figure 13.2, we observe that there is a curvilinear relationship between £ii and Y, suggesting that one can introduce Yf and Y3 as additional regressors. Thus, we run

3. Let the R2 obtained from (13.4.7) be R2new and that obtained from (13.4.6) be R^ld. Then we can use the F test first introduced in (8.5.18), namely,

F = _(R2new - RQld)/number of new regressors_ (g 5 18)

(1 - R2,ew) /(n - number of parameters in the new model)

to find out if the increase in R2 from using (13.4.7) is statistically significant.

4. If the computed F value is significant, say, at the 5 percent level, one can accept the hypothesis that the model (13.4.6) is mis-specified.

Returning to our illustrative example, we have the following results (standard errors in parentheses):

Y = 2140.7223 + 476.6557XJ - 0.09187^2 + 0.000119^3

(132.0044) (33.3951) (0.00620) (0.0000074) (13.4.9)

Note: Y2 and Y3 in (13.4.9) are obtained from (13.4.8). Now applying the F test we find

= 284-4035

The reader can easily verify that this F value is highly significant, indicating that the model (13.4.8) is mis-specified. Of course, we have reached the same conclusion on the basis of the visual examination of the residuals as well as the Durbin-Watson d value.

One advantage of RESET is that it is easy to apply, for it does not require one to specify what the alternative model is. But that is also its disadvantage because knowing that a model is mis-specified does not help us necessarily in choosing a better alternative.

Lagrange Multiplier (LM) Test for Adding Variables. This is an alternative to Ramsey's RESET test. To illustrate this test, we will continue with the preceding illustrative example.

If we compare the linear cost function (13.4.6) with the cubic cost function (13.4.4), the former is a restricted version of the latter (recall our discussion of restricted least-squares from Chapter 8). The restricted regression (13.4.6) assumes that the coefficients of the squared and cubed output terms are equal to zero. To test this, the LM test proceeds as follows:

1. Estimate the restricted regression (13.4.6) by OLS and obtain the residuals, u.

2. If in fact the unrestricted regression (13.4.4) is the true regression, the residuals obtained in (13.4.6) should be related to the squared and cubed output terms, that is, X2 and X3.

3. This suggests that we regress the u obtained in Step 1 on all the re-gressors (including those in the restricted regression), which in the present case means where v is an error term with the usual properties.

4. For large-sample size, Engle has shown that n (the sample size) times the R2 estimated from the (auxiliary) regression (13.4.11) follows the chi-square distribution with df equal to the number of restrictions imposed by the restricted regression, two in the present example since the terms X2 and Xf are dropped from the model.25 Symbolically, we write where asy means asymptotically, that is, in large samples.

5. If the chi-square value obtained from (13.4.12) exceeds the critical chi-square value at the chosen level of significance, we reject the restricted regression. Otherwise, we do not reject it.

asy asy

25R. F. Engle, "A General Approach to Lagrangian Multiplier Model Diagnostics," Journal of Econometrics, vol. 20, 1982, pp. 83-104.

524 PART TWO: RELAXING THE ASSUMPTIONS OF THE CLASSICAL MODEL

For our example, the regression results are as follows:

where Y is total cost and X is output. The standard errors for this regression are already given in Table 13.1.

When the residuals from (13.4.13) are regressed as just suggested in Step 3, we obtain the following results:

u* = -24.7 + 43.5443X* - 12.9615X2 + 0.9396X*3 se = (6.375) (4.779) (0.986) (0.059) (13.4.14)

Although our sample size of 10 is by no means large, just to illustrate the LM mechanism, we obtain nR2 = (10)(0.9896) = 9.896. From the chi-square table we observe that for 2 df the 1 percent critical chi-square value is about 9.21. Therefore, the observed value of 9.896 is significant at the 1 percent level, and our conclusion would be to reject the restricted regression (i.e., the linear cost function). We reached the similar conclusion on the basis of Ramsey's RESET test.