## Example 115

THE BREUSCH-PAGAN-GODFREY (BPG) TEST

As an example, let us revisit the data (Table 11.3) that were used to illustrate the Goldfeld-Quandt heteroscedasticity test. Regressing Yon X, we obtain the following:

Step 1.

Yt = 9.2903 + 0.6378X, se = (5.2314) (0.0286) RSS = 2361.153 R2 = 0.9466 (11.5.18)

Step 2.

Step 3. Divide the squared residuals o, obtained from regression (11.5.18) by 78.7051 to construct the variable p,.

Step 4. Assuming that p, are linearly related to X ,(= Z\) as per (11.5.14), we obtain the regression pi = -0.7426 + 0.0101X, se = (0.7529) (0.0041) ESS = 10.4280 R2 = 0.18 (11.5.19)

Step 5.

Under the assumptions of the BPG test © in (11.5.20) asymptotically follows the chi-square distribution with 1 df. [Note: There is only one regressor in (11.5.19).] Now from the chi-square table we find that for 1 df the 5 percent critical chi-square value is 3.8414 and the 1 percent critical x2 value is 6.6349. Thus, the observed chi-square value of 5.2140 is significant at the 5 percent but not the 1 percent level of significance. Therefore, we reach the same conclusion as the Goldfeld-Quandt test. But keep in mind that, strictly speaking, the BPG test is an asymptotic, or large-sample, test and in the present example 30 observations may not constitute a large sample. It should also be pointed out that in small samples the test is sensitive to the assumption that the disturbances o, are normally distributed. Of course, we can test the normality assumption by the tests discussed in Chapter 5.23

**See Adrian C. Darnell, A Dictionary of Econometrics, Edward Elgar, Cheltenham, U.K., 1994, pp. 178-179.

*3On this, see R. Koenker, "A Note on Studentizing a Test for Heteroscedasticity," Journal of Econometrics, vol. 17, 1981, pp. 1180-1*00.

CHAPTER ELEVEN: HETEROSCEDASTICITY 413

White's General Heteroscedasticity Test. Unlike the Goldfeld-Quandt test, which requires reordering the observations with respect to the X variable that supposedly caused heteroscedasticity, or the BPG test, which is sensitive to the normality assumption, the general test of heteroscedastic-ity proposed by White does not rely on the normality assumption and is easy to implement.i4 As an illustration of the basic idea, consider the following three-variable regression model (the generalization to the k-variable model is straightforward):

The White test proceeds as follows:

Step 1. Given the data, we estimate (i 1.5.2i) and obtain the residuals, u.

Step 2. We then run the following (aUxiliary) regression:

u2 = ai + ai Xii + a3 X3i + a4 X|. + a5 Xfi + a6 XiX + Vi

That is, the squared residuals from the original regression are regressed on the original X variables or regressors, their squared values, and the cross product(s) of the regressors. Higher powers of regressors can also be introduced. Note that there is a constant term in this equation even though the original regression may or may not contain it. Obtain the R2 from this (auxiliary) regression.

Step 3. Under the null hypothesis that there is no heteroscedasticity, it can be shown that sample size (n) times the R2 obtained from the auxiliary regression asymptotically follows the chi-square distribution with df equal to the number of regressors (excluding the constant term) in the auxiliary regression. That is, n ■ R2 ~ /¿if (11.5.23)

asy where df is as defined previously. In our example, there are 5 df since there are 5 regressors in the auxiliary regression.

Step 4. If the chi-square value obtained in (ii.5.23) exceeds the critical chi-square value at the chosen level of significance, the conclusion is that there is heteroscedasticity. If it does not exceed the critical chi-square value, there is no heteroscedasticity, which is to say that in the auxiliary regression (ii.5.2i), a2 = a3 = a4 = a5 = a6 = 0 (see footnote 25).

24H. White, "A Heteroscedasticity Consistent Covariance Matrix Estimator and a Direct Test of Heteroscedasticity," Econometrica, vol. 48, i980, pp. 8i7-8i8.

25Implied in this procedure is the assumption that the error variance of u ,af, is functionally related to the regressors, their squares, and their cross products. If all the partial slope coefficients in this regression are simultaneously equal to zero, then the error variance is the homoscedastic constant equal to ai.

414 PART TWO: RELAXING THE ASSUMPTIONS OF THE CLASSICAL MODEL