based on the MLE of f. Since each MLE requires the other, how can we proceed to obtain both? The answer is provided by Oberhofer and Kmenta (1974) who show that for certain models, including this one, one can iterate back and forth between the two estimators. (This is the same estimator we used in Section 11.7.2.) Thus, the MLEs are obtained by iterating to convergence between (13-66) and

The process may begin with the (consistent) ordinary least squares estimator, then (13-66), and so on. The computations are simple, using basic matrix algebra. Hypothesis tests about f may be done using the familiar Wald statistic. The appropriate estimator of the asymptotic covariance matrix is the inverse matrix in brackets in (13-55).

For testing the hypothesis that the off-diagonal elements of T are zero—that is, that there is no correlation across firms—there are three approaches. The likelihood ratio test is based on the statistic

^LR = T(ln theteroscedasticl - ln | t general |) = T ^ ln ôf - ln | 11 , (13-67)

where of are the estimates of of obtained from the maximum likelihood estimates of the groupwise heteroscedastic model and T is the maximum likelihood estimator in the unrestricted model. (Note how the excess variation produced by the restrictive model is used to construct the test.) The large-sample distribution of the statistic is chi-squared with n(n -1)/2 degrees of freedom. The Lagrange multiplier test developed by Breusch and Pagan (1980) provides an alternative. The general form of the statistic is where rfj is the ijth residual correlation coefficient. If every individual had a different parameter vector, then individual specific ordinary least squares would be efficient (and ML) and we would compute rij from the OLS residuals (assuming that there are sufficient observations for the computation). Here, however, we are assuming only a single-parameter vector. Therefore, the appropriate basis for computing the correlations is the residuals from the iterated estimator in the groupwise heteroscedastic model, that is, the same residuals used to compute of. (An asymptotically valid approximation to the test can be based on the FGLS residuals instead.) Note that this is not a procedure for testing all the way down to the classical, homoscedastic regression model. That case, which involves different LM and LR statistics, is discussed next. If either the LR statistic in (13-67) or the LM statistic in (13-68) are smaller than the critical value from the table, the conclusion, based on this test, is that the appropriate model is the groupwise heteroscedastic model.

For the groupwise heteroscedasticity model, ML estimation reduces to groupwise weighted least squares. The maximum likelihood estimator of f is feasible GLS. The maximum likelihood estimator of the group specific variances is given by the diagonal

element in (13-66), while the cross group covariances are now zero. An additional useful result is provided by the negative of the expected second derivatives matrix of the log-likelihood in (13-65) with diagonal T,

Eh X Xi

Was this article helpful?

## Post a comment