We will analyze this estimation technique in some detail in Chapter 18, so we will only sketch the important results here. It is useful to consider the instrumental variables case, as it is fairly general and we can easily specialize it to the simpler regression model if that is appropriate. Thus, we depart from the model specification in (10-1), but at this point, we no longer require that E ^ | xi ] = 0. Instead, we adopt the instrumental variables formulation in Section 10.2.4. That is, our model is yt = x p + et E [et | z ] = 0
for K variables in xt and for some set of L instrumental variables, zt, where L > K. The earlier case of the generalized regression model arises if zt = xt, and the classical regression form results if we add fl = I as well, so this is a convenient encompassing model framework.
In the next section on generalized least squares estimation, we will consider two cases, first with a known fl, then with an unknown fl that must be estimated. In estimation by the generalized method of moments neither of these approaches is relevant because we begin with much less (assumed) knowledge about the data generating process. In particular, we will consider three cases:
• Generalized model: Cov[et, es | X, Z] = a 2rnts, where Z and X are the n x L and n x K observed data matrices. (We assume, as will often be true, that the fully general case will apply in a time series setting. Hence the change in the subscripts.) No specific distribution is assumed for the disturbances, conditional or unconditional.
The assumption E [ei | zi ] = 0 implies the following orthogonality condition:
By summing the terms, we find that this further implies the population moment equation,
Was this article helpful?