## Appendix 7a

7A.1 DERIVATION OF OLS ESTIMATORS GIVEN IN EQUATIONS (7.4.3) TO (7.4.5)

Differentiating the equation

partially with respect to the three unknowns and setting the resulting equations to zero, we obtain

Simplifying these, we obtain Eqs. (7.4.3) to (7.4.5).

In passing, note that the three preceding equations can also be written as

J2utX3l = 0

which show the properties of the least-squares fit, namely, that the residuals sum to zero and that they are uncorrelated with the explanatory variables X2 and X3.

244 PART ONE: SINGLE-EQUATION REGRESSION MODELS

Incidentally, notice that to obtain the OLS estimators of the k-variable linear regression model (7.4.20) we proceed analogously. Thus, we first write

Differentiating this expression partially with respect to each of the k unknowns, setting the resulting equations equal to zero, and rearranging, we obtain the following k normal equations in the k unknowns:

£Y = np 1 + fa X2i + X/i + ■■■ + kY^Xki J2 YiX2i = X2i + X2i + X2iX/i + ••• + fa X2iXki YiX/i = X/i + X2iX/i + X2i + ••• + PkJ2 X/iXki

J2 YiXki= Xki+ fa I] X2iXki+ X3iXki + ■■■ + PkJ^Xki Or, switching to small letters, these equations can be expressed as

It should further be noted that the k-variable model also satisfies these equations:

7A.2 EQUALITY BETWEEN THE COEFFICIENTS OF PGNP IN (7.3.5) AND (7.6.2)

Letting Y = CM, X2 = PGNP, and X/ = FLR and using the deviation form, write yi = b^x/i + uu x2i = b2 3x3i + u2i

Now regress u1 on u2 to obtain:

u22i

Gujarati: Basic I. Single-Equation 7. Multiple Regression © The McGraw-Hill

Econometrics, Fourth Regression Models Analysis: The Problem of Companies, 2004

Edition Estimation

CHAPTER SEVEN: MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF ESTIMATION 245

Note that because the us are residuals, their mean values are zero. Using (1) and (2), we can write (3) as a1 =

E(X2i — b2 3X3i )2 Expand the preceding expression, and note that

J2yiXii IX-

Making these substitutions into (4), we get 