## Summary And Conclusions

1. This chapter introduced the simplest possible multiple linear regression model, namely, the three-variable regression model. It is understood that the term linear refers to linearity in the parameters and not necessarily in the variables.

2. Although a three-variable regression model is in many ways an extension of the two-variable model, there are some new concepts involved, such as partial regression coefficients, partial correlation coefficients, multiple correlation coefficient, adjusted and unadjusted (for degrees of freedom) R2, multicollinearity, and specification bias.

3. This chapter also considered the functional form of the multiple regression model, such as the Cobb-Douglas production function and the polynomial regression model.

4. Although R2 and adjusted R2 are overall measures of how the chosen model fits a given set of data, their importance should not be overplayed. What is critical is the underlying theoretical expectations about the model in terms of a priori signs of the coefficients of the variables entering the model and, as it is shown in the following chapter, their statistical significance.

5. The results presented in this chapter can be easily generalized to a multiple linear regression model involving any number of regressors. But the algebra becomes very tedious. This tedium can be avoided by resorting to matrix algebra. For the interested reader, the extension to the k-variable

CHAPTER SEVEN: MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF ESTIMATION 233

regression model using matrix algebra is presented in Appendix C, which is optional. But the general reader can read the remainder of the text without knowing much of matrix algebra.

EXERCISES Questions

7.1. Consider the data in Table 7.5. TABLE 7.5 