F

260 PART ONE: SINGLE-EQUATION REGRESSION MODELS

If F > Fa(k-\,n-k), reject Ho; otherwise you may accept Ho where Fa(k-i,n-k) is the critical F value at the a level of significance and (k — 1) numerator df and (n — k) denominator df. Alternatively, if the p value of F obtained from (8.5.13) is sufficiently low, reject H0.

Before moving on, return to Example 7.5 in Chapter 7. From regression (7.10.7) we observe that RGDP (relative per capita GDP) and RGDP squared explain only about 5.3 percent of the variation in GDPG (GDP growth rate) in a sample of 119 countries. This R2 of 0.053 seems a "low" value. Is it really statistically different from zero? How do we find that out?

Recall our earlier discussion in "An Important Relationship between R2 and F" about the relationship between R2 and the F value as given in (8.5.11) or (8.5.12) for the specific case of two regressors. As noted, if R2 is zero, then F is zero ipso facto, which will be the case if the regressors have no impact whatsoever on the regressand. Therefore, if we insert R2 = 0.053 into formula (8.5.12), we obtain

Under the null hypothesis that R2 = 0, the preceding F value follows the F distribution with 2 and 116 df in the numerator, respectively. (Note: There are 119 observations and two regressors.) From the F table we see that this F value is significant at about the 5 percent level; the p value is actually 0.0425. Therefore, we can reject the null hypothesis that the two regressors have no impact on the regressand, notwithstanding the fact that the R2 is only 0.053.

This example brings out an important empirical observation that in cross-sectional data involving several observations, one generally obtains low R2 because of the diversity of the cross-sectional units. Therefore, one should not be surprised or worried about finding low R2's in cross-sectional regressions. What is relevant is that the model is correctly specified, that the regressors have the correct (i.e., theoretically expected) signs, and that (hopefully) the regression coefficients are statistically significant. The reader should check that individually both the regressors in (7.10.7) are statistically significant at the 5 percent or better level (i.e., lower than 5 percent).

The "Incremental" or "Marginal" Contribution of an Explanatory Variable

In Chapter 7 we stated that generally we cannot allocate the R2 value among the various regressors. In our child mortality example we found that the R2 was 0.7077 but we cannot say what part of this value is due to the regressor PGNP and what part is due to female literacy rate (FLR) because of possible correlation between the two regressors in the sample at hand. We can shed more light on this using the analysis of covariance technique.

CHAPTER EIGHT: MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF INFERENCE 261

For our illustrative example we found that individually X2 (PGNP) and X3 (FLR) were statistically significant on the basis of (separate) t tests. We have also found that on the basis of the F test collectively both the regressors have a significant effect on the regressand Y (child mortality).

Now suppose we introduce PGNP and FLR sequentially; that is, we first regress child mortality on PGNP and assess its significance and then add FLR to the model to find out whether it contributes anything (of course, the order in which PGNP and FLR enter can be reversed). By contribution we mean whether the addition of the variable to the model increases ESS (and hence R2) "significantly" in relation to the RSS. This contribution may appropriately be called the incremental, or marginal, contribution of an explanatory variable.

The topic of incremental contribution is an important one in practice. In most empirical investigations the researcher may not be completely sure whether it is worth adding an X variable to the model knowing that several other X variables are already present in the model. One does not wish to include variable(s) that contribute very little toward ESS. By the same token, one does not want to exclude variable(s) that substantially increase ESS. But how does one decide whether an X variable significantly reduces RSS? The analysis of variance technique can be easily extended to answer this question.

Suppose we first regress child mortality on PGNP and obtain the following regression:

As these results show, PGNP has a significant effect on CM. The ANOVA table corresponding to the preceding regression is given in Table 8.5.

Assuming the disturbances ui are normally distributed and the hypothesis that PGNP has no effect on CM, we obtain the F value of

60,449.5

TABLE 8.5 ANOVATABLE FOR REGRESSION (8.5.14)

Source of variation SS df MSS

RSS 303,228.5 62 4890.7822

Total 363,678 63

262 PART ONE: SINGLE-EQUATION REGRESSION MODELS

which follows the F distribution with 1 and 62 df. This F value is highly significant, as the computed p value is 0.0008. Thus, as before, we reject the hypothesis that PGNP has no effect on CM. Incidentally, note that t2 = (—3.5156)2 = 12.3594, which is approximately the same as the F value of (8.5.15), where the t value is obtained from (8.5.14). But this should not be surprising in view of the fact that the square of the t statistic with n df is equal to the F value with 1 df in the numerator and n df in the denominator, a relationship first established in Chapter 5. Note that in the present example, n = 64.

Having run the regression (8.5.14), let us suppose we decide to add FLR to the model and obtain the multiple regression (8.2.1). The questions we want to answer are:

1. What is the marginal, or incremental, contribution of FLR, knowing that PGNP is already in the model and that it is significantly related to CM?

2. Is the incremental contribution of FLR statistically significant?

3. What is the criterion for adding variables to the model?

The preceding questions can be answered by the ANOVA technique. To see this, let us construct Table 8.6. In this table X2 refers to PGNP and X3 refers to FLR.

To assess the incremental contribution of X3 after allowing for the contribution of X2, we form

TABLE 8.6 ANOVATABLE TO ASSESS INCREMENTAL CONTRIBUTION OF A VARIABLE(S)

Source of variation

Was this article helpful?

0 0
Rules Of The Rich And Wealthy

Rules Of The Rich And Wealthy

Learning About The Rules Of The Rich And Wealthy Can Have Amazing Benefits For Your Life And Success. Discover the hidden rules and beat the rich at their own game. The general population has a love / hate kinship with riches. They resent those who have it, but spend their total lives attempting to get it for themselves. The reason an immense majority of individuals never accumulate a substantial savings is because they don't comprehend the nature of money or how it works.

Get My Free Ebook


Post a comment