## Example

GDP GROWTH RATE, 1960-1985 AND RELATIVE PER CAPITA GDP, IN 119 DEVELOPING COUNTRIES

As an additional economic example of the polynomial regression model, consider the following regression results20:

where GDPG = GDP growth rate, percent (average for 1960-1985), and RGDP = relative per capita GDP, 1960 (percentage of U.S. GDP per capita, 1960). The adjusted R2 (adj R2) tells us that, after taking into account the number of regressors, the model explains only about 3.6 percent of the variation in GDPG. Even the unadjusted R2 of 0.053 seems low. This might sound a disappointing value but, as we shall show in the next chapter, such low R2's are frequently encountered in cross-sectional data with a large number of observations. Besides, even an apparently low R2 value can be statistically significant (i.e., different from zero), as we will show in the next chapter.

As this regression shows, GDPG in developing countries increased as RGDP increased, but at a decreasing rate; that is, developing economies were not catching up with advanced economies.21 This example shows how relatively simple econometric models can be used to shed light on important economic phenomena.

*7.11 PARTIAL CORRELATION COEFFICIENTS

Explanation of Simple and Partial Correlation Coefficients

In Chapter 3 we introduced the coefficient of correlation r as a measure of the degree of linear association between two variables. For the three-variable

20Source: The East Asian Economic Miracle: Economic Growth and Public Policy, A World Bank Policy Research Report, Oxford University Press, U.K, 1993, p. 29.

21If you take the derivative of (7.10.7), you will obtain dGDPG

dRGDP

showing that the rate of change of GDPG with respect to RGDP is declining. If you set this derivative to zero, you will get RGDP ^ 0.5082. Thus, if a country's GDP reaches about 51 percent of the U.S. GDP, the rate of growth of GDPG will crawl to zero. Optional.

230 PART ONE: SINGLE-EQUATION REGRESSION MODELS

regression model we can compute three correlation coefficients: r12 (correlation between Y and X2), r13 (correlation coefficient between Y and X3), and r23 (correlation coefficient between X2 and X3); notice that we are letting the subscript 1 represent Y for notational convenience. These correlation coefficients are called gross or simple correlation coefficients, or correlation coefficients of zero order. These coefficients can be computed by the definition of correlation coefficient given in (3.5.13).

But now consider this question: Does, say, r12 in fact measure the "true" degree of (linear) association between Y and X2 when a third variable X3 may be associated with both of them? This question is analogous to the following question: Suppose the true regression model is (7.1.1) but we omit from the model the variable X3 and simply regress Y on X2, obtaining the slope coefficient of, say, b12. Will this coefficient be equal to the true coefficient p2 if the model (7.1.1) were estimated to begin with? The answer should be apparent from our discussion in Section 7.7. In general, ri2 is not likely to reflect the true degree of association between Y and X2 in the presence of X3. As a matter of fact, it is likely to give a false impression of the nature of association between Y and X2, as will be shown shortly. Therefore, what we need is a correlation coefficient that is independent of the influence, if any, of X3 on X2 and Y. Such a correlation coefficient can be obtained and is known appropriately as the partial correlation coefficient. Conceptually, it is similar to the partial regression coefficient. We define r12 3 = partial correlation coefficient between Y and X2, holding X3 constant r13 2 = partial correlation coefficient between Y and X3, holding X2 constant r2 3.1 = partial correlation coefficient between X2 and X3, holding Y constant

These partial correlations can be easily obtained from the simple or zero-order, correlation coefficients as follows (for proofs, see the exercises)22:

 V(1 - r?3)(1 - 