## Properties of Conditional Expectation and Conditional Variance

1. If f (X) is a function of X, then E( f (X) | X) = f (X), that is, the function of X behaves as a constant in computation of its expectation conditional on X. Thus, [E(X31X)] = E(X3); this is because, if X is known, X3 is also known.

For example, E[XY + cX2 | X] = XE(Y | X) + cX2, where c is a constant.

3. IfXand Yare independent, E(Y | X) = E(Y). That is, if Xand Yare independent random variables, then the conditional expectation of Y, given X, is the same as the unconditional expectation of Y.

4. The law of iterated expectations. It is interesting to note the following relation between the unconditional expectation of a random variable Y, E(Y), and its conditional expectation based on another random variable X, E(Y | X):

This is known as the law of iterated expectations, which in the present context states that the marginal, or unconditional, expectation of Y is equal to the expectation of its conditional expectation, the symbol EX denoting that the expectation is taken over the values of X. Put simply, this law states that if we first obtain E(Y | X) as a function of X and take its expected value over the distribution of X values, you wind up with E(Y), the unconditional expectation of Y. The reader can verify this using the data given in Example 4.

886 APPENDIX A: A REVIEW OF SOME STATISTICAL CONCEPTS

5. If X and Y are independent, then var (Y | X) = var (Y).

6. var (Y) = E[var(Y | X)] + var[E(Y | X)]; that is, the (unconditional) variance of Y is equal to expectation of the conditional variance of Y plus the variance of the conditional expectation of Y.

### Higher Moments of Probability Distributions

Although mean, variance, and covariance are the most frequently used summary measures of univariate and multivariate PDFs, we occasionally need to consider higher moments of the PDFs, such as the third and the fourth moments. The third and fourth moments of a univariate PDF f(x) around its mean value (p) are defined as

Third moment: E(X — p)3 Fourth moment: E(X — p)4

In general, the rth moment about the mean is defined as rth moment: E(X — p)r

The third and fourth moments of a distribution are often used in studying the "shape" of a probability distribution, in particular, its skewness, S (i.e., lack of symmetry) and kurtosis, K (i.e., tallness or flatness), as shown in Figure A.3.

One measure of skewness is defined as c E( X — p)3

third moment about the mean cube of the standard deviation

A commonly used measure of kurtosis is given by 