## Jointly Distributed Continuous Random Variables

Section 4.4 introduced joint distributions for discrete random variables. Many of the concepts and results discussed there extend quite naturally to the case of continuous random variables.

Definitions

Let Xi, X?,..., XK be continuous random variables.

(i) Their joint cumulative distribution function, FXl, X2, ■ ■ ■ , .^Uu x2,..., xK) expresses the probability that simultaneously Xt is less than xu X2 is less than x2, and so on; that is

Fx„Xl,XK(xux2,..., xK) = P(xl <v, n x: <.v: n • ■ • n xK <xK)

(ii) The cumulative distribution functions Fx,(x\), FX2(x2), . . . , FXk(xk) of the individual random variables are called their marginal distribution functions. For any /, FXi(Xj) is the probability that the random variable X, does not exceed the specific value x,.

(iii) The random variables are independent if and only if

Fxi.xz, ■ • • , = FXi(,\])Fy2l\2) - ■ ■ FXk(xk)

The notion of statistical independence here is precisely the same as in the discrete case, independence of a set of random variables implies that the probability distribution of any one of them is unaffected by the values taken by the others. Thus, for example, the assertion that consecutive daily changes in the price of a share of common stock are independent of one another implies that information about past price changes is of no value in assessing what is likely to happen tomorrow.

The notion of expectation extends to functions of jointly distributed continuous random variables. As in the case of discrete random variables, an important quantity of this kind is the covariance, which is used in assessing linear association between a pair of random variables.

### Definition

Let X and Y be a pair of continuous random variables, with respective means ¡xx and ¡xY. The expected value of (X - fxx)(Y - fxY) is called the covariance between Y and X. That is

An alternative but equivalent expression is

If the random variables X and Y are independent, then the covariance between them is 0. However, the converse is not necessarily true.

The results in Section 4.4 on means and variances of sums and differences of discrete random variables also hold for continuous random variables. For convenience, they are repeated here.

Sums and Differences of Random Variables

Let Xu X2,.. ., XK be K random variables with means ¿i,, fx2, ■ . ., /¿k and variances a-,2, a2,. . ., (tk2. The following properties hold:

(i) The mean of their sum is the sum of their means; that is

+ X2 + ■ ■ ■ + XK) = /A, + fi2 + ■ ■ ■ + ¿Li*

(ii) If the covariance between every pair of these random variables is 0, then the variance of their sum is the sum of their variances; that is

Var(X, + X2 + ■ • • + XK) = or + tr22 + ■ • ■ + ov

Let X and Y be a pair of random variables with means ¡xx and fxY and variances or and oy2. The following properties hold:

(iii) The mean of their difference is the difference of their means; that is

(iv) If the covariance between X and Y is 0, then the variance of their difference is the sum of their variances; that is

The results (ii) and (iv) hold only if the covariance between the random variables is zero. More generally, if X and Y are a pair of random variables with variances crx and oy2, and covariance Cov(X, 7), it can be shown that

fv * tini r A contractor is uncertain of the precise total costs for either materials or labor

EXAMPLE . • t • . , i • , i . . j g 2 for a project. It is believed that materials costs can be represented by a random van-

able with mean \$100,000 and standard deviation \$10,000. Labor costs are \$1,500 a day, and the number of days needed to complete the project can be represented by a random variable with mean 80 and standard deviation 12. Assuming that materials and labor costs are independent, what are the mean and standard deviation of total project cost (materials plus labor)?

Let the random variables Xt and X2 denote, respectively, materials and labor costs. Then, Xy has mean ¡jl{ = 100,000 and standard deviation cr{ = 10,000. For the random variable X2

fi2 = (1,500)(80) = 120,000 and <r2 = (1,500)(12) = 18,000 Since total project cost is Xy + X2, we have mean cost

/X, + /jl2 = 100,000 + 120,000 = \$220,000 and since Xx and X2 are independent, the variance of their sum is o-,2 + (t 2 = (10,000)2 + (18,000)2 = 424,000,000 Taking the square root, we find the standard deviation to be \$20,591.

Like Example 4.7, this example illustrates the reduction in risk that can follow from the diversification of investments. An investor has \$1,000, which can be allocated in any proportions to two alternative investments. The returns per dollar on these investments will be denoted by the random variables X and Y. It will be assumed that these random variables have the same mean, ¿u,, and the same variance, cr2, and that they are independent of one another. Suppose that the investor chooses to allocate \$a to the first investment, so that \$(1,000 — a) is allocated to the second. We now compare the merits of alternative allocations. The total return on the investment is

This random variable has expected value

This expected return is the same whatever choice of a is made. The variance of the total return is

Notice that, if either a = 0 or a = 1,000, so that the entire \$1,000 is allocated to just one of the investments, the variance of total return is l,000,000cr2. However, if \$500 is allocated to each investment, so that a — 500, the variance of total return is 500,000cr2. (This is the smallest possible value in this example.) Thus, by spreading the investment this way, as compared with allocating everything to just one of the possibilities, the investor can achieve the same expected return, but a much smaller variance—that is, a much lower level of risk. 