## Info

= a where I is an n x n identity matrix.

Matrix (C.2.2) [and its representation given in (C.2.3)] is called the variance-covariance matrix of the disturbances u; the elements on the main diagonal of this matrix (running from the upper left corner to the lower right corner) give the variances, and the elements off the main diagonal give the covariances.4 Note that the variance-covariance matrix is symmetric: The elements above and below the main diagonal are reflections of one another.

Assumption 3 states that the n x k matrix X is nonstochastic; that is, it consists of fixed numbers. As noted previously, our regression analysis is conditional regression analysis, conditional upon the fixed values of the X variables.

Assumption 4 states that the X matrix has full column rank equal to k, the number of columns in the matrix. This means that the columns of the X matrix are linearly independent; that is, there is no exact linear relationship among the X variables. In other words there is no multicollinearity. In scalar notation this is equivalent to saying that there exists no set of numbers A.i, k2,..., kk not all zero such that [cf. (7.1.8)]

where X^- = 1 for all i (to allow for the column of 1's in the X matrix). In matrix notation, (C.2.4) can be represented as k'x = 0 (C.2.5)

where k' is a 1 x k row vector and x is a k x 1 column vector.

4By definition, the variance of ui = E[ui — E(ui)]2 and the covariance between ui and Uj = E[ui — E(ui)][uJ — E(u;)]. But because of the assumption E(ui) = 0 for each i, we have the variance-covariance matrix (C.2.3).

APPENDIX C: THE MATRIX APPROACH TO LINEAR REGRESSION MODEL 931

If an exact linear relationship such as (C.2.4) exists, the variables are said to be collinear. If, on the other hand, (C.2.4) holds true only if = k2 = X3 = ... = 0, then the X variables are said to be linearly independent. An intuitive reason for the no multicollinearity assumption was given in Chapter 7, and we explored this assumption further in Chapter 10.