The concept of linear independence or dependence of vectors is closely linked to the concept of nonsingularity or singularity of matrices discussed in chapter y. We now investigate further the nonsingularity (singularity) properties of matrices by introducing the concept of the rank of a matrix.
Consider any arbitrary matrix of order m x n. U consists of it column vectors with m elements each, and w row vectors with n elements each. Therefore the n column. m-element vectors belong to R'", whereas the in row «-element vectors belong to R". Denote by c the maximum number of linearly independent columns of A so that c < n. If cis strictly less than/?, then there will be more than one subset of the n column vectors that consists of linearly independent vectors. For example, if« = 5, then it may be thai columns 1, 2, and 3 form a linearly independent subset, but so may columns 1.4. and 5. However, all five column vectors taken together will be linearly dependent. In this example, c = 3. If we take the setoff linearly independent columns and discard the remaining « - c columns, we can form a matrix B with c linearly independent columns of dimension in x c. We proceed by denoting by r the number of linearly independent rows of A. This number r must also be the number of linearly independent rows of B. Since each row of B has celements it follows that
Was this article helpful?