Info

This verifies the property.

Inverses and Their Properties

For a given matrix A, the transpose A' is always derivable. On the other hand, its inverse matrix—another type of "derived" matrix—may or may not exist. The inverse of matrix A, denoted by A~', is defined only if A is a square matrix, in which case the inverse is the matrix that satisfies the condition

That is, whether A is pre- or postmultiplied by A ~the product will be the same identity matrix. This is another exception to the rule that matrix multiplication is not commutative.

The following points are worth noting:

1. Not every square matrix has an inverse—squareness is a necessary condition, but not a sufficient condition, for the existence of an inverse. If a square matrix A has an inverse, A is said to be nonsingular; if A possesses no inverse, it is called a singular matrix.

2. If A ~~1 does exist, then the matrix A can be regarded as the inverse of A ~just as A is the inverse of A. In short, A and A 1 are inverses of each other.

3. If A is n X n, then A 1 must also be n X w; otherwise it cannot be conformable for both pre- and postmultiplication. The identity matrix produced by the multiplication will also be n X «.

4. If an inverse exists, then it is unique. To prove its uniqueness, let us suppose that B has been found to be an inverse for A, so that

Now assume that there is another matrix C such that AC = CA = I. By premultiplying both sides of AB = I by C, we find that f

CAB = C/( = C) [by (4.8)] Since CA = I by assumption, the preceding equation is reducible to

That is, B and C must be one and the same inverse matrix. For this reason, we can speak of the (as against an) inverse of A.

5. The two parts of condition (4.12)—namely,^-1 = / and A~XA = I—actually imply each other, so that satisfying either equation is sufficient to establish the inverse relationship between A and A '. To prove this, we should show that if AA ~1 = I, and if there is a matrix B such that BA — I, then B = A ~1 (so that BA = / must in effect be the equation A ~ 'A = /). Let us postmultiply both sides of the given equation BA = I by A ~1; then

[associative law]

Therefore, as required,

Analogously, it can be demonstrated that, if A ~ 'A = I, then the only matrix C which yields CA ~1 = / is C = A.

Example 5 Let A =

multiplier in B can

and B

0 0

Post a comment