Académique Documents
Professionnel Documents
Culture Documents
Matrix Arithmetic
In this note we explore matrix arithmetic for its own sake. The author introduces it in Chapter
Four using linear transformations. While his approach is quite rigorous, matrix arithmetic can be
studied after Chapter One. This note assumes that Chapter
One has been completed.
1 9
1 9 7
1. Let A =
and B = 0 1 . Since the size of A is 2 3 and that of B is
0 1 5
7 5
3 2, A 6= B. Do note, however, that the entries of A and B are the same.
2
x yx
1
xy
2. Find all values of x and y so that
=
. We see that the size of
2
y2
x+1
1
each matrix is 2 2. So we set the corresponding entries equal:
x2 = 1
2 =x+1
yx = xy
y2 = 1
We see that x = 1 and y = 1. From 2 = x+1, we get that x must be 1. From y x = xy,
we get that 2y = 2x and so x = y. Thus y is also 1.
1
As for the second question, we have been doing this for quite a while now: Adding, subtracting,
multiplying, and dividing(when possible) real numbers. So how can we add and subtract two
matrices? Eventually we will multiply matrices, but for now we consider another multiplication.
Here are the definitions.
Definition 2 Let A = (aij ) and B = (bij ) be m n matrices. We define their sum, denoted by
A + B, and their difference, denoted by A B, to be the respective matrices (aij + bij ) and (aij bij ).
We define scalar multiplication by for any r R, rA is the matrix (raij ).
These definitions should appear quite natural: When two matrices have the same size, we just
add or subtract their corresponding entries, and for the scalar multiplication, we just multiply each
entry by the scalar. Here are some examples.
Example 2
Let
2 3
1 2
1
2
3
A=
,B=
, and C =
.
1 2
6 2
1 2 3
Compute each of the following, if possible.
1. A + B.
Since A and B are both
add them.
2
2 matrices, we can
Here we go: A + B =
2 3
1 2
2 + (1)
3+2
1 5
+
=
=
.
1 2
6 2
1 + 6 2 + (2)
5 0
2. B A.
Since
A and
B
are
1 2
2
6 2
1
both
2 2 matrices, wecan subtract them. Here we go: B A =
3
1 2
23
3 1
=
=
.
2
6 (1) 2 2
7 4
3. B + C.
No can do. B and C have different sizes: B is 2 2 and C is 2 3.
4. 4C.
We just multiply each entry of C by 4: 4C =
41
42
43
4 (1) 4 (2) 4 (3)
4
8
24
.
4 8 24
5. 2A 3B.
These matrices have the same size, so
do the
multiplication
first and then the
well
scalar
4 6
3 6
7
0
subtraction. Here we go: 2A 3B =
=
.
2 4
18 6
20 10
Matrix arithmetic has some of the same properties as real number arithmetic.
Properties of Matrix Arithmetic
Let A, B, and C be m n matrices and r, s R.
1. A + B = B + A
2. A + (B + C) = (A + B) + C
3. r(A + B) = rA + rB
4. (r + s)A = rA + sA
5. (rs)A = r(sA)
Definition 3
b1
b2
We take a row vector a1 a2 ap with p entries and a column vector .. with p entries
.
bp
b1
b2
and define their product, denoted by a1 a2 ap .. , to be the real number a1b1 + a2b2 +
.
bp
+ ap bp. Notice that were just taking the sum of the products of the corresponding entries and
that we may view a real number as a 1 1 matrix. Lets do a couple of examples.
Example 3
Multiply!
3
1. 2 3 4 4 = 2(3) + 3(4) + 4(5) = 38.
5
2
2
2. 1 2 2 3
1 = 1(2) + 2(2) + (2)(1) + 3(2) = 2.
2
Now well multiply a general matrix by a column. Afterall, we can think of a matrix as several
row vectors of the same size put together. To do such a multiplication, the number of entries in
each row must be the number of entries in the column and then we multiply each row of the matrix
by the column.
Definition 4 Let A be an m p matrix and b a p 1 column vector. We define their product,
denoted by Ab, to be the m 1 column vector whose i-th entry, 1 i m, is the product of the
i-th row of A and b. Here is a couple of examples.
Example 4
Multiply the matrix by the column.
1
1 2 3
1(1) + 2(2) + 3(3)
4
1.
2
=
=
.
2 1 2
2(1) + 1(2) + 2(3)
6
3
2 2
2(5) + (2)(1)
12
5
2.
0
3
=
0(5) + 3(1)
= 3.
1
1 4
1(5) + 4(1)
9
Now we can extend this multiplication to appropriately sized arbitrary matrices. We can think
of a matrix as several column vectors of the same size put together. To multiply a row by a column,
we must be sure that they have the same number of entries. This means that the number of columns
of our first matrix must be the number of rows of the second.
Definition 5
Let A be an m p matrix and B a p n matrix. We define their product, denoted by AB, to be
the m n matrix whose ij-entry, 1 i m and 1 j n, is the product of the i-th row of A and
the j-th column of B. Here are a few examples.
Example 5
Multiply, if possible.
1 2
4 3
1. Let A =
and B =
. Since both matrices are 2 2, we can find AB and
3 4
2 1
BA. Note
that
the size of
both products is 2 2. Here we go:
1 2
4 3
AB =
=
3 4
2 1
st
1 row of A times 1st column of B 1st row of A times 2nd column of B
=
2nd row of A times 1st column of B 2nd row of A times 2nd column of B
1(4) + 2(2) 1(3) + 2(1)
0 1
=
and
3(4) + 4(2) 3(3) + 4(1)
4 5
4 3
1 2
BA =
=
2 1
3 4
st
1 row of B times 1st column of A 1st row of B times 2nd column of A
=
2nd row of B times 1st column of A 2nd row of B times 2nd column of A
4(1) + (3)(3) 4(2) + (3)(4)
5 4
=
.
2(1) + 1(3)
2(2) + 1(4)
1
0
Did you notice what just happened? We have that AB 6= BA! Yes, its true: Matrix
multiplication is not commutative.
2 2 9
1 2 3
2. Can we multiply
?
1 0 8
5 2 3
No. The first matrix is 2 3 and then second is also 2 3. The number of columns of the
first is not the same as the number of rows of the second.
1 2
2 2 9
3. Can we multiply
5 2?
1 0 8
1 3
Yes, the firstis 2
3 and the second is 3 2, so their product is 2 2. Here we go:
1 2
3 35
2(1) + 2(5) + 9(1)
2(2) + 2(2) + 9(3)
2 2 9
5 2 =
.
=
1(1) + 0(5) + 8(1) 1(2) + 0(2) + 8(3)
9 22
1 0 8
1 3
Now we state some nice, and natural,
Properties of Matrix Multiplication: Let A, B, and C be matrices of the appropriate sizes
and r R. Then
1. A(BC) = (AB)C
3. A(B + C) = AB + AC
4. (A + B)C = AC + BC
c2
b2
as a1 a2 ap and the j-th columns of B and C as .. and .. , respectively. Then the
.
.
cp
bp
b 1 + c1
b2 + c2
j-th column of B + C is .. . So the ij-entry of A(B + C) is the product of the i-th row of
.
b p + c2
A and the j-th column of B + C. Multiplying
and then using the usual properties of real number
b 1 + c1
b2 + c2
arithmetic, we have a1 a2 ap .. =
.
b p + c2
a1(b1 + c1) + a2(b2 + c2 ) + + ap (bp + cp ) =
a 1 b 1 + a 1 c1 + a 2 b 2 + b 2 c2 + + a p b p + a p cp =
(a1b1 + a2b2 + + apbp ) + (a1c1 + a2c2 + + ap cp ).
Notice that the expressions in parentheses are the products of the i-th row of A with each of the
j-th columns of B and C and that the sum of these two is the ij-entry of AB + AC. So were done.
Now we will prove that last statement, about this mysterious identity matrix. We need a definition first: The main diagonal of a matrix A consists of the entries of the from aii . Let M = (mij )
be an n n matrix. Let I be the n n matrix whose main diagonal entries are all 10 s and all of its
1 0 0 0 0
0 1 0 0 0
0 0 1 0 0
..
other entries 00 s, i.e., I = .. .. .. . .
. Since M and I are both n n, thier products
.
.
. . .
. . .
. . ..
.. .. ..
. .
0 0 0 0 1
IM and MI are both n n. Notice that the i-th row of I is the row vector whose i-th entry is
1 and all others 0s. So when we multiply this i-th row of I by the j-th column of M, the only
entry in the column that gets multiplied by the 1 is mij . Thus IM = M. Now notice that the j-th
column of I is the column vector whose j-th entry is a 1 and all others 0s. So when we multiply
the i-th row of M by the j-th column of I, the only entry in the row that gets multiplied the 1 is
mij . Thus MI = M. The proof that I is unique is quite similar to that of the zero matrix. And
were done.
have A = ..
..
.. , x = .. , and b = .. . So our matrix equation Ax = b
.
.
.
.
.
.
.
.
am1 am2 amn
xn
bm
represents the system of linear equations, which is a much more conciseway
of writing a system.
u1
u2
It also provides a more convenient way of determining whether or not .. is a solution to the
.
un
system: Just check whether or not Au = b. Lets do an example.
Example 6
Consider the linear system
2x y
=0
x
+z =4 .
x + 2y z = 1
1. Write it as a matrix equation. So following the above, we let A be the matrix of coefficients,
7
x the columnvector of
and b the column vector
We have
the
variables,
of the constants.
A
=
2 1 0
x
0
2 1 0
x
0
1 0
1 , x = y , b = 4 , and the equation is 1 0
1 y = 4 .
1 2 2
z
1
1 2 2
z
1
1
1
2 1 0
1
2(1) 1(2) + 0(3)
Let u = 2. Then multiplying, we have Au = 1 0
1 2 = 1(1) + 0(2) + 1(3) =
3
1 2 2
3
1(1) + 2(2) 2(3)
0
1
4 = b. So 2 is a solution to the system.
1
3
2
3. Determine whether or not 4 is a solution.
2
2
2 1 0
2
2(2) 1(4) + 0(2)
1 4 = 1(2) + 0(4) + 1(2) =
Let v = 4. Then multiplying, we have Av = 1 0
2
1 2 2
2
1(2) + 2(4) 2(2)
0
2
4 6= b. So 4 is not a solution to the system.
6
2
Invertible Matrices
We know that every nonzero real number x has a multiplicative inverse, namely x1 = 1/x, as
xx1 = 1. Is there an analogous inverse for matrices? That is, for any nonzero square matrix A, is
there a matrix B such that AB = BA =I, where I is the appropriate identity matrix?
Consider
1 0
a b
the specific nonzero matrix A =
. Lets try to find its B. So let B =
. Then
0 0
c d
a b
AB =
. Oh. The matrix AB has a row of 0s, so it can never be the identity, which is
0 0
1 0
. So the answer to our question is no. Here, then, is a definition:
0 1
Definition 6
We will say that an n n matrix A is invertible provided there is an n n matrix B for which
AB = BA = I. This B is called an inverse of A.
Notice that II = I, so there is at least one invertible matrix for each possible size. Are there
more? Why, yes, there are. Thanks for asking. Heres one now.
Example
7
2 1
1 1
2(1) + 1(1) 2(1) + 1(2)
Let A =
and B =
. Multiplying, we get that AB =
=
1 1
1 2
1(1) + 1(1) 1(1) + 1(2)
1 0
1(2) 1(1)
1(1) 1(1)
1 0
and BA =
=
. So A is invertible. Perhaps youve
0 1
1(2) + 2(1) 1(1) + 2(1)
0 1
noticed that B is also invertible.
8
As you may have guessed, there are lots of invertible matrices. But there are also lots of
matrices that are not invertible. Before determining a method to determine whether or not a
matrix is invertible, lets state and prove a few results about invertible matrices.
Proposition 1
Let A be an n n invertible matrix. If B and C are two inverses for A, then B = C. In other
words, the inverse of a square matrix, if it exists, is unique.
Proof
Let A, B, and C be n n matrices. Assume that A is invertible and B and C are its inverses. So
we have that AB = BA = AC = CA = I. Now B = BI = B(AC) = (BA)C = IC = C. Notice
that we used the associativity of matrix multiplication here.
Now we can refer to the inverse of a square matrix A and so we will write its inverse as A1 and
read it as A inverse. In this case we have AA1 = A1A = I. Next we prove a theorem about
the inverses of matrices.
Proposition 2
Let A and B be n n invertible matrices. Then
1. A1 is invertible and (A1 )1 = A and
2. AB is invertible and (AB)1 = B 1 A1 .
Note that we reverse the order of the inverses in the product. This is due to the fact that matrix
multiplication is not commutative.
Proof
Let A and B be n n invertible matrices. Then AA1 = A1A = I. So A satisfies the definition
for A1 being invertible. Thus (A1 )1 = A.
To show that AB is invertible, we will just multiply, taking full advantage that matrix multiplication is associative:
(AB)(B 1A1 ) = A(BB 1)A1 = AIA1 = AA1 = I
(B 1 A1 )(AB) = B 1 (A1 A)B = B 1IB = B 1 B = I.
and
A (Au) = A1 b
(A1 A)u = A1 b
Iu = A1 b
u = A1 b.
10
1 1 | 0 1
2 1 | 1 0
0 3 | 1 2
0 1 | 1/3 2/3
1 0 | 1/3 1/3
So we see that the reduced echelon form of A is the identity. Thus
0 1 | 1/3 2/3
1/3 1/3
1
A is invertible and A =
. We can rewrite this inverse a bit more nicely
1/3 2/3
1 1 1
1
.
by factoring out the 1/3: A =
3 1 2
1 0 2
(b) Let A = 1 1 2. We form giant augmented matrix (A|I) and row reduce:
2 2 1
1 0 2 | 1 0 0
1 0 2 | 1 0 0
1 0 2 | 1
0 0
1 1 2 | 0 1 0 0 1 0 | 1 1 0 0 1 0 | 1
1 0
2 2 1 | 0 0 1
0 2 3 | 2 0 1
0 0 3 | 4 2 1
1 0 2 | 1
0
0
1 0 0 | 5/3 4/3 2/3
0 1 0 | 1
1
0 0 1 0 |
1
1
0 .
0 0 1 | 4/3 2/3 1/3
0 0 1 | 4/3
2/3 1/3
5 4 2
1
= 3
3
0 .
3
4
2 1
1 1
(c) Let B =
. We form the giant augmented matrix (B|I) and row reduce:
1 1
1 1 | 1 0
1 1 | 1 0
inverse weve already found. The matrix equation for this system is Ax = b where x = y
z
5 4 2
3
3
3
1
1
3
3
0
3 = 0.
and b = 3 . Then x = A b =
3
6
4
2 1
6
0
The following theorem tells us that if the product of two square matrices is the identity, then
they are in fact inverses of each other.
Theorem 2
Let A and B be n n matrices and I the n n identity matrix. If AB = I, then A and B are
invertible and A1 = B.
Proof
Let A and B be n n matrices, I the n n identity matrix, and the zero vector in Rn . Assume
AB = I. We will use the IMT Part I to prove that B is invertible first. Consider the matrix
equation Bx = and let u Rn be a solution. So we have
Bu =
A(Bu) = A
(AB)u =
Iu =
u = .
The only solution to Bx = is x = . Hence by the IMT Part I, B is invertible. So B 1 exists.
Then multiplying both sides of AB = I on the right by B 1 gives us that A = B 1 . Since B 1 is
invertible, A is too and A1 = (B 1 )1 = B.
12
Transposes
We finish this section off with whats called the transpose of a matrix. Heres the definition.
Definition 7
Let A = (aij ) be an m n matrix. The transpose of A, denoted by AT , is the matrix whose i-th
column is the i-th row of A, or equivalently, whose j-th row is the j-th column of A. Notice that
AT is an n m matrix. We will write AT = (aTji ) where aTji = aij . Here is a couple of examples.
Example 9
1 2 3
1. Let A =
. The first row of A is 1 2 3 and the second row is 4 5 6 . So these
4 5 6
1 4
become the columns of AT , that is, AT = 2 5. Alternatively, we see that the columns of
3 6
1
2
3
A are
,
, and
. So these become the rows of AT , as we can see above.
4
5
6
1 0 2
2. Let B = 4 1 9. We make the rows of B the columns of B T . Doing so, we get
2 3 0
1 4 2
1 3.
BT = 0
2
9 0
Note that the main diagonals of a matrix and its transpose are the same.
Proposition 3
Let A and B be appropriately sized matrices and r R. Then
1. (AT )T = A
2. (A + B)T = AT + B T
3. (rA)T = rAT
4. (AB)T = B T AT
Proof
The first three properties seem perfectly natural. Some time when you get bored, you should try
to prove them. But what about that multiplication one? Does that seem natural? Maybe. Given
how the inverse of the product of matrices works, maybe this is fine. Lets prove it. Let A be an
m p matrix and B a p n matrix. Then AB is an m n matrix. This means that (AB)T is
an n m matrix. Then B T is an n p matrix and AT is a p m matrix. So multiplying B T AT
makes sense and its size is also n m. But what about their corresponding entries? The ji-entry of
(AB)T is the ij-entry of AB, which is the i-th row
ofA times the j-th column of B. For simplicity,
b1
b2
let a1 a2 ap be the i-th row of A and .. the j-th column of B. Then the ij-entry of
.
bp
13
AB is a1b1 + a2b2 +
+ apbp , but this is also equal to b1 a1 + b2a2 + bp ap, which is the product of
a1
a2
b1 b2 bp and .. . This is exactly the product of the j-th row of B T and the i-th column
.
ap
T
of A , which is the ji-entry of B T AT . Thus ji-entry of (AB)T is the ji-entry of B T AT . Therefore
(AB)T = B T AT .
3 4
1 2 3
4
1
2
1
0
1
1. Let A =
, B = 5 1, C =
,D=
,
1 1 0
1 5 1
0 2 1
1 1
3 4
2
E = 2 3, F =
, and G = 2 1 . Compute each of the following and simplify,
3
0 1
whenever possible. If a computation is not possible, state why.
(a) 3C 4D
(e) 3BC 4BD
(b) A (D + 2C)
(f) CB + D
(c) A E
(g) GC
(d) AE
(h) F G
2. Illustrate the associativity of matrix multiplication by multiplying (AB)C and A(BC) where
A, B, and C are matrices above.
3. Let A be an n n matrix. Let m N. As you would expect, we define Am to be the product
of A with itself m times. Notice that this makes sense as matrix multiplication is associative.
1 2
4
(a) Compute A for A =
.
1 1
(b) Provide a counter-example to the statement: For any 2 2 matrices A and B, (AB)2 =
A2 B 2 .
4. Prove that for all m n matrices A, B, and C, if A + C = B + C, then A = B.
5. Let be the m n zero matrix. Prove that for any m n matrix A, A = (1)A and
0A = .
6. Let be the m n zero matrix. Prove that for any r R and m n matrix A, if rA = ,
then r = 0 or A = .
7. We have seen an example of two 2 2 matrices A and B for which AB 6= BA, showing us that
matrix multiplication is not commutative. However, its more that just not commutative, its
soooooo not commutative. Doing the following problems will illustrate what we mean by this.
(a) Find two matrices A and B for which AB and BA are defined, but have different sizes.
(b) Find two matrices A and B for which AB is defined, but BA is not.
14
8. Let be the 2 2 zero matrix and I the 2 2 identity matrix. Provide a counter-example
to each of the following statements:
(a) For any 2 2 matrices A and B, if AB = , then A = or B = .
(b) For any 2 2 matrices, A, B, and C, if AB = AC, then B = C.
(c) For any 2 2 matrix A, if A2 = A, then A = or A = I.
9. Suppose that we have a homogeneous system of linear equations whose matrix equation is
given by Ax = where A is the m n matrix of coefficients, x is the column matrix of
the n variables, and is the column matrix of the m zeroes. Use the properties of matrix
arithmetic to show that for any solutions u and v to the system and r R, u + v and ru are
also solutions.
10. Consider the system of linear equations:
x1 + x2 + x3 + x4
x1 x2 + x3 + x4
x2 x3 x4
x1 + x2 x3 x4
=3
=5
= 4
= 3.
1
1
1
2
(b) Use matrix multiplication to determine whether or not u =
1 and v = 0 are
2
3
solutions to the system.
11. Determine whether or not each of the following matrices is invertible. If so, find its inverse.
1
2
(a) A =
4
7
4
(c) C =
2
2 0 1
3
3
8
6 3 5
5 0
2
3
3
2 1 1
(b) B = 2 1 2
1 1 1
1 0 1
(d) D D where D =
0 2 1
T
12. Solve the systems of linear equations using the inverse of the coefficient matrix.
(a)
(b)
2x + y z = 2
2x y + 2z = 1
x+yz =3
x1+2x2 +x3
x2 x3 +x4
2x1 +4x2 +3x3
x1+2x2 x3 +2x4
15
=2
= 2
=0
= 4
13. Provide a counter-example to the statement: For any n n invertible matrices A and B,
A + B is invertible.
14. Find an example of a 2 2 nonidentity matrix whose inverse is its transpose.
15. Using the matrices in Problem #1, compute each of the following, if possible.
(a) 3A 2E T
(d) 3AT 2E
(b) AT B
(e) CC T + F G
(c) DT (F + GT )
(f) (F T + G)D
16