376 CHAPTER 18. LINEAR FUNCTIONS
This next theorem shows that this is what is needed in order to have matrix multiplica-tion correspond to composition of linear transformations.
Theorem 18.1.12 Let T : Rn → Rm and S : Rm → Rp and suppose both T and Sare linear. Then S◦T : Rn → Rp is also linear and
[S◦T ] = [S] [T ]
where matrix multiplication is defined in Definition 18.1.11.
Proof: By definition,
∑j[S◦T ]i j x j ≡ (S◦T (x))i = (S (T (x)))i = ([S] (Tx))i
= ∑k[S]ik (Tx)k = ∑
k[Si]ik ([T ]x)k
= ∑k[Si]ik ∑
j[T ]k j x j = ∑
j
(∑k[Si]ik [T ]k j
)x j
It follows, since x is completely arbitrary that for each i, and for each j,
[S◦T ]i j = ∑k[Si]ik [T ]k j
Here is something you must understand about matrix multiplication. For A and B ma-trices, in order to form the product AB the number of columns of A must equal the numberof rows of B.
(m×n)(n× p) = m× p, (m×n)(k× p) = nonsense (18.1)
The two outside numbers give the size of the product and the middle two numbers mustmatch. You must have the same number of columns on the left as you have rows on theright.
Example 18.1.13 Let A =
(1 −1 23 −2 1
)and B =
2 3−1 10 3
. Then find AB. After
this, find BA
Consider first AB. It is the product of a 2× 3 and a 3× 2 matrix and so it is a 2× 2matrix. The top left corner is the dot product of the top row of A and the first column
of B and so forth. Be sure you can show the following that AB =
(3 88 10
),BA = 11 −8 7
2 −1 −19 −6 3
.
Note this shows that matrix multiplication is not commutative. Indeed, it can resultin matrices of different size when you interchange the order. Here is a perplexing littleobservation. If you add the entries on the main diagonal of both matrices in the above, youget the same number 13. This is the diagonal from upper left to lower right. You mightwonder whether this always happens or if this is just a fluke. In fact, it will always happen.You should try and show this.