Now what if instead of numbers, the entries, A,B,C,D,E,F,G are matrices of a size such
that the multiplications and additions needed in the above formula all make sense. Would the
formula be true in this case?
Suppose A is a matrix of the form
( )
| A11 ⋅⋅⋅ A1m |
A = ( ... ... ... ) (1.1)
Ar1 ⋅⋅⋅ Arm
(1.1)
where A_{ij} is a s_{i}×p_{j} matrix where s_{i} is constant for j = 1,
⋅⋅⋅
,m for each i = 1,
⋅⋅⋅
,r. Such a
matrix is called ablock matrix, also a partitioned matrix. How do you get the block A_{ij}?
Here is how for A an m × n matrix:
In the block column matrix on the right, you need to have c_{j}− 1 rows of zeros above the
small p_{j}× p_{j} identity matrix where the columns of A involved in A_{ij} are c_{j},
⋅⋅⋅
,c_{j} + p_{j}− 1
and in the block row matrix on the left, you need to have r_{i}− 1 columns of zeros to the left of
the s_{i}× s_{i} identity matrix where the rows of A involved in A_{ij} are r_{i},
⋅⋅⋅
,r_{i} + s_{i}. An
important observation to make is that the matrix on the right specifies columns to use in the
block and the one on the left specifies the rows. Thus the block A_{ij}, in this case, is a
matrix of size s_{i}× p_{j}. There is no overlap between the blocks of A. Thus the identity
n × n identity matrix corresponding to multiplication on the right of A is of the
form
( )
Ip1×p1 0
|( ... |) ,
0 Ip ×p
m m
where these little identity matrices don’t overlap. A similar conclusion follows from
consideration of the matrices I_{si×si}. Note that in (1.2), the matrix on the right is a block
column matrix for the above block diagonal matrix, and the matrix on the left in (1.2) is
a block row matrix taken from a similar block diagonal matrix consisting of the
I_{si×si}.
Next consider the question of multiplication of two block matrices. Let B be a block
matrix of the form
such that for all i,j, it makes sense to multiply B_{is}A_{sj} for all s ∈
{1,⋅⋅⋅,p}
. (That is the two
matrices B_{is} and A_{sj} are conformable.) and that for fixed ij, it follows that B_{is}A_{sj} is the
same size for each s so that it makes sense to write ∑_{s}B_{is}A_{sj}.
The following theorem says essentially that when you take the product of two matrices,
you can partition both matrices, formally multiply the blocks to get another block matrix and
this one will be BA partitioned. Before presenting this theorem, here is a simple lemma which
is really a special case of the theorem.
Lemma A.2.1Consider the following product.
( )
0 ( )
( I ) 0 I 0
0
where the first is n × r and the second is r × n. The small identity matrix I is anr × r matrix and there are l zero rows above I and l zero columns to the left of Iin the right matrix. Then the product of these matrices is a block matrix of theform
( )
0 0 0
( 0 I 0 ) .
0 0 0
Proof:From the definition of matrix multiplication, the product is
( ( ) ( ) ( ) ( ) )
0 0 0 0
( ( I ) 0 ⋅⋅⋅ ( I ) e1 ⋅⋅⋅ ( I ) er ⋅⋅⋅ ( I ) 0 )
0 0 0 0
which yields the claimed result. In the formula e_{j} refers to the column vector of length r
which has a 1 in the j^{th} position. This proves the lemma. ■
Theorem A.2.2Let B be a q × p block matrix as in (1.3) and let A bea p × n block matrix as in (1.4) such that B_{is}is conformable with A_{sj}and eachproduct, B_{is}A_{sj}for s = 1,
⋅⋅⋅
,p is of the same size, so that they can be added.Then BA can be obtained as a block matrix such that the ij^{th}block is of the form
where here it is assumed B_{is} is r_{i}×p_{s} and A_{sj} is p_{s}×q_{j}. The product involves the s^{th} block
in the i^{th} row of blocks for B and the s^{th} block in the j^{th} column of A. Thus there are the
same number of rows above the I_{ps×ps} as there are columns to the left of I_{ps×ps} in those two
inside matrices. Then from Lemma A.2.1
which equals the ij^{th} block of BA. Hence the ij^{th} block of BA equals the formal
multiplication according to matrix multiplication,
∑
BisAsj.
s
This proves the theorem. ■
Example A.2.3Multiply the following pair of partitioned matrices using the abovetheorem by multiplying the blocks as described above and then in the conventionalmanner.