06.09.2021 Views

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

3.5. BLOCK MULTIPLICATION OF MATRICES 99<br />

where A ij is a s i × p j matrix where s i is constant for j =1, ··· ,m for each i =1, ··· ,r.<br />

Such a matrix is called a block matrix, also a partitioned matrix. Howdoyougetthe<br />

block A ij ?HereishowforA an m × n matrix:<br />

n×p j<br />

{<br />

s i ×m ⎛ }} {<br />

{( }} {<br />

0 Isi ×s i<br />

0 ) A⎝ 0 ⎞<br />

I pj ×p j<br />

⎠. (3.18)<br />

0<br />

In the block column matrix on the right, you need to have c j − 1 rows of zeros above the<br />

small p j × p j identity matrix where the columns of A involved in A ij are c j , ··· ,c j + p j − 1<br />

and in the block row matrix on the left, you need to have r i − 1 columns of zeros to the left<br />

of the s i × s i identity matrix where the rows of A involved in A ij are r i , ··· ,r i + s i . An<br />

important observation to make is that the matrix on the right specifies columns to use in<br />

the block and the one on the left specifies the rows used. Thus the block A ij in this case<br />

is a matrix of size s i × p j . There is no overlap between the blocks of A. Thus the identity<br />

n × n identity matrix corresponding to multiplication on the right of A is of the form<br />

⎛<br />

⎞<br />

I p1 ×p 1<br />

0<br />

⎜<br />

⎝<br />

. ..<br />

⎟<br />

⎠<br />

0 I pm×p m<br />

where these little identity matrices don’t overlap. A similar conclusion follows from consideration<br />

of the matrices I si ×s i<br />

. Note that in (3.18) the matrix on the right is a block column<br />

matrix for the above block diagonal matrix and the matrix on the left in (3.18) is a block<br />

row matrix taken from a similar block diagonal matrix consisting of the I si ×s i<br />

.<br />

Next consider the question of multiplication of two block matrices. Let B be a block<br />

matrix of the form<br />

⎛<br />

⎜<br />

⎝<br />

and A is a block matrix of the form<br />

⎛<br />

⎜<br />

⎝<br />

B 11 ··· B 1p<br />

⎞<br />

.<br />

. .. . A 11 ··· A 1m<br />

⎞<br />

.<br />

. .. . ..<br />

⎟<br />

⎠ (3.19)<br />

⎟<br />

⎠ (3.20)<br />

and that for all i, j, it makes sense to multiply B is A sj for all s ∈{1, ··· ,p}. (That is the<br />

two matrices, B is and A sj are conformable.) and that for fixed ij, it follows B is A sj is the<br />

same size for each s so that it makes sense to write ∑ s B isA sj .<br />

The following theorem says essentially that when you take the product of two matrices,<br />

you can do it two ways. One way is to simply multiply them forming BA. The other way<br />

is to partition both matrices, formally multiply the blocks to get another block matrix and<br />

this one will be BA partitioned. Before presenting this theorem, here is a simple lemma<br />

which is really a special case of the theorem.<br />

Lemma 3.5.1 Consider the following product.<br />

⎛<br />

⎝ 0 ⎞<br />

I ⎠ ( 0 I 0 )<br />

0

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!