282 CHAPTER 11. INNER PRODUCT SPACES
∑k,l
(Avk, wl) (vp, vk) (wl, wr)
= (Avp, wr)−∑k,l
(Avk, wl) δpkδrl = (Avp, wr)− (Avp, wr) = 0.
Letting A−∑
k,l (Avk, wl)wl⊗vk = B, this shows that Bvp = 0 since wr is an arbitraryelement of the basis for Y. Since vp is an arbitrary element of the basis for X, it followsB = 0 as hoped. This has shown {wj ⊗ vi : i = 1, · · · , n, j = 1, · · · ,m} spans L (X,Y ) .
It only remains to verify the wj ⊗ vi are linearly independent. Suppose then that∑i,j
cijwj ⊗ vi = 0
Then do both sides to vs. By definition this gives
0 =∑i,j
cijwj (vs, vi) =∑i,j
cijwjδsi =∑j
csjwj
Now the vectors {w1, · · · , wm} are independent because it is an orthonormal set and so theabove requires csj = 0 for each j. Since s was arbitrary, this shows the linear transformations,{wj ⊗ vi} form a linearly independent set. ■
Note this shows the dimension of L (X,Y ) = nm. The theorem is also of enormousimportance because it shows you can always consider an arbitrary linear transformation asa sum of rank one transformations whose properties are easily understood. The followingtheorem is also of great interest.
Theorem 11.4.5 Let A =∑
i,j cijwi⊗vj ∈ L (X,Y ) where as before, the vectors, {wi} arean orthonormal basis for Y and the vectors, {vj} are an orthonormal basis for X. Then ifthe matrix of A has entries Mij , it follows that Mij = cij .
Proof: RecallAvi ≡
∑k
Mkiwk
Also
Avi =∑k,j
ckjwk ⊗ vj (vi) =∑k,j
ckjwk (vi, vj)
=∑k,j
ckjwkδij =∑k
ckiwk
Therefore, ∑k
Mkiwk =∑k
ckiwk
and so Mki = cki for all k. This happens for each i. ■
11.5 Least Squares
A common problem in experimental work is to find a straight line which approximates aswell as possible a collection of points in the plane {(xi, yi)}pi=1. The usual way of dealingwith these problems is by the method of least squares and it turns out that all these sortsof approximation problems can be reduced to Ax = b where the problem is to find the bestx for solving this equation even when there is no solution.