Definition 11.4.1Let X and Y be inner product spaces and let x ∈ X and y ∈ Y. Define thetensor productof these two vectors, y ⊗ x, an element of ℒ

(X, Y)

by

y⊗ x(u) ≡ y(u,x)X .

This is also called a rank one transformationbecause the image of this transformation is containedin the span of the vector, y.

The verification that this is a linear map is left to you. Be sure to verify this! The following lemma
has some of the most important properties of this linear transformation.

Lemma 11.4.2Let X,Y,Z be inner product spaces. Then for α a scalar,

w_{l}⊗ v_{k} = B, this shows that Bv_{p} = 0 since w_{r} is an
arbitrary element of the basis for Y. Since v_{p} is an arbitrary element of the basis for X, it
follows B = 0 as hoped. This has shown

{wj ⊗vi : i = 1,⋅⋅⋅,n,j = 1,⋅⋅⋅,m}

spans
ℒ

(X,Y )

.

It only remains to verify the w_{j}⊗ v_{i} are linearly independent. Suppose then that

∑
cijwj ⊗ vi = 0
i,j

Then do both sides to v_{s}. By definition this gives

are independent because it is an orthonormal set and so the above
requires c_{sj} = 0 for each j. Since s was arbitrary, this shows the linear transformations,

{wj ⊗ vi}

form a linearly independent set. ■

Note this shows the dimension of ℒ

(X,Y )

= nm. The theorem is also of enormous importance
because it shows you can always consider an arbitrary linear transformation as a sum of rank one
transformations whose properties are easily understood. The following theorem is also of great
interest.

Theorem 11.4.5Let A = ∑_{i,j}c_{ij}w_{i}⊗ v_{j}∈ℒ

(X,Y )

where as before, the vectors,

{w }
i

are an orthonormal basis for Y and the vectors,

{v }
j

are an orthonormal basis for X. Thenif the matrix of A has entries M_{ij}, it follows that M_{ij} = c_{ij}.

Proof: Recall

∑
Avi ≡ Mkiwk
k

Also

∑ ∑
Avi = ckjwk ⊗ vj(vi) = ckjwk (vi,vj)
k∑,j ∑ k,j
= ckjwk δij = ckiwk
k,j k

Therefore,

∑ ∑
Mkiwk = ckiwk
k k

and so M_{ki} = c_{ki}for all k. This happens for each i. ■