11.4 The Tensor Product Of Two Vectors
Definition 11.4.1 Let X and Y be inner product spaces and let x ∈ X and y ∈ Y. Define the
tensor product of these two vectors, y ⊗ x, an element of ℒ
This is also called a rank one transformation because the image of this transformation is contained
in the span of the vector, y.
The verification that this is a linear map is left to you. Be sure to verify this! The following lemma
has some of the most important properties of this linear transformation.
Lemma 11.4.2 Let X,Y,Z be inner product spaces. Then for α a scalar,
Proof: Let u ∈ X and v ∈ Y. Then
Therefore, this verifies 11.7.
To verify 11.8, let u ∈ X.
Since the two linear transformations on both sides of 11.8 give the same answer for every u ∈ X, it
follows the two transformations are the same. ■
Definition 11.4.3 Let X,Y be two vector spaces. Then define for A,B ∈ℒ
and α ∈ F,
new elements of ℒ
denoted by A
+ B and αA as follows.
Theorem 11.4.4 Let X and Y be finite dimensional inner product spaces. Then ℒ
vector space with the above definition of what it means to multiply by a scalar and add. Let
be an orthonormal basis for X and
be an orthonormal basis for Y.
Then a basis for ℒ
Proof: It is obvious that ℒ
is a vector space. It remains to verify the given set is a
basis. Consider the following:
Letting A −∑
wl ⊗ vk
this shows that Bvp
= 0 since wr
arbitrary element of the basis for Y.
is an arbitrary element of the basis for X,
= 0 as hoped. This has shown
It only remains to verify the wj ⊗ vi are linearly independent. Suppose then that
Then do both sides to vs. By definition this gives
Now the vectors
are independent because it is an orthonormal set and so the above
= 0 for each j.
was arbitrary, this shows the linear transformations,
form a linearly independent set.
Note this shows the dimension of ℒ
The theorem is also of enormous importance
because it shows you can always consider an arbitrary linear transformation as a sum of rank one
transformations whose properties are easily understood. The following theorem is also of great
Theorem 11.4.5 Let A = ∑
i,jcijwi ⊗ vj ∈ℒ
where as before, the vectors,
are an orthonormal basis for Y and the vectors,
are an orthonormal basis for X. Then
if the matrix of A has entries Mij, it follows that Mij
and so Mki = cki for all k. This happens for each i. ■