Chapter 2 Basic Topological and Algebraic Considerations
It is often the case that one desires to consider the complex numbers denoted as ℂ. However, most of the
time in the first part of the book, we have in mind ℝ. Nevertheless, it is convenient to have the theory
pertain just as well to ℂ^{p}. Therefore, we will denote by F that which could be either ℝ or ℂ. The symbol
|x|
will refer to the norm of x ∈ F^{p}, defined as
┌ -------
││ ∑p ( ) ∘ ------
∘ |xj|2 = |x| where x ≡ x1 ⋅⋅⋅ xp ,|a + ib| ≡ a2 + b2
j=1
but we could also use the norm
∥⋅∥
=
∥⋅∥
_{∞}defined as
∥x∥∞ = max {|xi|,i = 1,⋅⋅⋅,p}
I assume the reader has seen most of this before in multivariable calculus.
From linear algebra or calculus,
∥⋅∥
is called a norm if the following conditions are satisfied.
Thus the comma goes across + and you can factor out scalars in first argument and conjugates of the
scalars in the second. The properties 2.4-2.6 are called the inner product axioms.
The Cauchy Schwarz inequality, Theorem 1.10.1 says that
∥x + y∥ = max {|xi + yi|,i ≤ p} ≤ max {|xi|+ |yi|,i ≤ p}
∞
≤ max {|xi|,i ≤ p}+ max {|yi|,i ≤ p} = ∥x∥∞ + ∥y∥∞ ■
Note that the above two norms are equivalent in the sense that
√ -
∥x∥∞ ≤ |x| ≤ p∥x∥∞ (*)
(*)
Thus in what follows, it will not matter which norm is being used. Actually, any two norms are equivalent,
which will be shown later. The significance of the Euclidean norm
|⋅|
is geometrical. See the problems. The
fundamental result pertaining to the inner product just discussed is the Gram Schmidt process presented
next.
Definition 2.0.2A set of vectors
{v1,⋅⋅⋅,vk}
is calledorthonormal if
{
1 if i = j
(vi,vj) = δij ≡ 0 if i ⁄= j
Then there is a very easy proposition which follows this.
Proposition 2.0.3Suppose
{v1,⋅⋅⋅,vk}
is an orthonormal set of vectors. Then it is linearlyindependent.
Proof:Suppose ∑_{i=1}^{k}c_{i}v_{i} = 0. Then taking dot products with v_{j},
∑ ∑
0 = (0,vj) = ci(vi,vj) = ciδij = cj.
i i
Since j is arbitrary, this shows the set is linearly independent as claimed. ■
It turns out that if X is any subspace of F^{m}, then there exists an orthonormal basis for X. This
follows from the use of the next lemma applied to a basis for X. Recall first that from linear
algebra, every subspace of F^{m} has a basis. If this is not familiar, see the appendix on linear
algebra.
Lemma 2.0.4Let
{x1,⋅⋅⋅,xn}
be a linearlyindependent subset of F^{p}, p ≥ n. Then there existorthonormal vectors
{u1,⋅⋅⋅,un}
which have the property that for each k ≤ n,span
(x1,⋅⋅⋅,xk)
=
span
(u1,⋅⋅⋅,uk)
.
Proof: Let u_{1}≡ x_{1}∕
|x1|
. Thus for k = 1,span
(u1)
= span
(x1)
and
{u1}
is an orthonormal
set. Now suppose for some k < n, u_{1},
( )
∑k
C ( (xk+1,ul)− (xk+1,uj)δlj) = C ((xk+1,ul)− (xk+1,ul)) = 0.
j=1
The vectors,
{uj}
_{j=1}^{n}, generated in this way are therefore orthonormal because each vector has unit
length. ■
The following lemma is a fairly simple observation about the Gram Schmidt process which
says that if you start with orthonormal vectors, the process will not undo what you already
have.
Lemma 2.0.5Suppose
{w1, ⋅⋅⋅,wr,vr+1,⋅⋅⋅,vp}
is a linearly independent set of vectors such that
{w1,⋅⋅⋅,wr }
is an orthonormal set of vectors. Then when the Gram Schmidt process is applied tothe vectors in the given order, it will not change any of the w_{1},
⋅⋅⋅
,w_{r}.
Proof:Let
{u1,⋅⋅⋅,up}
be the orthonormal set delivered by the Gram Schmidt process. Then
u_{1} = w_{1} because by definition, u_{1}≡ w_{1}∕
|w1 |
= w_{1}. Now suppose u_{j} = w_{j} for all j ≤ k ≤ r. Then if k < r,
consider the definition of u_{k+1}.
By induction, u_{j} = w_{j} and so this reduces to w_{k+1}∕
|wk+1|
= w_{k+1}. ■
Note how the argument depended only on the axioms of the inner product. Thus it will hold with no
change in any inner product space, not just F^{n}. However, our main interest is the special case. An inner
produce space is just a vector space which has an inner product, that which satisfies 2.4 -
2.6.
Lemma 2.0.6Suppose V,W are subspaces of F^{n}and they have orthonormal bases,
{v1,⋅⋅⋅,vr},{w1,⋅⋅⋅,wr}
respectively. Let A map V to W be defined by
( r ) r
A ∑ cv ≡ ∑ cw
k=1 k k k=1 k k
Then
|Av |
=
|v|
. That is, A preserves Euclidean norms.
Proof:This follows right away from a computation. If
{u1,⋅⋅⋅,ur}
is orthonormal, then
| r |2 ( r r )
||∑ || ∑ ∑
|| ckuk|| = ckuk, ckuk
k=1 k=1 k=1