To save on notation, F will denote a field. In this course, it will be either ℝ or ℂ, the real or the complex
numbers.
Definition A.3.1Let
{x1,⋅⋅⋅,xp}
be vectors in F^{n}. Alinear combination is any expression of theform
∑p
cixi
i=1
where the c_{i}are scalars. The set of all linear combinations of these vectors is calledspan
(x1,⋅⋅⋅,xn)
.IfV ⊆ F^{n}, then V is called asubspace if whenever α,β are scalars and u and v are vectors of V, it followsαu + βv ∈ V . That is, it is “closed under the algebraic operations of vector addition and scalarmultiplication”. A linear combination of vectors is said to betrivial if all the scalars in the linearcombination equal zero. A set of vectors is said to be linearly independent if the only linear combination ofthese vectors which equals the zero vector is the trivial linear combination. Thus
{x1,⋅⋅⋅,xn}
is calledlinearly independentif whenever
p
∑ c x = 0
k=1 k k
it follows that all the scalars c_{k}equal zero. A set of vectors,
{x ,⋅⋅⋅,x }
1 p
, is called linearly dependent if itis not linearly independent. Thus the set of vectors islinearly dependent if there exist scalarsc_{i},i = 1,
⋅⋅⋅
,n, not all zero such that∑_{k=1}^{p}c_{k}x_{k} = 0.
Proposition A.3.2Let V ⊆ F^{n}. Then V is a subspace if and only if it is a vector space itself withrespect to the same operations of scalar multiplication and vector addition.
Proof: Suppose first that V is a subspace. All algebraic properties involving scalar multiplication
and vector addition hold for V because these things hold for F^{n}. Is 0 ∈ V ? Yes it is. This is
because 0v ∈ V and 0v = 0. By assumption, for α a scalar and v ∈ V,αv ∈ V. Therefore,
−v =
(− 1)
v ∈ V . Thus V has the additive identity and additive inverse. By assumption, V is
closed with respect to the two operations. Thus V is a vector space. If V ⊆ F^{n} is a vector
space, then by definition, if α,β are scalars and u,v vectors in V, it follows that αv + βu ∈ V .
■
Thus, from the above, subspaces of F^{n} are just subsets of F^{n} which are themselves vector
spaces.
Lemma A.3.3A set of vectors
{x1,⋅⋅⋅,xp}
is linearly independent if and only if none of thevectors can be obtained as a linear combination of the others.
Proof:Suppose first that
{x1,⋅⋅⋅,xp}
is linearly independent. If x_{k} = ∑_{j≠k}c_{j}x_{j}, then
0 = 1x + ∑ (− c )x ,
k j⁄=k j j
a nontrivial linear combination, contrary to assumption. This shows that if the set is linearly independent,
then none of the vectors is a linear combination of the others.
Now suppose no vector is a linear combination of the others. Is
{x1,⋅⋅⋅,xp}
linearly independent? If it
is not, there exist scalars c_{i}, not all zero such that
∑p
cixi = 0.
i=1
Say c_{k}≠0. Then you can solve for x_{k} as
x = ∑ (− c )∕c x
k j⁄=k j k j
contrary to assumption. ■
The following is called the exchange theorem.
Theorem A.3.4(Exchange Theorem) Let
{x1,⋅⋅⋅,xr}
be a linearly independent set of vectorssuch that each x_{i}is in span
(y1,⋅⋅⋅,ys)
. Then r ≤ s.
Proof : Suppose not. Then r > s. By assumption, there exist scalars a_{ji} such that
∑s
xi = ajiyj
j=1
The matrix whose ji^{th} entry is a_{ji} has more columns than rows. Therefore, by Theorem A.2.10 there exists
a nonzero vector b ∈ F^{r} such that Ab = 0. Thus
Proof:From the exchange theorem, r ≤ s and s ≤ r. ■
Definition A.3.10Let V be a subspace of F^{n}. Then dim
(V )
read asthe dimension of V is thenumber of vectors in a basis.
Of course you should wonder right now whether an arbitrary subspace even has a basis. In fact it does
and this is in the next theorem. First, here is an interesting lemma.
Lemma A.3.11Suppose v
∕∈
span
(u1,⋅⋅⋅,uk)
and
{u1,⋅⋅⋅,uk}
is linearly independent. Then
{u1,⋅⋅⋅,uk,v}
is also linearly independent.
Proof:Suppose ∑_{i=1}^{k}c_{i}u_{i} + dv = 0. It is required to verify that each c_{i} = 0 and that d = 0. But if
d≠0, then you can solve for v as a linear combination of the vectors,
{u1,⋅⋅⋅,uk }
,
∑k ( c)
v = − i- ui
i=1 d
contrary to assumption. Therefore, d = 0. But then ∑_{i=1}^{k}c_{i}u_{i} = 0 and the linear independence of
{u1,⋅⋅⋅,uk}
implies each c_{i} = 0 also. ■
Theorem A.3.12Let V be a nonzero subspace of F^{n}. Then V has a basis.
Proof:Let v_{1}∈ V where v_{1}≠0. If span
{v1}
= V, stop.
{v1}
is a basis for V . Otherwise, there exists
v_{2}∈ V which is not in span
is a larger linearly independent set of vectors. Continuing this way, the process must stop
before n + 1 steps because if not, it would be possible to obtain n + 1 linearly independent vectors, contrary
to the exchange theorem. ■
In words the following corollary states that any linearly independent set of vectors can be enlarged to
form a basis.
Corollary A.3.13Let V be a subspace of F^{n}and let
{v1,⋅⋅⋅,vr}
be a linearly independent setof vectors in V . Then either it is a basis for V or there exist vectors, v_{r+1},
⋅⋅⋅
,v_{s}such that
{v1,⋅⋅⋅,vr,vr+1,⋅⋅⋅,vs}
is a basis for V.
Proof:This follows immediately from the proof of Theorem A.3.12. You do exactly the same argument
except you start with
{v1,⋅⋅⋅,vr}
rather than
{v1}
. ■
It is also true that any spanning set of vectors can be restricted to obtain a basis.
Theorem A.3.14Let V be a subspace of F^{n}and supposespan
(u1⋅⋅⋅,up )
= V where the u_{i}arenonzero vectors. Then there exist vectors
{v1 ⋅⋅⋅,vr}
such that
{v1⋅⋅⋅,vr}
⊆
{u1 ⋅⋅⋅,up}
and
{v1⋅⋅⋅,vr}
is a basis for V .
Proof:Let r be the smallest positive integer with the property that for some set
{v1⋅⋅⋅,vr}
⊆
{u1⋅⋅⋅,up}
,
span(v1 ⋅⋅⋅,vr) = V.
Then r ≤ p and it must be the case that
{v1⋅⋅⋅,vr}
is linearly independent because if it were not so, one
of the vectors, say v_{k} would be a linear combination of the others. But then you could delete this vector
from
{v1⋅⋅⋅,vr}
and the resulting list of r − 1 vectors would still span V contrary to the definition of r.
■