be vectors in F^{n}. Alinear combination is any expression ofthe form
∑p
cixi
i=1
where the c_{i}are scalars. The set of all linear combinations of these vectors is calledspan
(x1,⋅⋅⋅,xn)
.A nonempty V ⊆ F^{n}, is is called asubspace if whenever α,β are scalarsand u and v are vectors of V, it follows αu + βv ∈ V . That is, it is “closed under thealgebraic operations of vector addition and scalar multiplication”. The empty set is nevera subspace by definition. A linear combination of vectors is said to betrivial if allthe scalars in the linear combination equal zero. A set of vectors is said to be linearlyindependent if the only linear combination of these vectors which equals the zero vectoris the trivial linear combination. Thus
{x1,⋅⋅⋅,xn }
is called linearly independentifwhenever
∑p
ckxk = 0
k=1
it follows that all the scalars c_{k}equal zero. A set of vectors,
{x1,⋅⋅⋅,xp}
, is called linearlydependent if it is not linearly independent. Thus the set of vectors islinearly dependent if thereexist scalars c_{i},i = 1,
⋅⋅⋅
,n, not all zero such that∑_{k=1}^{p}c_{k}x_{k} = 0.
Proposition 2.6.2Let V ⊆ F^{n}. Then V is a subspace if and only if it is a vector spaceitself with respect to the same operations of scalar multiplication and vector addition.
Proof: Suppose first that V is a subspace. All algebraic properties involving scalar multiplication
and vector addition hold for V because these things hold for F^{n}. Is 0 ∈ V ? Yes it is. This is
because 0v ∈ V and 0v = 0. By assumption, for α a scalar and v ∈ V,αv ∈ V. Therefore,
−v =
(− 1)
v ∈ V . Thus V has the additive identity and additive inverse. By assumption, V is
closed with respect to the two operations. Thus V is a vector space. If V ⊆ F^{n} is a vector space,
then by definition, if α,β are scalars and u,v vectors in V, it follows that αv + βu ∈ V .
■
Thus, from the above, subspaces of F^{n} are just subsets of F^{n} which are themselves vector
spaces.
Lemma 2.6.3A set of vectors
{x1,⋅⋅⋅,xp}
is linearly independent if and only if none ofthe vectors can be obtained as a linear combination of the others.
Proof:Suppose first that
{x1,⋅⋅⋅,xp}
is linearly independent. If x_{k} = ∑_{j≠k}c_{j}x_{j},
then
∑
0 = 1xk + (− cj)xj,
j⁄=k
a nontrivial linear combination, contrary to assumption. This shows that if the set is linearly
independent, then none of the vectors is a linear combination of the others.
Now suppose no vector is a linear combination of the others. Is
{x1,⋅⋅⋅,xp}
linearly
independent? If it is not, there exist scalars c_{i}, not all zero such that
∑p
cixi = 0.
i=1
Say c_{k}≠0. Then you can solve for x_{k} as
∑
xk = (− cj)∕ckxj
j⁄=k
contrary to assumption. ■
The following is called the exchange theorem.
Theorem 2.6.4(Exchange Theorem) Let
{x1,⋅⋅⋅,xr}
be a linearly independent set ofvectors such that each x_{i}is in span
(y1,⋅⋅⋅,ys)
. Then r ≤ s.
Proof 1: Suppose not. Then r > s. By assumption, there exist scalars a_{ji} such
that
∑s
xi = ajiyj
j=1
The matrix whose ji^{th} entry is a_{ji} has more columns than rows. Therefore, by Theorem 2.5.2
there exists a nonzero vector b ∈ F^{r} such that Ab = 0. Thus
Not all of these scalars can equal zero because if this were the case, it would follow
that x_{1} = 0 and so
{x1,⋅⋅⋅,xr}
would not be linearly independent. Indeed, if x_{1} = 0,
1x_{1} + ∑_{i=2}^{r}0x_{i} = x_{1} = 0 and so there would exist a nontrivial linear combination of the vectors
= V because if v ∈ V, there exist constants c_{1},
⋅⋅⋅
,c_{s} such
that
s∑−1
v = cizi + csyk.
i=1
Now replace the y_{k} in the above with a linear combination of the vectors,
{x1,z1,⋅⋅⋅,zs−1}
to
obtain v ∈span
{x1,z1,⋅⋅⋅,zs−1}
. The vector y_{k}, in the list
{y1,⋅⋅⋅,ys}
, has now been replaced
with the vector x_{1} and the resulting modified list of vectors has the same span as the original list
of vectors,
{y1,⋅⋅⋅,ys}
.
Now suppose that r > s and that span
{x1,⋅⋅⋅,xl,z1,⋅⋅⋅,zp}
= V where the vectors,
z_{1},
⋅⋅⋅
,z_{p} are each taken from the set,
{y1,⋅⋅⋅,ys}
and l + p = s. This has now been done for
l = 1 above. Then since r > s, it follows that l ≤ s < r and so l + 1 ≤ r. Therefore, x_{l+1} is a vector
not in the list,
{x1,⋅⋅⋅,xl}
and since span
{x1,⋅⋅⋅,xl,z1,⋅⋅⋅,zp}
= V, there exist scalars c_{i} and
d_{j} such that
∑l ∑p
xl+1 = cixi + djzj. (2.26)
i=1 j=1
(2.26)
Now not all the d_{j} can equal zero because if this were so, it would follow that
{x1,⋅⋅⋅,xr}
would
be a linearly dependent set because one of the vectors would equal a linear combination of the
others. Therefore, 2.26 can be solved for one of the z_{i}, say z_{k}, in terms of x_{l+1} and the other z_{i}
and just as in the above argument, replace that z_{i} with x_{l+1} to obtain
Proof:From the exchange theorem, r ≤ s and s ≤ r. ■
Definition 2.6.10Let V be a subspace of F^{n}. Then dim
(V )
read asthe dimension of Vis the number of vectors in a basis.
Of course you should wonder right now whether an arbitrary subspace even has
a basis. In fact it does and this is in the next theorem. First, here is an interesting
lemma.
Lemma 2.6.11Suppose v
∕∈
span
(u1,⋅⋅⋅,uk )
and
{u1,⋅⋅⋅,uk }
is linearly independent.Then
{u1,⋅⋅⋅,uk,v}
is also linearly independent.
Proof:Suppose ∑_{i=1}^{k}c_{i}u_{i} + dv = 0. It is required to verify that each c_{i} = 0 and
that d = 0. But if d≠0, then you can solve for v as a linear combination of the vectors,
{u1,⋅⋅⋅,uk}
,
∑k (ci)
v = − d- ui
i=1
contrary to assumption. Therefore, d = 0. But then ∑_{i=1}^{k}c_{i}u_{i} = 0 and the linear independence of
{u1,⋅⋅⋅,uk}
implies each c_{i} = 0 also. ■
Theorem 2.6.12Let V be a nonzero subspace of F^{n}. Then V has a basis.
Proof:Let v_{1}∈ V where v_{1}≠0. If span
{v1}
= V, stop.
{v1}
is a basis for V .
Otherwise, there exists v_{2}∈ V which is not in span
is a larger linearly
independent set of vectors. Continuing this way, the process must stop before n + 1 steps because
if not, it would be possible to obtain n + 1 linearly independent vectors contrary to the exchange
theorem. ■
In words the following corollary states that any linearly independent set of vectors can be
enlarged to form a basis.
Corollary 2.6.13Let V be a subspace of F^{n}and let
{v1,⋅⋅⋅,vr}
be a linearly independentset of vectors in V . Then either it is a basis for V or there exist vectors, v_{r+1},
⋅⋅⋅
,v_{s}suchthat
{v1,⋅⋅⋅,vr,vr+1,⋅⋅⋅,vs}
is a basis for V.
Proof:This follows immediately from the proof of Theorem 2.6.12. You do exactly the same
argument except you start with
{v ,⋅⋅⋅,v }
1 r
rather than
{v }
1
. ■
It is also true that any spanning set of vectors can be restricted to obtain a basis.
Theorem 2.6.14Let V be a subspace of F^{n}and supposespan
(u1⋅⋅⋅,up)
= V wherethe u_{i}are nonzero vectors. Then there exist vectors
{v1⋅⋅⋅,vr}
such that
{v1⋅⋅⋅,vr}
⊆
{u1⋅⋅⋅,up}
and
{v1⋅⋅⋅,vr}
is a basis for V .
Proof:Let r be the smallest positive integer with the property that for some set
{v1⋅⋅⋅,vr}
⊆
{u1⋅⋅⋅,up}
,
span(v1⋅⋅⋅,vr) = V.
Then r ≤ p and it must be the case that
{v1⋅⋅⋅,vr}
is linearly independent because if it were not
so, one of the vectors, say v_{k} would be a linear combination of the others. But then you could
delete this vector from
{v1⋅⋅⋅,vr}
and the resulting list of r − 1 vectors would still span V
contrary to the definition of r. ■