The fundamental idea in linear algebra is the following notion of a linear combination.
Definition 3.1.1 Let x_{1},
Definition 3.1.2 Let

and this is to have each c_{i} = 0. More succinctly, if ∑ _{i=1}^{n}c_{i}v_{i} = 0 then each c_{i} = 0. A set S ⊆ V is linearly dependent if it is not linearly independent. That is, there is some subset of S
The following is a useful equivalent description of what it means to be independent.
Proposition 3.1.3 A set of vectors S is independent if and only if no vector is a linear combination of the others.
Proof: ⇒ Suppose S is linearly independent. Could you have for some

No. This is not possible because if the above holds, then you would have

in contradiction to the assumption that
⇐ Suppose now that no vector in S is a linear combination of the others. Suppose ∑ _{i=1}^{n}c_{i}u_{i} = 0 where each u_{i} ∈ S. It is desired to show that whenever this happens, each c_{i} = 0. Could any of the c_{i} be non zero? No. If c_{k}≠0, then you would have

and so

showing that one can obtain u_{k} as a linear combination of the other vectors after all. It follows that all c_{i} = 0 and so
Example 3.1.4 Determine whether the real valued functions defined on ℝ given by the polynomials

are independent.
Suppose

then differentiate both sides to obtain

Now differentiate again.

In the second equation, let x = −1. Then −c = 0 so c = 0. Thus
The main theorem is the following theorem, called the replacement or exchange theorem. It uses the argument of the second half of the above proposition repeatedly.
Proof: Say r > s. By assumption, u_{1} = ∑ _{i}b_{i}v_{i}. Not all of the b_{i} can equal 0 because if this were so, you would have u_{1} = 0 which would violate the assumption that

since u_{1} = 0. Thus some v_{i} say v_{i1} is a linear combination of the vector u_{1} along with the v_{j} for j≠i. It follows that the span of

where the vectors

By the assumption that

Continuing this replacement process, it follows that since r > s, one can eliminate all of the vectors
Next is the definition of dimension and basis of a vector space.
Definition 3.1.6 Let V be a vector space with field of scalars F. A subset S of V is a basis for V means that
The plural of basis is bases. It is this way to avoid hissing when referring to it.
The dimension of a vector space is the number of vectors in a basis. A vector space is finite dimensional if it equals the span of some finite set of vectors.
Lemma 3.1.7 Let S be a linearly independent set of vectors in a vector space V . Suppose v
Proof: Suppose

where a,b_{1},

contrary to the assumption that v

But S is linearly independent and so by assumption each b_{i} = 0. ■
Proposition 3.1.8 Let V be a finite dimensional nonzero vector space with field of scalars F. Then it has a basis and also any two bases have the same number of vectors so the above definition of a basis is well defined.
Proof: Pick u_{1}≠0. If span
If
As a specific example, consider F^{n} as the vector space. As mentioned above, these are the mappings from

with addition and scalar mutiplication defined as


Also, when referring to vectors in F^{n}, it is customary to denote them as bold faced letters, which is a convention I will begin to observe at this point. It is also more convenient to write these vectors in F^{n} as columns of numbers. Thus

There is a fundamental concept known as linear independence.
Observation 3.1.9 F^{n} has dimension n. To see this, note that a basis is e_{1},

the vector in F^{n} which has a 1 in the i^{th} position and a zero everywhere else.
To see this, note that

and that if

then

so each x_{i} is zero. Thus this set of vectors is a spanning set and is linearly independent so it is a basis. There are n of these vectors and so the dimension of F^{n} is indeed n.
There is a fundamental observation about linear combinations of vectors in F^{n} which is stated next.

Proof: If the conclusion were not so, then by definition,