Linear Independence and Bases

If we look somewhat more closely at two of the examples described in the previous section, namely Example 4.1.4 and Example 4.1.3, we notice that although they do have many properties in common there is one striking difference between them. This difference lies in the fact that in the former we can find a finite number of elements, 1, such that every element can be written as a combination of these with coefficients from , whereas in the latter no such finite set of elements exists.

We now intend to examine, in some detail, vector spaces which can be generated, as was the space in Example 4.1.4, by a finite set of elements.

DEFINITION: If is a vector space over and if then any element of the form



where the is a linear combination over of .

Since we usually are working with some fixed field we shall often say linear combination rather than linear combination over . Similarly it will be understood that when we say vector space we mean vector space over .

DEFINITION: If is a nonempty subset of the vector space then , the linear span of , is the set of all linear combinations of finite sets of elements of .

We put, after all, into the elements required by the axioms of a vector space, so it is not surprising to find

LEMMA 4.2.1: is a subspace of .


The proof of each part of the next lemma is straightforward and easy and we leave the proofs as exercises to the reader.

LEMMA 4.2.2: If are subsets of then

1. implies .

2. .

3. .

DEFINITION: The vector space is said to be finite-dimensional (over ) if there is a finite subset in such that .

Note that is finite-dimensional over for if consists of the -vectors



then .

Although we have defined what is meant by a finite-dimensional space we have not, as yet, defined what is meant by the dimension of a space. This will come shortly.

DEFINITION: If is a vector space and if are in we say that they are linearly dependent over if there exist elements in not all of them such that



If the vectors are not linearly dependent over they are said to be linearly independent over . Here too we shall often contract the phrase ``linearly dependent over '' to ``linearly dependent.'' Note that if are linearly independent then none of them can be , for if , say, then



for any in .

In it is easy to verify that , and are linearly independent while and are linearly dependent.

We point out that linear dependence is a function not only of the vectors but also of the field. For instance, the field of complex numbers is a vector space over the field of real numbers and it is also a vector space over the field of complex numbers. The elements in it are linearly independent over the reals but are linearly dependent over the complexes, since .

The concept linear dependence is an absolutely basic and ultra-important one. We now look at some of its properties.

LEMMA 4.2.3: If are linearly independent, then every element in their linear span has a unique representation in the form with the


The next theorem, although very easy and at first glance of a somewhat technical nature, has as consequences results which form the very foundations of the subject. We shall list some of these as corollaries; the others will appear in the succession of lemmas and theorems that are to follow.

THEOREM 4.2.1: If are in then either they are linearly independent or some is a linear combination of the preceding ones, .


COROLLARY 1: If in have as linear span and if are linearly independent, then we can find a subset of of the form consisting of linearly independent elements whose linear span is also


COROLLARY 2: If is a finite-dimensional vector space, then it contain a finite set of linearly independent elements whose linear span is


DEFINITION: A subset of a vector space is called a basis of if consists of linearly independent elements (that is, any finite number of elements in is linearly independent) and .

In this terminology we can rephrase Corollary 2 as

COROLLARY 3: If is a finite-dimensional vector space and if span then some subset of   forms a basis of .

Corollary 3 asserts that a finite-dimensional vector space has a basis containing a finite number of elements . Together with Lemma 4.2.3 this tells us that every element in has a unique representation in the form with in .

Let us see some of the heuristic implications of these remarks. Suppose that is a finite-dimensional vector space over ; as we have seen above, has a basis . Thus every element has a unique representation in the form . Let us map into by defining the image of to be . By the uniqueness of representation in this form, the mapping is well defined, one-to-one, and is onto; it can be shown to have all the requisite properties of an isomorphism. Thus is isomorphic to for some , where in fact is the number of elements in some basis of over . If some other basis of should have elements, by the same token would be isomorphic to . Since both and would now be isomorphic to , they would be isomorphic to each other.

A natural question then arises! Under what conditions on and are and isomorphic? Our intuition suggests that this can only happen when . Why? For one thing, if should be a field with a finite number of elements --- for instance, if the integers modulo the prime number --- then has elements whereas has elements. Isomorphism would imply that they have the same number of elements, and so, we would have . From another point of view, if were the field of real numbers, then (in what may be a rather vague geometric way to the reader) represents real -space, and our geometric feeling tells us that -space is different from -space for . Thus we might expect that if is any field then is isomorphic to only if . Equivalently, from our earlier discussion, we should expect that any two bases of have the same number of elements. It is towards this goal that we prove the next lemma.

LEMMA 4.2.4: If is a basis of over and if in are linearly independent over then .


This lemma has as consequences (which we list as corollaries) the basic results spelling out the nature of the dimension of a vector space. These corollaries are of the utmost importance in all that follows, not only in this chapter but in the rest of the book, in fact in all of mathematics. The corollaries are all theorems in their own rights.

COROLLARY 1: If is finite-dimensional over then any two bases of have the same number of elements.


COROLLARY 2: is isomorphic if and only if .


Corollary 2 puts on a firm footing the heuristic remarks made earlier about the possible isomorphism of and . As we saw in those remarks, is isomorphic to for some . By Corollary 2, this is unique, thus

COROLLARY 3: If is finite-dimensional over then is isomorphic to for a unique integer ; in fact, is the number of elements in any basis of over .

DEFINITION: The integer in Corollary 3 is called the dimension of over .

The dimension of over is thus the number of elements in any basis of over .

We shall write the dimension of over as or, the occasional time in which we shall want to stress the role of the field , as .

COROLLARY 4: Any two finite-dimensional vector spaces over of the same dimension are isomorphic.


How much freedom do we have in constructing bases of ? The next lemma asserts that starting with any linearly independent set of vectors we can``blow it up'' to a basis of V.

LEMMA 4.2.5: If is finite-dimensional over and if are linearly independent, then we can find vectors in such that is a basis of .


What is the relation of the dimension of a homomorphic image of to that of ? The answer is provided us by

LEMMA 4.2.6: If is finite-dimensional and if is a subspace of then is finite-dimensional, and



COROLLARY: If and are finite-dimensional subspaces of a vector space then is finite-dimensional and