Linear Independence and Bases
If we look somewhat more closely at two of the examples described in the previous section, namely Example 4.1.4 and Example 4.1.3,
we notice that although they do have many properties in common there is one striking difference between them. This difference lies in the fact that in the former we can find a finite number of elements, 1, such that every element can be written as a combination of these with coefficients from , whereas in the latter no such finite set of elements exists.
We now intend to examine, in some detail, vector spaces which can be generated, as was the space in Example 4.1.4, by a finite set of elements.
DEFINITION: If is a vector space over and if then any element of the form
where the is a linear combination over of .
Since we usually are working with some fixed field we shall often say linear combination rather than linear combination over . Similarly it will be understood that when we say vector space we mean vector space over .
DEFINITION: If is a nonempty subset of the vector space then , the linear span of , is the set of all linear combinations of finite sets of elements of .
We put, after all, into the elements required by the axioms of a vector space, so it is not surprising to find
LEMMA 4.2.1: is a subspace of .
Proof: If and are in then
where the 's and 's are in and the and are all in . Thus, for
and so is again in . has been shown to be a subspace of .
The proof of each part of the next lemma is straightforward and easy and we leave the proofs as exercises to the reader.
LEMMA 4.2.2: If are subsets of then
1. implies .
DEFINITION: The vector space is said to be finite-dimensional (over ) if there is a finite subset in such that .
Note that is finite-dimensional over for if consists of the -vectors
Although we have defined what is meant by a finite-dimensional space we have not, as yet, defined what is meant by the dimension of a space. This will come shortly.
DEFINITION: If is a vector space and if are in we say that they are linearly dependent over if there exist elements in not all of them such that
If the vectors are not linearly dependent over they are said to be linearly independent over . Here too we shall often contract the phrase ``linearly dependent over '' to ``linearly dependent.'' Note that if are linearly independent then none of them can be , for if , say, then
for any in .
In it is easy to verify that , and are linearly independent while and are linearly dependent.
We point out that linear dependence is a function not only of the vectors but also of the field. For instance, the field of complex numbers is a vector space over the field of real numbers and it is also a vector space over the field of complex numbers. The elements in it are linearly independent over the reals but are linearly dependent over the complexes, since .
The concept linear dependence is an absolutely basic and ultra-important one. We now look at some of its properties.
LEMMA 4.2.3: If are linearly independent, then every element in their linear span has a unique representation in the form with the
Proof: By definition, every element in the linear span is of the form . To show uniqueness we must demonstrate that if
then we certainly have
which by the linear independence of forces
The next theorem, although very easy and at first glance of a somewhat technical nature, has as consequences results which form the very foundations of the subject. We shall list some of these as corollaries; the others will appear in the succession of lemmas and theorems that are to follow.
THEOREM 4.2.1: If are in then either they are linearly independent or some is a linear combination of the preceding ones, .
Proof: If are linearly independent there is, of course, nothing to prove. Suppose then that
where not all the 's are . Let be the largest integer for which . Since for which, since , implies that
Thus is a linear combination of its predecessors.
COROLLARY 1: If in have as linear span and if are linearly independent, then we can find a subset of of the form consisting of linearly independent elements whose linear span is also
Proof: If are linearly independent we are done. If not, weed out from this set the first which is a linear combination of its predecessors. Since are linearly independent, . The subset so constructed, has elements. Clearly its linear span is contained in . However, we claim that it is actually equal to ; for, given can be written as a linear combination of . But in this linear combination we can replace by a linear combination of . That is, is a linear combination of .
Continuing this weeding out process, we reach a subset whose linear span is still but in which no element is a linear combination of the preceding ones. By Theorem 4.2.1 the elements must be linearly independent.
COROLLARY 2: If is a finite-dimensional vector space, then it contain a finite set of linearly independent elements whose linear span is
Proof: Since is finite-dimensional, it is the linear span of a finite number of elements . By Corollary 1 we can find a subset of these, denoted by , consisting of linearly independent elements whose linear span must also be
DEFINITION: A subset of a vector space is called a basis of if consists of linearly independent elements (that is, any finite number of elements in is linearly independent) and .
In this terminology we can rephrase Corollary 2 as
COROLLARY 3: If is a finite-dimensional vector space and if span then some subset of forms a basis of .
Corollary 3 asserts that a finite-dimensional vector space has a basis containing a finite number of elements . Together with Lemma 4.2.3 this tells us that every element in has a unique representation in the form with in .
Let us see some of the heuristic implications of these remarks. Suppose that is a finite-dimensional vector space over ; as we have seen above, has a basis . Thus every element has a unique representation in the form . Let us map into by defining the image of to be . By the uniqueness of representation in this form, the mapping is well defined, one-to-one, and is onto; it can be shown to have all the requisite properties of an isomorphism. Thus is isomorphic to for some , where in fact is the number of elements in some basis of over . If some other basis of should have elements, by the same token would be isomorphic to . Since both and would now be isomorphic to , they would be isomorphic to each other.
A natural question then arises! Under what conditions on and are and isomorphic? Our intuition suggests that this can only happen when . Why? For one thing, if should be a field with a finite number of elements --- for instance, if the integers modulo the prime number --- then has elements whereas has elements. Isomorphism would imply that they have the same number of elements, and so, we would have . From another point of view, if were the field of real numbers, then (in what may be a rather vague geometric way to the reader) represents real -space, and our geometric feeling tells us that -space is different from -space for . Thus we might expect that if is any field then is isomorphic to only if . Equivalently, from our earlier discussion, we should expect that any two bases of have the same number of elements. It is towards this goal that we prove the next lemma.
LEMMA 4.2.4: If is a basis of over and if in are linearly independent over then .
Proof: Every vector in , so in particular , is a linear combination of Therefore the vectors are linearly dependent. Moreover, they span since already do so. Thus some proper subset of these with forms a basis We have ``traded off'' one in forming this new basis, for at least one . Repeat the procedure with the set . From this linearly dependent set, by Corollary 1 to Theorem 4.2.1, we can extract a basis of the form . Keeping up this procedure we eventually get down to a basis of of the form ; since is not a linear combination of , . . ., , the above basis must actually include some . To get to this basis we have introduced 's, each such introduction having cost us at least one , and yet there is a left. Thus and so .
This lemma has as consequences (which we list as corollaries) the basic results spelling out the nature of the dimension of a vector space. These corollaries are of the utmost importance in all that follows, not only in this chapter but in the rest of the book, in fact in all of mathematics. The corollaries are all theorems in their own rights.
COROLLARY 1: If is finite-dimensional over then any two bases of have the same number of elements.
Proof: Let be one basis of over and let be another. In particular, are linearly independent over whence, by Lemma 4.2.4, . Now interchange the roles of the 's and 's and we obtain that . Together these say that .
COROLLARY 2: is isomorphic if and only if .
Proof: has, as one basis, the set of vectors,
Likewise has a basis containing vectors. An isomorphism maps a basis onto a basis (Problem 4, end of this section), hence, by Corollary 1, .
Corollary 2 puts on a firm footing the heuristic remarks made earlier about the possible isomorphism of and . As we saw in those remarks, is isomorphic to for some . By Corollary 2, this is unique, thus
COROLLARY 3: If is finite-dimensional over then is isomorphic to for a unique integer ; in fact, is the number of elements in any basis of over .
DEFINITION: The integer in Corollary 3 is called the dimension of over .
The dimension of over is thus the number of elements in any basis of over .
We shall write the dimension of over as or, the occasional time in which we shall want to stress the role of the field , as .
COROLLARY 4: Any two finite-dimensional vector spaces over of the same dimension are isomorphic.
Proof: If this dimension is then each is isomorphic to , hence they are isomorphic to each other.
How much freedom do we have in constructing bases of ? The next lemma asserts that starting with any linearly independent set of vectors we can``blow it up'' to a basis of V.
LEMMA 4.2.5: If is finite-dimensional over and if are linearly independent, then we can find vectors in such that is a basis of .
Proof: Since is finite-dimensional it has a basis; let be a basis of . Since these span V, the vectors also span . By Corollary 1 to Theorem 4.2.1 there is a subset of these of the form which consists of linearly independent elements which span . To prove the lemma merely put .
What is the relation of the dimension of a homomorphic image of to that of ? The answer is provided us by
LEMMA 4.2.6: If is finite-dimensional and if is a subspace of then is finite-dimensional, and
Proof: By Lemma 4.2.4, if then any elements in are linearly dependent; in particular, any elements in are linearly dependent. Thus we can find a largest set of linearly independent elements in and . If then is a linearly dependent set, whence
and not all of the 's are . If , by the linear independence of the we would get that each , a contradiction. Thus , and so
Consequently, span ; by this, is finite-dimensional over , and furthermore, it has a basis of elements, where . From the definition of dimension it then follows that .
Now, let be a basis of . By Lemma 4.2.5, we can fill this out to a basis, of , where and .
Let be the images, in of . Since any vector is of the form
then , the image of , is of the form
(since . Thus span . We claim that they are linearly independent, for if then , and so
which, by the linear independence of the set forces
We have shown that has a basis of elements, and so,
COROLLARY: If and are finite-dimensional subspaces of a vector space then is finite-dimensional and
Proof: By the result of Problem 13 at the end of Section 4.1,
and since and are finite-dimensional we get that
Transposing yields the result stated in the lemma.