Existence of a vector space basis. Linear dependence basis dimension replacement of basis Complementing an independent system of vectors to a basis

Golovizin V.V. Lectures on Algebra and Geometry. 5

Lectures on Algebra and Geometry. Semester 2.

Lecture 23. The basis of a vector space.

Summary: criterion of linear dependence of a system of non-zero vectors, subsystems of a system of vectors, generating system of vectors, minimal generating system and maximal linearly independent system, vector space basis and 4 equivalent definitions, vector space dimension, finite-dimensional vector space and the existence of its basis, addition to the basis.

item 1. Criterion for linear dependence of a system of nonzero vectors.

Theorem. A system of non-zero vectors is linearly dependent if and only if there is a vector of the system that is linearly expressed in terms of the previous vectors of this system.

Proof. Let the system consist of nonzero vectors and be linearly dependent. Consider a system of one vector:
. Because
, then the system
is linearly independent. Attach a vector to it . If the resulting system
linearly independent, then we add the following vector to it: . Etc. continue until we get a linearly dependent system
, Where . There will definitely be such a number, because. source system
is linearly dependent by assumption.

So, by construction, we have obtained a linearly dependent system
, moreover, the system
is linearly independent.

System
represents the null vector in a non-trivial way, i.e. there is such a non-zero set of scalars
, What

where is the scalar
.

Indeed, otherwise, if
, then we would have a non-trivial representation of the zero vector by a linearly independent system
, which is impossible.

Dividing the last equality by a non-zero scalar
, we can express a vector from it :

,

Since the converse is obvious, the theorem is proved.

item 2. Subsystems of a system of vectors of a vector space.

Definition. Any non-empty subset of the vector system
is called a subsystem of the given system of vectors.

Example. Let
is a system of 10 vectors. Then the systems of vectors:
;
,
are subsystems of this system of vectors.

Theorem. If a system of vectors contains a linearly dependent subsystem, then the system of vectors itself is also linearly dependent.

Proof. Let a system of vectors be given
and let, for definiteness, the subsystem
, Where
is linearly dependent. Then it represents the null vector in a non-trivial way:

where among the coefficients
there is at least one that is not equal to zero. But then the following equality is a non-trivial representation of the null vector:

whence, by definition, follows the linear dependence of the system
, etc.

The theorem has been proven.

Consequence. Any subsystem of a linearly independent system of vectors is linearly independent.

Proof. Let's assume the opposite. Let some subsystem of the given system be linearly dependent. Then the linear dependence of this system follows from the theorem, which contradicts the condition.

The consequence is proven.

item 3. Column systems of an arithmetic vector column space.

From the results of the previous section, as a special case, the following theorem follows.

1) A column system is linearly dependent if and only if there is at least one column in the system that is linearly expressed in terms of other columns of the given system.

2) A column system is linearly independent if and only if no column of the system is linearly expressed in terms of other columns of the given system.

3) The system of columns containing the zero column is linearly dependent.

4) A column system containing two equal columns is linearly dependent.

5) A column system containing two proportional columns is linearly dependent.

6) A column system containing a linearly dependent subsystem is linearly dependent.

7) Any subsystem of a linearly independent system of columns is linearly independent.

The only thing that may need to be clarified here is the concept of proportional columns.

Definition. Two non-zero columns
called proportional if there is a scalar
, such that
or

,
, …,
.

Example. System
is linearly dependent since its first two columns are proportional.

Comment. We already know (see Lecture 21) that the determinant is equal to zero if the system of its columns (rows) is linearly dependent. Later it will be proved that the converse statement is also true: if the determinant is equal to zero, then the system of its columns and the system of its rows are linearly dependent.

item 4. Basis of a vector space.

Definition. Vector system
vector space over the field K is called a generating (generating) system of vectors of this vector space if it represents any of its vectors, i.e. if there is such a set of scalars
, What .

Definition. A system of vectors in a vector space is called a minimal generating system if, when any vector is removed from this system, it ceases to be a generating system.

Comment. From the definition it immediately follows that if the generating system of vectors is not minimal, then there is at least one vector of the system, upon removal of which from the system, the remaining system of vectors will still be generating.

Lemma (On a linearly dependent generating system.)

If in a linearly dependent and generating system of vectors one of the vectors is linearly expressed in terms of the others, then it can be removed from the system and the remaining system of vectors will be generating.

Proof. Let the system
linearly dependent and generating, and let one of its vectors be linearly expressed in terms of other vectors of this system.

For definiteness and for simplicity of notation, we assume that

Because
is a generating system, then
there is such a set of scalars
, What

.

Hence we get

those. any vector x is linearly expressed in terms of the vectors of the system
, which means that it is a generating system, etc.

Corollary 1. A linearly dependent and generating system of vectors is not minimal.

Proof. Immediately follows from the lemma and the definition of a minimal generating system of vectors.

Corollary 2. The minimal generating system of vectors is linearly independent.

Proof. Assuming the opposite, we arrive at a contradiction with Corollary 1.

Definition. A system of vectors in a vector space is called a maximal linearly independent system if, when any vector is added to this system, it becomes linearly dependent.

Comment. From the definition it immediately follows that if the system is linearly independent, but not maximal, then there is a vector, when added to the system, a linearly independent system is obtained.

Definition. The basis of a vector space V over a field K is an ordered system of its vectors that represents any vector of the vector space in a unique way.

In other words, the system of vectors
vector space V over a field K is called its basis if
there is only one set of scalars
, such that .

Theorem. (On four equivalent definitions of a basis.)

Let
is an ordered system of vectors in a vector space. Then the following statements are equivalent:

1. System
is the basis.

2. System
is a linearly independent and generating system of vectors.

3. System
is the maximum linearly independent system of vectors.

4. System
is the minimal generating system of vectors.

Proof.

Let the system of vectors
is the basis. It immediately follows from the definition of a basis that this system of vectors is a generating system of vectors of a vector space, so we only need to prove its linear independence.

Let's assume that this system vectors linearly dependent. Then there are two representations of the zero vector - trivial and non-trivial, which contradicts the definition of a basis.

Let the system of vectors
is linearly independent and generating. We need to prove that this linearly independent system is maximal.

Let's assume the opposite. Let the given linearly independent system of vectors be not maximal. Then, by virtue of the remark above, there is a vector that can be added to this system, and the resulting system of vectors remains linearly independent. However, on the other hand, the vector added to the system can be represented as a linear combination of the original system of vectors due to the fact that it is a generating system.

And we get that in the new, extended, system of vectors, one of its vectors is linearly expressed through the other vectors of this system. Such a system of vectors is linearly dependent. We got a contradiction.

Let the system of vectors
vector space is maximal linearly independent. Let us prove that it is a minimal generating system.

a) First we prove that it is a generating system.

Note that due to linear independence, system
does not contain a null vector. Let be an arbitrary nonzero vector. Let's add it to the given system of vectors:
. The resulting system of non-zero vectors is linearly dependent, since the original system of vectors is maximal linearly independent. So, in this system, there is a vector linearly expressed through the previous ones. In the original linearly independent system
none of the vectors can be expressed in terms of the previous ones, therefore, only the vector x is linearly expressed in terms of the previous ones. Thus the system
represents any non-zero vector. It remains to note that this system obviously also represents the zero vector, i.e. system
is generative.

b) Now let us prove its minimality. Let's assume the opposite. Then one of the vectors of the system can be removed from the system and the remaining system of vectors will still be a generating system and, therefore, the vector removed from the system is also linearly expressed in terms of the remaining vectors of the system, which contradicts the linear independence of the original system of vectors.

Let the system of vectors
vector space is a minimal generating system. Then it represents any vector of a vector space. We need to prove the uniqueness of the representation.

Let's assume the opposite. Let some vector x be linearly expressed in terms of the vectors of the given system in two different ways:

By subtracting the other from one, we get:

By Corollary 2, the system
is linearly independent, i.e. represents the null vector only trivially, so all the coefficients of this linear combination must be zero:

Thus, any vector x is linearly expressed in terms of the vectors of the given system in a unique way, q.e.d.

The theorem has been proven.

item 5. Dimension of a vector space.

Theorem 1. (On the number of vectors in linearly independent and generating systems of vectors.) The number of vectors in any linearly independent system of vectors does not exceed the number of vectors in any generating system of vectors of the same vector space.

Proof. Let
arbitrary linearly independent system of vectors,
is an arbitrary generating system. Let's assume that .

Because
generating system, then it represents any vector of the space, including the vector . Let's add it to this system. We get a linearly dependent and generating system of vectors:
. Then there is a vector
of this system, which is linearly expressed in terms of the previous vectors of this system and, by virtue of the lemma, can be removed from the system, and the remaining system of vectors will still be generating.


. Because this system is generating, then it represents a vector
and, adding it to this system, we again obtain a linearly dependent and generating system: .

Then everything repeats. There is a vector in this system that is linearly expressed in terms of the previous ones, and it cannot be a vector , because source system
linearly independent and vector not expressed linearly in terms of a vector
. So it can only be one of the vectors
. Removing it from the system , we obtain, after renumbering, the system , which will be the generating system. Continuing this process, after steps we obtain a generating system of vectors: , where
, because according to our guess. This means that this system, as a generator, also represents the vector , which contradicts the condition of linear independence of the system
.

Theorem 1 is proved.

Theorem 2. (On the number of vectors in a basis.) Any basis of a vector space contains the same number of vectors.

Proof. Let
And
are two arbitrary vector space bases. Any basis is a linearly independent and generating system of vectors.

Because the first system is linearly independent, and the second is generating, then, by Theorem 1,
.

Similarly, the second system is linearly independent, and the first is generating, then . Hence it follows that
, etc.

Theorem 2 is proved.

This theorem allows us to introduce the following definition.

Definition. The dimension of a vector space V over a field K is the number of vectors in its basis.

Designation:
or
.

item 6. Existence of a vector space basis.

Definition. A vector space is called finite-dimensional if it has a finite generating system of vectors.

Comment. We will study only finite-dimensional vector spaces. Despite the fact that we already know quite a lot about the basis of a finite-dimensional vector space, we are not sure that such a basis exists at all. All previously obtained properties were obtained under the assumption that the basis exists. The following theorem closes this issue.

Theorem. (On the existence of a basis for a finite-dimensional vector space.) Any finite-dimensional vector space has a basis.

Proof. By assumption, there exists a finite generating system of vectors of a given finite-dimensional vector space V:
.

We note right away that if the generating system of vectors is empty, i.e. does not contain any vector, then by definition it is assumed that the given vector space is null, i.e.
. In this case, by definition, it is assumed that the basis of the zero vector space is an empty basis, and its dimension, by definition, is assumed to be zero.

Let further, a nonzero vector space and
finite generating system of non-zero vectors. If it is linearly independent, then everything is proved, because linearly independent and generating system of vectors of a vector space is its basis. If the given system of vectors is linearly dependent, then one of the vectors of this system is linearly expressed in terms of the remaining ones and it can be removed from the system, and the remaining system of vectors, by virtue of Lemma Sec. 5, will still be generating.

We renumber the remaining system of vectors:
. Further reasoning is repeated. If this system is linearly independent, then it is a basis. If not, then again there is a vector in this system that can be removed, and the remaining system will be generating.

By repeating this process, we cannot be left with an empty vector system, since in the most extreme case, we will end up with a generating system of one non-zero vector, which is linearly independent and, therefore, a basis. Therefore, at some step we come to a linearly independent and generating system of vectors, i.e. to the basis.

The theorem has been proven.

Lemma. Let . Then:

1. Any system from the vector is linearly dependent.

2. Any linearly independent system of vectors is its basis.

Proof. 1). By the condition of the lemma, the number of vectors in the basis is equal and the basis is a generating system, so the number of vectors in any linearly independent system cannot exceed .

2). As follows from what has just been proved, any linearly independent system of vectors in this vector space is maximal, and hence a basis.

The lemma is proven.

Theorem (On addition to a basis.) Any linearly independent system of vectors of a vector space can be completed to a basis of this space.

Proof. Let a vector space of dimension n and
some linearly independent system of its vectors. Then
.

If
, then by the previous lemma, this system is a basis and there is nothing to prove.

If
, then this system is not a maximal linear independent system (otherwise it would be a basis, which is impossible, because ). Therefore, there is a vector
, such that the system
is linearly independent.

If, now , then the system
is the basis.

If
, all repeats. The process of replenishing the system cannot continue indefinitely, because. at each step, we obtain a linearly independent system of vectors in the space , and by the previous lemma, the number of vectors in such a system cannot exceed the dimensions of the space. Consequently, at some step we will come to the basis of the given space.

The theorem has been proven.

item 7. Example.

1. Let K be an arbitrary field, be an arithmetic vector space of height columns. Then . To prove this, consider the column system of this space.

Definition. A system of elements xj...,xx of a linear space V is called linearly dependent if there are numbers a",...,otq, not all equal to zero and such that If equality (1) is satisfied only for a] = ... = aq = 0, then the system of elements xj,...,x9 is called linearly independent. The following assertions are true. Theorem 1. A system of elements X\,..., xq (q ^ 2) is linearly dependent if and only if at least one of its elements can be represented as a linear combination of the others. Let us first assume that the system of elements xx...,xq is linearly dependent. For definiteness, we assume that in equality (1) the coefficient a9 is nonzero. Transferring all the terms, except the last one, to the right side, after dividing by otq Ф О we get that the element xq is a linear combination of elements xi,..., xq: Conversely, if one of the elements is equal to a linear combination of the rest, then, transferring it to the left side, we get a linear combination in which there are non-zero coefficients (-1 Ф 0). Hence, the system of elements Xi,_____ xq is linearly dependent. Theorem 2. Let the system of elements X|,...,X9 be linearly independent and y = a\X\ + .+ aqxq. Then the coefficients ori,...,aq are uniquely determined from the element y. m Let Then Linear dependence Basis Dimension Change of basis whence. From the linear independence of the elements X|,...,xq, it follows that a( and, hence, a. with zero factors, we obtain that not all coefficients are equal to zero in the linear combination of Fig. 5. Example: Vectors from Vj are linearly dependent if and only if they are coplanar (Fig. 5). An ordered system of elements in |,..., e" of a linear space V is called a basis of this linear space if the elements in |,..., en are linearly independent and each element of V can be represented as their linear combination. Ordering means here that each element assigned a certain (ordinal) number.From one system of n elements it is possible to construct n ordered systems.Example Let a.b.c be a triple of non-coplanar vectors from Vj (Fig.6). Then the ordered triples are different bases. Let c = (s! ... en) be a basis of the space V. Then for any element x from V there is a set of numbers. .., C such that By virtue of Theorem 2, the numbers,..., C - the coordinates of the element x in the basis c - are uniquely determined. Let's see what happens to the coordinates of the elements during the simplest actions. Let and for any number a Thus, when elements are added, their corresponding coordinates are added, and when an element is multiplied by a number, all its coordinates are multiplied by this number. It is often convenient to write the coordinates of an element as a column. For example, n is the element's coordinate column in the c basis. Let us expand an arbitrary system of elements X|,..., x in the basis c, and consider the coordinate columns of the elements X|,..., x9 in this basis: Theorem 4. The system of elements x\,...,xq is linearly dependent if and only if the system of their coordinate columns in some basis is linearly dependent. * Let at least one of the coefficients A* be different from zero. Let us write it down in more detail From here, due to the uniqueness of the expansion of the element in terms of the basis, it follows that Linear dependence Basis Dimension Change of basis This means that the system of coordinate columns is linearly dependent. If equality (2) holds, then, by reasoning in reverse order, we obtain formula (1). Thus, the vanishing of some non-trivial (at least one of the coefficients is non-zero) linear combination of elements of a linear space is equivalent to the fact that a non-trivial linear combination of their coordinate columns (with the same coefficients) is equal to the zero column. Theorem 5. Let a basis c of a linear space V consist of n elements. Then any system of m elements, where m > n, is linearly dependent. or, which is also, * In view of Theorem 3, it suffices to consider the case Let Xj,...,xn+| - arbitrary elements of the space V. Let us expand each element in terms of the basis c and write down the coordinates of the elements ........... in the form of a matrix, assigning a column to the coordinates of the element. We get a matrix of n rows and n + 1 columns - Due to the fact that the rank of the matrix K does not exceed the number n of its rows, the columns of the matrix K (n + 1 of them) are linearly dependent. And since these are the coordinate columns of the elements, then, according to Theorem 4, the system of elements X|.....xn+| is also linearly dependent. Consequence. All bases of the linear space V consist of the same number of elements. Let the basis c consist of n elements, and the basis c" of n elements. In view of the theorem just proved from the linear independence of the system e\,. .., e"n we conclude that n" ⩽ n. By interchanging the bases e and c" we obtain by the same theorem that n ⩽ n". Thus, n = n. The property of a linear space V is the number of elements of the basis of this space. Example 1. The basis of the coordinate space En is formed by the elements 4 The system of elements ei.ej,...,en is linearly independent: from equality we obtain, which means, In addition, any element E, = ... from R" can be written as a linear combination of elements. Thus, the dimension of the space R is equal to n. is equal to the number of elements of the FSR, i.e. n - r. where r is the rank of the matrix of coefficients of a homogeneous system, an is the number of unknowns Example 3. The dimension of the linear space Mn of polynomials of degree at most n is equal to n + 1. we differentiate equality (3) with respect to t: SETTING t = 0 again, we get WHAT 0| = 0. Continuing this process, we successively make sure that This. means that the system of elements in | = 1,...,en4) = *n is linearly independent. Therefore, the desired dimension is equal to n + 1. Agreement. Throughout this chapter, unless otherwise stated, it is assumed that the dimension of a linear space V is equal. It is clear that if W is a subspace of an n-dimensional linear space V, then dim W ^ n. Let us show that in an n-dimensional linear space V there are linear subspaces of any dimension k ^ n. Let a system of elements of a linear space V of dimension n be linearly independent and k. Then in the space V there are elements a*+1,..., an such that the system an is a basis of V. M Let b be an arbitrary element of the linear space V. , a* would be a basis by definition. But due to the condition, this is impossible. Therefore, there must be an element a*+i ∈ V such that the completed system ai,. .., ab, a*+| will be l frost but independent. If k + 1 = n, then this system is a basis of the space V. If k + 1, then the previous arguments should be repeated for the system a. In this way, any given linearly independent system of elements can be completed to the basis of the entire space V. Example. Complement the system of two vectors a| = (1,2,0,1), aj = (-1,1.1,0) of the space R4 to the basis of this space. M Let us take the vectors aj = (in the space R4 and show that the system of vectors ai.aj.aj, a4 is the basis of R4. The rank of the matrix whose rows are the coordinates of the vectors aar, az, e4, is equal to four. This means that the rows of the matrix A, and, therefore, the vectors at. ar. az, a ^ are linearly independent. a Linear dependence Basis Dimension Replacement of the basis by elementary row transformations is reduced to a trapezoidal form, and then supplemented by n - to rows of the form so that the rank of the resulting matrix is ​​equal to n. The following statement is true. Theorem 7. Let be linear subspaces of a linear space V, Then. Change of basis The transition matrix is ​​called the transition matrix from basis c to basis c. Properties of the transition matrix The proof of this property is carried out by contradiction. The equality det S = 0 implies a linear dependence of the columns of the matrix S. These columns are the coordinate columns of the element "..." e "n in the basis c. Therefore (and due to Theorem 4) the elements e"u..., e"n must be linearly dependent. The latter contradicts the fact that c" is a basis. Hence, the assumption that det S = 0 is false. 2. If ..., and ..., are the coordinates of the element x in the bases c and c", respectively, then to the basis of

It is called finite-dimensional if it has a finite generating system of vectors.

Comment. We will study only finite-dimensional vector spaces. Despite the fact that we already know quite a lot about the basis of a finite-dimensional vector space, we are not sure that such a space even exists. All previously obtained were obtained under the assumption that the basis exists. The next one closes this question.

Theorem. (On the existence of a basis for a finite-dimensional vector space.)

Any finite-dimensional vector space has a basis.

Proof. By assumption, there exists a finite generating system of a given finite-dimensional vector space V: .

We note right away that if the generating system of vectors is empty, i.e. does not contain any vector, then by definition it is assumed that the given vector space is null, i.e. . In this case, by definition, it is assumed that the basis of the zero vector space is an empty basis, and by definition it is assumed to be equal to zero.

If this system is independent, then everything is proved, because linearly independent and generating system of vectors of a vector space is its basis.

If the given system of vectors is linearly dependent, then one of the vectors of this system is linearly expressed in terms of the remaining ones and it can be removed from the system, and the remaining system of vectors will still be generating.

Let's renumber the remaining system of vectors: . Further reasoning is repeated.

If this system is linearly independent, then it is a basis. If not, then again there is a vector in this system that can be removed, and the remaining system will be generating.

By repeating this process, we cannot be left with an empty vector system, since in the most extreme case, we will end up with a generating system of one non-zero vector, which is linearly independent and, therefore, a basis. Therefore, at some step we come to a linearly independent and generating system of vectors, i.e. to the basis, etc.

The theorem has been proven.

Lemma. (On systems of vectors in an n-dimensional vector space.)

Let . Then:

1. Any system from the vector is linearly dependent.

2. Any linearly independent system of vectors is its basis.

Proof. 1). By the condition of the lemma, the number of vectors in the basis is equal and the basis is a generating system, so the number of vectors in any linearly independent system cannot exceed , i.e. any system containing a vector is linearly dependent.

2). As follows from what has just been proved, any linearly independent system of vectors in this vector space is maximal, and hence a basis.

The lemma is proven.

Theorem (On addition to a basis.) Any linearly independent system of vectors of a vector space can be completed to a basis of this space.

Proof. Let be a vector space of dimension n and some linearly independent system of its vectors. Then .

If , then by the previous lemma, this system is a basis and there is nothing to prove.

If , then this system is not a maximal independent system (otherwise it would be a basis, which is impossible, since ). Therefore, there is a vector such that the system is linearly independent.

If, now , then the system is the basis.

If so, everything repeats. The process of replenishing the system cannot continue indefinitely, because. at each step, we obtain a linearly independent system of vectors in the space , and by the previous lemma, the number of vectors in such a system cannot exceed the dimensions of the space. Consequently, at some step we arrive at a basis of the given space., etc.

Definition. Basis

arithmetic vector space of columns of height n is called canonical or natural.

Let V vector space over the field R, S- system of vectors from V.

Definition 1. The basis of the system of vectors S such an ordered linearly independent subsystem is called B 1, B 2, ..., B R systems S, that any vector of the system S linear combination of vectors B 1, B 2, ..., B R.

Definition 2. The rank of the system of vectors S is the number of basis vectors of the system S. The rank of the system of vectors is denoted S symbol R= rank S.

If S = ( 0 ), then the system has no basis and it is assumed that rang S= 0.

Example 1 Let a system of vectors be given A 1 = (1,2), A 2 = (2,3), A 3 = (3,5), A 4 = (1,3). Vector A 1 , A 2 form the basis of this system, since they are linearly independent (see Example 3.1) and A 3 = A 1 + A 2 , A 4 = 3A 1 - A 2. The rank of this system of vectors is two.

Theorem 1(theorem about bases). Let S be a finite system of vectors from V, S ≠{0 }. Then the assertions are true.

1 ° Any linearly independent subsystem of the system S can be completed to a basis.

2 ° The system S has a basis.

2 ° Any two bases of the system S contain the same number of vectors, i.e., the rank of the system does not depend on the choice of the basis.

4 ° If R= rank S, then any r linearly independent vectors form a basis of the system S.

5 ° If R= rank S, Then any k > r vectors of the system S are linearly dependent.

6 ° Any vector A€ S is uniquely linearly expressed in terms of basis vectors, i.e., if B 1, B 2, ..., B R is a basis of the system S, then

A = A1 B 1 + A2 B 2 +...+ ARB R; A1 , A2 , ..., AN€P,(1)

And this view is the only one.

By virtue of the 5° basis this is Maximum linearly independent subsystem systems S, and the rank of the system S the number of vectors in such a subsystem.

Vector representation A in the form (1) is called Decomposition of a vector in basis vectors, and the numbers a1, a2 , ..., ar are called Vector coordinates A in this basis.

Proof. 1° Let B 1, B 2, ..., B K- linearly independent subsystem of the system S. If each vector of the system S Expressed linearly in terms of the vectors of our subsystem, then by definition it is the basis of the system S.

If there is a vector in the system S, which is not linearly expressed in terms of vectors B 1, B 2, ..., B K, then we denote it by B K+1 . Then the systems B 1, B 2, ..., B K, B K+1 - linearly independent. If each vector of the system S Is linearly expressed in terms of the vectors of this subsystem, then by definition it is the basis of the system S.

If there is a vector in the system S, which is not linearly expressed in terms of B 1, B 2, ..., B K, B K+1, then we repeat the reasoning. Continuing this process, we either arrive at the basis of the system S, or increase the number of vectors in a linearly independent system by one. Since in the system S a finite number of vectors, then the second alternative cannot continue indefinitely and at some step we obtain the basis of the system S.

2° Let S finite system of vectors and S ≠{0 ). Then in the system S have a vector B 1 ≠ 0, which forms a linearly independent subsystem of the system S. According to the first part, it can be supplemented to the basis of the system S. Thus the system S has a basis.

3° Assume that the system S has two bases:

B 1, B 2, ..., B R , (2)

C 1, C 2, ..., C S , (3)

By definition of the basis, the system of vectors (2) is linearly independent and (2) Н S. Further, by definition of the basis, each vector of system (2) is a linear combination of vectors of system (3). Then by the main theorem on two systems of vectors R £ S. Similarly, it is proved that S £ R. These two inequalities imply R = S.

4° Let R= rank S, A 1, A 2, ..., A R- linearly independent subsystem S. Let us show that it is the basis of the systems S. If it is not a basis, then by the first part it can be supplemented to a basis and we obtain a basis A 1, A 2, ..., A R, A R+1,..., A R+T containing more than R

5° If K vectors A 1, A 2, ..., A K (K > R) systems S are linearly independent, then in the first part this system of vectors can be supplemented to a basis and we obtain a basis A 1, A 2, ..., A K, A K+1,..., A K+T containing more than R vectors. This contradicts what was proved in the third part.

6° Let B 1, B 2, ..., B R system basis S. By the definition of a basis, any vector A S is a linear combination of basis vectors:

A = a1 B 1 + a2 B 2 +...+ar B R.

Proving the uniqueness of such a representation, we assume the contrary, that there is one more representation:

A = b1 B 1 + b2 B 2 +...+br B R.

Subtracting the equalities term by term, we find

0 = (a1 - b1) B 1 + (a2 - b2) B 2 +...+ (ar - br) B R.

Since the basis B 1, B 2, ..., B R linearly independent system, then all coefficients ai - bi =0; I = 1, 2, ..., R. Therefore, ai = bi ; I = 1, 2, ..., R and uniqueness is proved.