Review for Quiz 3

SECTION 4.1

You must know the definition of a vector space and of a subspace of a vector space. Very important are:

     Theorem 1 (conditions for a subspace) on pag. 167
     Theorem 2 (Solution subspace) on pag. 169. This Theorem can be used quite often as a very quick way to determine whether a
     subset is a subspace or not. You should also know how to prove Theorem 2.
     Example 2 on pag. 167. Note that this Example is just a particular case of Theorem 2. It can be derived from Theorem 2 by taking
     A as a 1 x n matrix. Read the proof of Example 2 and make sure you understand it.

Remember that:

     to prove that a subset W IS NOT a subspace you need to
     find two vectors in W whose sum is not in W, OR
     find one vector in W such that a scalar multiple of it is not in W.
     If you want instead to prove that a subset W IS a subspace, you need to show that properties (i) and (ii) in Theorem 1 hold for
     ALL vectors in W. Just checking the two properties for a few examples of vectors in W is not enough.

Suggested problems: Problems 1--14 and all the other homework problems. .

SECTION 4.2

The definition of linear independence on pag. 175 is fundamental. You can check if a set of vectors is linearly independent or linearly
dependent in different ways.

     One way that works all the time is to set up a system as in equation (4), pag. 175 and check whether the zero vector is the only
     solution or not. If it is then the vectors are linearly independent. See Example 5, pag. 176 or Example 6, pag 177.
     Another way that works only when we have n vectors in R^n is to use Theorem 2 (pag. 178) and check whether the determinant of
     the corresponding matrix is zero or not. If it is nonzero then the vectors are linearly independent.
     Also remember that any set of more than n vectors in R^n is linearly dependent so, for example, in problem 4 on pag. 179 we do
     not need to do any work to conclude that the vectors are linearly dependent.

Remember the geometrical interpretation of linear dependence and independence.

     Two vectors are linearly dependent if they are parallel.
     Three vectors are linearly dependent if they are coplanar. Can you see then why, for instance, three or more vectors in R^2 are
     linearly dependent?

Also remember that if a set of vectors is linearly dependent it is possible to write one of the vectors in the set as a linear combination of the
others; practice on this.

Suggested problems: All the homework problems (and similar ones) are good practice for the test. Recommended are also problems
1--8.

SECTION 4.3

     Definition of Basis. A set of vectors is a basis for the vector space V if
          The vectors are linearly independent
          The vectors span the subspace V.
     To check if a set of vectors is a basis for  you just need to make sure you have exactly n vectors and that the vectors are
     linearly independent. To find a basis for a subspace of  things are a little bit more complicated, see the textbook.
     To find a basis for the solution space of a homogeneous linear system we use the Algorithm on page 186.
     Definition of dimension of a vector space. The dimension of a vector space is given by the number of vectors in any basis (this
     definition makes sense because any two bases for a vector space consist of the same number of vectors).
     The dimension of  R^n is n. To find the dimension of a subspace V you first need to find a basis and then count the number of vectors in the basis.

Suggested Problems: All the homework problems up to # 26.
 

SECTION 4.4

     Definition of Row and Column space.
     Algorithm 1 on pag. 190 for a basis for the row space. Note that the rows for the basis are taken from the Echelon matrix and
     not from the original matrix A (unless A is already in Echelon form).
     Algorithm 2 on pag. 193 for a basis for a column space. This time the columns are taken from the original matrix A. The goal of
     reducing A to Echelon form is only to determine which ones are the pivot columns.
     Definition of row rank, column rank, and rank of a matrix A. (Column rank = dimension of the column space of A, Col(A), Row
     rank = dimension of the row space of A, Row(A). Column rank = Row rank = rank of A).
 
Suggested Problems: All the hw problems.