Skip to main content

Section 5.1 Linear Independence

Linear independenceβ€”or, rather, its oppositeβ€”is related to the idea of redundancy. If there is a linear dependence among a set of vectors, then we don’t need all of those vectors to produce the same span.

Definition 5.1.1.

Consider a set of vectors \(V' = \{\bfv_1, \ldots, \bfv_n\}\) in a vector space \(V\text{.}\) When \(n \ge 2\text{,}\) we say that \(V'\) is linearly dependent if, for some \(i\text{,}\) \(1 \le i \le n\text{,}\) \(\bfv_i\) is a linear combination of the other vectors in the set. When \(n=1\text{,}\) the set \(V'\) is linearly dependent if and only if \(\bfv_1 = \bfo\text{.}\)

Example 5.1.2.

Consider the following three vectors in \(\ff_3^3\text{:}\)
\begin{equation*} \bfu = \begin{bmatrix} 0 \\ 2 \\ 1 \end{bmatrix}, \hspace{6pt} \bfv = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}, \hspace{6pt} \bfw = \begin{bmatrix} 1 \\ 0 \\ 2 \end{bmatrix} \text{.} \end{equation*}
The set \(\{\bfu, \bfv, \bfw\}\) is linearly dependent since \(\bfv = 2\bfu + \bfw\text{.}\)
The definition of linear dependence we have given is the intuitive one, but it is not the one most widely used. The following result provides an equivalent definition of linear dependence that is much easier to deploy.

Proof.

We will first dispatch with the case where \(n=1\text{.}\) If \(n=1\) and \(V'\) is linearly dependent, then \(\bfv_1=\bfo\text{.}\) Then the equation \(1\bfv_1 = \bfo\) is satisfied. Conversely, if \(c_1\bfv_1 = \bfo\) for some \(c_1 \neq 0\text{,}\) then by TheoremΒ 2.3.12, ItemΒ 7, we must have \(\bfv_1=\bfo\text{,}\) meaning \(V'\) is linearly dependent.
We now consider the case where \(n \ge 2\text{.}\) If \(V'\) is linearly dependent, then some vector in \(V'\text{,}\) call it \(\bfv_j\text{,}\) is a linear combination of the other vectors in \(V'\text{.}\) This means we have
\begin{equation*} \bfv_j = c_1\bfv_1 + \cdots + c_{j-1}\bfv_{j-1} + c_{j+1}\bfv_{j+1} + \cdots + c_n\bfv_n\text{.} \end{equation*}
If we subtract \(\bfv_j\) from both sides, we have
\begin{equation*} \bfo = c_1\bfv_1 + \cdots + c_{j-1}\bfv_{j-1} - \bfv_j + c_{j+1}\bfv_{j+1} + \cdots + c_n\bfv_n\text{.} \end{equation*}
Since we now have written \(\bfo\) as a non-trivial linear combination of the vectors in \(V'\)β€”that is, the coefficients in the linear combination are not all zeroβ€”we have completed half of the proof.
We now suppose that there is a linear combination of the vectors in \(V'\text{,}\)
\begin{equation*} \bfo = c_1\bfv_1 + \cdots + c_n\bfv_n\text{,} \end{equation*}
where not all of the coefficients are zero. If \(c_j \neq 0\text{,}\) then we can write
\begin{align*} -c_j\bfv_j \amp = c_1\bfv_1 + \cdots + c_{j-1}\bfv_{j-1} + c_{j+1}\bfv_{j+1} + \cdots + c_n\bfv_n\\ \bfv_j \amp = \left(-\frac{c_1}{c_j}\right)\bfv_1 + \cdots + \left(-\frac{c_{j-1}}{c_j}\right)\bfv_{j-1} \\ \amp \hspace{12pt} + \left(-\frac{c_{j+1}}{c_j}\right)\bfv_{j+1} + \cdots + \left(-\frac{c_n}{c_j}\right)\bfv_n\text{.} \end{align*}
Thus we have written \(\bfv_j\) as a linear combination of the other vectors in \(V'\text{,}\) so \(V'\) is linearly dependent.
We will often use this statement in PropositionΒ 5.1.3 as our definition of linear dependence.

Definition 5.1.4.

A set of vectors \(V' = \{\bfv_1, \ldots, \bfv_n\}\) in a vector space \(V\) is linearly independent if it is not linearly dependent.

Note 5.1.5.

In practice, we will think about linear independence in the following way. A set \(V' = \{\bfv_1, \ldots, \bfv_n\}\) is linearly independent if the vector equation
\begin{equation*} x_1\bfv_1 + \cdots + x_n \bfv_n = \bfo \end{equation*}
has only the trivial solution.
Further, when a set \(V'\) is linearly dependent, then we will call a non-trivial linear combination
\begin{equation*} c_1\bfv_1 + \cdots + c_n\bfv_n = \bfo \end{equation*}
a linear dependence relation for the vectors in \(V'\text{.}\)
We will try to make the notions of linear dependence and linear independence more concrete with some examples.

Example 5.1.6.

Consider the set \(V' = \{p_1, p_2\}\) in \(P_2\text{,}\) where
\begin{equation*} p_1 = 1 + t \hspace{6pt} \text{and} \hspace{6pt} p_2 = 3t^2\text{.} \end{equation*}
We can see that the set \(V'\) is linearly independent, because the only way to produce the zero polynomial from the combination \(c_1p_1 + c_2p_2\) is if \(c_1 = c_2 = 0\text{.}\) This is relatively easy to see in this example, since the degrees of \(t\) are not at all shared between \(p_1\) and \(p_2\text{.}\) If the coefficient of \(t^2\) must be zero in the sum \(c_1p_1+c_2p_2\text{,}\) then we must have \(c_2=0\text{.}\) And if the coefficient of \(t\) must be zero in the sum \(c_1p_1 + c_2p_2\text{,}\) then we must have \(c_1=0\text{.}\)

Example 5.1.7.

Consider the following vectors in \(\rr^2\text{:}\)
\begin{equation*} \bfv_1 = \begin{bmatrix} 2 \\ 4 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} -4 \\ -2 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 5 \\ -5 \end{bmatrix}\text{.} \end{equation*}
We can show that \(\bfv_3 \in \spn\{\bfv_1,\bfv_2\}\) by a familiar matrix reduction:
\begin{equation*} \begin{bmatrix} 2 \amp -4 \amp 5 \\ 4 \amp -2 \amp -5 \end{bmatrix} \sim \begin{bmatrix} 1 \amp 0 \amp -2.5 \\ 0 \amp 1 \amp -2.5 \end{bmatrix}\text{.} \end{equation*}
This shows us that \(\bfv_3 = -2.5 \bfv_1 - 2.5 \bfv_2\text{,}\) which proves that the set \(\{\bfv_1, \bfv_2, \bfv_3\}\) is linearly dependent. Further, we can conclude that the following is a linear dependence relation for the vectors in the set \(\{\bfv_1, \bfv_2, \bfv_3\}\text{:}\)
\begin{equation*} \bfo = 2.5 \bfv_1 +2.5 \bfv_2 + \bfv_3 \text{.} \end{equation*}
We will see now that sets of two vectors are particularly nice when it comes to determining linear independence. (This means that there was an easier way to complete ExampleΒ 5.1.6.)
Consider a set \(V' = \{\bfv, \bfw\}\) in a vector space \(V\text{.}\) If \(V'\) is linearly dependent, then we can write \(\bfv = c\bfw\) or \(\bfw = d\bfv\) for some \(c,d \in \ff\text{.}\) Conversely, if \(V'\) is linearly independent, then we cannot have either \(\bfv = c\bfw\) or \(\bfw = d\bfv\text{.}\) This means that we have a handy characterization of linear dependence for sets of two vectors.

Example 5.1.9.

If \(\bfv\) and \(\bfw\) are the following vectors in \(\rr^3\text{,}\)
\begin{equation*} \bfv = \begin{bmatrix} 4 \\ 3 \\ -3 \end{bmatrix} \hspace{6pt} \text{and} \hspace{6pt} \bfw = \begin{bmatrix} 2 \\ 3 \\ -1 \end{bmatrix}\text{,} \end{equation*}
then the set \(\{\bfv, \bfw\}\) is linearly independent since neither \(\bfv\) nor \(\bfw\) is a multiple of the other.
There is one other notable fact that will allow us to determine whether particular sets of vectors are linearly dependent.
A reader may guess that we will occasionally need to figure out whether or not a given set of vectors is linearly independent. As with questions about span, this turns out to be easier when the vector space is \(\ff^n\) for some field \(\ff\text{.}\) For other sorts of vector spaces, we will need different methods.

Proof.

Let \(\bfv_1, \ldots, \bfv_n \in \ff^m\) be the columns of \(A\text{.}\) Then the vector form of the equation \(A\bfx = \bfo\) is
\begin{equation*} x_1\bfv_1 + \cdots + x_n\bfv_n = \bfo\text{.} \end{equation*}
If the columns of \(A\) are linearly independent, then the only solution to this equation is \(\bfx = \bfo\text{,}\) which means that \(\nll(A) = \{\bfo\}\text{.}\) Alternatively, if the columns of \(A\) are linearly dependent, then \(\nll(A)\) contains a non-zero vectorβ€”namely, the vector of scalars which provides a linear dependence relation for these vectors.

Note 5.1.12.

The reader may note the slight abuse of terminology in the previous proof. We referred to the columns of a matrix being linearly dependent or independent instead of the set containing the columns of the matrix. We trust that the reader will forgive and overlook this misstep since the meaning is clear and the verbal gymnastics needed to be precise at all times can prove tiresome.
The following proposition provides another test of the linear dependence of a set of vectors.

Proof.

Let \(V' = \{\bfv_1, \ldots, \bfv_n\}\) be a set of vectors in \(\ff^m\) and let \(A\) be the matrix with the elements of \(V'\) as its columns. By CorollaryΒ 1.3.7 (or, technically, by the version of this result generalized to any field \(\ff\)), we know that the \(m\times n\) linear system represented by \(A\bfx = \bfo\) cannot have a unique solution. Since \(\bfx = \bfo\) is a known solution, the presence of another solution means that the columns of \(A\) must be linearly dependent by PropositionΒ 5.1.11.
PropositionΒ 5.1.11 provides us with a convenient algorithm to determine whether or not a set of vectors in \(\ff^m\) is linearly independent.

Example 5.1.15.

Consider the following vectors in \(\ff_5^3\text{:}\)
\begin{equation*} \bfv_1 = \begin{bmatrix} 4 \\ 1 \\ 4 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 2 \\ 3 \\ 4 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 1 \\ 3 \\ 1 \end{bmatrix}\text{.} \end{equation*}
The set \(\{\bfv_1, \bfv_2, \bfv_3\}\) is linearly independent because the matrix \(A = [\bfv_1\; \bfv_2\; \bfv_3]\) has \(I_3\) as its RREF.

Example 5.1.16.

We consider the following vectors in \(\rr^4\text{:}\)
\begin{equation*} \bfv_1 = \begin{bmatrix} -3 \\ 4.5 \\ 3.5 \\ -4.5 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 3.5 \\ 5 \\ 4 \\ 2 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} -5 \\ 28 \\ 22 \\ -14 \end{bmatrix}, \hspace{6pt} \bfv_4 = \begin{bmatrix} 5 \\ 0 \\ 0.5 \\ -4 \end{bmatrix}\text{.} \end{equation*}
The set \(\{\bfv_1, \bfv_2, \bfv_3, \bfv_4\}\) is linearly dependent, because the matrix \(A\) which has the vectors \(\bfv_i\) as its columns has the following RREF:
\begin{equation*} \begin{bmatrix} 1 \amp 0 \amp 4 \amp 0 \\ 0 \amp 1 \amp 2 \amp 0 \\ 0 \amp 0 \amp 0 \amp 1 \\ 0 \amp 0 \amp 0 \amp 0 \end{bmatrix}\text{.} \end{equation*}
AlgorithmΒ 5.1.14 only covers the situations when our vectors come from some \(\ff^m\text{.}\) In the case of other vector spaces, we will need to do more work.

Example 5.1.17.

Consider the following three elements of \(P_2\text{:}\)
\begin{equation*} p_1 = 1+t, \hspace{6pt} p_2 = t - t^2, \hspace{6pt} p_3 = 2 + 2t + t^2\text{.} \end{equation*}
To determine whether or not the set \(\{p_1, p_2, p_3\}\) is linearly dependent, we need to return to the definition. Suppose that we have
\begin{equation*} c_1p_1 + c_2p_2 + c_3p_3 = 0 \end{equation*}
for some \(c_1, c_2, c_3 \in \rr\text{.}\) In other words, this linear combination is the zero polynomial, so we have
\begin{equation*} c_1p_1 + c_2p_2 + c_3p_3 = 0 + 0t + 0t^2\text{.} \end{equation*}
For these specific polynomials, this means we have
\begin{align*} c_1(1+t) + c_2(t-t^2) + c_3(2+2t+t^2) \amp = 0 + 0t + 0t^2\\ (c_1+2c_3)1 + (c_1+c_2+2c_3)t + (-c_2+c_3)t^2 \amp = 0+0t+0t^2\text{.} \end{align*}
Since the coefficients of the corresponding powers of \(t\) must be equal on both sides of this equation, we have a linear system to solve:
\begin{align*} c_1+2c_3 \amp = 0\\ c_1+c_2+2c_3 \amp = 0\\ -c_2+c_3 \amp = 0\text{.} \end{align*}
Our variables in this system are \(c_1\text{,}\) \(c_2\text{,}\) and \(c_3\text{,}\) and we solve the system using the techniques from SectionΒ 1.3. We find that
\begin{equation*} \begin{bmatrix} 1 \amp 0 \amp 2 \\ 1 \amp 1 \amp 2 \\ 0 \amp -1 \amp 1 \end{bmatrix} \sim \begin{bmatrix} 1 \amp 0 \amp 0 \\ 0 \amp 1 \amp 0 \\ 0 \amp 0 \amp 1 \end{bmatrix}\text{.} \end{equation*}
This shows that the only solution to this linear system is the trivial one: \({c_1=c_1=c_3=0}\text{.}\) That means that the set \(\{p_1,p_2,p_3\}\) is linearly independent.
We end this section with two additional results.

Proof.

Let \(A = [\bfv_1 \cdots \bfv_n ] \in M_n(\ff)\text{.}\) By TheoremΒ 3.4.15, we know that \(V'=\{ \bfv_1, \ldots, \bfv_n \}\) spans \(\ff^n\) if and only if the RREF of \(A\) has a pivot in every row. Further, AlgorithmΒ 5.1.14 says that \(V'\) is linearly independent if and only if the RREF of \(A\) has a pivot in every column. Since \(A\) is a square matrix, each of these happen exactly when the RREF of \(A\) is \(I_n\text{.}\)
The following result appears to be little more than a slight restatement of the definition of linear dependence. However, the precise wording used in this theorem turns out to be quite useful in proving some results later in the text.

Proof.

Let \(\{\bfv_1, \ldots, \bfv_n\}\) be a linearly dependent set of vectors in a vector space \(V\text{,}\) and suppose that \(\bfv_1 \neq \bfo\text{.}\) Then there exist scalars \(c_1, \ldots, c_n\text{,}\) not all of which are zero, such that
\begin{equation*} \bfo = c_1\bfv_1 + \cdots + c_n\bfv_n \text{.} \end{equation*}
Let \(k\) be the largest index such that \(c_k \neq 0\text{.}\) It must be that \(k \ge 2\) since we assumed \(\bfv_1 \neq \bfo\text{.}\) Then
\begin{equation*} c_k\bfv_k = -c_1\bfv_1 - \cdots - c_{k-1}\bfv_{k-1}\text{.} \end{equation*}
Since \(c_k \neq 0\text{,}\) we have
\begin{equation*} \bfv_k = \left(-\frac{c_1}{c_k}\right)\bfv_1 + \cdots + \left(-\frac{c_{k-1}}{c_k}\right)\bfv_{k-1}\text{.} \end{equation*}
This shows that \(\bfv_k \in \spn\{\bfv_1, \ldots, \bfv_{k-1} \}\) and completes the proof.

Note 5.1.20.

It is important to record that TheoremΒ 5.1.19 doesn’t say that in linearly dependent sets every vector is a combination of the vectors with smaller subscripts. We merely have the existence of a vector with that property.

Reading Questions Reading Questions

1.

For each of the following, determine whether the given set of vectors in \(\rr^3\) is linearly dependent or linearly independent. (You should NOT need to do any matrix row reduction to figure this out.) Refer to a fact or theorem from the section when you are giving your answer.
  1. \(\{\bfv_1, \bfv_2\}\) where
    \begin{equation*} \bfv_1 = \begin{bmatrix} -2 \\ 4 \\ -10 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 3 \\ -6 \\ 15 \end{bmatrix} \end{equation*}
  2. \(\{\bfv_1, \bfv_2, \bfv_3\}\) where
    \begin{equation*} \bfv_1 = \begin{bmatrix} 2 \\ -3 \\ 1 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} -8 \\ -9 \\ 3 \end{bmatrix} \end{equation*}
  3. \(\{\bfv_1, \bfv_2, \bfv_3, \bfv_4\}\) where
    \begin{equation*} \bfv_1 = \begin{bmatrix} 2 \\ -3 \\ 1 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} -4 \\ 5 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} -8 \\ -9 \\ 3 \end{bmatrix}, \hspace{6pt} \bfv_4 = \begin{bmatrix} 10 \\ 7 \\ -2 \end{bmatrix} \end{equation*}
  4. \(\{\bfv_1, \bfv_2\}\) where
    \begin{equation*} \bfv_1 = \begin{bmatrix} 1 \\ 0 \\ -7 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 3 \\ 2 \\ 5 \end{bmatrix} \end{equation*}

2.

Determine whether the following sets in \(P_3\) are linearly independent. Explain your answers. You should not need to do any calculations.
  1. \(\displaystyle \{1+t^2, t^3\}\)
  2. \(\displaystyle \{1, 2t^2, -7+6t^2\}\)
  3. \(\displaystyle \{2-5t^2, -4+10t^2\}\)

Exercises Exercises

1.

For each of the following, determine by inspection (without doing any calculation) whether the given set is linearly dependent or linearly independent. Explain your answers.
  1. \(\{\bfv_1, \bfv_2\}\) in \(\rr^3\) where
    \begin{equation*} \bfv_1 = \begin{bmatrix} 2 \\ -1 \\ 4 \end{bmatrix}, \hspace{12pt} \bfv_2 = \begin{bmatrix} 3 \\ 8 \\ -4 \end{bmatrix} \end{equation*}
  2. \(\{p_1, p_2 \}\) in \(P_2\) where
    \begin{equation*} p_1 = 2t - 4t^2, \hspace{12pt} p_2 = -t + 2t^2 \end{equation*}
  3. \(\{\bfv_1, \bfv_2 \}\) in \(\ff_5^2\) where
    \begin{equation*} \bfv_1 = \begin{bmatrix} 3 \\ 4 \end{bmatrix}, \hspace{12pt} \bfv_2 = \begin{bmatrix} 1 \\ 3 \end{bmatrix} \end{equation*}
  4. \(\{\bfv_1, \bfv_2, \bfv_3 \}\) in \(\ff_7^2\) where
    \begin{equation*} \bfv_1 = \begin{bmatrix} 1 \\ 4 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 0 \\ 4 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 5 \\ 1 \end{bmatrix} \end{equation*}
  5. \(\{\bfv_1, \bfv_2, \bfv_3\}\) in \(\rr^3\) where
    \begin{equation*} \bfv_1 = \begin{bmatrix} 1 \\ -1 \\ 3 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 3 \\ -3 \\ 9 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 0 \\ -2 \\ -4 \end{bmatrix} \end{equation*}

2.

Determine the value(s) of \(c\text{,}\) if any, that will make the set \(\{\bfv_1, \bfv_2, \bfv_3\}\) linearly independent in \(\rr^3\text{.}\)
  1. \(\displaystyle \bfv_1 = \begin{bmatrix} -3 \\ 1 \\ 3 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 4 \\ -4 \\ -3 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 9 \\ -11 \\ c \end{bmatrix}\)
  2. \(\displaystyle \bfv_1 = \begin{bmatrix} -1 \\ 3 \\ 2 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 2 \\ -6 \\ -4 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 2 \\ -2 \\ c \end{bmatrix}\)

3.

Determine the value(s) of \(c\text{,}\) if any, that will make the set \(\{\bfv_1, \bfv_2, \bfv_3\}\) linearly dependent in \(\rr^3\text{,}\) if
\begin{equation*} \bfv_1 = \begin{bmatrix} 1 \\ -2 \\ 3 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 3 \\ -4 \\ -4 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} -2 \\ 0 \\ c \end{bmatrix}\text{.} \end{equation*}
Answer.
The set is linearly dependent if and only if \(c = 20\text{.}\)

5.

Determine whether the following statements are true or false. Justify your answer either way.
  1. If \(V' = \{\bfv_1, \bfv_2\}\) is a subset of a vector space \(V\) and \(\bfv_2\) is not a scalar multiple of \(\bfv_1\text{,}\) then \(V'\) is linearly independent.
  2. If \(V' = \{\bfv_1, \ldots, \bfv_4\}\) is a subset of a vector space \(V\) and \(\bfv_3\) is not a linear combination of \(\bfv_1\text{,}\) \(\bfv_2\text{,}\) and \(\bfv_4\text{,}\) then \(V'\) is linearly independent.
  3. If \(V' = \{\bfv_1, \ldots, \bfv_4\}\) is a subset of a vector space \(V\) and both \(\{\bfv_1, \bfv_2, \bfv_3\}\) and \(\{\bfv_2, \bfv_3, \bfv_4\}\) are linearly independent, then \(V'\) is linearly independent.
  4. If \(V' = \{\bfv_1, \ldots, \bfv_4\}\) is a subset of a vector space \(V\) and \(V'\) is linearly independent, then \(\{\bfv_1, \bfv_2, \bfv_3\}\) is also linearly independent.

6.

Determine whether or not the following set of vectors is linearly independent in the given vector space.
  1. \(\{\bfv_1, \bfv_2, \bfv_3\}\) in \(\rr^4\) if
    \begin{equation*} \bfv_1 = \begin{bmatrix} 3 \\ -2.5 \\ -5 \\ -1 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 4 \\ 1 \\ -1 \\ -4.5 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 4 \\ -1 \\ 3 \\ 2.5 \end{bmatrix} \end{equation*}
  2. \(\{\bfv_1, \bfv_2, \bfv_3\}\) in \(\ff_5^3\) if
    \begin{equation*} \bfv_1 = \begin{bmatrix} 1 \\ 3 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 2 \\ 1 \\ 4 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 0 \\ 0 \\ 4 \end{bmatrix} \end{equation*}
  3. \(\{\bfv_1, \bfv_2, \bfv_3\}\) in \(\ff_3^3\) if
    \begin{equation*} \bfv_1 = \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 1 \\ 1 \\ 2 \end{bmatrix} \end{equation*}

7.

For each of the following subsets \(\{p_1,p_2,p_3\}\) of \(P_2\text{,}\) determine whether the set is linearly dependent or linearly independent. Explain your answers.
  1. \(p_1 = 3 + 5t^2\text{,}\) \(p_2 = -5-3t+2t^2\text{,}\) \(p_3 = -4-5t-2t^2\)
  2. \(p_1 = 2-t+t^2\text{,}\) \(p_2 = -3+5t-12t^2\text{,}\) \(p_3 = -2-2t+8t^2\)

Writing Exercises

8.
Let \(T:V \to W\) be a linear transformation between vector spaces.
  1. Prove that if \(\{\bfv_1, \ldots, \bfv_n\}\) is a linearly dependent set in \(V\text{,}\) then \(\{T(\bfv_1), \ldots, T(\bfv_n)\}\) is a linearly dependent set in \(W\text{.}\)
  2. Prove that if \(T\) is injective and if \(\{T(\bfv_1), \ldots, T(\bfv_n)\}\) is a linearly dependent set in \(W\text{,}\) then \(\{\bfv_1, \ldots, \bfv_n\}\) is a linearly dependent set in \(V\text{.}\)
9.
Suppose that \(V_1\) and \(V_2\) are subsets of a vector space \(V\text{.}\) Prove that if \(V_1 \subseteq V_2\) and \(V_1\) is linearly dependent, then \(V_2\) is linearly dependent.
Solution.
Suppose that
\begin{equation*} V_1 = \{\bfv_1,\ldots, \bfv_k\} \end{equation*}
and
\begin{equation*} V_2 = \{\bfv_1,\ldots,\bfv_k,\bfv_{k+1},\ldots,\bfv_n\}\text{.} \end{equation*}
Since \(V_1\) is linearly dependent, we know that there exist scalars \(c_1, \ldots, c_k\text{,}\) not all of which are zero, such that
\begin{equation*} c_1\bfv_1 + \cdots + c_k\bfv_k = \bfo\text{.} \end{equation*}
Then it is easy to produce a linear dependence relation for the set \(V_2\text{:}\)
\begin{equation*} c_1\bfv_1 + \cdots + c_k\bfv_k + 0\bfv_{k+1} + \cdots + 0\bfv_n = \bfo\text{.} \end{equation*}
This proves that \(V_2\) is linearly dependent.
10.
Let \(A \in M_n(\ff)\) and let \(T:\ff^n \to \ff^n\) be a linear transformation.
  1. Prove that \(\nll(A) = \{\bfo\}\) if and only if \(\col(A) = \ff^n\text{.}\)
  2. Prove that \(T\) is injective if and only if it is surjective.
Solution.
  1. If \(A \in M_n(\ff)\text{,}\) then \(\nll(A) = \{\bfo\}\) if and only if the RREF of \(A\) is \(I_n\text{.}\) It is always true that the null space is \(\{\bfo \}\) if and only if and we have a pivot in each column of the RREF; when \(A\) is square, this means that the RREF must be \(I_n\text{.}\)
    We also know that \(\col(A) = \ff^n\) if and only if the RREF of \(A\) is \(I_n\text{.}\) It is always true that the column space is \(\ff^n\) if and only if and we have a pivot in each row of the RREF; when \(A\) is square, this means that the RREF must be \(I_n\text{.}\)
    Putting these two paragraphs together, we conclude that \(\nll(A) = \{\bfo\}\) if and only if \(\col(A) = \ff^n\text{.}\)
  2. If \(T\) is a linear transformation \(\ff^n \to \ff^n\text{,}\) then by TheoremΒ 3.2.2 there is a matrix \(B \in M_n(\ff)\) such that \(T\) is multiplication by \(B\text{.}\) Then the kernel of \(T\) is the null space of \(B\) and the range of \(T\) is the column space of \(B\text{.}\) So, \(T\) is injective if and only if \(\nll(B) = \{\bfo\}\text{,}\) and \(T\) is surjective if and only if \(\col(B) = \ff^n\text{.}\) Then the result from part a completes the argument.
11.
Let \(V\) be a vector space over \(\qq\) and let \(V' = \{\bfv_1, \ldots, \bfv_n\}\) be a subset of \(V\text{.}\) Prove that \(V'\) is linearly dependent if and only if there exist integers \(c_1, \ldots, c_n\text{,}\) not all of which are zero, such that
\begin{equation*} c_1\bfv_1 + \cdots + c_n\bfv_n = \bfo\text{.} \end{equation*}
12.
Let \(A \in M_{m,n}(\ff)\) and suppose that \(\nll(A) \neq \{\bfo\}\text{.}\) Prove that the set of vectors that spans \(\nll(A)\) is linearly independent.