Skip to main content

Section 5.2 Basis of a Vector Space

We have previously examined when a set of vectors spans a vector space. In this section, we will learn how to work with the most efficient spanning set possible.

Subsection 5.2.1 The Definition of a Basis

We begin with the notion of finite- and infinite-dimensional vector spaces.

Definition 5.2.1.

A vector space \(V\) is finite-dimensional if there is a finite set of vectors that spans \(V\text{.}\) A vector space is infinite-dimensional if it is not finite-dimensional.
We recall that linear independence in SectionΒ 5.1 was introduced as a way to eliminate redundancy. We pick up on this idea in the next definition.

Definition 5.2.2.

Let \(V\) be a finite-dimensional vector space. Then a set \(B = \{\bfv_1, \ldots, \bfv_n \}\) is a basis for \(V\) if \(B\) is a linearly independent set and if \(V = \spn\{\bfv_1,\ldots, \bfv_n\}\text{.}\)

Note 5.2.3.

The notion of a basis exists for infinite-dimensional vector spaces, but since the overwhelming majority of our work will be with finite-dimensional spaces, we have only given the definition in that setting.

Example 5.2.4.

We recall that \(\bfe_i\) is the vector in \(\ff^n\) with \(1\) in the \(i\)th coordinate and zeros elsewhere. Then the set \(E = \{\bfe_1, \ldots, \bfe_n\}\) is a basis for \(\ff^n\text{.}\) If we form the \(n\times n\) matrix with these vectors as columns, we see that it is the \(n\times n\) identity matrix. Since there is a pivot in every column, \(E\) is linearly independent according to AlgorithmΒ 5.1.14. Then CorollaryΒ 5.1.18 tells us that \(E\) also spans \(\ff^n\text{.}\) This proves that \(E\) is a basis for \(\ff^n\text{.}\)
We call this basis the standard basis for \(\ff^n\text{.}\)

Example 5.2.5.

We now consider the subset \(B = \{1, t, t^2\}\) of the vector space \(P_2\text{.}\) Since any vector in \(P_2\) can be written as \(a(1)+b(t)+c(t^2)\text{,}\) it is clear that \(B\) spans \(P_2\text{.}\) It is also true that \(B\) is linearly independent: the set \(\{1, t\}\) is linearly independent since neither vector is a scalar multiple of the other. And then since \(t^2\) is not a linear combination of \(1\) and \(t\text{,}\) we conclude that \(B\) is linearly independent by (the contrapositive of) the Linear Dependence Lemma (TheoremΒ 5.1.19). This proves that \(B\) is a basis for \(P_2\text{.}\)
The analogous basis for \(P_n\text{,}\) \(\{1, t, \ldots, t^n \}\text{,}\) is often called the standard basis for \(P_n\text{.}\)

Example 5.2.6.

Consider the following matrix \(A \in M_{3,5}(\rr)\text{:}\)
\begin{equation*} A = \begin{bmatrix} 3 \amp -2 \amp -4 \amp -4 \amp 3 \\ 1 \amp -2 \amp 1 \amp 1 \amp 2 \\ 0 \amp 0 \amp 4 \amp 0 \amp -4 \end{bmatrix}\text{.} \end{equation*}
We will find a basis for \(\nll(A)\text{.}\)
Following the procedure we first encountered in ExampleΒ 3.4.5, we start by finding the RREF of \(A\text{:}\)
\begin{equation*} A \sim \begin{bmatrix} 1 \amp 0 \amp 0 \amp -5/2 \amp -2 \\ 0 \amp 1 \amp 0 \amp -7/4 \amp -5/2 \\ 0 \amp 0 \amp 1 \amp 0 \amp -1 \end{bmatrix}\text{.} \end{equation*}
We see that \(x_4\) and \(x_5\) are free variables, and that any vector \(\bfx\) in \(\nll(A)\) can be written as
\begin{equation*} \bfx = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \\ x_5 \end{bmatrix} = \begin{bmatrix} (5/2)x_4 + 2x_5 \\ (7/4)x_4 + (5/2)x_5 \\ x_5 \\ x_4 \\ x_5 \end{bmatrix} = x_4 \begin{bmatrix} 5/2 \\ 7/4 \\ 0 \\ 1 \\ 0 \end{bmatrix} + x_5 \begin{bmatrix} 2 \\ 5/2 \\ 1 \\ 0 \\ 1 \end{bmatrix}\text{.} \end{equation*}
If we label the vectors
\begin{equation*} \bfv_1 = \begin{bmatrix} 5/2 \\ 7/4 \\ 0 \\ 1 \\ 0 \end{bmatrix} \hspace{6pt} \text{and} \hspace{6pt} \bfv_2 = \begin{bmatrix} 2 \\ 5/2 \\ 1 \\ 0 \\ 1 \end{bmatrix}\text{,} \end{equation*}
then we can see that \(\nll(A) = \spn\{\bfv_1, \bfv_2 \}\text{.}\) Further, we see that \(\{\bfv_1, \bfv_2 \}\) is linearly independent (neither vector is a scalar multiple of the other), so \(\{ \bfv_1, \bfv_2 \}\) is a basis for \(\nll(A)\text{.}\)

Note 5.2.7.

What we observed in ExampleΒ 5.2.6 is true more generally. Since the method we use to find a spanning set for \(\nll(A)\) always produces a linearly independent set (see ExerciseΒ 5.1.12), this method will always produce a basis for \(\nll(A)\text{.}\)
Here is an example where we are looking at whether a set of two vectors is a basis.

Example 5.2.8.

It turns out that it is fairly easy to tell whether a set of two vectors in \(\rr^2\) forms a basis for \(\rr^2\text{.}\) Since linear independence is easy to check with two vectorsβ€”is either vector a scalar multiple of the other?β€”we can focus on this characteristic. This means that the set \(\{\bfv_1, \bfv_2 \}\text{,}\) where
\begin{equation*} \bfv_1 = \begin{bmatrix} 1 \\ -3 \end{bmatrix} \hspace{6pt} \text{and} \hspace{6pt} \bfv_2 = \begin{bmatrix} -2 \\ 5 \end{bmatrix} \end{equation*}
is a basis for \(\rr^2\text{.}\) Neither vector is a scalar multiple of the other, so the set is linearly independent. And then CorollaryΒ 5.1.18 tells us that this set must also span \(\rr^2\text{.}\) (We could also easily see this by row reducing the matrix \([\bfv_1\; \bfv_2]\text{.}\))
On the other hand, the set \(W' = \{ \bfw_1, \bfw_2 \}\text{,}\) where
\begin{equation*} \bfw_1 = \begin{bmatrix} 2 \\ -1 \end{bmatrix} \hspace{6pt} \text{and} \hspace{6pt} \bfw_2 = \begin{bmatrix} -4 \\ 2 \end{bmatrix} \end{equation*}
is not a basis for \(\rr^2\text{.}\) Since \(\bfw_2 = -2\bfw_1\text{,}\) \(W'\) is not linearly independent, so it cannot be a basis.
Putting some facts together, there is a fairly straightforward condition for when a set of vectors in \(\ff^n\) is a basis for that space.

Proof.

From TheoremΒ 3.4.15 we know that the set \(V' = \{\bfv_1, \ldots, \bfv_m \}\) spans \(\ff^m\) if and only if the RREF of \(A=[\bfv_1 \cdots \bfv_m]\) has a pivot in every row. Additionally, AlgorithmΒ 5.1.14 tells us that \(V'\) is linearly independent if and only if the RREF of \(A\) has a pivot in each column. The only way a matrix in RREF can have a pivot in every row and every column is if that RREF is the identity matrix.
We put this proposition into action in the following example.

Example 5.2.10.

Let \(A \in M_3(\ff_5)\) be the following matrix:
\begin{equation*} A = \begin{bmatrix} 3 \amp 4 \amp 4 \\ 3 \amp 0 \amp 1 \\ 4 \amp 3 \amp 4 \end{bmatrix}\text{.} \end{equation*}
We will label column \(i\) in \(A\) as the vector \(\bfv_i \in \ff_5^3\text{.}\)
Since the RREF of \(A\) is
\begin{equation*} \begin{bmatrix} 1 \amp 0 \amp 2 \\ 0 \amp 1 \amp 2 \\ 0 \amp 0 \amp 0 \end{bmatrix}\text{,} \end{equation*}
the set \(\{\bfv_1,\bfv_2,\bfv_3\}\) is not a basis for \(\ff_5^3\text{.}\)
On the other hand, if \(B \in M_3(\ff_5)\) is the matrix
\begin{equation*} B = \begin{bmatrix} 1 \amp 1 \amp 0 \\ 1 \amp 2 \amp 3 \\ 0 \amp 3 \amp 0 \end{bmatrix}\text{,} \end{equation*}
then the columns of \(B\) form a basis for \(\ff_5^3\) since the RREF of \(B\) is \(I_3\text{.}\)

Subsection 5.2.2 The Properties of a Basis

Having a basis is a powerful tool. In particular, it guarantees a uniqueness that is quite useful.

Proof.

We will prove the forward direction of this biconditional statement directly. Suppose that \(V' = \{\bfv_1, \ldots, \bfv_n \}\) is a basis of \(V\text{.}\) Since \(V = \spn(V')\text{,}\) every vector in \(V\) can be written as a linear combination of the vectors in \(V'\text{.}\) Let \(\bfv\) be a vector in \(V\text{,}\) and suppose that \(\bfv\) can be written as a linear combination of the vectors in \(V'\) in two ways:
\begin{equation*} \bfv = \sum_{i=1}^n a_i\bfv_i \hspace{6pt} \text{and} \hspace{6pt} \bfv = \sum_{i=1}^n b_i\bfv_i \text{.} \end{equation*}
We want to show that \(a_i = b_i\) for each \(i\text{,}\) \(1 \le i \le n\text{.}\) Since both of these representations are equal to \(\bfv\text{,}\) they are equal to each other, so we have
\begin{equation*} \bfo = \sum_{i=1}^n a_i\bfv_i - \sum_{i=1}^n b_i\bfv_i = \sum_{i=1}^n (a_i-b_i)\bfv_i\text{.} \end{equation*}
Since \(V'\) is a linearly independent set (since we are assuming it is a basis), it must be that \(a_i-b_i=0\) for each \(i\text{.}\) Therefore, \(a_i=b_i\) and the representation of \(\bfv\) is unique.
For the other direction, we suppose that every element of \(V\) can be uniquely represented as a linear combination of the vectors in \(V'\text{.}\) Since every element of \(V\) can be represented as a linear combination of the vectors in \(V'\text{,}\) we see that \(V'\) spans \(V\text{.}\) Since every element in \(V\) can be represented uniquely as a linear combination of the vectors in \(V'\text{,}\) and since \(\bfo \in V\) can be represented as the trivial linear combination of the vectors in \(V'\text{,}\) this means that \(V'\) is linearly independent. (The trivial linear combination of vectors in \(V'\) is the only way to obtain \(\bfo\) as a linear combination of the vectors in \(V'\text{.}\)) Since \(V'\) is linearly independent and spans \(V\text{,}\) this proves that \(V'\) is a basis for \(V\text{.}\)
This next result shows us how to trim a spanning set down until we reach a basis.

Proof.

We suppose that \(B = \{\bfv_1, \ldots, \bfv_n \}\text{.}\) If \(B\) is linearly dependent, then by the Linear Dependence Lemma (TheoremΒ 5.1.19), there exists a vector \(\bfv_k \in B\) such that \(\bfv_k\) can be written as a linear combination of the vectors \(\bfv_1, \ldots, \bfv_{k-1}\text{.}\) We suppose this combination is
\begin{equation} \bfv_k = a_1\bfv_1 + \cdots + a_{k-1}\bfv_{k-1}\text{.}\tag{5.1} \end{equation}
Now suppose \(\bfv\) is a vector in \(V\text{.}\) We have
\begin{equation} \bfv = c_1 \bfv_1 + \cdots + c_{k-1}\bfv_{k-1} + c_k\bfv_k + c_{k+1}\bfv_{k+1} + \cdots + c_n \bfv_n\text{.}\tag{5.2} \end{equation}
Using (5.1), we can substitute this expression in for \(\bfv_k\) in (5.2) and, once the algebraic dust settles, we will have written \(\bfv\) as a combination of the vectors in \(B - \{\bfv_k\}\text{.}\) This shows that \(\spn(B) = \spn(B - \{\bfv_k\})\text{.}\) (Since \(B - \{\bfv_k\} \subseteq B\text{,}\) it is true that \(\spn(B - \{\bfv_k\}) \subseteq \spn(B)\text{.}\) The argument thus far in this proof has established the other subset containment.)
If \(B\) is linearly independent, then it is already a basis for \(V\text{.}\) If it is linearly dependent, then we can remove a vector according to the above procedure to obtain a set \(B_1 = B - \{\bfw\}\) which still spans \(V\text{.}\) As long as there are two or more vectors in the spanning set, we can repeat this process until we are left with a linearly independent set and thus a basis. If the spanning set is eventually reduced to a single vector, that vector will be nonzero since \(V\) is nonzero, and therefore that set will be linearly independent and therefore a basis.

Proof.

Since a finite-dimensional vector space by definition has a finite spanning set \(B\text{,}\) TheoremΒ 5.2.12 tells us that a subset of \(B\) will be a basis for the vector space.
While the proof of TheoremΒ 5.2.12 provides a way to trim a spanning set down to a basis, it does not offer a practical method for this process. The following algorithm provides such a method for certain vector spaces.

Proof.

We form the matrix \(A = [\bfv_1 \cdots \bfv_n]\text{.}\) If \(A\) is already in RREF, then the non-pivot columns are linear combinations of the pivot columns that preceed them (when reading from left to right). So, those can be discarded and the pivot columns will be a basis, according to TheoremΒ 5.2.12.
We will complete the proof with a reminder about the effect of elementary row operations on the columns of a matrix. If a column \(\bfv_k\) of \(A\) is a linear combination of the columns that preceed it, then
\begin{equation*} \bfv_k = \sum_{i=1}^{k-1} c_i\bfv_i \end{equation*}
for some scalars \(c_i\text{.}\) This means that the column vector \([c_i]\) is a solution to the linear system represented by the augmented matrix \([\bfv_1 \cdots \bfv_{k-1} \mid \bfv_k ]\text{.}\) One of the earliest facts we learned about elementary row operations is that they preserve the solution sets of linear systems, so the same vector \([c_i]\) will be a solution to the linear system represented by the RREF of \([\bfv_1 \cdots \bfv_{k-1} \mid \bfv_k ]\text{.}\) This proves that the relationships between the columns of a matrix are the same as the relationships between the columns of the RREF of that matrix.
So, if \(A\) is not in RREF, we can find the RREF of \(A\text{,}\) call it \(C\text{.}\) The non-pivot columns of \(C\) indicate that the corresponding columns of \(A\) should not be included in the basis. In other words the pivot columns of \(C\) indicate that the corresponding columns of \(A\) are the ones that should remain to form the basis.

Note 5.2.15.

We emphasize here that the pivot columns in the reduced matrix do not provide the vectors for the basis! The pivot columns merely provide the instructions for which of the original vectors should be kept to form the basis.

Example 5.2.16.

Consider the following matrix \(A \in M_{4,5}(\ff_5)\text{:}\)
\begin{equation*} A = \begin{bmatrix} 3 \amp 2 \amp 3 \amp 3 \amp 3 \\ 0 \amp 0 \amp 1 \amp 4 \amp 0 \\ 3 \amp 2 \amp 0 \amp 1 \amp 2 \\ 0 \amp 0 \amp 2 \amp 3 \amp 1 \end{bmatrix}\text{.} \end{equation*}
We will find a basis for \(\col(A)\) using AlgorithmΒ 5.2.14. When we put \(A\) into RREF, we find
\begin{equation*} A \sim \begin{bmatrix} 1 \amp 4 \amp 0 \amp 2 \amp 0 \\ 0 \amp 0 \amp 1 \amp 4 \amp 0 \\ 0 \amp 0 \amp 0 \amp 0 \amp 1 \\ 0 \amp 0 \amp 0 \amp 0 \amp 0 \end{bmatrix}\text{.} \end{equation*}
The pivots are in columns 1, 3, and 5, so a basis for \(\col(A)\) is \(\{ \bfv_1, \bfv_3, \bfv_5 \}\text{,}\) where
\begin{equation*} \bfv_1 = \begin{bmatrix} 3 \\ 0 \\ 3 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 3 \\ 1 \\ 0 \\ 2 \end{bmatrix}, \hspace{6pt} \bfv_5 = \begin{bmatrix} 3 \\ 0 \\ 2 \\ 1 \end{bmatrix}, \hspace{6pt} \text{.} \end{equation*}
We arrive at the end of this section with two helpful perspectives on a basis. A basis can be formed by trimming a spanning set down until it is linearly independent. Thus, a basis is a spanning set that is as small as possible. On the other hand, a linearly independent set can always be enlarged until it spans. Therefore, a basis is also a linearly independent set that is as large as possible.

Reading Questions 5.2.3 Reading Questions

1.

Consider the set \(V' = \{\bfv_1, \bfv_2, \bfv_3, \bfv_4\}\) in \(\rr^3\) where
\begin{equation*} \bfv_1 = \begin{bmatrix} -3 \\ 1 \\ 4 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 6 \\ 2 \\ 1 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 0 \\ 4 \\ -1 \end{bmatrix}, \hspace{6pt} \bfv_4 = \begin{bmatrix} 2 \\ -2 \\ 7 \end{bmatrix}\text{.} \end{equation*}
Find a basis for \(\spn(V')\text{.}\) Follow ExampleΒ 5.2.16 and explain your answer.

2.

Determine whether or not the set \(\{ \bfv_1, \bfv_2, \bfv_3 \}\) forms a basis for \(\ff_7^3\text{,}\) where
\begin{equation*} \bfv_1 = \begin{bmatrix} 4 \\ 1 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 5 \\ 0 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 5 \\ 6 \\ 2 \end{bmatrix}\text{.} \end{equation*}
Explain your answer.

Exercises 5.2.4 Exercises

1.

For each of the following, determine whether the given set of vectors forms a basis for the indicated vector space.
  1. \(\{\bfv_1, \bfv_2, \bfv_3\}\) in \(\rr^3\) if
    \begin{equation*} \bfv_1 = \begin{bmatrix} -2 \\ 0.5 \\ -1 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} -4.5 \\ -2.5 \\ 4.5 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 4.5 \\ 1.5 \\ -4.5 \end{bmatrix} \end{equation*}
  2. \(\{\bfv_1, \bfv_2 \}\) in \(\ff_5^2\) if
    \begin{equation*} \bfv_1 = \begin{bmatrix} 1 \\ 4 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 3 \\ 2 \end{bmatrix} \end{equation*}
  3. \(\{p_1, p_2, p_3\}\) in \(P_2\) if
    \begin{equation*} p_1 = 4 + 2t + 4t^2, \hspace{6pt} p_2 = -3+4t, \hspace{6pt} p_3 = 2-2t-4t^2 \end{equation*}

2.

For each of the following, determine whether the given set of vectors forms a basis for the indicated vector space.
  1. \(\{\bfv_1, \bfv_2, \bfv_3, \bfv_4\}\) in \(\rr^4\) if
    \begin{equation*} \bfv_1 = \begin{bmatrix} 0 \\ -2 \\ 1 \\ -3 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} -2 \\ 2 \\ 1 \\ -2 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} -4 \\ -2 \\ 5 \\ -13 \end{bmatrix}, \hspace{6pt} \bfv_4 = \begin{bmatrix} 3 \\ 0 \\ 1 \\ 0 \end{bmatrix} \end{equation*}
  2. \(\{\bfv_1, \bfv_2, \bfv_3 \}\) in \(\ff_3^3\) if
    \begin{equation*} \bfv_1 = \begin{bmatrix} 1 \\ 0 \\ 2 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix} \end{equation*}
  3. \(\{p_1, p_2, p_3 \}\) in \(P_2\) if
    \begin{equation*} p_1 = 3+3t-4t^2, \hspace{6pt} p_2 = 3-3t, \hspace{6pt} p_3 = 12 + 6t -12t^2 \end{equation*}

3.

For each matrix \(A\text{,}\) find a basis for \(\nll(A)\) and \(\col(A)\text{.}\)
  1. \(A \in M_{3,4}(\rr)\text{,}\)
    \begin{equation*} A = \begin{bmatrix} -3 \amp 2 \amp -2 \amp 0.5 \\ 2.5 \amp 5 \amp 15 \amp 5 \\ -5 \amp -1.5 \amp -13 \amp 0.5 \end{bmatrix} \end{equation*}
  2. \(A \in M_{4,6}(\ff_5)\text{,}\)
    \begin{equation*} A = \begin{bmatrix} 3 \amp 2 \amp 0 \amp 4 \amp 3 \amp 4 \\ 3 \amp 3 \amp 3 \amp 1 \amp 3 \amp 1 \\ 4 \amp 0 \amp 2 \amp 0 \amp 3 \amp 3 \\ 4 \amp 4 \amp 1 \amp 2 \amp 0 \amp 2 \end{bmatrix} \end{equation*}

4.

Produce a matrix \(A \in M_{3,4}(\ff_5)\) which has two vectors in a basis for \(\nll(A)\) and two vectors in a basis for \(\col(A)\text{.}\)
Answer.
Any matrix that has two pivot columns and two non-pivot columns will work as an answer here. This is one of many such matrices:
\begin{equation*} \begin{bmatrix} 1 \amp 2 \amp 0 \amp 3 \\ 0 \amp 0 \amp 1 \amp 3 \\ 0 \amp 0 \amp 0 \amp 0 \end{bmatrix}\text{.} \end{equation*}
Of course, any matrix row equivalent to a matrix like this one would also work as an answer.

5.

Find a basis for \(\spn\{\bfv_1, \bfv_2, \bfv_3, \bfv_4, \bfv_5 \} \subseteq \rr^4\text{,}\) if
\begin{equation*} \bfv_1 = \begin{bmatrix} -2 \\ -3 \\ -2 \\ 1 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} -2 \\ 0 \\ -1 \\ -1 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 2 \\ 9 \\ 10 \\ 1 \end{bmatrix}, \hspace{6pt} \bfv_4 = \begin{bmatrix} 3 \\ 2 \\ -4 \\ 9 \end{bmatrix}, \hspace{6pt} \bfv_5 = \begin{bmatrix} 12 \\ -16 \\ 5 \\ 1 \end{bmatrix}\text{.} \end{equation*}

6.

Find a basis for \(\spn\{\bfv_1, \bfv_2, \bfv_3, \bfv_4, \bfv_5 \} \subseteq \ff_5^4\text{,}\) if
\begin{equation*} \bfv_1 = \begin{bmatrix} 0 \\ 0 \\ 2 \\ 3 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 3 \\ 1 \\ 3 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 2 \\ 0 \\ 0 \\ 2 \end{bmatrix}, \hspace{6pt} \bfv_4 = \begin{bmatrix} 1 \\ 2 \\ 4 \\ 2 \end{bmatrix}, \hspace{6pt} \bfv_5 = \begin{bmatrix} 1 \\ 4 \\ 3 \\ 3 \end{bmatrix}\text{.} \end{equation*}
Answer.
A basis is \(\{\bfv_1, \bfv_2, \bfv_3 \}\text{.}\)

7.

Let \(V\) be the vector space of all functions \(\rr \to \rr\text{.}\) Find a basis for the subspace \(H\text{,}\) if
\begin{equation*} H = \spn\{\sin(t), \sin(2t), \sin(t)\cos(t) \}\text{.} \end{equation*}
Answer.
We will argue using the (contrapositive of the) Linear Dependence Lemma. Since \(\sin(t)\) is not the zero function, then \(\{\sin(t)\}\) is a linearly independent set. Also, since \(\sin(2t)\) is not a scalar multiple of \(\sin(t)\) (this can be verified by comparing graphs of the two functions), the set \(\{\sin(t), \sin(2t)\}\) is also linearly independent. However, there is a trig identity which says that \(\sin(2t) = 2\sin(t)\cos(t)\text{,}\) meaning that the set \(\{\sin(t), \sin(2t), \sin(t)\cos(t) \}\) is linearly dependent (as one vector is a multiple of another within this set). Therefore, one basis for \(H\) is \(\{\sin(t), \sin(2t)\}\text{.}\)

8.

Find a matrix \(A \in M_2(\rr)\) such that
\begin{equation*} A \begin{bmatrix} 1 \\ 3 \end{bmatrix} = \begin{bmatrix} 2 \\ 0 \end{bmatrix}, \hspace{6pt} A \begin{bmatrix} -1 \\ -2 \end{bmatrix} = \begin{bmatrix} 1 \\ 1 \end{bmatrix}\text{.} \end{equation*}
Is \(A\) unique? Explain.

9.

Suppose that \(T:\rr^3 \to \rr^2\) is a linear transformation which satisfies the following:
\begin{equation*} T \left( \begin{bmatrix} 1 \\ 1 \\ 3 \end{bmatrix} \right) = \begin{bmatrix} 2 \\ -1 \end{bmatrix}, \hspace{6pt} T \left( \begin{bmatrix} -1 \\ 0 \\ 2 \end{bmatrix} \right) = \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \hspace{6pt} T \left( \begin{bmatrix} 3 \\ -2 \\ 0 \end{bmatrix} \right) = \begin{bmatrix} 3 \\ -2 \end{bmatrix}\text{.} \end{equation*}
Calculate
\begin{equation*} T \left( \begin{bmatrix} -1 \\ 4 \\ 6 \end{bmatrix} \right)\text{.} \end{equation*}

10.

Find a subset of the following set which is a basis for \(P_2\text{:}\)
\begin{equation*} \{t-1, t^2-2t, t^2-2, t^2+1 \}\text{.} \end{equation*}

Writing Exercises

11.
Let \(T: V\to W\) be a linear transformation between vector spaces, and let \(B\) be a basis for \(V\text{.}\)
  1. Produce an example to show that \(T(B)\) does not need to be a basis of \(W\text{.}\)
  2. Suppose that \(T\) is injective. Must \(T(B)\) be a basis for \(W\text{?}\) If so, prove it. If not, produce a counter-example.
  3. Suppose that \(T\) is surjective. Must \(T(B)\) be a basis for \(W\text{?}\) If so, prove it. If not, produce a counter-example.
12.
Suppose that \(\{\bfv_1, \ldots, \bfv_n \}\) is a basis for a vector space \(V\text{.}\) Prove that
\begin{equation*} \{ \bfv_1 + \bfv_2, \bfv_2 + \bfv_3, \ldots, \bfv_{n-1} + \bfv_n, \bfv_n \} \end{equation*}
is also a basis for \(V\text{.}\)
Solution.
We let \(B = \{ \bfv_1 + \bfv_2, \bfv_2 + \bfv_3, \ldots, \bfv_{n-1} + \bfv_n, \bfv_n \}\text{.}\) We will first show that \(B\) is linearly independent.
We first suppose that \(c_1, \ldots, c_n\) are scalars such that
\begin{equation*} c_1(\bfv_1 + \bfv_2) + \cdots + c_{n-1}(\bfv_{n-1} + \bfv_n) + c_n\bfv_n = \bfo\text{.} \end{equation*}
Rearranging this equation, we see that it is equivalent to
\begin{equation*} c_1\bfv_1 + (c_1+c_2)\bfv_2 + \cdots + (c_{n-1} + c_n)\bfv_n = \bfo\text{.} \end{equation*}
However, since we were given that \(B' = \{\bfv_1, \ldots, \bfv_n \}\) is a basis for \(V\text{,}\) this means that the coefficients in this last equation must all be zero, since \(B'\) is linearly independent. This means that \(c_1 = 0\text{,}\) and then since we must also have \(c_1+c_2 = 0\text{,}\) we have \(c_2 = 0\text{,}\) and so on. The result is that \(c_i = 0\) for all \(i\text{,}\) \(i=1,\ldots,n\text{.}\) This proves that \(B\) is linearly independent.
We will now show that \(B\) spans \(V\text{.}\) Let \(\bfv \in V\text{.}\) We want to argue that \(\bfv\) can be written as a linear combination of the vectors in \(B\text{.}\) Since \(B'\) is a basis for \(V\text{,}\) there exist scalars \(d_1,\ldots, d_n\) such that
\begin{equation} \bfv = d_1\bfv_1 + \cdots + d_n\bfv_n\text{.}\tag{5.3} \end{equation}
We want to argue that we can always find scalars \(c_1, \ldots, c_n\) such that
\begin{equation*} \bfv = c_1(\bfv_1 + \bfv_2) + \cdots + c_n\bfv_n\text{.} \end{equation*}
This equation can be rewritten as
\begin{equation} \bfv = c_1\bfv_1 + (c_1+c_2)\bfv_2 + \cdots + (c_{n-1}+c_n)\bfv_n\text{,}\tag{5.4} \end{equation}
and by The Unique Representation Theorem (TheoremΒ 5.2.11), we know that the coefficients on the right sides of (5.3) and (5.4) must be equal. Immediately we see that \(c_1 = d_1\) and then since we must have \(c_1 + c_2 = d_2\text{,}\) we conclude \(c_2 = d_2 - d_1\text{.}\) We can continue on in this way, eventually producing an expression for each \(c_i\) in terms of the \(d_i\) coefficients.
This proves that \(B\) spans \(V\) which concludes the proof that \(B\) is a basis of \(V\text{.}\)
13.
Prove or disprove: Every basis of \(P_2\) must contain a polynomial of degree 2, a polynomial of degree 1, and a constant polynomial.
14.
Write down a basis for \(M_2(\rr)\text{.}\) Prove that your set is a basis. (There is no need to prove that \(M_2(\rr)\) is a vector space as this was covered in ExampleΒ 2.3.10.)