Skip to main content

Section 5.3 Dimension

In this section we will define the dimension of a vector space, finally delivering on the promise made in the introduction to this chapter to describe an intrinsic quality of vector spaces that allows a comparison between spaces.

Subsection 5.3.1 The Dimension of a Vector Space

We are on the threshold of the definition of dimension. We will first present a result that connects (for a finite-dimensional space) the number of vectors needed for a spanning set to the concept of linear independence. We will omit the proof.

Proof.

Since \(\bfw_j \in \spn\{\bfv_1, \ldots, \bfv_m\}\) for each \(j=1, \ldots, n\text{,}\) there exist scalars \(c_{ij} \in \ff\) such that
\begin{equation*} \bfw_j = c_{1j}\bfv_1 + \cdots + c_{mj}\bfv_m\text{.} \end{equation*}
We now consider the homogeneous linear system represented by the equation \(A\bfx = \bfo\text{,}\) where \(A\) is the \(m\times n\) matrix \(A = [c_{ij}]\text{.}\) If \(\bfx\) is a solution to this system, then we have
\begin{equation*} 0 = c_{i1}x_1 + \cdots + c_{in}x_n \end{equation*}
for all \(i = 1,\ldots, m\text{.}\) This means that we have
\begin{align*} \bfo \amp = \sum_{i=1}^m \left(\sum_{j=1}^n c_{ij}x_j \right) \bfv_i\\ \amp = \sum_{j=1}^n x_j \left(\sum_{i=1}^m c_{ij} \bfv_i \right)\\ \amp = \sum_{j=1}^n x_j \bfw_j\text{.} \end{align*}
Since \(\{\bfw_1, \ldots, \bfw_n \}\) is a linearly independent set, we must have \(x_j=0\) for all \(j\text{,}\) meaning that \(\bfx = \bfo\) is the only solution to the equation \(A\bfx = \bfo\text{.}\) By CorollaryΒ 1.3.7 (or, rather, the corresponding corollary to TheoremΒ 2.2.3), this means that \(m \ge n\text{,}\) as desired.
We will now use this lemma to prove a result related to dimension.

Proof.

We will prove the contrapositive. If \(V\) is finite-dimensional, then there exists a set \(V' = \{\bfv_1, \ldots, \bfv_m\}\) such that \(V = \spn(V')\text{.}\) By LemmaΒ 5.3.1, for any \(n > m\text{,}\) \(V\) cannot contain a linearly independent subset of \(n\) vectors. Thus, there exists a natural number \(n\) such that \(V\) does not contain a linearly independent subset of that size. This completes the proof of the contrapositive.
This theorem gives us an introduction to our first infinite-dimensional example.

Example 5.3.3.

Let \(P\) be the vector space of all polynomials with real coefficients. (We do not restrict the degree of polynomials in \(P\text{.}\)) For any \(n \ge 1\text{,}\) \(P\) contains the linearly independent set
\begin{equation*} \{ 1, t, \ldots, t^n \}\text{.} \end{equation*}
Therefore, by TheoremΒ 5.3.2, \(P\) is infinite-dimensional.
We now come to the bedrock result of this section, the result that makes the definition of dimension possible.

Proof.

Since \(V \subseteq \spn\{\bfv_1, \ldots, \bfv_m \}\) and \(\{ \bfw_1, \ldots, \bfw_n \}\) is a linearly independent set, LemmaΒ 5.3.1 implies that \(m \ge n\text{.}\) However, since \({V \subseteq \spn\{\bfw_1, \ldots, \bfw_n \}}\) and \(\{ \bfv_1, \ldots, \bfv_m \}\) is a linearly independent set, LemmaΒ 5.3.1 also implies that \(n \ge m\text{.}\) Therefore, \(m=n\text{.}\)
Even though a vector space may have a huge number of bases, all of those bases have the same size. This is a number intrinsic to the vector space, not to any specific basis of that vector space. This is what we mean by the dimension of a vector space.

Definition 5.3.5.

Let \(V\) be a finite-dimensional vector space. If \(V \neq \{\bfo\}\text{,}\) then the dimension of \(V\text{,}\) written \(\dim(V)\text{,}\) is the size of any basis of \(V\text{.}\) If \(\dim(V)=n\text{,}\) we say that \(V\) is \(n\)-dimensional.
If \(V = \{\bfo\}\text{,}\) then we define the dimension of \(V\) to be 0.
Two of the families of vector spaces we frequently discuss have easy-to-determine dimensions, as the next two examples illustrate.

Example 5.3.6.

Since \(\{\bfe_1, \ldots, \bfe_n \}\) is a basis for \(\ff^n\text{,}\) then \(\dim(\ff^n)=n\text{.}\)

Example 5.3.7.

Since \(\{1, t, \ldots, t^n\}\) is a basis for \(P_n\text{,}\) then \(\dim(P_n)=n+1\text{.}\)
The proofs of the next two results are a consequence of LemmaΒ 5.3.1 and will appear in the exercises.
We will now begin to discuss dimension as a tool to compare vector spaces. Linear transformations are the main way we relate vector spaces to each other, so these next results rely on that machinery.

Proof.

Given \(\bfv \in V\text{,}\) there exists a unique linear combination
\begin{equation*} \bfv = \sum_{i=1}^n c_i \bfv_i \end{equation*}
by TheoremΒ 5.2.11. We define the function \(T\) by
\begin{equation*} T(\bfv) = \sum_{i=1}^n c_i \bfw_i\text{.} \end{equation*}
In words, we send a vector \(\bfv\) to the linear combination of the \(\bfw_i\) vectors using the same weights as those needed to form \(\bfv\) from the basis \(V'\text{.}\) This gives \(T(\bfv_i)=\bfw_i\) for each \(i\text{,}\) so we only need to show that \(T\) is a linear transformation.
Suppose that \(\bfu, \bfv \in V\) with
\begin{equation*} \bfv = \sum_{i=1}^n c_i \bfv_i \hspace{6pt} \text{and} \hspace{6pt} \bfu = \sum_{i=1}^n d_i \bfv_i\text{.} \end{equation*}
Then we have
\begin{align*} T(\bfu + \bfv) \amp = T \left( \sum_{i=1}^n (c_i+d_i)\bfv_i \right)\\ \amp = \sum_{i=1}^n (c_i+d_i)\bfw_i\\ \amp = \sum_{i=1}^n c_i\bfw_i + \sum_{i=1}^n d_i\bfw_i\\ \amp = T(\bfu) + T(\bfv)\text{.} \end{align*}
Now we let \(\bfv \in V\) and \(d \in \ff\text{.}\) Then, if
\begin{equation*} \bfv = \sum_{i=1}^n c_i \bfv_i\text{,} \end{equation*}
we have
\begin{align*} T(d\bfv) \amp = T\left( \sum_{i=1}^n (dc_i) \bfv_i \right)\\ \amp = \sum_{i=1}^n (dc_i) \bfw_i\\ \amp = d\sum_{i=1}^n c_i \bfw_i\\ \amp = dT(\bfv)\text{.} \end{align*}
We will complete the proof by justifying the claim that \(T\) is unique. Suppose that \(T' \in L(V,W)\) with \(T'(\bfv_i)=\bfw_i\) for each \(i\text{.}\) Then, if \(\bfv \in V\) with
\begin{equation*} \bfv = \sum_{i=1}^n c_i \bfv_i\text{,} \end{equation*}
we have
\begin{equation*} T'(\bfv) = T'\left( \sum_{i=1}^n c_i \bfv_i \right) = \sum_{i=1}^n c_i T'(\bfv_i) = \sum_{i=1}^n c_i \bfw_i\text{.} \end{equation*}
This shows that \(T'(\bfv)=T(\bfv)\) for every \(\bfv \in V\text{,}\) so \(T'=T\) and \(T\) is unique.
The notion of alikeness that we use in linear algebra is when two vector spaces are isomorphic. The reader may wish to consult DefinitionΒ 3.1.15 for a refresher.

Proof.

We first suppose that \(T\) is an isomorphism. We want to show that \(T(B)\) is a basis for \(W\text{,}\) so we begin with linear independence. Suppose that \(c_1, \ldots, c_n \in \ff\) such that
\begin{equation*} \bfo = \sum_{i=1}^n c_i T(\bfv_i)\text{.} \end{equation*}
Then we have
\begin{equation*} \bfo = \sum_{i=1}^n T(c_i\bfv_i) = T\left( \sum_{i=1}^n c_i\bfv_i \right)\text{.} \end{equation*}
Since \(T\) is injective, by TheoremΒ 3.4.7 we must have
\begin{equation*} \bfo = \sum_{i=1}^n c_i\bfv_i\text{.} \end{equation*}
But since \(B\) is a linearly independent set, we have \(c_i=0\) for all \(i\text{.}\) This proves that \(T(B)\) is linearly independent.
We now prove that \(T(B)\) spans \(W\text{.}\) Let \(\bfw \in W\text{.}\) Since \(T\) is surjective, there exists \(\bfv \in V\) such that \(T(\bfv)=\bfw\text{.}\) Since \(B\) is a basis for \(V\text{,}\) we have
\begin{equation*} \bfv = \sum_{i=1}^n c_i\bfv_i\text{.} \end{equation*}
Then
\begin{equation*} \bfw = T(\bfv) = T\left( \sum_{i=1}^n c_i\bfv_i \right) = \sum_{i=1}^n c_iT(\bfv_i)\text{.} \end{equation*}
This proves that \(W = \spn(T(B))\text{,}\) so \(T(B)\) is a basis for \(W\text{.}\)
We now need to prove the other implication, and we assume that \(T(B)\) is a basis for \(W\text{.}\) We need to show that \(T\) is an isomorphism. To show that \(T\) is injective, suppose that \(\bfv \in V\) such that \(T(\bfv)=\bfo\text{.}\) We have
\begin{equation*} \bfv = \sum_{i=1}^n c_i\bfv_i\text{,} \end{equation*}
so
\begin{equation*} \bfo = T(\bfv) = T\left( \sum_{i=1}^n c_i\bfv_i \right) = \sum_{i=1}^n c_iT(\bfv_i)\text{.} \end{equation*}
But since \(T(B)\) is a linearly independent set by assumption, this implies that \(c_i=0\) for all \(i\text{.}\) This means that \(\bfv = \bfo\text{,}\) so \(T\) is injective.
To prove that \(T\) is surjective, we assume that \(\bfw \in W\text{.}\) Since \(T(B)\) spans \(W\text{,}\) we have
\begin{equation*} \bfw = \sum_{i=1}^n d_iT(\bfv_i) \end{equation*}
for some \(d_i \in \ff\text{.}\) We claim that if
\begin{equation*} \bfv = \sum_{i=1}^n d_i \bfv_i\text{,} \end{equation*}
then \(T(\bfv) = \bfw\text{.}\) Here is the justification:
\begin{equation*} T(\bfv) = T\left( \sum_{i=1}^n d_i \bfv_i \right) = \sum_{i=1}^n d_i T(\bfv_i) = \bfw\text{.} \end{equation*}
This proves that \(T\) is surjective and is thus an isomorphism.
When we view dimension as an intrinsic quality of a vector space that allows comparison between spaces, we find something surprising about vector spaces with the same dimension. They are essentially the same!

Proof.

Suppose that \(\dim(V) = \dim(W) = n\text{.}\) Let \(\{\bfv_1, \ldots, \bfv_n\}\) be a basis for \(V\) and let \(\{\bfw_1, \ldots, \bfw_n \}\) be a basis for \(w\text{.}\) By TheoremΒ 5.3.10, we can find \(T \in L(V,W)\) such that \(T(\bfv_i) = \bfw_i\) for each \(i\text{,}\) \(1 \le i \le n\text{.}\) Then by TheoremΒ 5.3.11, \(T\) is an isomorphism.
To prove the claim in the other direction, suppose that \(T \in L(V,W)\) is an isomorphism. If \(\{\bfv_1, \ldots, \bfv_n \}\) is a basis for \(V\text{,}\) then \(\{T(\bfv_1), \ldots, T(\bfv_n) \}\) is a basis for \(W\) by TheoremΒ 5.3.11. Thus \(\dim(V) = \dim(W)\text{.}\)
Here is an immediate consequence of this result.

Example 5.3.14.

Since \(P_2\) is a three-dimensional vector space over \(\rr\text{,}\) \(\rr^3\) and \(P_2\) are isomorphic.

Subsection 5.3.2 Dimension and Subspaces

If we know the dimension of a vector space, then we sometimes have a quicker path to finding a basis for that space. This next result says that if we have a spanning set of the same size as a basis, then it must be a basis.

Proof.

By The Spanning Set Theorem (TheoremΒ 5.2.12), we know that a subset \(B'\) of \(B\) will be a basis for \(V\text{.}\) But since \(\dim(V)=n\text{,}\) the size of \(B'\) must be \(n\text{.}\) Therefore, \(B'=B\) and \(B\) is a basis for \(V\text{.}\)
What is true in TheoremΒ 5.3.15 for a spanning set is also true for a linearly independent set. To prove that, however, we first need the analog to The Spanning Set Theorem for linearly independent sets.

Proof.

Let \(V' = \{\bfv_1, \ldots \bfv_n \}\) be a linearly independent set of vectors in \(V\text{.}\) If \(V = \spn(V')\text{,}\) then \(V'\) is a basis and we are done. If \(V \neq \spn(V')\text{,}\) then there exists some vector \(\bfv_{n+1} \in V - \spn(V')\text{.}\) By the Linear Dependence Lemma (TheoremΒ 5.1.19), the set \(V_1 = V' \cup \{\bfv_{n+1} \}\) is linearly independent.
We can repeat this process. If \(V = \spn(V_1)\text{,}\) we are done; otherwise, we create \(V_2 = V_1 \cup \{\bfv_{n+2} \}\) in the same fashion that we created \(V_1\text{.}\) We can continue doing this, adding one vector at a time to this set and maintaining linear independence. Eventually we must reach the point where \(V = \spn\{ \bfv_1, \ldots, \bfv_n, \bfv_{n+1}, \ldots, \bfv_{n+k} \}\text{,}\) since otherwise LemmaΒ 5.3.1 would imply that \(V\) is infinite-dimensional.
We now have the machinery necessary to state the following theorem. The proof will appear in the exercises.
The final result of this section collects some facts about dimension and subspaces which we will use in some of the sections that follow.

Proof.

We will prove these facts in order. If the subspace \(U\) is \(\{\bfo \}\text{,}\) then we have nothing to prove. If not, then there is some non-zero vector \(\bfu_1 \in U\text{.}\) If \(U = \spn\{\bfu_1\}\text{,}\) we are done; if not, then there exists \(\bfu_2 \in U - \spn\{\bfu_1\}\text{.}\) By TheoremΒ 5.1.19, the set \(\{\bfu_1, \bfu_2\}\) is linearly independent. We can continue to repeat this process. At each stage we have a linearly independent set \({U_k = \{\bfu_1, \ldots, \bfu_k\}}\text{,}\) and this cannot continue indefinitely since \(U\) is a subspace of \(V\text{,}\) which is finite-dimensional. Thus this process must eventually stop when \(U = \spn(U_j)\) for some \(j\text{,}\) and that proves that \(U\) is finite dimensional.
The space \(U\) is finite dimensional, so it has a basis \(B\text{.}\) This is a linearly independent set of vectors in \(V\text{,}\) so TheoremΒ 5.3.16 says that \(B\) can be extended to a basis \(B'\) of \(V\text{.}\) This means that \(B'\) will have at least as many vectors in it as \(B\text{,}\) so \(\dim(V) \ge \dim(U)\text{.}\)
If \(U=V\) it is obvious that \(\dim(U)=\dim(V)\text{,}\) so we only need to prove the claim in the other direction. We will prove the contrapositive, so we assume \(U \neq V\text{.}\) Let \(B = \{\bfu_1, \ldots, \bfu_n \}\) be a basis for \(U\text{.}\) Since \(U \neq V\text{,}\) there exists a vector \(\bfv_1 \in V - \spn(B)\text{.}\) By TheoremΒ 5.1.19 the set \(\{\bfu_1, \ldots, \bfu_n, \bfv_1 \}\) is linearly independent in \(V\text{,}\) implying that \(\dim(V) \ge n+1\text{.}\) Therefore \({\dim(U) \neq \dim(V)}\text{.}\)

Example 5.3.19.

We can apply this latest result to the vector space \(\rr^3\text{.}\) The familiar subspaces of \(\rr^3\) are all of the subspaces of \(\rr^3\text{.}\)
  1. The only subspace of dimension 0 in \(\rr^3\) is the zero subspace \(\{\bfo \}\text{.}\)
  2. One-dimensional subspaces of \(\rr^3\) are lines through the origin. These can all be written as the span of a single (non-zero) vector.
  3. Two-dimensional subspaces of \(\rr^3\) are planes through the origin. These are all spanned by sets of two linearly independent vectors.
  4. The only three-dimensional subspace of \(\rr^3\) is \(\rr^3\) itself.

Reading Questions 5.3.3 Reading Questions

1.

Consider the following vectors in \(\rr^2\text{:}\)
\begin{equation*} \bfv_1 = \begin{bmatrix} 3 \\ 2 \end{bmatrix} \hspace{6pt} \text{and} \hspace{6pt} \bfv_2 = \begin{bmatrix} -2 \\ 1 \end{bmatrix}\text{.} \end{equation*}
By inspection, why is the set \(\{\bfv_1, \bfv_2 \}\) a basis for \(\rr^2\text{?}\) Explain your answer.

2.

Let \(\bfv_1\text{,}\) \(\bfv_2\text{,}\) \(\bfv_3\text{,}\) and \(\bfv_4\) be vectors in \(\rr^3\text{.}\)
  1. The set \(S = \{\bfv_1, \bfv_2, \bfv_3, \bfv_4 \}\) is not a basis for \(\rr^3\text{,}\) and there’s a very short argument why. What is that argument?
  2. Must there be a subset of \(S\) which is a basis of \(\rr^3\text{?}\) Why or why not?

Exercises 5.3.4 Exercises

1.

Find the dimension of the subspace of \(\rr^3\) consisting of all vectors whose first and third coordinates are equal.
Answer.
The dimension is 2.

2.

For each of the following sets of vectors in the given vector space, find the dimension of the subspace spanned by that set of vectors.
  1. \(\{\bfv_1, \bfv_2, \bfv_3, \bfv_4\}\) in \(\rr^3\) if
    \begin{equation*} \bfv_1 = \begin{bmatrix} 2 \\ -3 \\ 3 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 3 \\ 5 \\ -3 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 9 \\ -4 \\ 6 \end{bmatrix}, \hspace{6pt} \bfv_4 = \begin{bmatrix} 6.5 \\ -5 \\ 6 \end{bmatrix} \end{equation*}
  2. \(\{\bfv_1, \bfv_2, \bfv_3, \bfv_4\}\) in \(\ff_5^3\) if
    \begin{equation*} \bfv_1 = \begin{bmatrix} 2 \\ 0 \\ 0 \end{bmatrix}, \hspace{6pt} \bfv_2 = \begin{bmatrix} 0 \\ 3 \\ 2 \end{bmatrix}, \hspace{6pt} \bfv_3 = \begin{bmatrix} 4 \\ 2 \\ 3 \end{bmatrix}, \hspace{6pt} \bfv_4 = \begin{bmatrix} 3 \\ 1 \\ 1 \end{bmatrix} \end{equation*}

3.

For each of the following matrices \(A\text{,}\) determine the dimensions of \(\nll(A)\) and \(\col(A)\text{.}\)
  1. \(A \in M_{4,5}(\rr)\text{,}\)
    \begin{equation*} A = \begin{bmatrix} 1 \amp -2.5 \amp 4 \amp 1 \amp 2.5 \\ -2.5 \amp 6 \amp -9.5 \amp -4.5 \amp -8 \\ 5 \amp 3 \amp -11 \amp 3 \amp -5 \\ 2 \amp -1 \amp 0 \amp -4 \amp -5 \end{bmatrix} \end{equation*}
  2. \(A \in M_{2,4}(\ff_5)\text{,}\)
    \begin{equation*} A = \begin{bmatrix} 3 \amp 1 \amp 3 \amp 0 \\ 1 \amp 2 \amp 0 \amp 1 \end{bmatrix} \end{equation*}

4.

Determine whether the following statements are true or false. Justify your answer either way.
  1. If a set \(\{\bfv_1, \ldots, \bfv_m \}\) spans a finite-dimensional space \(V\text{,}\) and if \(V'\) is a set of more than \(m\) vectors in \(V\text{,}\) then \(V'\) is linearly dependent.
  2. The vector space \(\rr^2\) is a subspace of \(\rr^3\text{.}\)
  3. A vector space is infinite-dimensional if it is spanned by an infinite set.
Answer.
  1. This is true by LemmaΒ 5.3.1.
  2. This is false, as \(\rr^2\) is not even a subset of \(\rr^3\text{.}\)
  3. This is false. The vector space \(\rr^2\) is a counter-example, as we know that \(\dim(\rr^2)=2\) but it is spanned by an infinite set (the entire vector space).

5.

Determine whether the following statements are true or false. Justify your answer either way.
  1. If \(\dim(V) = m\text{,}\) then there exists a spanning set of \(m+1\) vectors in \(V\text{.}\)
  2. If every set of \(m\) vectors in \(V\) fails to span \(V\text{,}\) then \(\dim(V) > m\text{.}\)
  3. If \(m \ge 2\) and \(\dim(V) = m\text{,}\) then every set of \(m-1\) non-zero vectors in \(V\) is linearly independent.

6.

The first four Hermite polynomials are \(1\text{,}\) \(2t\text{,}\) \(-2 + 4t^2\text{,}\) and \(-12t+8t^3\text{.}\) Show that the set of these polynomials is a basis for \(P_3\text{.}\)
Answer.
Since \(\dim(P_3)=4\) and this is a set of four polynomials, we only need to argue that this set is linearly independent (see TheoremΒ 5.3.17). If we label the polynomials as \(p_1 = 1\text{,}\) \(p_2 = 2t\text{,}\) \(p_3=-2+4t^2\text{,}\) and \(p_4 = -12t+8t^3\text{,}\) then we can argue that the set \(\{p_1,p_2,p_3,p_4\}\) is linearly independent by the contrapositive of the Linear Dependence Lemma. The set containing only \(p_1\) is linearly independent since \(p_1 \neq 0\text{.}\) Then the set \(\{p_1,p_2\}\) is linearly independent since neither polynomial is a scalar multiple of the other. Then since \(p_3\) cannot be a linear combination of \(p_1\) and \(p_2\) for degree reasons, \(\{p_1,p_2,p_3\}\) is linearly independent. Similarly, since \(p_4\) cannot be a linear combination of \(p_1\text{,}\) \(p_2\text{,}\) and \(p_3\) for degree reasons, \(\{p_1,p_2,p_3,p_4\}\) is linearly independent.

7.

The first four Laguerre polynomials are \(1\text{,}\) \(1-t\text{,}\) \(2-4t+t^2\text{,}\) and \({6-18t+9t^2-t^3}\text{.}\) Show that the set of these polynomials is a basis for \(P_3\text{.}\)

Writing Exercises

8.
Let \(A\) be a matrix.
  1. Prove that \(\dim(\nll(A))\) is the number of non-pivot columns in \(A\text{.}\)
  2. Prove that \(\dim(\col(A))\) is the number of pivot columns of \(A\text{.}\)
Answer.
  1. The null space of a matrix has a basis vector for each column in the RREF which does not contain a pivot. Therefore, the dimension of the null space is the number of non-pivot columns.
  2. By AlgorithmΒ 5.2.14, we see that there is a vector in the basis for \(\col(A)\) for each pivot in the RREF of \(A\text{.}\) Therefore, \(\dim(\col(A))\) is the number of pivot columns of \(A\text{.}\)
9.
Let \(V\) be the set of all functions \(\rr\to\rr\text{.}\) Prove that \(V\) is infinite-dimensional.
Answer.
Since \(V\) contains the vector space of all polynomials, and since the vector space of all polynomials is infinite-dimensional (see ExampleΒ 5.3.3), then \(V\) must be infinite-dimensional.
10.
Suppose that \(T:V\to W\) is a linear transformation between vector spaces and that \(V\) is finite-dimensional. Prove that \(\dim(\range(T)) \le \dim(V)\text{.}\)
11.
Prove that \(\cc^2\) is two-dimensional as a vector space over \(\cc\) but four-dimensional as a vector space over \(\rr\text{.}\)