Subspace and Dimension

When an addition and a scalar multiplication are defined in a subset of vector space, it becomes a vector space. The vector space is called a subspace.

Definition 1..9  

Let $V$ be a vector space and $W$ be a subset of $V$. Then $W$ is a subspace of $V$ if and only if $1.$ ${\bf w}_{1}, {\bf w}_{2}$ in $W$ implies that the sum ${\bf w}_{1} + {\bf w}_{2}$ is also in $W$.
$2.$ w is in $W$ implies that the scalar multiple $\alpha {\bf w}$ is in $W$.

A subspace is itself a vector space. In other words, it satisfies the properties 1 thru 9 of the vector space. In the definition of subspace, we checked only closure property. Other properties are inherited from the vector space $V$. There is a way to create a vector space quickly.

A set of the linear combination of vectors ${\bf v}_{1},{\bf v}_{2}, {\bf v}_{3}, \ldots , {\bf v}_{n}$ can be written as $\langle{\bf v}_{1},{\bf v}_{2}, {\bf v}_{3}, \ldots , {\bf v}_{n}\rangle$. It is called a linear span by vectors ${\bf v}_{1},{\bf v}_{2}, {\bf v}_{3}, \ldots , {\bf v}_{n}$. This is a vector space.

Example 1..20  

$\langle{\bf v}_{1},{\bf v}_{2}, {\bf v}_{3}, \ldots , {\bf v}_{n}\rangle$ is a vector space

Answer Let ${\bf v},{\bf w}$ be elements of $\langle{\bf v}_{1},{\bf v}_{2}, {\bf v}_{3}, \ldots , {\bf v}_{n}\rangle$. Then

$\displaystyle {\bf v}$ $\displaystyle =$ $\displaystyle c_{1}{\bf v}_{1} + c_{2}{\bf v}_{2} + c_{3}{\bf v}_{3} + \cdots + c_{n}{\bf v}_{n},$  
$\displaystyle {\bf w}$ $\displaystyle =$ $\displaystyle d_{1}{\bf v}_{1} + d_{2}{\bf v}_{2} + d_{3}{\bf v}_{3} + \cdots + d_{n}{\bf v}_{n}.$  

Then

$\displaystyle {\bf v} + {\bf w} = (c_{1}+d_{1}){\bf v}_{1} + (c_{2}+d_{2}){\bf v}_{2} + \cdots + (c_{n}+d_{n}){\bf v}_{n},$

$\displaystyle \alpha {\bf v} = (\alpha c_{1}){\bf v}_{1} + (\alpha c_{2}){\bf v}_{2} + \cdots + (\alpha c_{n}){\bf v}_{n}. $

Thus, v + w and $\alpha$ v are in $\langle{\bf v}_{1},{\bf v}_{2}, {\bf v}_{3}, \ldots , {\bf v}_{n}\rangle$. $ \blacksquare$

Example 1..21  

Show that $C[a,b]$ is a subspace of $PC[a,b]$.

Answer Suppose that $f(x),g(x) \in C[a,b]$. Then, since the sum of continuous functions is continuous and the scalar multiple of continuous function is continuous, we have $f(x) + g(x) \in C[a,b],  \alpha f(x) \in C[a,b]$. $ \blacksquare$

Example 1..22  

Let $U,W$ be subspaces of vector space $V$. Then show the intersection $U \cap W$ and $U + W$ are subspace of $V$. Note that

$\displaystyle U \cap W = \{{\bf v} : {\bf v} \in U {\rm and}  {\bf v} \in W \}, $

$\displaystyle U + W = \{{\bf u} + {\bf w} : {\bf u} \in U, {\bf w} \in W \}. $

Answer Let ${\bf u},{\bf w} \in U \cap W$. Then ${\bf u},{\bf w} \in U$ and ${\bf u},{\bf w} \in W$. Thus by closure property

$\displaystyle {\bf u} + {\bf w} \in U \cap W. $

Also, $\alpha {\bf u} \in U$ and $\alpha {\bf u} \in W$. Thus, we have

$\displaystyle \alpha {\bf u} \in U \cap W. $

Therefore, $U \cap W$ is a subspace of $V$.

Suppose that ${\bf u},{\bf w} \in U + W$. Then ${\bf u} = {\bf u}_{1} + {\bf w}_{1}, {\bf w} = {\bf u}_{2} + {\bf w}_{2} $. Thus,

$\displaystyle {\bf u} + {\bf w} = ({\bf u}_{1} + {\bf w}_{1} ) + ({\bf u}_{2} +...
...w}_{2} ) ({\bf u}_{1} + {\bf u}_{2} ) + ({\bf w}_{1} + {\bf w}_{2} )\in U + W, $

$\displaystyle \alpha {\bf u} = \alpha ({\bf u}_{1} + {\bf w}_{1}) = \alpha {\bf u}_{1} + \alpha {\bf w}_{1} \in U + W. $

Therefore, $U + W$ is a subspace of $V$. $ \blacksquare$

For $U + W$, if an element w of $U + W$ is expressed uniquely in the form ${\bf w} = {\bf u} + {\bf w}  ({\bf u} \in U, {\bf w} \in W)$. Then we say $U + W$ is a direct sum of $U$ and $W$ and denoted by $U \bigoplus W$.

$\spadesuit$Basis $\spadesuit$

Definition 1..10  

A sert of vectors $\{{\bf v}_{1},{\bf v}_{2},\ldots,{\bf v}_{n}\}$ has the following properties. Then it is said to be basis of the vector space $V$.
1. ${\{{\bf v}_{i}\}}_{i=1}^{n}$ are linearly independent each other.
2. a vector v of $V$ can be expressed in a linear combination of ${\{{\bf v}_{i}\}}_{i=1}^{n}$. That is $V = \langle{\bf v}_{1},{\bf v}_{2}, \ldots , {\bf v}_{n}\rangle $. In this case, we say the set of ${\{{\bf v}_{i}\}}_{i=1}^{n}$ span the vector space.

It is obvious that every set of linearly independent vectors can not be a basis. For example, consider the set of vectors $\{{\bf i},{\bf j}\}$. the set of $\{{\bf i},{\bf j}\}$ is linearly independent. But for any real values $c_{1},c_{2}$, ${\bf k} = c_{1}{\bf i} + c_{2}{\bf j}$ is impossible.

Let the largest number of linearly independent vectors in $\{{\bf v}_{1},{\bf v}_{2},\ldots,{\bf v}_{n}\}$ be $r$. Then consider $\{{\bf v}_{1}, {\bf v}_{2}, \ldots, {\bf v}_{r}\}$. Then the rest of vectors ${\bf v}_{r+1},\ldots,{\bf v}_{n}$ are linearly dependent of

$\displaystyle \{{\bf v}_{1}, {\bf v}_{2}, \ldots, {\bf v}_{r}\}$

Then by the theorem 1.3, every vector is a linear combination of $\{{\bf v}_{1}, {\bf v}_{2}, \ldots, {\bf v}_{r}\}$. Thus, we have the next theorem.

$\spadesuit$Dimension $\spadesuit$

Theorem 1..5  

Suppose that $V = \langle{\bf v}_{1},{\bf v}_{2}, \ldots , {\bf v}_{n}\rangle $. Choose a linearly independent vectors $\{{\bf v}_{1}, {\bf v}_{2}, \ldots, {\bf v}_{r}\}$ from $\{{\bf v}_{1},{\bf v}_{2},\ldots,{\bf v}_{n}\}$. Then we have

$\displaystyle \langle{\bf v}_{1},{\bf v}_{2},\ldots,{\bf v}_{n}\rangle = \langle{\bf v}_{1},{\bf v}_{2},\ldots,{\bf v}_{r}\rangle $

In this case, $\{{\bf v}_{1}, {\bf v}_{2}, \ldots, {\bf v}_{r}\}$ is a basis of $V$ and $r$ is said to be dimension and denoted by $\dim V = r$.

Example 1..23  

Let $S$ be a set of vectors on the plane $x + y + z = 0$. Then

$\displaystyle S = \{(x,y,-x-y) : x,y \in R\}. $

Find the basis of $S$ and dimension of $S$.

Answer We first show that $S = \{(x,y,-x-y) : x,y \in R\}$ is a subspace of $R^3$. Let ${\bf s}_{1},{\bf s}_{2}$ be elements of $S$. Then we can write ${\bf s}_{1} = (x_{1},y_{1},-x_{1}-y_{1}), {\bf s}_{2} = (x_{2},y_{2},-x_{2}-y_{2})$. Thus,

$\displaystyle {\bf s}_{1}+{\bf s}_{2} = (x_{1}+x_{2},y_{1}+y_{2},-(x_{1}+x_{2})-(y_{1}+y_{2})), $

$\displaystyle \alpha{\bf s}_{1} = \alpha(x_{1},y_{1},(-x_{1}-y_{1})) = (\alpha x_{1}, \alpha y_{1}, -\alpha x_{1}-\alpha y_{1}). $

Therefore, $S$ is a subspace of $R^3$.

Next let ${\bf s}$ be an element of $S$. Then we express s using ${\bf i},{\bf j},{\bf k}$. Since ${\bf s} = (x,y,-x-y)$. we can write

$\displaystyle {\bf s} = x{\bf i} + y{\bf j} -(x+y){\bf k} = x({\bf i} - {\bf k}) + y({\bf j} - {\bf k}). $

Thus, every vector in $S$ can be expressed by a linear combination of ${\bf i} - {\bf k}$ and ${\bf j} - {\bf k}$ . Also, ${\bf i} - {\bf k}$ and ${\bf j} - {\bf k}$ are linearly independent. This shows that the set of ${\bf i} - {\bf k}$ and ${\bf j} - {\bf k}$ is the basis of $S$. Therefore, $\dim S = 2$. $ \blacksquare$

Before moving to the next section, we study the dimension about sum space and intersection space. Proof can be seen in Exercise1.4.

Theorem 1..6  

Let $U,W$ be subspaces of a vector space $V$. Then

$\displaystyle \dim (U + W) = \dim U + \dim W - \dim(U \cap W). $

$\spadesuit$Diagonalization of Gram-Schmidt $\spadesuit$

We have learned in 1.2 how to create an orthonormal system from an orthogonal system. In this section, we learn how to create an orthonormal system from the independent vectors. Give $m$ vectors ${\bf v}_{1},{\bf v}_{2}, \ldots ,{\bf v}_{m}$. Suppose the following holds

$\displaystyle ({\bf v}_{i},{\bf v}_{j}) = \left \{ \begin{array}{cl}
0,& i \neq j \\
1,& i = j
\end{array}\right.
=  \delta_{ij} $

, then $m$ vectors form an orthonormal system. The $\delta_{ij}$ here is called a Kronecker delta. We have seen a few examples of an orthonormal system in 1.2. Now if you look very carefully, these examples are all independent.

Theorem 1..7  

If a set of vectors $\{{\bf v}_{1},{\bf v}_{2},\ldots,{\bf v}_{m}\}$ forms an orthonormal system, then they are independent.

Proof Let $c_{1}{\bf v}_{1} + c_{2}{\bf v}_{2} + \cdots + c_{m}{\bf v}_{m} = {\bf0}$ and make an inner product with ${\bf v}_{1}$. Then

$\displaystyle 0 = ({\bf v}_{1},{\bf0})$ $\displaystyle =$ $\displaystyle ({\bf v}_{1},c_{1}{\bf v}_{1} +c_{2}{\bf v}_{2} + \cdots + c_{m}{\bf v}_{m})$  
  $\displaystyle =$ $\displaystyle ({\bf v}_{1},c_{1}v_{1}) + ({\bf v}_{1},c_{2}{\bf v}_{2}) + \cdots + ({\bf v}_{1},c_{m}{\bf v}_{m})$  
  $\displaystyle =$ $\displaystyle c_{1}({\bf v}_{1},{\bf v}_{1})$  
  $\displaystyle =$ $\displaystyle c_{1} .$  

Similarly, the rest of $c_{i}$ are 0. Thus, $\{{\bf v}_{1},{\bf v}_{2},\ldots,{\bf v}_{m}\}$ is linearly independent. $ \blacksquare$

Conversely, given a set of linearly independent vectors, is it possible to create an orthonormal system?

Suppose that a set of vectors ${\mathbf x}_{1}, {\mathbf x}_{2}, \ldots, {\mathbf x}_{m}$ is linearly independent. Then all vectors ${\mathbf x}_{i} \neq 0$ (why?). Now let $\displaystyle{\frac{{\mathbf x}_{1}}{\Vert{\mathbf x}_{1}\Vert} = {\bf v}_{1}}$ . Then ${\bf v}_{1}$ is a unit vector. Next we choose a vector which is orthogonal to the plane formed by vectors ${\mathbf x}_{2}$ and ${\bf v}_{1}$ for sides. Then we let ${\bf w}_{2} = c_{1}{\bf v}_{1} + c_{2}{\mathbf x}_{2}$ be a vector orthogonal to ${\bf v}_{1}$. Then

$\displaystyle ({\bf w}_{2},{\bf v}_{1})$ $\displaystyle =$ $\displaystyle (c_{1}{\bf v}_{1} + c_{2}{\mathbf x}_{2},{\bf v}_{1})$  
  $\displaystyle =$ $\displaystyle c_{1} + c_{2}({\mathbf x}_{2},{\bf v}_{1})$  
  $\displaystyle =$ $\displaystyle 0 .$  

Thus, $c_{1} = -c_{2}({\mathbf x}_{2},{\bf v}_{1})$. Therefore,

$\displaystyle {\bf w}_{2} = c_{2}({\mathbf x}_{2} - ({\mathbf x}_{2},{\bf v}_{1}){\bf v}_{1}).$

Note that ${\mathbf x}_{1},{\mathbf x}_{2}$ are linearly independent. we have ${\mathbf x}_{2} - ({\mathbf x}_{2},{\bf v}_{1}){\bf v}_{1} \neq {\bf0}$. Then for

$\displaystyle {\bf v}_{2} = \frac{{\bf w}_{2}}{\Vert{\bf w}_{2}\Vert} = \frac{{...
... v}_{1}}{\Vert{\mathbf x}_{2} - ({\mathbf x}_{2},{\bf v}_{1}){\bf v}_{1}\Vert} $

we have ${\bf v}_{1} \perp {\bf v}_{2}, \Vert{\bf v}_{1}\Vert = \Vert{\bf v}_{2}\Vert = 1$.

Next we find the unit vector ${\bf v}_{3}$ which is orthogonal to vectors ${\bf v}_{1}$ and ${\bf v}_{2}$. Consider a linear combination of ${\bf v}_{1},{\bf v}_{2}$ and ${\mathbf x}_{3}$. i.e. ${\bf w}_{3} = d_{1}{\bf v}_{1} + d_{2}{\bf v}_{2} + d_{3}{\mathbf x}_{3}$. Then

$\displaystyle ({\bf w}_{3},{\bf v}_{1}) = ({\bf w}_{3},{\bf v}_{2}) = 0$

implies that

$\displaystyle d_{1} + d_{3}({\mathbf x}_{3},{\bf v}_{1}) = 0,  d_{2} + d_{3}({\mathbf x}_{3},{\bf v}_{2}) = 0. $

Thus,

$\displaystyle {\bf w}_{3} = d_{3}({\mathbf x}_{3} - ({\mathbf x}_{3},{\bf v}_{1}){\bf v}_{1} - ({\mathbf x}_{3},{\bf v}_{2}){\bf v}_{2}). $

Since ${\mathbf x}_{1},{\mathbf x}_{2}, {\mathbf x}_{3}$ are linearly independent, we have

$\displaystyle {\mathbf x}_{3} - ({\mathbf x}_{3},{\bf v}_{1}){\bf v}_{1} - ({\mathbf x}_{3},{\bf v}_{2}){\bf v}_{2} \neq {\bf0}. $

Thus by letting

$\displaystyle {\bf v}_{3} = \frac{{\bf w}_{3}}{\Vert{\bf w}_{3}\Vert} = \frac{{...
..._{3},{\bf v}_{1}){\bf v}_{1} - ({\mathbf x}_{3},{\bf v}_{2}){\bf v}_{2}\Vert}, $

we have a unit vector orthogonal to ${\bf v}_{1}$ and ${\bf v}_{2}$ Similarly, continue this process by letting

$\displaystyle {\bf v}_{j} = \frac{{\bf w}_{j}}{\Vert{\bf w}_{j}\Vert} = \frac{{...
..._{i=1}^{j-1}({\mathbf x}_{j},{\bf v}_{i}){\bf v}_{i}\Vert},  2 \leq j \leq m ,$

we have $\{{\bf v}_{1},{\bf v}_{2},\ldots,{\bf v}_{m}\}$ which forms an orthonormal system. We call this process Gram-Schmidt orthonormalization.

Example 1..24  

For

$\displaystyle {\mathbf x}_{1} = \left(\begin{array}{c}
0 \\
1 \\
1
\end{array...
... ), {\mathbf x}_{3} = \left(\begin{array}{c}
1 \\
1 \\
0
\end{array}\right ),$

find an orthonormal system.

Answer

$\displaystyle {\bf v}_{1} = \frac{{\mathbf x}_{1}}{\Vert{\mathbf x}_{1}\Vert} =...
...\left(\begin{array}{c}
0 \\
\frac{1}{2} \\
\frac{1}{2}
\end{array}\right ) . $

ŽŸ‚É,

$\displaystyle {\mathbf x}_{2} - ({\mathbf x}_{2},{\bf v}_{1}){\bf v}_{1} =
\lef...
...\left(\begin{array}{r}
1 \\
-\frac{1}{2} \\
\frac{1}{2}
\end{array}\right ), $

$\displaystyle \Vert{\mathbf x}_{2} - ({\mathbf x}_{2},{\bf v}_{1}){\bf v}_{1}\Vert = \sqrt{1 + (-\frac{1}{2})^2 + (\frac{1}{2})^2} = \sqrt{\frac{3}{2}} $

implies

$\displaystyle {\bf v}_{2} = \sqrt{\frac{2}{3}}\left(\begin{array}{r}
1\\
-\fra...
...{\sqrt{6}} \\
-\frac{1}{\sqrt{6}} \\
\frac{1}{\sqrt{6}}
\end{array}\right ). $

Lastly,

$\displaystyle {\mathbf x}_{3} - ({\mathbf x}_{3},{\bf v}_{1}){\bf v}_{1} - ({\m...
...2}{\sqrt{6}}\\
-\frac{1}{\sqrt{6}}\\
\frac{1}{\sqrt{6}}
\end{array}\right ), $

$\displaystyle \Vert{\mathbf x}_{3} - ({\mathbf x}_{3},{\bf v}_{1}){\bf v}_{1} -...
...qrt{(\frac{2}{3})^2 + (\frac{2}{3})^2 + (-\frac{2}{3})^2} = \frac{2}{\sqrt{3}} $

implies

$\displaystyle {\bf v}_{3} = \frac{\sqrt{3}}{2}\left(\begin{array}{r}
\frac{2}{3...
...}{\sqrt{3}}\\
\frac{1}{\sqrt{3}}\\
\frac{1}{-\sqrt{3}}
\end{array}\right ) .
$

The set of vectors ${\bf v}_{1},{\bf v}_{2},{\bf v}_{3}$ forms an orthonormal system. $ \blacksquare$

Exercise1-8

1. Determine whether $W = \{(x,y,1) : x,y$   real$\}$ is a subspace of the vector space $R^3$.

2. Show that $W = \{(x,y,-3x+2y) : x, y$   real$\}$ is a subspace of the vector space $R^3$.

3. Find the basis of a vector space $W = \{(x,y,-3x+2y) : x, y$   real$\}$ . Find the dimension of $W$.

4. Show the following set of vectors is a basis of the vector space $R^3$.

$\displaystyle \{{\bf i} + {\bf j} , {\bf k} , {\bf i } + {\bf k}\} $

5. Find the dimension of the following subspace.

$\displaystyle \{3, x-2, x+3, x^2+1\}$

6. From the vectors ${\mathbf x}_{1} = (1,1,1), {\mathbf x}_{2} = (0,1,1), {\mathbf x}_{3} = (0,0,1)$ , create an orthonormal system.

7. Let $U,W$ be subspace of a vector space $V$. Show the following dimensional equation holds.

$\displaystyle \dim (U + W) = \dim U + \dim W - \dim(U \cap W). $

8. Show that any set of vectors with more than 4 vectors in 3D vector spaceis linearly dependent.