Back to the class
Problems #3,33 pg.213-215
#3, pg.213: Determine whether the following set is a basis for $\mathbb{R}^{3 \times 1}$:
$$\left\{ \begin{bmatrix} 1 \\ 0 \\ -3\\ \end{bmatrix}, \begin{bmatrix}3 \\ 1 \\ -4 \end{bmatrix}, \begin{bmatrix} -2 \\ -1 \\ 1 \end{bmatrix} \right\}.$$
Solution: We will test for independent of these vectors by the definition of linear independence: consider the vector equation
$$x_1\begin{bmatrix} 1 \\ 0 \\ -3\\ \end{bmatrix}+x_2 \begin{bmatrix}3 \\ 1 \\ -4 \end{bmatrix}+x_3 \begin{bmatrix} -2 \\ -1 \\ 1 \end{bmatrix}=0$$
We can solve this vector equation by considering its augmented matrix and reducing it to reduced echelon form:
$$\begin{bmatrix} 1&3 & -2 & 0 \\ 0&1&-1&0 \\ -3&-4&1&0 \end{bmatrix} \sim \begin{bmatrix}1&0&1&0\\ 0&1&-1&0\\ 0&0&0&0 \end{bmatrix}.$$
We can easily see there are nontrivial solutions to this vector equation by rewriting this matrix as a system of equations:
$$\left\{ \begin{array}{ll}
x_1 + x_3 &= 0 \\
x_2 - x_3 &= 0
\end{array} \right.$$
and hence we see that there are infinitely many solutions that can be found by choosing values for the free variable $x_3$:
$$\vec{x} = \begin{bmatrix}x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix}-x_3 \\ x_3 \\ x_3 \end{bmatrix} = x_3 \begin{bmatrix} -1\\1\\1 \end{bmatrix}.$$
This means that the set is a dependent set of vectors (hence not a basis). We can still check to see if it spans $\mathbb{R}^{3 \times 1}$, though. Let $\vec{b} = \begin{bmatrix} b_1 \\ b_2 \\ b_3 \end{bmatrix} \in \mathbb{R}^{3 \times 1}$. We must determine if there are weights $\alpha_1,\alpha_2,\alpha_3$ such that
$$ \alpha_1 \begin{bmatrix} 1 \\ 0 \\ -3\\ \end{bmatrix} +\alpha_2 \begin{bmatrix}3 \\ 1 \\ -4 \end{bmatrix} + \alpha_3 \begin{bmatrix} -2 \\ -1 \\ 1 \end{bmatrix} = \begin{bmatrix} b_1 \\ b_2 \\ b_3 \end{bmatrix}.$$ We can set this up as an augmented matrix and row reduce to reduced echelon form:
$$\begin{bmatrix}1&3&-2&b_1\\0&1&-1&b_2\\-3&-4&1&b_3 \end{bmatrix} \sim \begin{bmatrix}1 & 0 & 1 & b_1-3b_2 \\ 0&1&-1& b_2 \\ 0&0&0&3b_1-5b_2+b_3 \end{bmatrix}.$$
This solution is equivalent to the system of equations
$$\left\{ \begin{array}{ll}
\alpha_1 + \alpha_3 = b_1-3b_2 \\
\alpha_2 - \alpha_3 = b_2 \\
0 = 3b_1-5b_2+b_3
\end{array} \right.$$
Hence this system only have a solution whenever $0=3b_1-5b_2+b_3$ (not all things in $\mathbb{R}^{3 \times 1}$ have this property!). Thus these vectors do not span $\mathbb{R}^3$.
#33, pg.215: Consider the polynomials $\vec{p}_1(t)=1+t^2$ and $\vec{p}_2(t)=1-t^2$. Is $\{\vec{p}_1,\vec{p}_2\}$ a linearly independent set in $\mathbb{P}_3$? Why or why not?
Solution: Yes it is independent. Try to solve the vector equation
$$x_1 \vec{p}_1 + x_2 \vec{p}_2 = 0$$
or equivalently
$$x_1 (1+t^2) + x_2 (1-t^2)=0$$
yielding
$$(x_1+x_2) + (x_1-x_2)t^2=0.$$
Equating coefficients yields the following system of equations:
$$\left\{ \begin{array}{ll}
x_1+x_2 = 0 \\
x_1-x_2 = 0
\end{array} \right.$$
which is equivalent to
$$\left\{ \begin{array}{ll}
x_1 = -x_2 \\
x_1 = x_2
\end{array} \right.$$
So $x_1=x_2=-x_2$. Thus the only solution here is $x_2=0$ and hence $x_1=0$, a trivial solution. Hence the vectors $\vec{p}_1$ and $\vec{p}_2$ are linearly independent.