--| home | txt | code | teach | talks | specialfunctionswiki | timescalewiki | hyperspacewiki | links |--
Back to the class

Section 2.3 #2: Determine if $\vec{v}=\left[ \begin{array}{r} 2 \\ 1 \end{array} \right]$ is a linear combination of $\vec{u}_1=\left[ \begin{array}{l} 4 \\ -2 \end{array} \right]$ and $\vec{u}_2= \left[ \begin{array}{r} -2 \\ 1 \end{array} \right]$
Solution: We are being asked if the following equation has a solution or not: $c_1 \vec{u}_1 + c_2 \vec{u}_2 = \vec{v}.$ Expanding and simplifying the left-hand side yields $$\left[ \begin{array}{rr} 4c_1 &- 2c_2 \\ -2c_1 &+ c_2 \end{array} \right] = \left[ \begin{array}{rr} 2 \\ 1 \end{array} \right].$$ We now begin to put the augmented matrix associated with this system equations into reduced row echelon form: $$\begin{array}{ll} \left[ \begin{array}{rrr} 4 & -2 & 2 \\ -2 & 1 & 1 \end{array} \right] &\stackrel{r_2^*=r_2+\frac{1}{2}r_1}{\sim} \left[ \begin{array}{rrr} 4 & -2 & 2 \\ 0 & 0 & 2 \end{array}\right], \end{array}$$ from which we see immediately that there is no solution. (Because the last line is encoding the equation "$0=2$" which is always false.)

Section 2.3 #8: Determine if the vector $\vec{b}=\left[ \begin{array}{l} 4 \\ 8 \\ 12 \end{array} \right]$ is in the span of the columns of the matrix $A = \left[ \begin{array}{lll} 1&2&3 \\ 5&6&7 \\ 9&10&11 \end{array} \right]$.
Solution: Recall that the span of a set of vectors is the set of all linear combinations of vectors in that set. Write $\vec{a}_1 = \left[ \begin{array}{r} 1 \\ 5 \\ 9 \end{array} \right], \vec{a}_2 = \left[ \begin{array}{r} 2 \\ 6\\ 10 \end{array} \right], \vec{a}_3 = \left[ \begin{array}{r} 3 \\ 7 \\ 11 \end{array} \right]$ to denote the columns of $A$. We are being asked whether or not the following equation has a solution: $c_1 \vec{a}_1+c_2 \vec{a}_2 + c_3 \vec{a}_3 = \vec{b}$. We now put the matrix into reduced row echelon form: compute $$\begin{array}{ll} \left[ \begin{array}{rrrr} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 9 & 10 & 11 & 12 \end{array} \right] &\stackrel{r_2^*=r_2-5r_1, r_3^*=r_3-9r_1}{\sim} \left[ \begin{array}{rrrr} 1 & 2 & 3 & 4 \\ 0 & -4 & -8 & -12 \\ 0 & -8 & -16 & -24 \end{array} \right] \\ &\stackrel{r_2^*=-\frac{1}{4}r_2, r_3^*=-\frac{1}{8}r_3}{\sim} \left[ \begin{array}{rrrr} 1 & 2 & 3 & 4 \\ 0 & 1 & 2 & 3 \\ 0 & 1 & 2 & 3 \end{array} \right] \\ &\stackrel{r_3^*=r_3-r_2}{\sim} \left[ \begin{array}{rrrr} 1 & 2 & 3 & 4 \\ 0 & 1 & 2 & 3 \\ 0 & 0 & 0 & 0 \end{array} \right] \\ &\stackrel{r_1^*=r_1-2r_2}{\sim} \left[ \begin{array}{rrrr} 1 & 0 & -1 & -2 \\ 0 & 1 & 2 & 3 \\ 0 & 0 & 0 & 0 \end{array} \right]. \end{array}$$ This is encoding the system of equations $$\left\{ \begin{array}{rrrr} c_1 & &-c_3 & =-2 \\ & c_2 &+2c_3& =3 \\ & & 0 &= 0 \end{array} \right.$$ In this we see that $c_3$ can be viewed as a free variable with $c_1=-2+c_3$ and $c_2=3-2c_3$. Since there is a free variable, there are in fact many solutions. For example taking $c_3=0$ yields $c_1=-2$ and $c_2=3$. Taking $c_3=1$ yields $c_1=-1$ and $c_2=1$. Therefore, yes, $\vec{b}$ is in the span of the columns of $A$.

Section 2.3 #10: Show that $\mathbb{R}^{2\times 1} = \mathrm{span} \left\{ \left[ \begin{array}{l} 3 \\ -2 \end{array} \right], \left[ \begin{array}{r} 0 \\ 1 \end{array} \right] \right\}$.
Solution: To say that $\mathbb{R}^{2 \times 1}$ is the span of this set of vectors means that the set of all vectors in $\mathbb{R}^{2 \times 1}$ is equal to the span of this set of vectors (i.e. the set of all linear combinations of those vectors). We will proceed by arguing that these sets are equal (recall a set $X$ is equal to a set $Y$ if and only if both $X \subseteq Y$ and $Y \subseteq X$). Since $\mathrm{span} \left\{ \left[ \begin{array}{l} 3 \\ -2 \end{array} \right], \left[ \begin{array}{r} 0 \\ 1 \end{array} \right] \right\}$ is a set of linear combinations of vectors in $\mathbb{R}^{2 \times 1}$, the following is clear: $$\mathrm{span} \left\{ \left[ \begin{array}{l} 3 \\ -2 \end{array} \right], \left[ \begin{array}{r} 0 \\ 1 \end{array} \right] \right\} \subseteq \mathbb{R}^{2 \times 1}.$$ It remains to show that $\mathbb{R}^{2 \times 1} \subseteq \mathrm{span} \left\{ \left[ \begin{array}{l} 3 \\ -2 \end{array} \right], \left[ \begin{array}{r} 0 \\ 1 \end{array} \right] \right\}$. Let $\left[ \begin{array}{rr} a \\ b \end{array} \right] \in \mathbb{R}^{2 \times 1}$ be an arbitrary vector. It is sufficient to show that this vector lies in $\mathrm{span} \left\{ \left[ \begin{array}{l} 3 \\ -2 \end{array} \right], \left[ \begin{array}{r} 0 \\ 1 \end{array} \right] \right\}$. To show this, we must find a linear combination $$c_1 \left[ \begin{array}{rr} 3 \\ -2 \end{array} \right] + c_2\left[ \begin{array}{rr} 0 \\ 1 \end{array} \right] \in \mathrm{span} \left\{ \left[ \begin{array}{l} 3 \\ -2 \end{array} \right], \left[ \begin{array}{r} 0 \\ 1 \end{array} \right] \right\}$$ which is equal to $\left[ \begin{array}{r} a \\ b \end{array} \right]$. In other words, we must solve the following linear system of equations: $$c_1 \left[ \begin{array}{rr} 3 \\ -2 \end{array} \right] + c_2\left[ \begin{array}{rr} 0 \\ 1 \end{array} \right] = \left[ \begin{array}{r} a \\ b \end{array} \right].$$ To finish, we put the associated augmented matrix into reduced echelon form: calculate $$\begin{array}{rr} \left[ \begin{array}{rrr} 3 & 0 & a \\ -2 & 1 & b \end{array} \right] & \stackrel{r_1^*=\frac{1}{3}r_1, r_2^*=-\frac{1}{2}r_2}{\sim} \left[ \begin{array}{rrr} 1 & 0 & \frac{a}{3} \\ 1 & -\frac{1}{2} & -\frac{b}{2} \end{array} \right] \\ &\stackrel{r_2^*=r_2-r_1}{\sim} \left[ \begin{array}{rrr} 1 & 0 & \frac{a}{3} \\ 0 & -\frac{1}{2} & -\frac{b}{2} - \frac{a}{3} \end{array} \right] \\ &\stackrel{r_2^*=-2r_2}{\sim} \left[ \begin{array}{rrr} 1 & 0 & \frac{a}{3} \\ 0 & 1 & b + \frac{2a}{3} \end{array} \right]. \end{array}$$ Therefore we have found the solution $\left[ \begin{array}{r} c_1 \\ c_2 \end{array} \right] = \left[ \begin{array}{r} \frac{a}{3} \\ b+\frac{2a}{3} \end{array} \right],$ completing the proof.

Section 2.3 #30: Is the following set of vectors dependent or independent?:
$$\left\{ \left[ \begin{array}{r} 0 \\ 0 \\ 0 \\ 1 \end{array} \right], \left[ \begin{array}{r} 0 \\ 0 \\ 2 \\ 1 \end{array} \right], \left[ \begin{array}{r} 0 \\ 3 \\ 2 \\ 1 \end{array} \right], \left[ \begin{array}{r} 4 \\ 3 \\ 2 \\ 1 \end{array} \right] \right\}$$ Solution: Because of the definition of linear independence, we consider the following vector equation: $$c_1 \left[ \begin{array}{r} 0 \\ 0 \\ 0 \\ 1 \end{array} \right] + c_2 \left[ \begin{array}{r} 0 \\ 0 \\ 2 \\ 1 \end{array} \right] + c_3 \left[ \begin{array}{r} 0 \\ 3 \\ 2 \\ 1 \end{array} \right] + c_4 \left[ \begin{array}{r} 4 \\ 3 \\ 2 \\ 1 \end{array} \right] = \left[ \begin{array}{r} 0 \\ 0 \\ 0 \\ 0 \end{array} \right].$$ We will solve it (notice the trivial solution $c_1=c_2=c_3=c_4=0$ is a solution); if the only solution is the trivial solution, then we conclude the set is independent. If there is a solution that is not the trivial solution, then we will conclude that the set is not independent. To solve it, we will put the associated augmented matrix into reduced echelon form: compute $$\begin{array}{ll} \left[ \begin{array}{rrrrr} 0 & 0 & 0 & 4 & 0 \\ 0 & 0 & 3 & 3 & 0 \\ 0 & 2 & 2 & 2 & 0 \\ 1 & 1 & 1 & 1 & 0 \end{array} \right] &\stackrel{r_1 \leftrightarrow r_4, r_2 \leftrightarrow r_3}{\sim} \left[ \begin{array}{rrrrr} 1 & 1 & 1 & 1 & 0 \\ 0 & 2 & 2 & 2 & 0 \\ 0 & 0 & 3 & 3 & 0 \\ 0 & 0 & 0 & 4 & 0 \end{array} \right] \\ &\stackrel{r_2^*=\frac{1}{2}r_2, r_3^*=\frac{1}{3}r_3,r_4^*=\frac{1}{4}r_4}{\sim} \left[ \begin{array}{rrrrr} 1 & 1 & 1 & 1 & 0 \\ 0 & 1 & 1 & 1 & 0 \\ 0 & 0 & 1 & 1 & 0 \\ 0 & 0 & 0 & 1 & 0 \end{array} \right] \\ &\stackrel{r_1^*=r_1-r_4, r_2^*=r_2-r_4, r_3^*=r_3-r_4}{\sim} \left[ \begin{array}{rrrrr} 1 & 1 & 1 & 0 & 0 \\ 0 & 1 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \end{array} \right] \\ &\stackrel{r_2^*=r_2-r_3, r_1^*=r_1-r_3}{\sim} \begin{bmatrix} 1&1&0&0&0 \\ 0&1&0&0&0 \\ 0&0&1&0&0 \\ 0&0&0&1&0 \end{bmatrix} \\ &\stackrel{r_1^*=r_1-r_2}{\sim} \begin{bmatrix} 1&0&0&0&0 \\ 0&1&0&0&0 \\ 0&0&1&0&0 \\ 0&0&0&1&0 \\ \end{bmatrix} \end{array}$$ From this we observe that the solution of the vector equations is $\begin{bmatrix} c_1 \\ c_2 \\ c_3 \\ c_4 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \\ 0 \end{bmatrix}$, and therefore the set of vectors is linearly independent.


In the following two problems: $$A = \left[ \begin{array}{rr} 3 & 0 \\ -1 & 5 \end{array} \right], B = \left[ \begin{array}{rrr} 4 & -2 & 1 \\ 0 & 2 & 3 \end{array} \right], C = \left[ \begin{array}{rr} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{array} \right], D = \left[ \begin{array}{rr} 0 & -3 \\ -2 & 1 \end{array} \right]$$ Section 3.1 #2: Calculate $3D-2A$.
Solution: Calculate $$\begin{array}{ll} 3D - 2A &= 3 \left[ \begin{array}{rr} 0 & -3 \\ -2 & 1 \end{array} \right] - 2 \left[ \begin{array}{rr} 3 & 0 \\ -1 & 5 \end{array} \right] \\ &= \left[ \begin{array}{rr} 0 & -9 \\ -6 & 3 \end{array} \right] + \left[ \begin{array}{rr} -6 & 0 \\ 2 & -10 \end{array} \right] \\ &= \left[ \begin{array}{rr} 0-6 & -9+0 \\ -6+2 & 3-10 \end{array} \right] \\ &= \left[ \begin{array}{rr} -6 & -9 \\ -4 & -7 \end{array} \right]. \end{array}$$

Section 3.1 #4: Calculate $C-B^T$.
Solution: First calculate $$B^T = \left[ \begin{array}{rrr} 4 & -2 & 1 \\ 0 & 2 & 3 \end{array} \right]^T = \left[ \begin{array}{rr} 4 & 0 \\ -2 & 2 \\ 1 & 3 \end{array} \right].$$ Now calculate $$\begin{array}{ll} C - B^T &= \left[ \begin{array}{rr} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{array} \right] - \left[ \begin{array}{rr} 4 & 0 \\ -2 & 2 \\ 1 & 3 \end{array} \right] \\ &= \left[ \begin{array}{rrr} 1-4 & 2-0 \\ 3-(-2) & 4-2 \\ 5-1 & 6-3 \end{array} \right] \\ &= \left[ \begin{array}{rrr} -3 & 2 \\ 5 & 2 \\ 4 & 3 \end{array} \right]. \end{array}$$

Problem A: Determine if the following set of vectors is linearly independent (field: $\mathbb{C}$): $$\left\{ \left[ \begin{array}{ll} 3 \\ -i \\ 2 \end{array} \right], \left[ \begin{array}{ll} 2 \\ i \\ 3 \end{array} \right], \left[ \begin{array}{ll} 1 \\ 1 \\ 1 \end{array} \right] \right\}.$$ Solution: Consider the following vector equation: $$c_1 \left[ \begin{array}{ll} 3 \\ -i \\ 2 \end{array} \right] + c_2 \left[ \begin{array}{ll} 2 \\ i \\ 3 \end{array} \right] + c_3 \left[ \begin{array}{ll} 1 \\ 1 \\ 1 \end{array} \right] = \left[ \begin{array}{ll} 0 \\ 0 \\ 0 \end{array} \right].$$

We now solve the associated augmented matrix: compute $$\begin{array}{ll} \begin{bmatrix} 3 & 2 & 1 & 0 \\ -i & i & 1 & 0 \\ 2 & 3 & 1 & 0 \end{bmatrix} &\stackrel{r_1^*=\frac{1}{3}r_1, r_2^*=\frac{1}{-i}r_2, r_3^*=\frac{1}{2}r_3}{\sim} \begin{bmatrix} 1 & \frac{2}{3} & \frac{1}{3} & 0 \\ 1 & -1 & \frac{1}{i} & 0 \\ 1 & \frac{3}{2} & \frac{1}{2} & 0 \end{bmatrix} \\ &\stackrel{r_2^*=r_2-r_1, r_3^*=r_3-r_1}{\sim} \begin{bmatrix} 1 & \frac{2}{3} & \frac{1}{3} & 0 \\ 0 & -\frac{5}{3} & \frac{1}{i}-\frac{1}{3} & 0 \\ 0 & \frac{5}{6} & \frac{1}{6} & 0 \end{bmatrix} \\ &\stackrel{r_3^*=r_3+\frac{1}{2}r_2}{\sim} \begin{bmatrix} 1 & \frac{2}{3} & \frac{1}{3} & 0 \\ 0 & -\frac{5}{3} & \frac{1}{i}-\frac{1}{3} & 0 \\ 0 & 0 & \frac{1}{2i} & 0 \end{bmatrix} \\ &\stackrel{r_3^*=2ir_3}{\sim} \begin{bmatrix} 1 & \frac{2}{3} & \frac{1}{3} & 0 \\ 0 & -\frac{5}{3} & \frac{1}{i}-\frac{1}{3} & 0 \\ 0 & 0 & 1 & 0 \end{bmatrix} \\ &\stackrel{r_1^*=r_1-\frac{1}{3}r_3,r_2^*=r_2-\left(\frac{1}{i} - \frac{1}{3}\right)r_3}{\sim} \begin{bmatrix} 1 & \frac{2}{3} & 0 & 0 \\ 0 & -\frac{5}{3} & 0 & 0 \\ 0 & 0 & 1 & 0 \end{bmatrix} \\ &\stackrel{r_2^*=-\frac{3}{5}r_2}{\sim} \begin{bmatrix} 1 & \frac{2}{3} & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \end{bmatrix} \\ &\stackrel{r_1^*=r_1-\frac{2}{3}r_2}{\sim} \begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \end{bmatrix} \\ \end{array}$$ From this we see that the solution is $\begin{bmatrix} c_1 \\ c_2 \\ c_3 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix}$ and hence we conclude that the set of vectors in question is independent.

Problem B: Determine if the following set of vectors is linearly independent (field: $\mathbb{Z}_3$): $$\left\{ \left[ \begin{array}{ll} 1 \\ 2 \\ \end{array} \right], \left[ \begin{array}{ll} 2 \\ 1 \end{array} \right] \right\}.$$ Solution: Consider the vector equation $$c_1 \left[ \begin{array}{ll} 1 \\ 2 \\ \end{array} \right] + c_2 \left[ \begin{array}{ll} 2 \\ 1 \end{array} \right] = \begin{bmatrix} 0 \\ 0 \end{bmatrix}.$$

Now we put the associated augmented matrix into reduced echelon form: $$\begin{bmatrix} 1 & 2 & 0 \\ 2 & 1 & 0 \end{bmatrix} \stackrel{r_2^*=r_2-2r_1}{\sim} \begin{bmatrix} 1 & 2 & 0 \\ 0 & 0 & 0 \end{bmatrix}.$$ This encodes the solution $c_1+2c_2=0$, i.e. $c_1=-2c_2=c_2$. Choosing, for example, $c_2=1$ shows that there is a nontrivial solution to the vector equation and therefore this set of vectors is not independent.

Problem C: Determine if the following set of vectors is linearly independent (field: $\mathbb{Z}_5$): $$\left\{ \left[ \begin{array}{ll} 3 \\ 2 \\ 0 \end{array} \right], \left[ \begin{array}{ll} 1 \\ 1 \\ 1 \end{array} \right], \left[ \begin{array}{ll} 1 \\ 2 \\ 1 \end{array} \right] \right\}.$$ Solution: Consider the vector equation $$c_1 \left[ \begin{array}{ll} 3 \\ 2 \\ 0 \end{array} \right] + c_2 \left[ \begin{array}{ll} 1 \\ 1 \\ 1 \end{array} \right] + c_3 \left[ \begin{array}{ll} 1 \\ 2 \\ 1 \end{array} \right] = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix}.$$ We now solve the associated augmented matrix: compute $$\begin{array}{ll} \begin{bmatrix} 3 & 1 & 1 & 0 \\ 2 & 1 & 2 & 0 \\ 0 & 1 & 1 & 0 \end{bmatrix} &\stackrel{r_2^*=r_2-4r_1}{\sim} \begin{bmatrix} 3 & 1 & 1 & 0 \\ 0 & 2 & 3 & 0 \\ 0 & 1 & 1 & 0 \end{bmatrix} \\ &\stackrel{r_3^*=r_3-3r_2}{\sim} \begin{bmatrix} 3 & 1 & 1 & 0 \\ 0 & 2 & 3 & 0 \\ 0 & 0 & 2 & 0 \end{bmatrix} \\ &\stackrel{r_3^*=3r_3, r_2^*=3r_2, r_1^*=2r_1}{\sim} \begin{bmatrix} 1 & 2 & 2 & 0 \\ 0 & 1 & 4 & 0 \\ 0 & 0 & 1 & 0 \end{bmatrix} \\ &\stackrel{r_1^*=r_1-2r_3, r_2^*=r_2-4r_3}{\sim} \begin{bmatrix} 1 & 2 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \end{bmatrix} \\ &\stackrel{r_1^*=r_1-2r_2}{\sim} \begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \end{bmatrix}. \end{array}$$ This shows that the only solution to the vector equation is the trivial solution and therefore the set of vectors is linearly independent.