Section4.3Linear Systems As Linear Combinations of Vectors¶ permalink
In the previous two sections, our geometric intuition of linear systems depended on the number of variables or unknowns. For example, if there were two unknowns, say \(x\) and \(y\text{,}\) then each equation in the system corresponded with a line in \(\R^2\text{.}\) Solving the system was equivalent to finding all points common to all of the lines. If there were three variables, then each equation corresponded with a plane in \(\R^3\text{.}\) Again, solving the system, was tantamount to finding all points at the intersection of the planes. In general, if there are \(n\) variables, then each equation corresponds with an \((n-1)\)–dimensional hyperplane in \(\R^n\text{,}\) and the solution set of the system is the intersection of all the hyperplanes.
In this chapter we will change our geometric perspective on solving linear systems from a subtractive viewpoint (i.e. intersecting hyperplanes) to an additive one (i.e. linear combinations). This new way of understanding linear systems of equations will prove to be superior in many respects, and will allow us to understand general solutions to higher order, linear differential equations.
SubsectionMatrix Multiplication as Linear Combination¶ permalink
Consider the following square matrix multiplied by a column vector: \begin{equation*} \begin{bmatrix} a_{11} \amp a_{12} \\ a_{21} \amp a_{22} \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} \:\: = \:\: \begin{bmatrix} a_{11}x + a_{12}y \\ a_{21}x + a_{22}y \end{bmatrix} \:\: = \:\: \begin{bmatrix} a_{11} \\ a_{21} \end{bmatrix} x + \begin{bmatrix} a_{12} \\ a_{22} \end{bmatrix} y \end{equation*} In other words when you multiply a matrix \(A\) by a column vector, \(\vec{x}\text{,}\) it is equivalent to scaling the columns of the matrix by the corresponding amounts in the column vector and then summing them. Or put more succinctly, it is equivalent to a linear combination of the column vectors of the matrix.
This allows us to write a linear system in a brand new way: \begin{equation*} \begin{bmatrix} 1 \amp 1 \\ 3 \amp 0 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = \begin{bmatrix} 3 \\ 3 \end{bmatrix} \qquad \Longleftrightarrow \qquad x \begin{bmatrix} 1 \\ 3 \end{bmatrix} + y \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \begin{bmatrix} 3 \\ 3 \end{bmatrix} \end{equation*} As you can easily verify, this system has solution \(x=1, y=2\text{.}\)
Linear System | Matrix Equation | Linear Combination Eqn. | ||
\(\begin{alignedat}{6} \ell_1: \amp 1x \amp {}+{} \amp 1y \amp {}={} \amp 3 \\ \ell_2: \amp 3x \amp {}+{} \amp 0y \amp {}={} \amp 3 \\ \end{alignedat}\) | \(\underbrace{\begin{bmatrix} 1 \amp 1 \\ 3 \amp 0 \end{bmatrix}}_{A} \underbrace{\begin{bmatrix} x \\ y \end{bmatrix}}_{\vec{x}} = \underbrace{\begin{bmatrix} 3 \\ 3 \end{bmatrix}}_{\vec{w}}\) | \(x \underbrace{\begin{bmatrix} 1 \\ 3 \end{bmatrix}}_{\vec{u}} + y \underbrace{\begin{bmatrix} 1 \\ 0 \end{bmatrix}}_{\vec{v}} = \underbrace{\begin{bmatrix} 3 \\ 3 \end{bmatrix}}_{\vec{w}}\) | ||
\(\ell_1 \cap \ell_2 = \left\{ (1,2) \right\}\) | \(1\vec{u} + 2\vec{v} = \vec{w}\) |