Unique solution for linearly independent or dependent function group?


The teacher said we have solutions due to linear dependence.


The bot stated that linear independence ensures a unique solution.

It depends!

“Linear dependence” just means that some “object”, say A, can be expressed as a linear mix (a mix that uses only addition and multiplication) of some other “objects”, says B, C (we will stop at 2 objects, but there may be many more, maybe infinitely more)

So

  • A is “linearly dependent on B and C” or
  • A is a “linear combination of B and C

if there exist values, say x, y, such that we can say:

B \cdot x + C \cdot y = A

Often those symbols stand for real numbers \in \mathbb{R} or vectors or matrices of real numbers and the + and * stand for the usual addition and multiplication, but that can all be generalized.

Mapping this to the diagram

The question is whether the vector

A = \begin{bmatrix}5 \\ 3\end{bmatrix}

can be expressed as a linear combination of

B = \begin{bmatrix}-1 \\ 1\end{bmatrix} and C = \begin{bmatrix}2 \\ 1\end{bmatrix}.

The answer is yes. In fact, as long as B and C are not parallel (i.e. as long as B is not itself a linear combination of C, i.e. as long as there is no z such that B = z \cdot C), there is such a linear combination.

In this case, x = 11/3 and y = -2/3

The bot is answering a related question

If you are looking for a solution (x,y) to a system of equations:

b_1 * x + c_1 * y = a_1
b_2 * x + c_2 * y = a_2

which asks whether the two lines in the plane described by each of the equations have a common point (x,y)

which can also be written

\begin{bmatrix}b_1 \\b_2\end{bmatrix} \cdot x + \begin{bmatrix}c_1 \\c_2\end{bmatrix} \cdot y = \begin{bmatrix}a_1 \\a_2\end{bmatrix}

or

B \cdot x + C \cdot y = A

then a unique solution exists if and only if the vectors (here written in row form)

\begin{bmatrix}b_1 & c_1\end{bmatrix}

and

\begin{bmatrix}b_2 & c_2\end{bmatrix}

are linearly independent.

Because linear dependence between those vectors:

\begin{bmatrix}b_2 & c_2\end{bmatrix} = \begin{bmatrix}b_1 & c_1\end{bmatrix} \cdot \mu

would mean we could write

\begin{bmatrix}b_1 \\ b_1 \cdot \mu \end{bmatrix} \cdot x + \begin{bmatrix}c_1 \\ c_1 \cdot \mu \end{bmatrix} \cdot y = \begin{bmatrix}a_1 \\a_2\end{bmatrix}

which has the same solutions as

\begin{bmatrix}b_1 \\ b_1 \end{bmatrix} \cdot x + \begin{bmatrix}c_1 \\ c_1 \end{bmatrix} \cdot y = \begin{bmatrix}a_1 \\a_2 / \mu \end{bmatrix}

which has either

  • no solutions if a1 \ne a_2/\mu, as the system of equations describes two parallel lines in the plane that have no point (x,y) in common
  • infinitely many solutions if a1 = a_2/\mu, and the solution is the line in the plane of those (x,y) which fulfill the (in that case unique) equation.

This also applies to the column direction

Linear dependence between the column vectors:

\begin{bmatrix}b_1 \\ b_2\end{bmatrix} \cdot \mu = \begin{bmatrix}c_1 \\ c_2\end{bmatrix}

would mean the system of equations could be written as

\begin{bmatrix}b_1 \\ b_2 \end{bmatrix} \cdot x + \begin{bmatrix}b_1 \cdot \mu \\ b_2 \cdot \mu \end{bmatrix} \cdot y = \begin{bmatrix}a_1 \\a_2\end{bmatrix}

which has the same solutions as

\begin{bmatrix}b_1 \\ b_2 \end{bmatrix} \cdot ( x + y \cdot \mu ) = \begin{bmatrix}a_1 \\a_2 \end{bmatrix}

which can be written as

B \cdot ( x + y \cdot \mu ) = A

which has either

  • no solutions if vectors A and B are linearly independent (not parallel) because there is no value ( x + y \cdot \mu ) that can make them parallel
  • infinitely many solutions if vectors A and B are linearly dependent (parallel) as ( x + y \cdot \mu ) just needs to be the constant c that makes left and right equal, and the (x,y) that fit are a line in the plane.